Sample records for functional programming model

  1. A Linear Programming Model to Optimize Various Objective Functions of a Foundation Type State Support Program.

    ERIC Educational Resources Information Center

    Matzke, Orville R.

    The purpose of this study was to formulate a linear programming model to simulate a foundation type support program and to apply this model to a state support program for the public elementary and secondary school districts in the State of Iowa. The model was successful in producing optimal solutions to five objective functions proposed for…

  2. Model driver screening and evaluation program. Volume 1, Project summary and model program recommendations

    DOT National Transportation Integrated Search

    2003-05-01

    This research project studied the feasibility as well as the scientific validity and utility of performing functional capacity screening with older drivers. A Model Program was described encompassing procedures to detect functionally impaired drivers...

  3. Fuzzy bi-objective linear programming for portfolio selection problem with magnitude ranking function

    NASA Astrophysics Data System (ADS)

    Kusumawati, Rosita; Subekti, Retno

    2017-04-01

    Fuzzy bi-objective linear programming (FBOLP) model is bi-objective linear programming model in fuzzy number set where the coefficients of the equations are fuzzy number. This model is proposed to solve portfolio selection problem which generate an asset portfolio with the lowest risk and the highest expected return. FBOLP model with normal fuzzy numbers for risk and expected return of stocks is transformed into linear programming (LP) model using magnitude ranking function.

  4. Possible Content Areas for Implementation of the Basic Life Functions Instructional Program Model.

    ERIC Educational Resources Information Center

    Wisconsin State Dept. of Public Instruction, Madison. Div. for Handicapped Children.

    Identified are curricular items intended to develop skills pertinent to the 12 broad instructional objectives of the Basic Life Functions Instructional Program Model, a program for trainable mentally retarded children. The 12 instructional objectives are: communicating ideas, self-understanding, interacting with others, traveling, adapting to and…

  5. Model driver screening and evaluation program. Volume 2, Maryland pilot older driver study

    DOT National Transportation Integrated Search

    2003-05-01

    This research project studied the feasibility as well as the scientific validity and utility of performing functional capacity screening with older drivers. A Model Program was described encompassing procedures to detect functionally impaired drivers...

  6. Scheduling Algorithm for Mission Planning and Logistics Evaluation (SAMPLE). Volume 3: The GREEDY algorithm

    NASA Technical Reports Server (NTRS)

    Dupnick, E.; Wiggins, D.

    1980-01-01

    The functional specifications, functional design and flow, and the program logic of the GREEDY computer program are described. The GREEDY program is a submodule of the Scheduling Algorithm for Mission Planning and Logistics Evaluation (SAMPLE) program and has been designed as a continuation of the shuttle Mission Payloads (MPLS) program. The MPLS uses input payload data to form a set of feasible payload combinations; from these, GREEDY selects a subset of combinations (a traffic model) so all payloads can be included without redundancy. The program also provides the user a tutorial option so that he can choose an alternate traffic model in case a particular traffic model is unacceptable.

  7. Transfer-function-parameter estimation from frequency response data: A FORTRAN program

    NASA Technical Reports Server (NTRS)

    Seidel, R. C.

    1975-01-01

    A FORTRAN computer program designed to fit a linear transfer function model to given frequency response magnitude and phase data is presented. A conjugate gradient search is used that minimizes the integral of the absolute value of the error squared between the model and the data. The search is constrained to insure model stability. A scaling of the model parameters by their own magnitude aids search convergence. Efficient computer algorithms result in a small and fast program suitable for a minicomputer. A sample problem with different model structures and parameter estimates is reported.

  8. Using State Merging and State Pruning to Address the Path Explosion Problem Faced by Symbolic Execution

    DTIC Science & Technology

    2014-06-19

    urgent and compelling. Recent efforts in this area automate program analysis techniques using model checking and symbolic execution [2, 5–7]. These...bounded model checking tool for x86 binary programs developed at the Air Force Institute of Technology (AFIT). Jiseki creates a bit-vector logic model based...assume there are n different paths through the function foo . The program could potentially call the function foo a bound number of times, resulting in n

  9. An Overview of R in Health Decision Sciences.

    PubMed

    Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam

    2017-10-01

    As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.

  10. A Functional Model for Management of Large Scale Assessments.

    ERIC Educational Resources Information Center

    Banta, Trudy W.; And Others

    This functional model for managing large-scale program evaluations was developed and validated in connection with the assessment of Tennessee's Nutrition Education and Training Program. Management of such a large-scale assessment requires the development of a structure for the organization; distribution and recovery of large quantities of…

  11. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    NASA Technical Reports Server (NTRS)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  12. Systems approach provides management control of complex programs

    NASA Technical Reports Server (NTRS)

    Dudek, E. F., Jr.; Mc Carthy, J. F., Jr.

    1970-01-01

    Integrated program management process provides management visual assistance through three interrelated charts - system model that identifies each function to be performed, matrix that identifies personnel responsibilities for these functions, process chart that breaks down the functions into discrete tasks.

  13. MODFLOW Ground-Water Model - User Guide to the Subsidence and Aquifer-System Compaction Package (SUB-WT) for Water-Table Aquifers

    USGS Publications Warehouse

    Leake, S.A.; Galloway, D.L.

    2007-01-01

    A new computer program was developed to simulate vertical compaction in models of regional ground-water flow. The program simulates ground-water storage changes and compaction in discontinuous interbeds or in extensive confining units, accounting for stress-dependent changes in storage properties. The new program is a package for MODFLOW, the U.S. Geological Survey modular finite-difference ground-water flow model. Several features of the program make it useful for application in shallow, unconfined flow systems. Geostatic stress can be treated as a function of water-table elevation, and compaction is a function of computed changes in effective stress at the bottom of a model layer. Thickness of compressible sediments in an unconfined model layer can vary in proportion to saturated thickness.

  14. Human operator identification model and related computer programs

    NASA Technical Reports Server (NTRS)

    Kessler, K. M.; Mohr, J. N.

    1978-01-01

    Four computer programs which provide computational assistance in the analysis of man/machine systems are reported. The programs are: (1) Modified Transfer Function Program (TF); (2) Time Varying Response Program (TVSR); (3) Optimal Simulation Program (TVOPT); and (4) Linear Identification Program (SCIDNT). The TV program converts the time domain state variable system representative to frequency domain transfer function system representation. The TVSR program computes time histories of the input/output responses of the human operator model. The TVOPT program is an optimal simulation program and is similar to TVSR in that it produces time histories of system states associated with an operator in the loop system. The differences between the two programs are presented. The SCIDNT program is an open loop identification code which operates on the simulated data from TVOPT (or TVSR) or real operator data from motion simulators.

  15. Software For Integer Programming

    NASA Technical Reports Server (NTRS)

    Fogle, F. R.

    1992-01-01

    Improved Exploratory Search Technique for Pure Integer Linear Programming Problems (IESIP) program optimizes objective function of variables subject to confining functions or constraints, using discrete optimization or integer programming. Enables rapid solution of problems up to 10 variables in size. Integer programming required for accuracy in modeling systems containing small number of components, distribution of goods, scheduling operations on machine tools, and scheduling production in general. Written in Borland's TURBO Pascal.

  16. Inexact nonlinear improved fuzzy chance-constrained programming model for irrigation water management under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Chenglong; Zhang, Fan; Guo, Shanshan; Liu, Xiao; Guo, Ping

    2018-01-01

    An inexact nonlinear mλ-measure fuzzy chance-constrained programming (INMFCCP) model is developed for irrigation water allocation under uncertainty. Techniques of inexact quadratic programming (IQP), mλ-measure, and fuzzy chance-constrained programming (FCCP) are integrated into a general optimization framework. The INMFCCP model can deal with not only nonlinearities in the objective function, but also uncertainties presented as discrete intervals in the objective function, variables and left-hand side constraints and fuzziness in the right-hand side constraints. Moreover, this model improves upon the conventional fuzzy chance-constrained programming by introducing a linear combination of possibility measure and necessity measure with varying preference parameters. To demonstrate its applicability, the model is then applied to a case study in the middle reaches of Heihe River Basin, northwest China. An interval regression analysis method is used to obtain interval crop water production functions in the whole growth period under uncertainty. Therefore, more flexible solutions can be generated for optimal irrigation water allocation. The variation of results can be examined by giving different confidence levels and preference parameters. Besides, it can reflect interrelationships among system benefits, preference parameters, confidence levels and the corresponding risk levels. Comparison between interval crop water production functions and deterministic ones based on the developed INMFCCP model indicates that the former is capable of reflecting more complexities and uncertainties in practical application. These results can provide more reliable scientific basis for supporting irrigation water management in arid areas.

  17. Application of digital computer APU modeling techniques to control system design.

    NASA Technical Reports Server (NTRS)

    Bailey, D. A.; Burriss, W. L.

    1973-01-01

    Study of the required controls for a H2-O2 auxiliary power unit (APU) technology program for the Space Shuttle. A steady-state system digital computer program was prepared and used to optimize initial system design. Analytical models of each system component were included. The program was used to solve a nineteen-dimensional problem, and then time-dependent differential equations were added to the computer program to simulate transient APU system and control. Some system parameters were considered quasi-steady-state, and others were treated as differential variables. The dynamic control analysis proceeded from initial ideal control modeling (which considered one control function and assumed the others to be ideal), stepwise through the system (adding control functions), until all of the control functions and their interactions were considered. In this way, the adequacy of the final control design over the required wide range of APU operating conditions was established.

  18. Use of DAVID algorithms for gene functional classification in a non-model organism, rainbow trout

    USDA-ARS?s Scientific Manuscript database

    Gene functional clustering is essential in transcriptome data analysis but software programs are not always suitable for use with non-model species. The DAVID Gene Functional Classification Tool has been widely used for soft clustering in model species, but requires adaptations for use in non-model ...

  19. An Evidence-Based Construction of the Models of Decline of Functioning. Part 1: Two Major Models of Decline of Functioning

    ERIC Educational Resources Information Center

    Okawa, Yayoi; Nakamura, Shigemi; Kudo, Minako; Ueda, Satoshi

    2009-01-01

    The purpose of this study is to confirm the working hypothesis on two major models of functioning decline and two corresponding models of rehabilitation program in an older population through detailed interviews with the persons who have functioning declines and on-the-spot observations of key activities on home visits. A total of 542…

  20. The VIS-AD data model: Integrating metadata and polymorphic display with a scientific programming language

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Dyer, Charles R.; Paul, Brian E.

    1994-01-01

    The VIS-AD data model integrates metadata about the precision of values, including missing data indicators and the way that arrays sample continuous functions, with the data objects of a scientific programming language. The data objects of this data model form a lattice, ordered by the precision with which they approximate mathematical objects. We define a similar lattice of displays and study visualization processes as functions from data lattices to display lattices. Such functions can be applied to visualize data objects of all data types and are thus polymorphic.

  1. Development of an Operational Model for the Application of Planning-Programming-Budgeting Systems in Local School Districts. Program Budgeting Note 1, Introduction to Program Budgeting.

    ERIC Educational Resources Information Center

    State Univ. of New York, Buffalo. Western New York School Study Council.

    Although the public is best served by governmental agencies which have integrated the major functions of planning, managing, and budgeting, it can be asserted that the planning function is paramount. A review of the evolution of public agency administration in the U.S. reveals that until recent years the planning function has been largely…

  2. Research Program for Vibration Control in Structures

    NASA Technical Reports Server (NTRS)

    Mingori, D. L.; Gibson, J. S.

    1986-01-01

    Purpose of program to apply control theory to large space structures (LSS's) and design practical compensator for suppressing vibration. Program models LSS as distributed system. Control theory applied to produce compensator described by functional gains and transfer functions. Used for comparison of robustness of low- and high-order compensators that control surface vibrations of realistic wrap-rib antenna. Program written in FORTRAN for batch execution.

  3. LENMODEL: A forward model for calculating length distributions and fission-track ages in apatite

    NASA Astrophysics Data System (ADS)

    Crowley, Kevin D.

    1993-05-01

    The program LENMODEL is a forward model for annealing of fission tracks in apatite. It provides estimates of the track-length distribution, fission-track age, and areal track density for any user-supplied thermal history. The program approximates the thermal history, in which temperature is represented as a continuous function of time, by a series of isothermal steps of various durations. Equations describing the production of tracks as a function of time and annealing of tracks as a function of time and temperature are solved for each step. The step calculations are summed to obtain estimates for the entire thermal history. Computational efficiency is maximized by performing the step calculations backwards in model time. The program incorporates an intuitive and easy-to-use graphical interface. Thermal history is input to the program using a mouse. Model options are specified by selecting context-sensitive commands from a bar menu. The program allows for considerable selection of equations and parameters used in the calculations. The program was written for PC-compatible computers running DOS TM 3.0 and above (and Windows TM 3.0 or above) with VGA or SVGA graphics and a Microsoft TM-compatible mouse. Single copies of a runtime version of the program are available from the author by written request as explained in the last section of this paper.

  4. STEP and STEPSPL: Computer programs for aerodynamic model structure determination and parameter estimation

    NASA Technical Reports Server (NTRS)

    Batterson, J. G.

    1986-01-01

    The successful parametric modeling of the aerodynamics for an airplane operating at high angles of attack or sideslip is performed in two phases. First the aerodynamic model structure must be determined and second the associated aerodynamic parameters (stability and control derivatives) must be estimated for that model. The purpose of this paper is to document two versions of a stepwise regression computer program which were developed for the determination of airplane aerodynamic model structure and to provide two examples of their use on computer generated data. References are provided for the application of the programs to real flight data. The two computer programs that are the subject of this report, STEP and STEPSPL, are written in FORTRAN IV (ANSI l966) compatible with a CDC FTN4 compiler. Both programs are adaptations of a standard forward stepwise regression algorithm. The purpose of the adaptation is to facilitate the selection of a adequate mathematical model of the aerodynamic force and moment coefficients of an airplane from flight test data. The major difference between STEP and STEPSPL is in the basis for the model. The basis for the model in STEP is the standard polynomial Taylor's series expansion of the aerodynamic function about some steady-state trim condition. Program STEPSPL utilizes a set of spline basis functions.

  5. Development of Regional Supply Functions and a Least-Cost Model for Allocating Water Resources in Utah: A Parametric Linear Programming Approach.

    DTIC Science & Technology

    SYSTEMS ANALYSIS, * WATER SUPPLIES, MATHEMATICAL MODELS, OPTIMIZATION, ECONOMICS, LINEAR PROGRAMMING, HYDROLOGY, REGIONS, ALLOCATIONS, RESTRAINT, RIVERS, EVAPORATION, LAKES, UTAH, SALVAGE, MINES(EXCAVATIONS).

  6. The Component Model of Infrastructure: A Practical Approach to Understanding Public Health Program Infrastructure

    PubMed Central

    Snyder, Kimberly; Rieker, Patricia P.

    2014-01-01

    Functioning program infrastructure is necessary for achieving public health outcomes. It is what supports program capacity, implementation, and sustainability. The public health program infrastructure model presented in this article is grounded in data from a broader evaluation of 18 state tobacco control programs and previous work. The newly developed Component Model of Infrastructure (CMI) addresses the limitations of a previous model and contains 5 core components (multilevel leadership, managed resources, engaged data, responsive plans and planning, networked partnerships) and 3 supporting components (strategic understanding, operations, contextual influences). The CMI is a practical, implementation-focused model applicable across public health programs, enabling linkages to capacity, sustainability, and outcome measurement. PMID:24922125

  7. Investigation of the applicability of a functional programming model to fault-tolerant parallel processing for knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Harper, Richard

    1989-01-01

    In a fault-tolerant parallel computer, a functional programming model can facilitate distributed checkpointing, error recovery, load balancing, and graceful degradation. Such a model has been implemented on the Draper Fault-Tolerant Parallel Processor (FTPP). When used in conjunction with the FTPP's fault detection and masking capabilities, this implementation results in a graceful degradation of system performance after faults. Three graceful degradation algorithms have been implemented and are presented. A user interface has been implemented which requires minimal cognitive overhead by the application programmer, masking such complexities as the system's redundancy, distributed nature, variable complement of processing resources, load balancing, fault occurrence and recovery. This user interface is described and its use demonstrated. The applicability of the functional programming style to the Activation Framework, a paradigm for intelligent systems, is then briefly described.

  8. User’s Guide for COMBIMAN Programs (COMputerized BIomechanical MAN-Model). Version 4.

    DTIC Science & Technology

    1981-01-01

    103 2.2.23 STATE SWITCH Function (PFK29) 105 2.2.24 RESTART PROGRAM Function (PFK30) 110 2.2.25 END PROGRAM Function (PFK31) il 2.3 EXECUTING THE JOB...108 40 Transformation Equation Developed for Positioning Stomach Link (Set State Switch 72 ON) 109 41 JOB CONTROL CARDS to Execute CBM04. 114 42...154 50b Program CBMAM Survey Member Dimension Definition Cards. 154 51 Example of Survey, or Type 1, Member. 155 52 Job Control Cards to Execute CBMAM

  9. Simulation program for estimating statistical power of Cox's proportional hazards model assuming no specific distribution for the survival time.

    PubMed

    Akazawa, K; Nakamura, T; Moriguchi, S; Shimada, M; Nose, Y

    1991-07-01

    Small sample properties of the maximum partial likelihood estimates for Cox's proportional hazards model depend on the sample size, the true values of regression coefficients, covariate structure, censoring pattern and possibly baseline hazard functions. Therefore, it would be difficult to construct a formula or table to calculate the exact power of a statistical test for the treatment effect in any specific clinical trial. The simulation program, written in SAS/IML, described in this paper uses Monte-Carlo methods to provide estimates of the exact power for Cox's proportional hazards model. For illustrative purposes, the program was applied to real data obtained from a clinical trial performed in Japan. Since the program does not assume any specific function for the baseline hazard, it is, in principle, applicable to any censored survival data as long as they follow Cox's proportional hazards model.

  10. Nonlinear and Digital Man-machine Control Systems Modeling

    NASA Technical Reports Server (NTRS)

    Mekel, R.

    1972-01-01

    An adaptive modeling technique is examined by which controllers can be synthesized to provide corrective dynamics to a human operator's mathematical model in closed loop control systems. The technique utilizes a class of Liapunov functions formulated for this purpose, Liapunov's stability criterion and a model-reference system configuration. The Liapunov function is formulated to posses variable characteristics to take into consideration the identification dynamics. The time derivative of the Liapunov function generate the identification and control laws for the mathematical model system. These laws permit the realization of a controller which updates the human operator's mathematical model parameters so that model and human operator produce the same response when subjected to the same stimulus. A very useful feature is the development of a digital computer program which is easily implemented and modified concurrent with experimentation. The program permits the modeling process to interact with the experimentation process in a mutually beneficial way.

  11. Computing quantum hashing in the model of quantum branching programs

    NASA Astrophysics Data System (ADS)

    Ablayev, Farid; Ablayev, Marat; Vasiliev, Alexander

    2018-02-01

    We investigate the branching program complexity of quantum hashing. We consider a quantum hash function that maps elements of a finite field into quantum states. We require that this function is preimage-resistant and collision-resistant. We consider two complexity measures for Quantum Branching Programs (QBP): a number of qubits and a number of compu-tational steps. We show that the quantum hash function can be computed efficiently. Moreover, we prove that such QBP construction is optimal. That is, we prove lower bounds that match the constructed quantum hash function computation.

  12. Nutritional programming of gastrointestinal tract development. Is the pig a good model for man?

    PubMed

    Guilloteau, Paul; Zabielski, Romuald; Hammon, Harald M; Metges, Cornelia C

    2010-06-01

    The consequences of early-life nutritional programming in man and other mammalian species have been studied chiefly at the metabolic level. Very few studies, if any, have been performed in the gastrointestinal tract (GIT) as the target organ, but extensive GIT studies are needed since the GIT plays a key role in nutrient supply and has an impact on functions of the entire organism. The possible deleterious effects of nutritional programming at the metabolic level were discovered following epidemiological studies in human subjects, and confirmed in animal models. Investigating the impact of programming on GIT structure and function would need appropriate animal models due to ethical restrictions in the use of human subjects. The aim of the present review is to discuss the use of pigs as an animal model as a compromise between ethically acceptable animal studies and the requirement of data which can be interpolated to the human situation. In nutritional programming studies, rodents are the most frequently used model for man, but GIT development and digestive function in rodents are considerably different from those in man. In that aspect, the pig GIT is much closer to the human than that of rodents. The swine species is closely comparable with man in many nutritional and digestive aspects, and thus provides ample opportunity to be used in investigations on the consequences of nutritional programming for the GIT. In particular, the 'sow-piglets' dyad could be a useful tool to simulate the 'human mother-infant' dyad in studies which examine short-, middle- and long-term effects and is suggested as the reference model.

  13. A Teaching Aid for Physiologists--Simulation of Kidney Function

    ERIC Educational Resources Information Center

    Packer, J. S.; Packer, J. E.

    1977-01-01

    Presented is the development of a simulation model of the facultative water transfer mechanism of the mammalian kidney. Discussion topics include simulation philosophy, simulation facilities, the model, and programming the model as a teaching aid. Graphs illustrate typical program displays. A listing of references concludes the article. (MA)

  14. Basic Life Functions Instructional Program Model. Field Copy.

    ERIC Educational Resources Information Center

    Wisconsin State Dept. of Public Instruction, Madison. Div. for Handicapped Children.

    Presented is a model, designed by the Wisconsin Department of Public Instruction, for development of an instructional program in basic living skills for trainable mentally retarded children (2- to 20-years-old). The model identifies the following instructional goals: to communicate ideas, to understand one's self and interact with others, to…

  15. Planning for School Transition: An Ecological-Developmental Approach.

    ERIC Educational Resources Information Center

    Diamond, Karen E.; And Others

    1988-01-01

    The paper describes an ecological-developmental model for planning a child's transition from a preschool special education program to a public school classroom. The model stresses interactions between the various environments in which the child functions. A description of a preschool transition program based on the model is also included.…

  16. Difficulties with True Interoperability in Modeling & Simulation

    DTIC Science & Technology

    2011-12-01

    2009. Programming Scala : Scalability = Functional Programming + Ob- jects. 1 st ed. O‟Reilly Media. 2652 Gallant and Gaughan AUTHOR BIOGRAPHIES...that develops a model or simulation has a specific purpose, set of requirements and limited funding. These programs cannot afford to coordinate with...implementation. The program offices should budget for and plan for coordination across domain projects within a limited scope to improve interoperability with

  17. FACTOR - FACTOR II. Departmental Program and Model Documentation 71-3.

    ERIC Educational Resources Information Center

    Wilson, Stanley; Billingsley, Ray

    This computer program is designed to optimize a Cobb-Douglas type of production function. The user of this program may choose isoquants and/or the expansion path for a Cobb-Douglas type of production function with up to nine resources. An expansion path is the combination of quantities of each resource that minimizes the cost at each production…

  18. Understanding Portability of a High-Level Programming Model on Contemporary Heterogeneous Architectures

    DOE PAGES

    Sabne, Amit J.; Sakdhnagool, Putt; Lee, Seyong; ...

    2015-07-13

    Accelerator-based heterogeneous computing is gaining momentum in the high-performance computing arena. However, the increased complexity of heterogeneous architectures demands more generic, high-level programming models. OpenACC is one such attempt to tackle this problem. Although the abstraction provided by OpenACC offers productivity, it raises questions concerning both functional and performance portability. In this article, the authors propose HeteroIR, a high-level, architecture-independent intermediate representation, to map high-level programming models, such as OpenACC, to heterogeneous architectures. They present a compiler approach that translates OpenACC programs into HeteroIR and accelerator kernels to obtain OpenACC functional portability. They then evaluate the performance portability obtained bymore » OpenACC with their approach on 12 OpenACC programs on Nvidia CUDA, AMD GCN, and Intel Xeon Phi architectures. They study the effects of various compiler optimizations and OpenACC program settings on these architectures to provide insights into the achieved performance portability.« less

  19. Representing and reasoning about program in situation calculus

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Zhang, Ming-yi; Wu, Mao-nian; Xie, Gang

    2011-12-01

    Situation calculus is an expressive tool for modeling dynamical system in artificial intelligence, changes in a dynamical world is represented naturally by the notions of action, situation and fluent in situation calculus. Program can be viewed as a discrete dynamical system, so it is possible to model program with situation calculus. To model program written in a smaller core programming language CL, notion of fluent is expanded for representing value of expression. Together with some functions returning concerned objects from expressions, a basic action theory of CL programming is constructed. Under such a theory, some properties of program, such as correctness and termination can be reasoned about.

  20. Construction of Hierarchical Models for Fluid Dynamics in Earth and Planetary Sciences : DCMODEL project

    NASA Astrophysics Data System (ADS)

    Takahashi, Y. O.; Takehiro, S.; Sugiyama, K.; Odaka, M.; Ishiwatari, M.; Sasaki, Y.; Nishizawa, S.; Ishioka, K.; Nakajima, K.; Hayashi, Y.

    2012-12-01

    Toward the understanding of fluid motions of planetary atmospheres and planetary interiors by performing multiple numerical experiments with multiple models, we are now proceeding ``dcmodel project'', where a series of hierarchical numerical models with various complexity is developed and maintained. In ``dcmodel project'', a series of the numerical models are developed taking care of the following points: 1) a common ``style'' of program codes assuring readability of the software, 2) open source codes of the models to the public, 3) scalability of the models assuring execution on various scales of computational resources, 4) stressing the importance of documentation and presenting a method for writing reference manuals. The lineup of the models and utility programs of the project is as follows: Gtool5, ISPACK/SPML, SPMODEL, Deepconv, Dcpam, and Rdoc-f95. In the followings, features of each component are briefly described. Gtool5 (Ishiwatari et al., 2012) is a Fortran90 library, which provides data input/output interfaces and various utilities commonly used in the models of dcmodel project. A self-descriptive data format netCDF is adopted as a IO format of Gtool5. The interfaces of gtool5 library can reduce the number of operation steps for the data IO in the program code of the models compared with the interfaces of the raw netCDF library. Further, by use of gtool5 library, procedures for data IO and addition of metadata for post-processing can be easily implemented in the program codes in a consolidated form independent of the size and complexity of the models. ``ISPACK'' is the spectral transformation library and ``SPML (SPMODEL library)'' (Takehiro et al., 2006) is its wrapper library. Most prominent feature of SPML is a series of array-handling functions with systematic function naming rules, and this enables us to write codes with a form which is easily deduced from the mathematical expressions of the governing equations. ``SPMODEL'' (Takehiro et al., 2006) is a collection of various sample programs using ``SPML''. These sample programs provide the basekit for simple numerical experiments of geophysical fluid dynamics. For example, SPMODEL includes 1-dimensional KdV equation model, 2-dimensional barotropic, shallow water, Boussinesq models, 3-dimensional MHD dynamo models in rotating spherical shells. These models are written in the common style in harmony with SPML functions. ``Deepconv'' (Sugiyama et al., 2010) and ``Dcpam'' are a cloud resolving model and a general circulation model for the purpose of applications to the planetary atmospheres, respectively. ``Deepconv'' includes several physical processes appropriate for simulations of Jupiter and Mars atmospheres, while ``Dcpam'' does for simulations of Earth, Mars, and Venus-like atmospheres. ``Rdoc-f95'' is a automatic generator of reference manuals of Fortran90/95 programs, which is an extension of ruby documentation tool kit ``rdoc''. It analyzes dependency of modules, functions, and subroutines in the multiple program source codes. At the same time, it can list up the namelist variables in the programs.

  1. Evaluating the Facilities Planning, Design, and Construction Department: The Capital Programs Management Audit.

    ERIC Educational Resources Information Center

    Kaiser, Harvey H.; Kirkwood, Dennis M.

    2000-01-01

    Presents a diagnostic model for assessing the state of an institution's capital programs management (CPM) by delineating "work processes" which comprise that function. What capital programs management is, its resources, and its phases and work processes are described, followed by case studies of the CPM Process Model as an assessment tool. (GR)

  2. Gstat: a program for geostatistical modelling, prediction and simulation

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer J.; Wesseling, Cees G.

    1998-01-01

    Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.

  3. Diet models with linear goal programming: impact of achievement functions.

    PubMed

    Gerdessen, J C; de Vries, J H M

    2015-11-01

    Diet models based on goal programming (GP) are valuable tools in designing diets that comply with nutritional, palatability and cost constraints. Results derived from GP models are usually very sensitive to the type of achievement function that is chosen.This paper aims to provide a methodological insight into several achievement functions. It describes the extended GP (EGP) achievement function, which enables the decision maker to use either a MinSum achievement function (which minimizes the sum of the unwanted deviations) or a MinMax achievement function (which minimizes the largest unwanted deviation), or a compromise between both. An additional advantage of EGP models is that from one set of data and weights multiple solutions can be obtained. We use small numerical examples to illustrate the 'mechanics' of achievement functions. Then, the EGP achievement function is demonstrated on a diet problem with 144 foods, 19 nutrients and several types of palatability constraints, in which the nutritional constraints are modeled with fuzzy sets. Choice of achievement function affects the results of diet models. MinSum achievement functions can give rise to solutions that are sensitive to weight changes, and that pile all unwanted deviations on a limited number of nutritional constraints. MinMax achievement functions spread the unwanted deviations as evenly as possible, but may create many (small) deviations. EGP comprises both types of achievement functions, as well as compromises between them. It can thus, from one data set, find a range of solutions with various properties.

  4. Nonlinear-programming mathematical modeling of coal blending for power plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang Longhua; Zhou Junhu; Yao Qiang

    At present most of the blending works are guided by experience or linear-programming (LP) which can not reflect the coal complicated characteristics properly. Experimental and theoretical research work shows that most of the coal blend properties can not always be measured as a linear function of the properties of the individual coals in the blend. The authors introduced nonlinear functions or processes (including neural network and fuzzy mathematics), established on the experiments directed by the authors and other researchers, to quantitatively describe the complex coal blend parameters. Finally nonlinear-programming (NLP) mathematical modeling of coal blend is introduced and utilized inmore » the Hangzhou Coal Blending Center. Predictions based on the new method resulted in different results from the ones based on LP modeling. The authors concludes that it is very important to introduce NLP modeling, instead of NL modeling, into the work of coal blending.« less

  5. Legato: Personal Computer Software for Analyzing Pressure-Sensitive Paint Data

    NASA Technical Reports Server (NTRS)

    Schairer, Edward T.

    2001-01-01

    'Legato' is personal computer software for analyzing radiometric pressure-sensitive paint (PSP) data. The software is written in the C programming language and executes under Windows 95/98/NT operating systems. It includes all operations normally required to convert pressure-paint image intensities to normalized pressure distributions mapped to physical coordinates of the test article. The program can analyze data from both single- and bi-luminophore paints and provides for both in situ and a priori paint calibration. In addition, there are functions for determining paint calibration coefficients from calibration-chamber data. The software is designed as a self-contained, interactive research tool that requires as input only the bare minimum of information needed to accomplish each function, e.g., images, model geometry, and paint calibration coefficients (for a priori calibration) or pressure-tap data (for in situ calibration). The program includes functions that can be used to generate needed model geometry files for simple model geometries (e.g., airfoils, trapezoidal wings, rotor blades) based on the model planform and airfoil section. All data files except images are in ASCII format and thus are easily created, read, and edited. The program does not use database files. This simplifies setup but makes the program inappropriate for analyzing massive amounts of data from production wind tunnels. Program output consists of Cartesian plots, false-colored real and virtual images, pressure distributions mapped to the surface of the model, assorted ASCII data files, and a text file of tabulated results. Graphical output is displayed on the computer screen and can be saved as publication-quality (PostScript) files.

  6. Metacognition as a Mediating Variable Between Neurocognition and Functional Outcome in First Episode Psychosis.

    PubMed

    Davies, Geoff; Fowler, David; Greenwood, Kathryn

    2017-07-01

    Neurocognitive and functional outcome deficits have long been acknowledged in schizophrenia and neurocognition has been found to account for functional disability to a greater extent than psychopathology. Much of the variance in functional outcome however still remains unexplained and metacognition may mediate the relationship between neurocognition, functional capacity, and self-reported social and occupational function. Eighty first episode psychosis participants were recruited and completed measures of neurocognition (memory, executive function, and intelligence quotient), metacognition (Beck Cognitive Insight Scale, Metacognitive Awareness Interview), psychopathology (PANSS), and both functional capacity (UPSA) and real-life social and occupational function (The Time Use Survey). Path analyses investigated the relationships between variables through structural equation modeling. A series of path models demonstrated that metacognition partially mediates the relationship between neurocognition and functional capacity, and fully mediates the relationship between functional capacity and social and occupational function. The present study findings identify that metacognition may be critical to translating cognitive and functional skills into real-world contexts, and this relationship is found at early stages of illness. Understanding how individuals translate cognitive and functional skills into the real-world (the competence-performance gap) may offer valuable guidance to intervention programs. This finding is important to models of recovery as it suggests that intervention programs that focus on enhancing metacognition abilities may have a greater impact than traditional rehabilitation programs focusing on cognitive abilities, on social and occupational outcomes. © The Author 2016. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com

  7. A model for estimating the cost impact of schedule perturbations on aerospace research and development programs

    NASA Technical Reports Server (NTRS)

    Bishop, D. F.

    1972-01-01

    The problem of determining the cost impact attributable to perturbations in an aerospace R and D program schedule is discussed in terms of the diminishing availability of funds. The methodology from which a model is presented for updating R and D cost estimates as a function of perturbations in program time is presented.

  8. A Model of Self-Explanation Strategies of Instructional Text and Examples in the Acquisition of Programming Skills.

    ERIC Educational Resources Information Center

    Recker, Margaret M.; Pirolli, Peter

    Students learning to program recursive LISP functions in a typical school-like lesson on recursion were observed. The typical lesson contains text and examples and involves solving a series of programming problems. The focus of this study is on students' learning strategies in new domains. In this light, a Soar computational model of…

  9. A Microdata Model of Delayed Entry Program (DEP) Behavior. Technical Report 666.

    ERIC Educational Resources Information Center

    Phillips, Chester E.; Schmitz, Edward J.

    High personnel loss rates among recruits who have signed up for the Army's Delayed Entry Program (DEP) are becoming an increasing problem for DEP program managers. Therefore, a research project was conducted to examine DEP loss as a function of sociodemography and policy variables at the microdata level. Two DEP loss models were created. The first…

  10. Developmental programming: the concept, large animal models, and the key role of uteroplacental vascular development.

    PubMed

    Reynolds, L P; Borowicz, P P; Caton, J S; Vonnahme, K A; Luther, J S; Hammer, C J; Maddock Carlin, K R; Grazul-Bilska, A T; Redmer, D A

    2010-04-01

    Developmental programming refers to the programming of various bodily systems and processes by a stressor of the maternal system during pregnancy or during the neonatal period. Such stressors include nutritional stress, multiple pregnancy (i.e., increased numbers of fetuses in the gravid uterus), environmental stress (e.g., high environmental temperature, high altitude, prenatal steroid exposure), gynecological immaturity, and maternal or fetal genotype. Programming refers to impaired function of numerous bodily systems or processes, leading to poor growth, altered body composition, metabolic dysfunction, and poor productivity (e.g., poor growth, reproductive dysfunction) of the offspring throughout their lifespan and even across generations. A key component of developmental programming seems to be placental dysfunction, leading to altered fetal growth and development. We discuss various large animal models of developmental programming and how they have and will continue to contribute to our understanding of the mechanisms underlying altered placental function and developmental programming, and, further, how large animal models also will be critical to the identification and application of therapeutic strategies that will alleviate the negative consequences of developmental programming to improve offspring performance in livestock production and human medicine.

  11. Modeling and Analysis of Power Processing Systems (MAPPS). Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Lee, F. C.; Radman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.

    1980-01-01

    The computer programs and derivations generated in support of the modeling and design optimization program are presented. Programs for the buck regulator, boost regulator, and buck-boost regulator are described. The computer program for the design optimization calculations is presented. Constraints for the boost and buck-boost converter were derived. Derivations of state-space equations and transfer functions are presented. Computer lists for the converters are presented, and the input parameters justified.

  12. An Effective Model of the Retinoic Acid Induced HL-60 Differentiation Program.

    PubMed

    Tasseff, Ryan; Jensen, Holly A; Congleton, Johanna; Dai, David; Rogers, Katharine V; Sagar, Adithya; Bunaciu, Rodica P; Yen, Andrew; Varner, Jeffrey D

    2017-10-30

    In this study, we present an effective model All-Trans Retinoic Acid (ATRA)-induced differentiation of HL-60 cells. The model describes reinforcing feedback between an ATRA-inducible signalsome complex involving many proteins including Vav1, a guanine nucleotide exchange factor, and the activation of the mitogen activated protein kinase (MAPK) cascade. We decomposed the effective model into three modules; a signal initiation module that sensed and transformed an ATRA signal into program activation signals; a signal integration module that controlled the expression of upstream transcription factors; and a phenotype module which encoded the expression of functional differentiation markers from the ATRA-inducible transcription factors. We identified an ensemble of effective model parameters using measurements taken from ATRA-induced HL-60 cells. Using these parameters, model analysis predicted that MAPK activation was bistable as a function of ATRA exposure. Conformational experiments supported ATRA-induced bistability. Additionally, the model captured intermediate and phenotypic gene expression data. Knockout analysis suggested Gfi-1 and PPARg were critical to the ATRAinduced differentiation program. These findings, combined with other literature evidence, suggested that reinforcing feedback is central to hyperactive signaling in a diversity of cell fate programs.

  13. Model modifications for simulation of flow through stratified rocks in eastern Ohio

    USGS Publications Warehouse

    Helgesen, J.O.; Razem, A.C.; Larson, S.P.

    1982-01-01

    A quasi three-dimensional groundwater flow model is being used as part of a study to determine impacts of coal-strip mining on local hydrologic systems. Modifications to the model were necessary to simulate local hydrologic conditions properly. Perched water tables required that the method of calculating vertical flow rate be changed. A head-dependent spring-discharge function and a head-dependent stream aquifer-interchange function were added to the program. Modifications were also made to allow recharge from precipitation to any layer. The modified program, data deck instructions, and sample input and output are presented. (USGS)

  14. The reduction, verification and interpretation of MAGSAT magnetic data over Canada

    NASA Technical Reports Server (NTRS)

    Coles, R. L. (Principal Investigator); Haines, G. V.; Vanbeek, G. J.; Walker, J. K.; Newitt, L. R.

    1982-01-01

    Consideration is being given to representing the magnetic field in the area 40 deg N to 83 deg N by means of functions in spherical coordinates. A solution to Laplace's equation for the magnetic potential over a restricted area was found, and programming and testing are currently being carried out. Magnetic anomaly modelling is proceeding. The program SPHERE, which was adapted to function correctly on the Cyber computer, is now operational, for deriving gravity and magnetic models in a spherical coordinate system.

  15. CHARMM: The Biomolecular Simulation Program

    PubMed Central

    Brooks, B.R.; Brooks, C.L.; MacKerell, A.D.; Nilsson, L.; Petrella, R.J.; Roux, B.; Won, Y.; Archontis, G.; Bartels, C.; Boresch, S.; Caflisch, A.; Caves, L.; Cui, Q.; Dinner, A.R.; Feig, M.; Fischer, S.; Gao, J.; Hodoscek, M.; Im, W.; Kuczera, K.; Lazaridis, T.; Ma, J.; Ovchinnikov, V.; Paci, E.; Pastor, R.W.; Post, C.B.; Pu, J.Z.; Schaefer, M.; Tidor, B.; Venable, R. M.; Woodcock, H. L.; Wu, X.; Yang, W.; York, D.M.; Karplus, M.

    2009-01-01

    CHARMM (Chemistry at HARvard Molecular Mechanics) is a highly versatile and widely used molecular simulation program. It has been developed over the last three decades with a primary focus on molecules of biological interest, including proteins, peptides, lipids, nucleic acids, carbohydrates and small molecule ligands, as they occur in solution, crystals, and membrane environments. For the study of such systems, the program provides a large suite of computational tools that include numerous conformational and path sampling methods, free energy estimators, molecular minimization, dynamics, and analysis techniques, and model-building capabilities. In addition, the CHARMM program is applicable to problems involving a much broader class of many-particle systems. Calculations with CHARMM can be performed using a number of different energy functions and models, from mixed quantum mechanical-molecular mechanical force fields, to all-atom classical potential energy functions with explicit solvent and various boundary conditions, to implicit solvent and membrane models. The program has been ported to numerous platforms in both serial and parallel architectures. This paper provides an overview of the program as it exists today with an emphasis on developments since the publication of the original CHARMM paper in 1983. PMID:19444816

  16. Multidisciplinary team functioning.

    PubMed

    Kovitz, K E; Dougan, P; Riese, R; Brummitt, J R

    1984-01-01

    This paper advocates the need to move beyond interdisciplinary team composition as a minimum criterion for multidisciplinary functioning in child abuse treatment. Recent developments within the field reflect the practice of shared professional responsibility for detection, case management and treatment. Adherence to this particular model for intervention requires cooperative service planning and implementation as task related functions. Implicitly, this model also carries the potential to incorporate the supportive functioning essential to effective group process. However, explicit attention to the dynamics and process of small groups has been neglected in prescriptive accounts of multidisciplinary child abuse team organization. The present paper therefore focuses upon the maintenance and enhancement aspects of multidisciplinary group functioning. First, the development and philosophy of service for the Alberta Children's Hospital Child Abuse Program are reviewed. Second, composition of the team, it's mandate for service, and the population it serves are briefly described. Third, the conceptual framework within which the program functions is outlined. Strategies for effective group functioning are presented and the difficulties encountered with this model are highlighted. Finally, recommendations are offered for planning and implementing a multidisciplinary child abuse team and for maintaining its effective group functioning.

  17. A class of stochastic optimization problems with one quadratic & several linear objective functions and extended portfolio selection model

    NASA Astrophysics Data System (ADS)

    Xu, Jiuping; Li, Jun

    2002-09-01

    In this paper a class of stochastic multiple-objective programming problems with one quadratic, several linear objective functions and linear constraints has been introduced. The former model is transformed into a deterministic multiple-objective nonlinear programming model by means of the introduction of random variables' expectation. The reference direction approach is used to deal with linear objectives and results in a linear parametric optimization formula with a single linear objective function. This objective function is combined with the quadratic function using the weighted sums. The quadratic problem is transformed into a linear (parametric) complementary problem, the basic formula for the proposed approach. The sufficient and necessary conditions for (properly, weakly) efficient solutions and some construction characteristics of (weakly) efficient solution sets are obtained. An interactive algorithm is proposed based on reference direction and weighted sums. Varying the parameter vector on the right-hand side of the model, the DM can freely search the efficient frontier with the model. An extended portfolio selection model is formed when liquidity is considered as another objective to be optimized besides expectation and risk. The interactive approach is illustrated with a practical example.

  18. System support documentation: IDIMS FUNCTION AMOEBA

    NASA Technical Reports Server (NTRS)

    Bryant, J.

    1982-01-01

    A listing is provided for AMOEBA, a clustering program based on a spatial-spectral model for image data. The program is fast and automatic (in the sense that no parameters are required), and classifies each picture element into classes which are determined internally. As an IDIMS function, no limit on the size of the image is imposed.

  19. mfpa: Extension of mfp using the ACD covariate transformation for enhanced parametric multivariable modeling.

    PubMed

    Royston, Patrick; Sauerbrei, Willi

    2016-01-01

    In a recent article, Royston (2015, Stata Journal 15: 275-291) introduced the approximate cumulative distribution (acd) transformation of a continuous covariate x as a route toward modeling a sigmoid relationship between x and an outcome variable. In this article, we extend the approach to multivariable modeling by modifying the standard Stata program mfp. The result is a new program, mfpa, that has all the features of mfp plus the ability to fit a new model for user-selected covariates that we call fp1( p 1 , p 2 ). The fp1( p 1 , p 2 ) model comprises the best-fitting combination of a dimension-one fractional polynomial (fp1) function of x and an fp1 function of acd ( x ). We describe a new model-selection algorithm called function-selection procedure with acd transformation, which uses significance testing to attempt to simplify an fp1( p 1 , p 2 ) model to a submodel, an fp1 or linear model in x or in acd ( x ). The function-selection procedure with acd transformation is related in concept to the fsp (fp function-selection procedure), which is an integral part of mfp and which is used to simplify a dimension-two (fp2) function. We describe the mfpa command and give univariable and multivariable examples with real data to demonstrate its use.

  20. The National Shipbuilding Research Program: Employee Involvement/Safety

    DTIC Science & Technology

    1990-06-01

    THE NATIONAL SHIPBUILDING RESEARCH PROGRAM Employee InvoIvement/Safety U.S. DEPARTMENT OF TRANSPORTATION Maritime Administration and U.S. NAVY in...to and sought assistance either directly or through the Program Manager or the MTC Safety Chair- man from individual members who had functional respon...carpenters in the Model Shop. The training the2. 3. 4. 5. program to be developed and taught by the SP-5 Team. (The employees in the Model Shop were selected

  1. A New Model of Governance: One University's Journey.

    PubMed

    Brooks, Beth A; Scanlan, Therese

    2015-01-01

    In the nearly 50 years, since the Medicare Program established funding for nursing education in the United States, there has been a steady migration away from hospital-controlled programs toward those which function as wholly owned subsidiaries within larger health care systems. Private sector health care organizations in particular are under increasing pressure to adapt at the risk of losing all of their funding. However, accomplishing this presents multiple challenges for today's nursing education programs in terms of their regulatory compliance, accreditation, autonomy, and, above all, governance model. The authors outline the journey toward, and specific challenges involved in creating, implementing and administering a new governance model, which sustains the overall mission and vision of the education institution while functioning seamlessly within a modern corporate health care system.

  2. Optimal investment in a portfolio of HIV prevention programs.

    PubMed

    Zaric, G S; Brandeau, M L

    2001-01-01

    In this article, the authors determine the optimal allocation of HIV prevention funds and investigate the impact of different allocation methods on health outcomes. The authors present a resource allocation model that can be used to determine the allocation of HIV prevention funds that maximizes quality-adjusted life years (or life years) gained or HIV infections averted in a population over a specified time horizon. They apply the model to determine the allocation of a limited budget among 3 types of HIV prevention programs in a population of injection drug users and nonusers: needle exchange programs, methadone maintenance treatment, and condom availability programs. For each prevention program, the authors estimate a production function that relates the amount invested to the associated change in risky behavior. The authors determine the optimal allocation of funds for both objective functions for a high-prevalence population and a low-prevalence population. They also consider the allocation of funds under several common rules of thumb that are used to allocate HIV prevention resources. It is shown that simpler allocation methods (e.g., allocation based on HIV incidence or notions of equity among population groups) may lead to alloctions that do not yield the maximum health benefit. The optimal allocation of HIV prevention funds in a population depends on HIV prevalence and incidence, the objective function, the production functions for the prevention programs, and other factors. Consideration of cost, equity, and social and political norms may be important when allocating HIV prevention funds. The model presented in this article can help decision makers determine the health consequences of different allocations of funds.

  3. Application-oriented programming model for sensor networks embedded in the human body.

    PubMed

    Barbosa, Talles M G de A; Sene, Iwens G; da Rocha, Adson F; Nascimento, Fransisco A de O; Carvalho, Hervaldo S; Camapum, Juliana F

    2006-01-01

    This work presents a new programming model for sensor networks embedded in the human body which is based on the concept of multi-programming application-oriented software. This model was conceived with a top-down approach of four layers and its main goal is to allow the healthcare professionals to program and to reconfigure the network locally or by the Internet. In order to evaluate this hypothesis, a benchmarking was executed in order to allow the assessment of the mean time spent in the programming of a multi-functional sensor node used for the measurement and transmission of the electrocardiogram.

  4. Automating Partial Period Bond Valuation with Excel's Day Counting Functions

    ERIC Educational Resources Information Center

    Vicknair, David; Spruell, James

    2009-01-01

    An Excel model for calculating the actual price of bonds under a 30 day/month, 360 day/year day counting assumption by nesting the DAYS360 function within the PV function is developed. When programmed into an Excel spreadsheet, the model can accommodate annual and semiannual payment bonds sold on or between interest dates using six fundamental…

  5. Methods and Models for the Construction of Weakly Parallel Tests. Research Report 90-4.

    ERIC Educational Resources Information Center

    Adema, Jos J.

    Methods are proposed for the construction of weakly parallel tests, that is, tests with the same test information function. A mathematical programing model for constructing tests with a prespecified test information function and a heuristic for assigning items to tests such that their information functions are equal play an important role in the…

  6. Cellular automata with object-oriented features for parallel molecular network modeling.

    PubMed

    Zhu, Hao; Wu, Yinghui; Huang, Sui; Sun, Yan; Dhar, Pawan

    2005-06-01

    Cellular automata are an important modeling paradigm for studying the dynamics of large, parallel systems composed of multiple, interacting components. However, to model biological systems, cellular automata need to be extended beyond the large-scale parallelism and intensive communication in order to capture two fundamental properties characteristic of complex biological systems: hierarchy and heterogeneity. This paper proposes extensions to a cellular automata language, Cellang, to meet this purpose. The extended language, with object-oriented features, can be used to describe the structure and activity of parallel molecular networks within cells. Capabilities of this new programming language include object structure to define molecular programs within a cell, floating-point data type and mathematical functions to perform quantitative computation, message passing capability to describe molecular interactions, as well as new operators, statements, and built-in functions. We discuss relevant programming issues of these features, including the object-oriented description of molecular interactions with molecule encapsulation, message passing, and the description of heterogeneity and anisotropy at the cell and molecule levels. By enabling the integration of modeling at the molecular level with system behavior at cell, tissue, organ, or even organism levels, the program will help improve our understanding of how complex and dynamic biological activities are generated and controlled by parallel functioning of molecular networks. Index Terms-Cellular automata, modeling, molecular network, object-oriented.

  7. Joint pricing and production management: a geometric programming approach with consideration of cubic production cost function

    NASA Astrophysics Data System (ADS)

    Sadjadi, Seyed Jafar; Hamidi Hesarsorkh, Aghil; Mohammadi, Mehdi; Bonyadi Naeini, Ali

    2015-06-01

    Coordination and harmony between different departments of a company can be an important factor in achieving competitive advantage if the company corrects alignment between strategies of different departments. This paper presents an integrated decision model based on recent advances of geometric programming technique. The demand of a product considers as a power function of factors such as product's price, marketing expenditures, and consumer service expenditures. Furthermore, production cost considers as a cubic power function of outputs. The model will be solved by recent advances in convex optimization tools. Finally, the solution procedure is illustrated by numerical example.

  8. Preparing New Teachers: Operating Successful Field Experience Programs.

    ERIC Educational Resources Information Center

    Slick, Gloria Appelt, Ed.

    This is the second in a series of four books presenting a variety of field experience program models and philosophies that drive the programs provided to preservice teachers during their undergraduate teacher preparation. This publication addresses the operational aspects of a successfully functioning field experience program and office. The…

  9. Modeling and prototyping of biometric systems using dataflow programming

    NASA Astrophysics Data System (ADS)

    Minakova, N.; Petrov, I.

    2018-01-01

    The development of biometric systems is one of the labor-intensive processes. Therefore, the creation and analysis of approaches and techniques is an urgent task at present. This article presents a technique of modeling and prototyping biometric systems based on dataflow programming. The technique includes three main stages: the development of functional blocks, the creation of a dataflow graph and the generation of a prototype. A specially developed software modeling environment that implements this technique is described. As an example of the use of this technique, an example of the implementation of the iris localization subsystem is demonstrated. A variant of modification of dataflow programming is suggested to solve the problem related to the undefined order of block activation. The main advantage of the presented technique is the ability to visually display and design the model of the biometric system, the rapid creation of a working prototype and the reuse of the previously developed functional blocks.

  10. Reciprocal Relations between Coalition Functioning and the Provision of Implementation Support

    PubMed Central

    Brown, Louis D.; Feinberg, Mark E.; Shapiro, Valerie B.; Greenberg, Mark T.

    2014-01-01

    Community coalitions have been promoted as a strategy to help overcome challenges to the dissemination and implementation of evidence-based prevention programs. This paper explores the characteristics of coalitions that enable the provision of implementation support for prevention programs in general, and for the implementation of evidence-based prevention programs with fidelity. Longitudinal cross-lagged panel models were used to study 74 Communities That Care (CTC) coalitions in Pennsylvania. These analyses provide evidence of a unidirectional influence of coalition functioning on the provision of implementation support. Coalition member knowledge of the CTC model best predicted the coalition’s provision of support for evidence-based program implementation with fidelity. Implications for developing and testing innovative methods for delivering training and technical assistance to enhance coalition member knowledge are discussed. PMID:24323363

  11. Deployment strategy for battery energy storage system in distribution network based on voltage violation regulation

    NASA Astrophysics Data System (ADS)

    Wu, H.; Zhou, L.; Xu, T.; Fang, W. L.; He, W. G.; Liu, H. M.

    2017-11-01

    In order to improve the situation of voltage violation caused by the grid-connection of photovoltaic (PV) system in a distribution network, a bi-level programming model is proposed for battery energy storage system (BESS) deployment. The objective function of inner level programming is to minimize voltage violation, with the power of PV and BESS as the variables. The objective function of outer level programming is to minimize the comprehensive function originated from inner layer programming and all the BESS operating parameters, with the capacity and rated power of BESS as the variables. The differential evolution (DE) algorithm is applied to solve the model. Based on distribution network operation scenarios with photovoltaic generation under multiple alternative output modes, the simulation results of IEEE 33-bus system prove that the deployment strategy of BESS proposed in this paper is well adapted to voltage violation regulation invariable distribution network operation scenarios. It contributes to regulating voltage violation in distribution network, as well as to improve the utilization of PV systems.

  12. The MSFC Program Control Development Program

    NASA Technical Reports Server (NTRS)

    1994-01-01

    It is the policy of the Marshall Space Flight Center (MSFC) that employees be given the opportunity to develop their individual skills and realize their full potential consistent with their selected career path and with the overall Center's needs and objectives. The MSFC Program Control Development Program has been designed to assist individuals who have selected Program Control or Program Analyst Program Control as a career path to achieve their ultimate career goals. Individuals selected to participate in the MSFC Program Control Development Program will be provided with development training in the various Program Control functional areas identified in the NASA Program Control Model. The purpose of the MSFC Program Control Development Program is to develop individual skills in the various Program Control functions by on-the-job and classroom instructional training on the various systems, tools, techniques, and processes utilized in these areas.

  13. Calculating the renormalisation group equations of a SUSY model with Susyno

    NASA Astrophysics Data System (ADS)

    Fonseca, Renato M.

    2012-10-01

    Susyno is a Mathematica package dedicated to the computation of the 2-loop renormalisation group equations of a supersymmetric model based on any gauge group (the only exception being multiple U(1) groups) and for any field content. Program summary Program title: Susyno Catalogue identifier: AEMX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 30829 No. of bytes in distributed program, including test data, etc.: 650170 Distribution format: tar.gz Programming language: Mathematica 7 or higher. Computer: All systems that Mathematica 7+ is available for (PC, Mac). Operating system: Any platform supporting Mathematica 7+ (Windows, Linux, Mac OS). Classification: 4.2, 5, 11.1. Nature of problem: Calculating the renormalisation group equations of a supersymmetric model involves using long and complicated general formulae [1, 2]. In addition, to apply them it is necessary to know the Lagrangian in its full form. Building the complete Lagrangian of models with small representations of SU(2) and SU(3) might be easy but in the general case of arbitrary representations of an arbitrary gauge group, this task can be hard, lengthy and error prone. Solution method: The Susyno package uses group theoretical functions to calculate the super-potential and the soft-SUSY-breaking Lagrangian of a supersymmetric model, and calculates the two-loop RGEs of the model using the general equations of [1, 2]. Susyno works for models based on any representation(s) of any gauge group (the only exception being multiple U(1) groups). Restrictions: As the program is based on the formalism of [1, 2], it shares its limitations. Running time can also be a significant restriction, in particular for models with many fields. Unusual features: Susyno contains functions that (a) calculate the Lagrangian of supersymmetric models and (b) calculate some group theoretical quantities. Some of these functions are available to the user and can be freely used. A built-in help system provides detailed information. Running time: Tests were made using a computer with an Intel Core i5 760 CPU, running under Ubuntu 11.04 and with Mathematica 8.0.1 installed. Using the option to suppress printing, the one- and two-loop beta functions of the MSSM were obtained in 2.5 s (NMSSM: 5.4 s). Note that the running time scales up very quickly with the total number of fields in the model. References: [1] S.P. Martin and M.T. Vaughn, Phys. Rev. D 50 (1994) 2282. [Erratum-ibid D 78 (2008) 039903] [arXiv:hep-ph/9311340]. [2] Y. Yamada, Phys. Rev. D 50 (1994) 3537 [arXiv:hep-ph/9401241].

  14. A dynamic model of functioning of a bank

    NASA Astrophysics Data System (ADS)

    Malafeyev, Oleg; Awasthi, Achal; Zaitseva, Irina; Rezenkov, Denis; Bogdanova, Svetlana

    2018-04-01

    In this paper, we analyze dynamic programming as a novel approach to solve the problem of maximizing the profits of a bank. The mathematical model of the problem and the description of bank's work is described in this paper. The problem is then approached using the method of dynamic programming. Dynamic programming makes sure that the solutions obtained are globally optimal and numerically stable. The optimization process is set up as a discrete multi-stage decision process and solved with the help of dynamic programming.

  15. AiGERM: A logic programming front end for GERM

    NASA Technical Reports Server (NTRS)

    Hashim, Safaa H.

    1990-01-01

    AiGerm (Artificially Intelligent Graphical Entity Relation Modeler) is a relational data base query and programming language front end for MCC (Mission Control Center)/STP's (Space Test Program) Germ (Graphical Entity Relational Modeling) system. It is intended as an add-on component of the Germ system to be used for navigating very large networks of information. It can also function as an expert system shell for prototyping knowledge-based systems. AiGerm provides an interface between the programming language and Germ.

  16. Research on an augmented Lagrangian penalty function algorithm for nonlinear programming

    NASA Technical Reports Server (NTRS)

    Frair, L.

    1978-01-01

    The augmented Lagrangian (ALAG) Penalty Function Algorithm for optimizing nonlinear mathematical models is discussed. The mathematical models of interest are deterministic in nature and finite dimensional optimization is assumed. A detailed review of penalty function techniques in general and the ALAG technique in particular is presented. Numerical experiments are conducted utilizing a number of nonlinear optimization problems to identify an efficient ALAG Penalty Function Technique for computer implementation.

  17. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, volume 2, part 1. Appendix A: Software documentation

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.

  18. Program for computer aided reliability estimation

    NASA Technical Reports Server (NTRS)

    Mathur, F. P. (Inventor)

    1972-01-01

    A computer program for estimating the reliability of self-repair and fault-tolerant systems with respect to selected system and mission parameters is presented. The computer program is capable of operation in an interactive conversational mode as well as in a batch mode and is characterized by maintenance of several general equations representative of basic redundancy schemes in an equation repository. Selected reliability functions applicable to any mathematical model formulated with the general equations, used singly or in combination with each other, are separately stored. One or more system and/or mission parameters may be designated as a variable. Data in the form of values for selected reliability functions is generated in a tabular or graphic format for each formulated model.

  19. Cache Locality Optimization for Recursive Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lifflander, Jonathan; Krishnamoorthy, Sriram

    We present an approach to optimize the cache locality for recursive programs by dynamically splicing--recursively interleaving--the execution of distinct function invocations. By utilizing data effect annotations, we identify concurrency and data reuse opportunities across function invocations and interleave them to reduce reuse distance. We present algorithms that efficiently track effects in recursive programs, detect interference and dependencies, and interleave execution of function invocations using user-level (non-kernel) lightweight threads. To enable multi-core execution, a program is parallelized using a nested fork/join programming model. Our cache optimization strategy is designed to work in the context of a random work stealing scheduler. Wemore » present an implementation using the MIT Cilk framework that demonstrates significant improvements in sequential and parallel performance, competitive with a state-of-the-art compile-time optimizer for loop programs and a domain- specific optimizer for stencil programs.« less

  20. Solving bi-level optimization problems in engineering design using kriging models

    NASA Astrophysics Data System (ADS)

    Xia, Yi; Liu, Xiaojie; Du, Gang

    2018-05-01

    Stackelberg game-theoretic approaches are applied extensively in engineering design to handle distributed collaboration decisions. Bi-level genetic algorithms (BLGAs) and response surfaces have been used to solve the corresponding bi-level programming models. However, the computational costs for BLGAs often increase rapidly with the complexity of lower-level programs, and optimal solution functions sometimes cannot be approximated by response surfaces. This article proposes a new method, namely the optimal solution function approximation by kriging model (OSFAKM), in which kriging models are used to approximate the optimal solution functions. A detailed example demonstrates that OSFAKM can obtain better solutions than BLGAs and response surface-based methods, and at the same time reduce the workload of computation remarkably. Five benchmark problems and a case study of the optimal design of a thin-walled pressure vessel are also presented to illustrate the feasibility and potential of the proposed method for bi-level optimization in engineering design.

  1. Discovering rules for protein-ligand specificity using support vector inductive logic programming.

    PubMed

    Kelley, Lawrence A; Shrimpton, Paul J; Muggleton, Stephen H; Sternberg, Michael J E

    2009-09-01

    Structural genomics initiatives are rapidly generating vast numbers of protein structures. Comparative modelling is also capable of producing accurate structural models for many protein sequences. However, for many of the known structures, functions are not yet determined, and in many modelling tasks, an accurate structural model does not necessarily tell us about function. Thus, there is a pressing need for high-throughput methods for determining function from structure. The spatial arrangement of key amino acids in a folded protein, on the surface or buried in clefts, is often the determinants of its biological function. A central aim of molecular biology is to understand the relationship between such substructures or surfaces and biological function, leading both to function prediction and to function design. We present a new general method for discovering the features of binding pockets that confer specificity for particular ligands. Using a recently developed machine-learning technique which couples the rule-discovery approach of inductive logic programming with the statistical learning power of support vector machines, we are able to discriminate, with high precision (90%) and recall (86%) between pockets that bind FAD and those that bind NAD on a large benchmark set given only the geometry and composition of the backbone of the binding pocket without the use of docking. In addition, we learn rules governing this specificity which can feed into protein functional design protocols. An analysis of the rules found suggests that key features of the binding pocket may be tied to conformational freedom in the ligand. The representation is sufficiently general to be applicable to any discriminatory binding problem. All programs and data sets are freely available to non-commercial users at http://www.sbg.bio.ic.ac.uk/svilp_ligand/.

  2. FeynArts model file for MSSM transition counterterms from DREG to DRED

    NASA Astrophysics Data System (ADS)

    Stöckinger, Dominik; Varšo, Philipp

    2012-02-01

    The FeynArts model file MSSMdreg2dred implements MSSM transition counterterms which can convert one-loop Green functions from dimensional regularization to dimensional reduction. They correspond to a slight extension of the well-known Martin/Vaughn counterterms, specialized to the MSSM, and can serve also as supersymmetry-restoring counterterms. The paper provides full analytic results for the counterterms and gives one- and two-loop usage examples. The model file can simplify combining MS¯-parton distribution functions with supersymmetric renormalization or avoiding the renormalization of ɛ-scalars in dimensional reduction. Program summaryProgram title:MSSMdreg2dred.mod Catalogue identifier: AEKR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKR_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: LGPL-License [1] No. of lines in distributed program, including test data, etc.: 7600 No. of bytes in distributed program, including test data, etc.: 197 629 Distribution format: tar.gz Programming language: Mathematica, FeynArts Computer: Any, capable of running Mathematica and FeynArts Operating system: Any, with running Mathematica, FeynArts installation Classification: 4.4, 5, 11.1 Subprograms used: Cat Id Title Reference ADOW_v1_0 FeynArts CPC 140 (2001) 418 Nature of problem: The computation of one-loop Feynman diagrams in the minimal supersymmetric standard model (MSSM) requires regularization. Two schemes, dimensional regularization and dimensional reduction are both common but have different pros and cons. In order to combine the advantages of both schemes one would like to easily convert existing results from one scheme into the other. Solution method: Finite counterterms are constructed which correspond precisely to the one-loop scheme differences for the MSSM. They are provided as a FeynArts [2] model file. Using this model file together with FeynArts, the (ultra-violet) regularization of any MSSM one-loop Green function is switched automatically from dimensional regularization to dimensional reduction. In particular the counterterms serve as supersymmetry-restoring counterterms for dimensional regularization. Restrictions: The counterterms are restricted to the one-loop level and the MSSM. Running time: A few seconds to generate typical Feynman graphs with FeynArts.

  3. A Model for Determining Information Diffusion in a Family Planning Program

    ERIC Educational Resources Information Center

    Jackson, Audrey R.

    1972-01-01

    Knowledge of the existence of birth control clinics is seen as a function of proximity to clinics, friendliness of neighborhood, and propensity to discuss birth control with neighbors. A conceptual model is developed to illustrate variables contributing to the diffusion of birth control information in a public health family planning program.…

  4. Nuclear Engine System Simulation (NESS) version 2.0

    NASA Technical Reports Server (NTRS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-01-01

    The topics are presented in viewgraph form and include the following; nuclear thermal propulsion (NTP) engine system analysis program development; nuclear thermal propulsion engine analysis capability requirements; team resources used to support NESS development; expanded liquid engine simulations (ELES) computer model; ELES verification examples; NESS program development evolution; past NTP ELES analysis code modifications and verifications; general NTP engine system features modeled by NESS; representative NTP expander, gas generator, and bleed engine system cycles modeled by NESS; NESS program overview; NESS program flow logic; enabler (NERVA type) nuclear thermal rocket engine; prismatic fuel elements and supports; reactor fuel and support element parameters; reactor parameters as a function of thrust level; internal shield sizing; and reactor thermal model.

  5. The Development of a Model for Estimating the Costs Associated with the Delivery of a Metals Cluster Program.

    ERIC Educational Resources Information Center

    Hunt, Charles R.

    A study developed a model to assist school administrators to estimate costs associated with the delivery of a metals cluster program at Norfolk State College, Virginia. It sought to construct the model so that costs could be explained as a function of enrollment levels. Data were collected through a literature review, computer searches of the…

  6. Evaluation of Markov-Decision Model for Instructional Sequence Optimization. Semi-Annual Technical Report for the period 1 July-31 December 1975. Technical Report No. 76.

    ERIC Educational Resources Information Center

    Wollmer, Richard D.; Bond, Nicholas A.

    Two computer-assisted instruction programs were written in electronics and trigonometry to test the Wollmer Markov Model for optimizing hierarchial learning; calibration samples totalling 110 students completed these programs. Since the model postulated that transfer effects would be a function of the amount of practice, half of the students were…

  7. i-SVOC -- A simulation program for indoor SVOCs (Version 1.0)

    EPA Science Inventory

    Program i-SVOC estimates the emissions, transport, and sorption of semivolatile organic compounds (SVOCs) in the indoor environment as functions of time when a series of initial conditions is given. This program implements a framework for dynamic modeling of indoor SVOCs develope...

  8. Software For Computing Reliability Of Other Software

    NASA Technical Reports Server (NTRS)

    Nikora, Allen; Antczak, Thomas M.; Lyu, Michael

    1995-01-01

    Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.

  9. Linearized Programming of Memristors for Artificial Neuro-Sensor Signal Processing

    PubMed Central

    Yang, Changju; Kim, Hyongsuk

    2016-01-01

    A linearized programming method of memristor-based neural weights is proposed. Memristor is known as an ideal element to implement a neural synapse due to its embedded functions of analog memory and analog multiplication. Its resistance variation with a voltage input is generally a nonlinear function of time. Linearization of memristance variation about time is very important for the easiness of memristor programming. In this paper, a method utilizing an anti-serial architecture for linear programming is proposed. The anti-serial architecture is composed of two memristors with opposite polarities. It linearizes the variation of memristance due to complimentary actions of two memristors. For programming a memristor, additional memristor with opposite polarity is employed. The linearization effect of weight programming of an anti-serial architecture is investigated and memristor bridge synapse which is built with two sets of anti-serial memristor architecture is taken as an application example of the proposed method. Simulations are performed with memristors of both linear drift model and nonlinear model. PMID:27548186

  10. Linearized Programming of Memristors for Artificial Neuro-Sensor Signal Processing.

    PubMed

    Yang, Changju; Kim, Hyongsuk

    2016-08-19

    A linearized programming method of memristor-based neural weights is proposed. Memristor is known as an ideal element to implement a neural synapse due to its embedded functions of analog memory and analog multiplication. Its resistance variation with a voltage input is generally a nonlinear function of time. Linearization of memristance variation about time is very important for the easiness of memristor programming. In this paper, a method utilizing an anti-serial architecture for linear programming is proposed. The anti-serial architecture is composed of two memristors with opposite polarities. It linearizes the variation of memristance due to complimentary actions of two memristors. For programming a memristor, additional memristor with opposite polarity is employed. The linearization effect of weight programming of an anti-serial architecture is investigated and memristor bridge synapse which is built with two sets of anti-serial memristor architecture is taken as an application example of the proposed method. Simulations are performed with memristors of both linear drift model and nonlinear model.

  11. Space-time modeling in EPA's Ecosystem Services Research Program

    EPA Science Inventory

    The US EPA is conducting a long-term research program on the effects of human actions on ecosystem services. Ecosystem services are defined in this program as “the products of ecological functions or processes that directly or indirectly contribute to human well-being.” Modelin...

  12. Towards aspect-oriented functional--structural plant modelling.

    PubMed

    Cieslak, Mikolaj; Seleznyova, Alla N; Prusinkiewicz, Przemyslaw; Hanan, Jim

    2011-10-01

    Functional-structural plant models (FSPMs) are used to integrate knowledge and test hypotheses of plant behaviour, and to aid in the development of decision support systems. A significant amount of effort is being put into providing a sound methodology for building them. Standard techniques, such as procedural or object-oriented programming, are not suited for clearly separating aspects of plant function that criss-cross between different components of plant structure, which makes it difficult to reuse and share their implementations. The aim of this paper is to present an aspect-oriented programming approach that helps to overcome this difficulty. The L-system-based plant modelling language L+C was used to develop an aspect-oriented approach to plant modelling based on multi-modules. Each element of the plant structure was represented by a sequence of L-system modules (rather than a single module), with each module representing an aspect of the element's function. Separate sets of productions were used for modelling each aspect, with context-sensitive rules facilitated by local lists of modules to consider/ignore. Aspect weaving or communication between aspects was made possible through the use of pseudo-L-systems, where the strict-predecessor of a production rule was specified as a multi-module. The new approach was used to integrate previously modelled aspects of carbon dynamics, apical dominance and biomechanics with a model of a developing kiwifruit shoot. These aspects were specified independently and their implementation was based on source code provided by the original authors without major changes. This new aspect-oriented approach to plant modelling is well suited for studying complex phenomena in plant science, because it can be used to integrate separate models of individual aspects of plant development and function, both previously constructed and new, into clearly organized, comprehensive FSPMs. In a future work, this approach could be further extended into an aspect-oriented programming language for FSPMs.

  13. PyCOOL — A Cosmological Object-Oriented Lattice code written in Python

    NASA Astrophysics Data System (ADS)

    Sainio, J.

    2012-04-01

    There are a number of different phenomena in the early universe that have to be studied numerically with lattice simulations. This paper presents a graphics processing unit (GPU) accelerated Python program called PyCOOL that solves the evolution of scalar fields in a lattice with very precise symplectic integrators. The program has been written with the intention to hit a sweet spot of speed, accuracy and user friendliness. This has been achieved by using the Python language with the PyCUDA interface to make a program that is easy to adapt to different scalar field models. In this paper we derive the symplectic dynamics that govern the evolution of the system and then present the implementation of the program in Python and PyCUDA. The functionality of the program is tested in a chaotic inflation preheating model, a single field oscillon case and in a supersymmetric curvaton model which leads to Q-ball production. We have also compared the performance of a consumer graphics card to a professional Tesla compute card in these simulations. We find that the program is not only accurate but also very fast. To further increase the usefulness of the program we have equipped it with numerous post-processing functions that provide useful information about the cosmological model. These include various spectra and statistics of the fields. The program can be additionally used to calculate the generated curvature perturbation. The program is publicly available under GNU General Public License at https://github.com/jtksai/PyCOOL. Some additional information can be found from http://www.physics.utu.fi/tiedostot/theory/particlecosmology/pycool/.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sainio, J., E-mail: jani.sainio@utu.fi; Department of Physics and Astronomy, University of Turku, FI-20014 Turku

    There are a number of different phenomena in the early universe that have to be studied numerically with lattice simulations. This paper presents a graphics processing unit (GPU) accelerated Python program called PyCOOL that solves the evolution of scalar fields in a lattice with very precise symplectic integrators. The program has been written with the intention to hit a sweet spot of speed, accuracy and user friendliness. This has been achieved by using the Python language with the PyCUDA interface to make a program that is easy to adapt to different scalar field models. In this paper we derive themore » symplectic dynamics that govern the evolution of the system and then present the implementation of the program in Python and PyCUDA. The functionality of the program is tested in a chaotic inflation preheating model, a single field oscillon case and in a supersymmetric curvaton model which leads to Q-ball production. We have also compared the performance of a consumer graphics card to a professional Tesla compute card in these simulations. We find that the program is not only accurate but also very fast. To further increase the usefulness of the program we have equipped it with numerous post-processing functions that provide useful information about the cosmological model. These include various spectra and statistics of the fields. The program can be additionally used to calculate the generated curvature perturbation. The program is publicly available under GNU General Public License at https://github.com/jtksai/PyCOOL. Some additional information can be found from http://www.physics.utu.fi/tiedostot/theory/particlecosmology/pycool/.« less

  15. Documenting AUTOGEN and APGEN Model Files

    NASA Technical Reports Server (NTRS)

    Gladden, Roy E.; Khanampompan, Teerapat; Fisher, Forest W.; DelGuericio, Chris c.

    2008-01-01

    A computer program called "autogen hypertext map generator" satisfies a need for documenting and assisting in visualization of, and navigation through, model files used in the AUTOGEN and APGEN software mentioned in the two immediately preceding articles. This program parses autogen script files, autogen model files, PERL scripts, and apgen activity-definition files and produces a hypertext map of the files to aid in the navigation of the model. This program also provides a facility for adding notes and descriptions, beyond what is in the source model represented by the hypertext map. Further, this program provides access to a summary of the model through variable, function, sub routine, activity and resource declarations as well as providing full access to the source model and source code. The use of the tool enables easy access to the declarations and the ability to traverse routines and calls while analyzing the model.

  16. Building Interactive Simulations in Web Pages without Programming.

    PubMed

    Mailen Kootsey, J; McAuley, Grant; Bernal, Julie

    2005-01-01

    A software system is described for building interactive simulations and other numerical calculations in Web pages. The system is based on a new Java-based software architecture named NumberLinX (NLX) that isolates each function required to build the simulation so that a library of reusable objects could be assembled. The NLX objects are integrated into a commercial Web design program for coding-free page construction. The model description is entered through a wizard-like utility program that also functions as a model editor. The complete system permits very rapid construction of interactive simulations without coding. A wide range of applications are possible with the system beyond interactive calculations, including remote data collection and processing and collaboration over a network.

  17. IRFK2D: a computer program for simulating intrinsic random functions of order k

    NASA Astrophysics Data System (ADS)

    Pardo-Igúzquiza, Eulogio; Dowd, Peter A.

    2003-07-01

    IRFK2D is an ANSI Fortran-77 program that generates realizations of an intrinsic function of order k (with k equal to 0, 1 or 2) with a permissible polynomial generalized covariance model. The realizations may be non-conditional or conditioned to the experimental data. The turning bands method is used to generate realizations in 2D and 3D from simulations of an intrinsic random function of order k along lines that span the 2D or 3D space. The program generates two output files, the first containing the simulated values and the second containing the theoretical generalized variogram for different directions together with the theoretical model. The experimental variogram is calculated from the simulated values while the theoretical variogram is the specified generalized covariance model. The generalized variogram is used to assess the quality of the simulation as measured by the extent to which the generalized covariance is reproduced by the simulation. The examples given in this paper indicate that IRFK2D is an efficient implementation of the methodology.

  18. Empirical valence bond models for reactive potential energy surfaces: a parallel multilevel genetic program approach.

    PubMed

    Bellucci, Michael A; Coker, David F

    2011-07-28

    We describe a new method for constructing empirical valence bond potential energy surfaces using a parallel multilevel genetic program (PMLGP). Genetic programs can be used to perform an efficient search through function space and parameter space to find the best functions and sets of parameters that fit energies obtained by ab initio electronic structure calculations. Building on the traditional genetic program approach, the PMLGP utilizes a hierarchy of genetic programming on two different levels. The lower level genetic programs are used to optimize coevolving populations in parallel while the higher level genetic program (HLGP) is used to optimize the genetic operator probabilities of the lower level genetic programs. The HLGP allows the algorithm to dynamically learn the mutation or combination of mutations that most effectively increase the fitness of the populations, causing a significant increase in the algorithm's accuracy and efficiency. The algorithm's accuracy and efficiency is tested against a standard parallel genetic program with a variety of one-dimensional test cases. Subsequently, the PMLGP is utilized to obtain an accurate empirical valence bond model for proton transfer in 3-hydroxy-gamma-pyrone in gas phase and protic solvent. © 2011 American Institute of Physics

  19. Data Programming: Creating Large Training Sets, Quickly.

    PubMed

    Ratner, Alexander; De Sa, Christopher; Wu, Sen; Selsam, Daniel; Ré, Christopher

    2016-12-01

    Large labeled training sets are the critical building blocks of supervised learning methods and are key enablers of deep learning techniques. For some applications, creating labeled training sets is the most time-consuming and expensive part of applying machine learning. We therefore propose a paradigm for the programmatic creation of training sets called data programming in which users express weak supervision strategies or domain heuristics as labeling functions , which are programs that label subsets of the data, but that are noisy and may conflict. We show that by explicitly representing this training set labeling process as a generative model, we can "denoise" the generated training set, and establish theoretically that we can recover the parameters of these generative models in a handful of settings. We then show how to modify a discriminative loss function to make it noise-aware, and demonstrate our method over a range of discriminative models including logistic regression and LSTMs. Experimentally, on the 2014 TAC-KBP Slot Filling challenge, we show that data programming would have led to a new winning score, and also show that applying data programming to an LSTM model leads to a TAC-KBP score almost 6 F1 points over a state-of-the-art LSTM baseline (and into second place in the competition). Additionally, in initial user studies we observed that data programming may be an easier way for non-experts to create machine learning models when training data is limited or unavailable.

  20. Data Programming: Creating Large Training Sets, Quickly

    PubMed Central

    Ratner, Alexander; De Sa, Christopher; Wu, Sen; Selsam, Daniel; Ré, Christopher

    2018-01-01

    Large labeled training sets are the critical building blocks of supervised learning methods and are key enablers of deep learning techniques. For some applications, creating labeled training sets is the most time-consuming and expensive part of applying machine learning. We therefore propose a paradigm for the programmatic creation of training sets called data programming in which users express weak supervision strategies or domain heuristics as labeling functions, which are programs that label subsets of the data, but that are noisy and may conflict. We show that by explicitly representing this training set labeling process as a generative model, we can “denoise” the generated training set, and establish theoretically that we can recover the parameters of these generative models in a handful of settings. We then show how to modify a discriminative loss function to make it noise-aware, and demonstrate our method over a range of discriminative models including logistic regression and LSTMs. Experimentally, on the 2014 TAC-KBP Slot Filling challenge, we show that data programming would have led to a new winning score, and also show that applying data programming to an LSTM model leads to a TAC-KBP score almost 6 F1 points over a state-of-the-art LSTM baseline (and into second place in the competition). Additionally, in initial user studies we observed that data programming may be an easier way for non-experts to create machine learning models when training data is limited or unavailable. PMID:29872252

  1. Modeling small-scale dairy farms in central Mexico using multi-criteria programming.

    PubMed

    Val-Arreola, D; Kebreab, E; France, J

    2006-05-01

    Milk supply from Mexican dairy farms does not meet demand and small-scale farms can contribute toward closing the gap. Two multi-criteria programming techniques, goal programming and compromise programming, were used in a study of small-scale dairy farms in central Mexico. To build the goal and compromise programming models, 4 ordinary linear programming models were also developed, which had objective functions to maximize metabolizable energy for milk production, to maximize margin of income over feed costs, to maximize metabolizable protein for milk production, and to minimize purchased feedstuffs. Neither multi-criteria approach was significantly better than the other; however, by applying both models it was possible to perform a more comprehensive analysis of these small-scale dairy systems. The multi-criteria programming models affirm findings from previous work and suggest that a forage strategy based on alfalfa, ryegrass, and corn silage would meet nutrient requirements of the herd. Both models suggested that there is an economic advantage in rescheduling the calving season to the second and third calendar quarters to better synchronize higher demand for nutrients with the period of high forage availability.

  2. An Ecological Model of the Coordinated School Health Program: A Commentary

    ERIC Educational Resources Information Center

    Fetro, Joyce V.

    2010-01-01

    In his article, Lohrmann clearly traces the evolution of school health programs from 3 traditional components to Allensworth and Kolbe's expanded concept comprehensive school health programs (CSHP) comprised of 8 interrelated and synergistic components that historically functioned independently in schools. With completion of "Health Is Academic,"…

  3. [Study on the 3D mathematical mode of the muscle groups applied to human mandible by a linear programming method].

    PubMed

    Wang, Dongmei; Yu, Liniu; Zhou, Xianlian; Wang, Chengtao

    2004-02-01

    Four types of 3D mathematical mode of the muscle groups applied to the human mandible have been developed. One is based on electromyography (EMG) and the others are based on linear programming with different objective function. Each model contains 26 muscle forces and two joint forces, allowing simulation of static bite forces and concomitant joint reaction forces for various bite point locations and mandibular positions. In this paper, the method of image processing to measure the position and direction of muscle forces according to 3D CAD model was built with CT data. Matlab optimization toolbox is applied to solve the three modes based on linear programming. Results show that the model with an objective function requiring a minimum sum of the tensions in the muscles is reasonable and agrees very well with the normal physiology activity.

  4. Study on multimodal transport route under low carbon background

    NASA Astrophysics Data System (ADS)

    Liu, Lele; Liu, Jie

    2018-06-01

    Low-carbon environmental protection is the focus of attention around the world, scientists are constantly researching on production of carbon emissions and living carbon emissions. However, there is little literature about multimodal transportation based on carbon emission at home and abroad. Firstly, this paper introduces the theory of multimodal transportation, the multimodal transport models that didn't consider carbon emissions and consider carbon emissions are analyzed. On this basis, a multi-objective programming 0-1 programming model with minimum total transportation cost and minimum total carbon emission is proposed. The idea of weight is applied to Ideal point method for solving problem, multi-objective programming is transformed into a single objective function. The optimal solution of carbon emission to transportation cost under different weights is determined by a single objective function with variable weights. Based on the model and algorithm, an example is given and the results are analyzed.

  5. Frail elderly patients. New model for integrated service delivery.

    PubMed Central

    Hébert, Rejean; Durand, Pierre J.; Dubuc, Nicole; Tourigny, André

    2003-01-01

    PROBLEM BEING ADDRESSED: Given the complex needs of frail older people and the multiplicity of care providers and services, care for this clientele lacks continuity. OBJECTIVE OF PROGRAM: Integrated service delivery (ISD) systems have been developed to improve continuity and increase the efficacy and efficiency of services. PROGRAM DESCRIPTION: The Program of Research to Integrate Services for the Maintenance of Autonomy (PRISMA) is an innovative ISD model based on coordination. It includes coordination between decision makers and managers of different organizations and services; a single entry point; a case-management process; individualized service plans; a single assessment instrument based on clients' functional autonomy, coupled with a case-mix classification system; and a computerized clinical chart for communicating between institutions and professionals for client monitoring. CONCLUSION: Preliminary results on the efficacy of this model showed a decreased incidence of functional decline, a decreased burden for caregivers, and a smaller proportion of older people wishing to enter institutions. PMID:12943358

  6. The Secret Agent Society Social Skills Program for Children with High-Functioning Autism Spectrum Disorders: A Comparison of Two School Variants

    ERIC Educational Resources Information Center

    Beaumont, Renae; Rotolone, Cassie; Sofronoff, Kate

    2015-01-01

    School is often considered an ideal setting for child social skills training due to the opportunities it provides for skills teaching, modeling, and practice. The current study evaluated the effectiveness of two variants of the Secret Agent Society social skills program for children with high-functioning autism spectrum disorders (HFASD) in a…

  7. Neural networks with multiple general neuron models: a hybrid computational intelligence approach using Genetic Programming.

    PubMed

    Barton, Alan J; Valdés, Julio J; Orchard, Robert

    2009-01-01

    Classical neural networks are composed of neurons whose nature is determined by a certain function (the neuron model), usually pre-specified. In this paper, a type of neural network (NN-GP) is presented in which: (i) each neuron may have its own neuron model in the form of a general function, (ii) any layout (i.e network interconnection) is possible, and (iii) no bias nodes or weights are associated to the connections, neurons or layers. The general functions associated to a neuron are learned by searching a function space. They are not provided a priori, but are rather built as part of an Evolutionary Computation process based on Genetic Programming. The resulting network solutions are evaluated based on a fitness measure, which may, for example, be based on classification or regression errors. Two real-world examples are presented to illustrate the promising behaviour on classification problems via construction of a low-dimensional representation of a high-dimensional parameter space associated to the set of all network solutions.

  8. PACE and the Medicare+Choice risk-adjusted payment model.

    PubMed

    Temkin-Greener, H; Meiners, M R; Gruenberg, L

    2001-01-01

    This paper investigates the impact of the Medicare principal inpatient diagnostic cost group (PIP-DCG) payment model on the Program of All-Inclusive Care for the Elderly (PACE). Currently, more than 6,000 Medicare beneficiaries who are nursing home certifiable receive care from PACE, a program poised for expansion under the Balanced Budget Act of 1997. Overall, our analysis suggests that the application of the PIP-DCG model to the PACE program would reduce Medicare payments to PACE, on average, by 38%. The PIP-DCG payment model bases its risk adjustment on inpatient diagnoses and does not capture adequately the risk of caring for a population with functional impairments.

  9. A new neural network model for solving random interval linear programming problems.

    PubMed

    Arjmandzadeh, Ziba; Safi, Mohammadreza; Nazemi, Alireza

    2017-05-01

    This paper presents a neural network model for solving random interval linear programming problems. The original problem involving random interval variable coefficients is first transformed into an equivalent convex second order cone programming problem. A neural network model is then constructed for solving the obtained convex second order cone problem. Employing Lyapunov function approach, it is also shown that the proposed neural network model is stable in the sense of Lyapunov and it is globally convergent to an exact satisfactory solution of the original problem. Several illustrative examples are solved in support of this technique. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. New experimental models of skin homeostasis and diseases.

    PubMed

    Larcher, F; Espada, J; Díaz-Ley, B; Jaén, P; Juarranz, A; Quintanilla, M

    2015-01-01

    Homeostasis, whose regulation at the molecular level is still poorly understood, is intimately related to the functions of epidermal stem cells. Five research groups have been brought together to work on new in vitro and in vivo skin models through the SkinModel-CM program, under the auspices of the Spanish Autonomous Community of Madrid. This project aims to analyze the functions of DNA methyltransferase 1, endoglin, and podoplanin in epidermal stem cell activity, homeostasis, and skin cancer. These new models include 3-dimensional organotypic cultures, immunodeficient skin-humanized mice, and genetically modified mice. Another aim of the program is to use skin-humanized mice to model dermatoses such as Gorlin syndrome and xeroderma pigmentosum in order to optimize new protocols for photodynamic therapy. Copyright © 2013 Elsevier España, S.L.U. and AEDV. All rights reserved.

  11. Development of a multihospital pharmacy quality assurance program.

    PubMed

    Hoffmann, R P; Ravin, R; Colaluca, D M; Gifford, R; Grimes, D; Grzegorczyk, R; Keown, F; Kuhr, F; McKay, R; Peyser, J; Ryan, R; Zalewski, C

    1980-07-01

    Seven community hospitals have worked cooperatively for 18 months to develop an initial hospital pharmacy quality assurance program. Auditing criteria were developed for nine service areas corresponding to the model program developed by the American Society of Hospital Pharmacists. Current plans are to implement and modify this program as required at each participating hospital. Follow-up programs will also be essential to a functional, ongoing program, and these will be developed in the future.

  12. Development of an Algorithm for Automatic Analysis of the Impedance Spectrum Based on a Measurement Model

    NASA Astrophysics Data System (ADS)

    Kobayashi, Kiyoshi; Suzuki, Tohru S.

    2018-03-01

    A new algorithm for the automatic estimation of an equivalent circuit and the subsequent parameter optimization is developed by combining the data-mining concept and complex least-squares method. In this algorithm, the program generates an initial equivalent-circuit model based on the sampling data and then attempts to optimize the parameters. The basic hypothesis is that the measured impedance spectrum can be reproduced by the sum of the partial-impedance spectra presented by the resistor, inductor, resistor connected in parallel to a capacitor, and resistor connected in parallel to an inductor. The adequacy of the model is determined by using a simple artificial-intelligence function, which is applied to the output function of the Levenberg-Marquardt module. From the iteration of model modifications, the program finds an adequate equivalent-circuit model without any user input to the equivalent-circuit model.

  13. User's manual for a parameter identification technique. [with options for model simulation for fixed input forcing functions and identification from wind tunnel and flight measurements

    NASA Technical Reports Server (NTRS)

    Kanning, G.

    1975-01-01

    A digital computer program written in FORTRAN is presented that implements the system identification theory for deterministic systems using input-output measurements. The user supplies programs simulating the mathematical model of the physical plant whose parameters are to be identified. The user may choose any one of three options. The first option allows for a complete model simulation for fixed input forcing functions. The second option identifies up to 36 parameters of the model from wind tunnel or flight measurements. The third option performs a sensitivity analysis for up to 36 parameters. The use of each option is illustrated with an example using input-output measurements for a helicopter rotor tested in a wind tunnel.

  14. Overview of the Space Launch System Transonic Buffet Environment Test Program

    NASA Technical Reports Server (NTRS)

    Piatak, David J.; Sekula, Martin K.; Rausch, Russ D.; Florance, James R.; Ivanco, Thomas G.

    2015-01-01

    Fluctuating aerodynamic loads are a significant concern for the structural design of a launch vehicle, particularly while traversing the transonic flight environment. At these trajectory conditions, unsteady aerodynamic pressures can excite the vehicle dynamic modes of vibration and result in high structural bending moments and vibratory environments. To ensure that vehicle structural components and subsystems possess adequate strength, stress, and fatigue margins in the presence of buffet and other environments, buffet forcing functions are required to conduct the coupled load analysis of the launch vehicle. The accepted method to obtain these buffet forcing functions is to perform wind-tunnel testing of a rigid model that is heavily instrumented with unsteady pressure transducers designed to measure the buffet environment within the desired frequency range. Two wind-tunnel tests of a 3 percent scale rigid buffet model have been conducted at the Langley Research Center Transonic Dynamics Tunnel (TDT) as part of the Space Launch System (SLS) buffet test program. The SLS buffet models have been instrumented with as many as 472 unsteady pressure transducers to resolve the buffet forcing functions of this multi-body configuration through integration of the individual pressure time histories. This paper will discuss test program development, instrumentation, data acquisition, test implementation, data analysis techniques, and several methods explored to mitigate high buffet environment encountered during the test program. Preliminary buffet environments will be presented and compared using normalized sectional buffet forcing function root-meansquared levels along the vehicle centerline.

  15. Multi-Objective Programming for Lot-Sizing with Quantity Discount

    NASA Astrophysics Data System (ADS)

    Kang, He-Yau; Lee, Amy H. I.; Lai, Chun-Mei; Kang, Mei-Sung

    2011-11-01

    Multi-objective programming (MOP) is one of the popular methods for decision making in a complex environment. In a MOP, decision makers try to optimize two or more objectives simultaneously under various constraints. A complete optimal solution seldom exists, and a Pareto-optimal solution is usually used. Some methods, such as the weighting method which assigns priorities to the objectives and sets aspiration levels for the objectives, are used to derive a compromise solution. The ɛ-constraint method is a modified weight method. One of the objective functions is optimized while the other objective functions are treated as constraints and are incorporated in the constraint part of the model. This research considers a stochastic lot-sizing problem with multi-suppliers and quantity discounts. The model is transformed into a mixed integer programming (MIP) model next based on the ɛ-constraint method. An illustrative example is used to illustrate the practicality of the proposed model. The results demonstrate that the model is an effective and accurate tool for determining the replenishment of a manufacturer from multiple suppliers for multi-periods.

  16. The UMO (University of Maine, Orono) Teacher Training Program: A Case Study and a Model.

    ERIC Educational Resources Information Center

    Miller, James R.; McNally, Harry

    This case study presents a model of the University of Maine, Orono, pre-service program for preparing secondary social studies teachers. Focus is on the Foundations Component and the Methods Component, either of which can function independently of the other. Only brief mention is made of either the Exploratory Field Experience Component or the…

  17. Investigation of surface fluctuating pressures on a 1/4 scale YC-14 upper surface blown flap model

    NASA Technical Reports Server (NTRS)

    Pappa, R. S.

    1979-01-01

    Fluctuating pressures were measured at 30 positions on the surface of a 1/4-scale YC-14 wing and fuselage model during an outdoor static testing program. These data were obtained as part of a NASA program to study the fluctuating loads imposed on STOL aircraft configurations and to further the understanding of the scaling laws of unsteady surface pressure fields. Fluctuating pressure data were recorded at several discrete engine thrust settings for each of 16 configurations of the model. These data were reduced using the technique of random data analysis to obtain auto-and cross-spectral density functions and coherence functions for frequencies from 0 to 10 kHz, and cross-correlation functions for time delays from 0 to 10.24 ms. Results of this program provide the following items of particular interest: (1) Good collapse of normalized PSD functions on the USB flap was found using a technique applied by Lilley and Hodgson to data from a laboratory wall-jet apparatus. (2) Results indicate that the fluctuating pressure loading on surfaces washed by the jet exhaust flow was dominated by hydrodynamic pressure variations, loading on surface well outside the flow region dominated by acoustic pressure variations, and loading near the flow boundaries from a mixture of the two.

  18. Radio-science performance analysis software

    NASA Astrophysics Data System (ADS)

    Morabito, D. D.; Asmar, S. W.

    1995-02-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  19. Radio-Science Performance Analysis Software

    NASA Astrophysics Data System (ADS)

    Morabito, D. D.; Asmar, S. W.

    1994-10-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussion on operating the program set on Galileo and Ulysses data will be presented.

  20. Radio-science performance analysis software

    NASA Technical Reports Server (NTRS)

    Morabito, D. D.; Asmar, S. W.

    1995-01-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  1. Occupant behavior models: A critical review of implementation and representation approaches in building performance simulation programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Chen, Yixing; Belafi, Zsofia

    Occupant behavior (OB) in buildings is a leading factor influencing energy use in buildings. Quantifying this influence requires the integration of OB models with building performance simulation (BPS). This study reviews approaches to representing and implementing OB models in today’s popular BPS programs, and discusses weaknesses and strengths of these approaches and key issues in integrating of OB models with BPS programs. Two of the key findings are: (1) a common data model is needed to standardize the representation of OB models, enabling their flexibility and exchange among BPS programs and user applications; the data model can be implemented usingmore » a standard syntax (e.g., in the form of XML schema), and (2) a modular software implementation of OB models, such as functional mock-up units for co-simulation, adopting the common data model, has advantages in providing a robust and interoperable integration with multiple BPS programs. Such common OB model representation and implementation approaches help standardize the input structures of OB models, enable collaborative development of a shared library of OB models, and allow for rapid and widespread integration of OB models with BPS programs to improve the simulation of occupant behavior and quantification of their impact on building performance.« less

  2. Occupant behavior models: A critical review of implementation and representation approaches in building performance simulation programs

    DOE PAGES

    Hong, Tianzhen; Chen, Yixing; Belafi, Zsofia; ...

    2017-07-27

    Occupant behavior (OB) in buildings is a leading factor influencing energy use in buildings. Quantifying this influence requires the integration of OB models with building performance simulation (BPS). This study reviews approaches to representing and implementing OB models in today’s popular BPS programs, and discusses weaknesses and strengths of these approaches and key issues in integrating of OB models with BPS programs. Two of the key findings are: (1) a common data model is needed to standardize the representation of OB models, enabling their flexibility and exchange among BPS programs and user applications; the data model can be implemented usingmore » a standard syntax (e.g., in the form of XML schema), and (2) a modular software implementation of OB models, such as functional mock-up units for co-simulation, adopting the common data model, has advantages in providing a robust and interoperable integration with multiple BPS programs. Such common OB model representation and implementation approaches help standardize the input structures of OB models, enable collaborative development of a shared library of OB models, and allow for rapid and widespread integration of OB models with BPS programs to improve the simulation of occupant behavior and quantification of their impact on building performance.« less

  3. Discovering Link Communities in Complex Networks by an Integer Programming Model and a Genetic Algorithm

    PubMed Central

    Li, Zhenping; Zhang, Xiang-Sun; Wang, Rui-Sheng; Liu, Hongwei; Zhang, Shihua

    2013-01-01

    Identification of communities in complex networks is an important topic and issue in many fields such as sociology, biology, and computer science. Communities are often defined as groups of related nodes or links that correspond to functional subunits in the corresponding complex systems. While most conventional approaches have focused on discovering communities of nodes, some recent studies start partitioning links to find overlapping communities straightforwardly. In this paper, we propose a new quantity function for link community identification in complex networks. Based on this quantity function we formulate the link community partition problem into an integer programming model which allows us to partition a complex network into overlapping communities. We further propose a genetic algorithm for link community detection which can partition a network into overlapping communities without knowing the number of communities. We test our model and algorithm on both artificial networks and real-world networks. The results demonstrate that the model and algorithm are efficient in detecting overlapping community structure in complex networks. PMID:24386268

  4. Plate/shell structure topology optimization of orthotropic material for buckling problem based on independent continuous topological variables

    NASA Astrophysics Data System (ADS)

    Ye, Hong-Ling; Wang, Wei-Wei; Chen, Ning; Sui, Yun-Kang

    2017-10-01

    The purpose of the present work is to study the buckling problem with plate/shell topology optimization of orthotropic material. A model of buckling topology optimization is established based on the independent, continuous, and mapping method, which considers structural mass as objective and buckling critical loads as constraints. Firstly, composite exponential function (CEF) and power function (PF) as filter functions are introduced to recognize the element mass, the element stiffness matrix, and the element geometric stiffness matrix. The filter functions of the orthotropic material stiffness are deduced. Then these filter functions are put into buckling topology optimization of a differential equation to analyze the design sensitivity. Furthermore, the buckling constraints are approximately expressed as explicit functions with respect to the design variables based on the first-order Taylor expansion. The objective function is standardized based on the second-order Taylor expansion. Therefore, the optimization model is translated into a quadratic program. Finally, the dual sequence quadratic programming (DSQP) algorithm and the global convergence method of moving asymptotes algorithm with two different filter functions (CEF and PF) are applied to solve the optimal model. Three numerical results show that DSQP&CEF has the best performance in the view of structural mass and discretion.

  5. Training the secretary in community mental health: a second model for integrating secretaries into the therapeutic team in community mental health.

    PubMed

    Nyman, G W; Watson, D; Schmidt, D; James, S E

    1975-01-01

    The secretaries in community mental health centers have functions that transcend their job descriptions. Their performance of these functions contributes to the success or failure of their centers' therapeutic programs. The Mental Health Training Institute of North Carolina initiated two separate pilot training programs within 1971-1972, aimed at heightening the secretaries' appreciation of their role within their centers and at facilitating their integration into the therapeutic team. This paper is a discussion of the second of these two programs.

  6. Integrated Electronic Warfare System Advanced Development Model (ADM); Appendix 1 - Functional Requirement Specification.

    DTIC Science & Technology

    1977-10-01

    APPROVED DATE FUNCTION APPROVED jDATE WRITER J . K-olanek 2/6/76 REVISIONS CHK DESCRIPTION REV CHK DESCRIPTION IREV REVISION jJ ~ ~ ~~~ _ II SHEET NO...DOCUMENT (CDBDD) 45 5.5 COMPUTER PROGRAM PACKAGE (CPP)- j 45 5.6 COMPUTER PROGRAM OPERATOR’S MANUAL (CPOM) 45 5.7 COMPUTER PROGRAM TEST PLAN (CPTPL) 45...I LIST OF FIGURES Number Page 1 JEWS Simplified Block Diagram 4 2 System Controller Architecture 5 SIZE CODE IDENT NO DRAWING NO. A 49956 SCALE REV J

  7. Reframing Resilience: Pilot Evaluation of a Program to Promote Resilience in Marginalized Older Adults

    ERIC Educational Resources Information Center

    Fullen, Matthew C.; Gorby, Sean R.

    2016-01-01

    Resilience has been described as a paradigm for aging that is more inclusive than models that focus on physiological and functional abilities. We evaluated a novel program, Resilient Aging, designed to influence marginalized older adults' perceptions of their resilience, self-efficacy, and wellness. The multiweek group program incorporated an…

  8. A Simulation of AI Programming Techniques in BASIC.

    ERIC Educational Resources Information Center

    Mandell, Alan

    1986-01-01

    Explains the functions of and the techniques employed in expert systems. Offers the program "The Periodic Table Expert," as a model for using artificial intelligence techniques in BASIC. Includes the program listing and directions for its use on: Tandy 1000, 1200, and 2000; IBM PC; PC Jr; TRS-80; and Apple computers. (ML)

  9. Wheat forecast economics effect study. [value of improved information on crop inventories, production, imports and exports

    NASA Technical Reports Server (NTRS)

    Mehra, R. K.; Rouhani, R.; Jones, S.; Schick, I.

    1980-01-01

    A model to assess the value of improved information regarding the inventories, productions, exports, and imports of crop on a worldwide basis is discussed. A previously proposed model is interpreted in a stochastic control setting and the underlying assumptions of the model are revealed. In solving the stochastic optimization problem, the Markov programming approach is much more powerful and exact as compared to the dynamic programming-simulation approach of the original model. The convergence of a dual variable Markov programming algorithm is shown to be fast and efficient. A computer program for the general model of multicountry-multiperiod is developed. As an example, the case of one country-two periods is treated and the results are presented in detail. A comparison with the original model results reveals certain interesting aspects of the algorithms and the dependence of the value of information on the incremental cost function.

  10. An Eight-Parameter Function for Simulating Model Rocket Engine Thrust Curves

    ERIC Educational Resources Information Center

    Dooling, Thomas A.

    2007-01-01

    The toy model rocket is used extensively as an example of a realistic physical system. Teachers from grade school to the university level use them. Many teachers and students write computer programs to investigate rocket physics since the problem involves nonlinear functions related to air resistance and mass loss. This paper describes a nonlinear…

  11. Water resources planning and management : A stochastic dual dynamic programming approach

    NASA Astrophysics Data System (ADS)

    Goor, Q.; Pinte, D.; Tilmant, A.

    2008-12-01

    Allocating water between different users and uses, including the environment, is one of the most challenging task facing water resources managers and has always been at the heart of Integrated Water Resources Management (IWRM). As water scarcity is expected to increase over time, allocation decisions among the different uses will have to be found taking into account the complex interactions between water and the economy. Hydro-economic optimization models can capture those interactions while prescribing efficient allocation policies. Many hydro-economic models found in the literature are formulated as large-scale non linear optimization problems (NLP), seeking to maximize net benefits from the system operation while meeting operational and/or institutional constraints, and describing the main hydrological processes. However, those models rarely incorporate the uncertainty inherent to the availability of water, essentially because of the computational difficulties associated stochastic formulations. The purpose of this presentation is to present a stochastic programming model that can identify economically efficient allocation policies in large-scale multipurpose multireservoir systems. The model is based on stochastic dual dynamic programming (SDDP), an extension of traditional SDP that is not affected by the curse of dimensionality. SDDP identify efficient allocation policies while considering the hydrologic uncertainty. The objective function includes the net benefits from the hydropower and irrigation sectors, as well as penalties for not meeting operational and/or institutional constraints. To be able to implement the efficient decomposition scheme that remove the computational burden, the one-stage SDDP problem has to be a linear program. Recent developments improve the representation of the non-linear and mildly non- convex hydropower function through a convex hull approximation of the true hydropower function. This model is illustrated on a cascade of 14 reservoirs on the Nile river basin.

  12. A fortran program for Monte Carlo simulation of oil-field discovery sequences

    USGS Publications Warehouse

    Bohling, Geoffrey C.; Davis, J.C.

    1993-01-01

    We have developed a program for performing Monte Carlo simulation of oil-field discovery histories. A synthetic parent population of fields is generated as a finite sample from a distribution of specified form. The discovery sequence then is simulated by sampling without replacement from this parent population in accordance with a probabilistic discovery process model. The program computes a chi-squared deviation between synthetic and actual discovery sequences as a function of the parameters of the discovery process model, the number of fields in the parent population, and the distributional parameters of the parent population. The program employs the three-parameter log gamma model for the distribution of field sizes and employs a two-parameter discovery process model, allowing the simulation of a wide range of scenarios. ?? 1993.

  13. Fuzzy Energy and Reserve Co-optimization With High Penetration of Renewable Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Cong; Botterud, Audun; Zhou, Zhi

    In this study, we propose a fuzzy-based energy and reserve co-optimization model with consideration of high penetration of renewable energy. Under the assumption of a fixed uncertainty set of renewables, a two-stage robust model is proposed for clearing energy and reserves in the first stage and checking the feasibility and robustness of re-dispatches in the second stage. Fuzzy sets and their membership functions are introduced into the optimization model to represent the satisfaction degree of the variable uncertainty sets. The lower bound of the uncertainty set is expressed as fuzzy membership functions. The solutions are obtained by transforming the fuzzymore » mathematical programming formulation into traditional mixed integer linear programming problems.« less

  14. Fuzzy Energy and Reserve Co-optimization With High Penetration of Renewable Energy

    DOE PAGES

    Liu, Cong; Botterud, Audun; Zhou, Zhi; ...

    2016-10-21

    In this study, we propose a fuzzy-based energy and reserve co-optimization model with consideration of high penetration of renewable energy. Under the assumption of a fixed uncertainty set of renewables, a two-stage robust model is proposed for clearing energy and reserves in the first stage and checking the feasibility and robustness of re-dispatches in the second stage. Fuzzy sets and their membership functions are introduced into the optimization model to represent the satisfaction degree of the variable uncertainty sets. The lower bound of the uncertainty set is expressed as fuzzy membership functions. The solutions are obtained by transforming the fuzzymore » mathematical programming formulation into traditional mixed integer linear programming problems.« less

  15. Profiler - A Fast and Versatile New Program for Decomposing Galaxy Light Profiles

    NASA Astrophysics Data System (ADS)

    Ciambur, Bogdan C.

    2016-12-01

    I introduce Profiler, a user-friendly program designed to analyse the radial surface brightness profiles of galaxies. With an intuitive graphical user interface, Profiler can accurately model galaxies of a broad range of morphological types, with various parametric functions routinely employed in the field (Sérsic, core-Sérsic, exponential, Gaussian, Moffat, and Ferrers). In addition to these, Profiler can employ the broken exponential model for disc truncations or anti-truncations, and two special cases of the edge-on disc model: along the disc's major or minor axis. The convolution of (circular or elliptical) models with the point spread function is performed in 2D, and offers a choice between Gaussian, Moffat or a user-provided profile for the point spread function. Profiler is optimised to work with galaxy light profiles obtained from isophotal measurements, which allow for radial gradients in the geometric parameters of the isophotes, and are thus often better at capturing the total light than 2D image-fitting programs. Additionally, the 1D approach is generally less computationally expensive and more stable. I demonstrate Profiler's features by decomposing three case-study galaxies: the cored elliptical galaxy NGC 3348, the nucleated dwarf Seyfert I galaxy Pox 52, and NGC 2549, a double-barred galaxy with an edge-on, truncated disc.

  16. Quality assessment of protein model-structures based on structural and functional similarities.

    PubMed

    Konopka, Bogumil M; Nebel, Jean-Christophe; Kotulska, Malgorzata

    2012-09-21

    Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. GOBA--Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best result for two targets of CASP8, and one of CASP9, compared to the contest participants. Consequently, GOBA offers a novel single model quality assessment program that addresses the practical needs of biologists. In conjunction with other Model Quality Assessment Programs (MQAPs), it would prove useful for the evaluation of single protein models.

  17. Turbulence simulation mechanization for Space Shuttle Orbiter dynamics and control studies

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; King, R. L.

    1977-01-01

    The current version of the NASA turbulent simulation model in the form of a digital computer program, TBMOD, is described. The logic of the program is discussed and all inputs and outputs are defined. An alternate method of shear simulation suitable for incorporation into the model is presented. The simulation is based on a von Karman spectrum and the assumption of isotropy. The resulting spectral density functions for the shear model are included.

  18. The Routine Fitting of Kinetic Data to Models

    PubMed Central

    Berman, Mones; Shahn, Ezra; Weiss, Marjory F.

    1962-01-01

    A mathematical formalism is presented for use with digital computers to permit the routine fitting of data to physical and mathematical models. Given a set of data, the mathematical equations describing a model, initial conditions for an experiment, and initial estimates for the values of model parameters, the computer program automatically proceeds to obtain a least squares fit of the data by an iterative adjustment of the values of the parameters. When the experimental measures are linear combinations of functions, the linear coefficients for a least squares fit may also be calculated. The values of both the parameters of the model and the coefficients for the sum of functions may be unknown independent variables, unknown dependent variables, or known constants. In the case of dependence, only linear dependencies are provided for in routine use. The computer program includes a number of subroutines, each one of which performs a special task. This permits flexibility in choosing various types of solutions and procedures. One subroutine, for example, handles linear differential equations, another, special non-linear functions, etc. The use of analytic or numerical solutions of equations is possible. PMID:13867975

  19. Program Synthesizes UML Sequence Diagrams

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2006-01-01

    A computer program called "Rational Sequence" generates Universal Modeling Language (UML) sequence diagrams of a target Java program running on a Java virtual machine (JVM). Rational Sequence thereby performs a reverse engineering function that aids in the design documentation of the target Java program. Whereas previously, the construction of sequence diagrams was a tedious manual process, Rational Sequence generates UML sequence diagrams automatically from the running Java code.

  20. 75 FR 42760 - Statement of Organization, Functions, and Delegations of Authority

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-22

    ... accounting reports and invoices, and monitoring all spending. The Team develops, defends and executes the... results; performance measurement; research and evaluation methodologies; demonstration testing and model... ACF programs; strategic planning; performance measurement; program and policy evaluation; research and...

  1. Propulsive Reaction Control System Model

    NASA Technical Reports Server (NTRS)

    Brugarolas, Paul; Phan, Linh H.; Serricchio, Frederick; San Martin, Alejandro M.

    2011-01-01

    This software models a propulsive reaction control system (RCS) for guidance, navigation, and control simulation purposes. The model includes the drive electronics, the electromechanical valve dynamics, the combustion dynamics, and thrust. This innovation follows the Mars Science Laboratory entry reaction control system design, and has been created to meet the Mars Science Laboratory (MSL) entry, descent, and landing simulation needs. It has been built to be plug-and-play on multiple MSL testbeds [analysis, Monte Carlo, flight software development, hardware-in-the-loop, and ATLO (assembly, test and launch operations) testbeds]. This RCS model is a C language program. It contains two main functions: the RCS electronics model function that models the RCS FPGA (field-programmable-gate-array) processing and commanding of the RCS valve, and the RCS dynamic model function that models the valve and combustion dynamics. In addition, this software provides support functions to initialize the model states, set parameters, access model telemetry, and access calculated thruster forces.

  2. Design Through Manufacturing: The Solid Model - Finite Element Analysis Interface

    NASA Technical Reports Server (NTRS)

    Rubin, Carol

    2003-01-01

    State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts which reflect every detail of the finished product. Ideally, these models should fulfill two very important functions: (1) they must provide numerical control information for automated manufacturing of precision parts, and (2) they must enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in space missions. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. The research performed during the fellowship period investigated the transition process from the solid CAD model to the FEA stress analysis model with the final goal of creating an automatic interface between the two. During the period of the fellowship a detailed multi-year program for the development of such an interface was created. The ultimate goal of this program will be the development of a fully parameterized automatic ProE/FEA translator for parts and assemblies, with the incorporation of data base management into the solution, and ultimately including computational fluid dynamics and thermal modeling in the interface.

  3. Toward a New Model for Promoting Urban Children's Mental Health: Accessible, Effective, and Sustainable School-Based Mental Health Services

    ERIC Educational Resources Information Center

    Atkins, Marc S.; Graczyk, Patricia A.; Frazier, Stacy L.; Abdul-Adil, Jaleel

    2003-01-01

    A program of research related to school-based models for urban children's mental health is described, with a particular focus on improving access to services, promoting children's functioning, and providing for program sustainability. The first study in this series responded to the urgent need to engage more families in mental health services, and…

  4. AITRAC: Augmented Interactive Transient Radiation Analysis by Computer. User's information manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1977-10-01

    AITRAC is a program designed for on-line, interactive, DC, and transient analysis of electronic circuits. The program solves linear and nonlinear simultaneous equations which characterize the mathematical models used to predict circuit response. The program features 100 external node--200 branch capability; conversional, free-format input language; built-in junction, FET, MOS, and switch models; sparse matrix algorithm with extended-precision H matrix and T vector calculations, for fast and accurate execution; linear transconductances: beta, GM, MU, ZM; accurate and fast radiation effects analysis; special interface for user-defined equations; selective control of multiple outputs; graphical outputs in wide and narrow formats; and on-line parametermore » modification capability. The user describes the problem by entering the circuit topology and part parameters. The program then automatically generates and solves the circuit equations, providing the user with printed or plotted output. The circuit topology and/or part values may then be changed by the user, and a new analysis, requested. Circuit descriptions may be saved on disk files for storage and later use. The program contains built-in standard models for resistors, voltage and current sources, capacitors, inductors including mutual couplings, switches, junction diodes and transistors, FETS, and MOS devices. Nonstandard models may be constructed from standard models or by using the special equations interface. Time functions may be described by straight-line segments or by sine, damped sine, and exponential functions. 42 figures, 1 table. (RWR)« less

  5. On 3-D inelastic analysis methods for hot section components. Volume 1: Special finite element models

    NASA Technical Reports Server (NTRS)

    Nakazawa, S.

    1987-01-01

    This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Section Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of new computer codes that permit more accurate and efficient three-dimensional analysis of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components. This report is presented in two volumes. Volume 1 describes effort performed under Task 4B, Special Finite Element Special Function Models, while Volume 2 concentrates on Task 4C, Advanced Special Functions Models.

  6. Policy Gradient Adaptive Dynamic Programming for Data-Based Optimal Control.

    PubMed

    Luo, Biao; Liu, Derong; Wu, Huai-Ning; Wang, Ding; Lewis, Frank L

    2017-10-01

    The model-free optimal control problem of general discrete-time nonlinear systems is considered in this paper, and a data-based policy gradient adaptive dynamic programming (PGADP) algorithm is developed to design an adaptive optimal controller method. By using offline and online data rather than the mathematical system model, the PGADP algorithm improves control policy with a gradient descent scheme. The convergence of the PGADP algorithm is proved by demonstrating that the constructed Q -function sequence converges to the optimal Q -function. Based on the PGADP algorithm, the adaptive control method is developed with an actor-critic structure and the method of weighted residuals. Its convergence properties are analyzed, where the approximate Q -function converges to its optimum. Computer simulation results demonstrate the effectiveness of the PGADP-based adaptive control method.

  7. Public Relations for Brazilian Libraries: Process, Principles, Program Planning, Planning Techniques and Suggestions.

    ERIC Educational Resources Information Center

    Kies, Cosette N.

    A brief overview of the functions of public relations in libraries introduces this manual, which provides an explanation of the public relations (PR) process, including fact-finding, planning, communicating, evaluating, and marketing; some PR principles; a 10-step program that could serve as a model for planning a PR program; a discussion of PR…

  8. The rehabilitation enhancing aging through connected health (REACH) study: study protocol for a quasi-experimental clinical trial.

    PubMed

    Ni, Meng; Brown, Lorna G; Lawler, Danielle; Ellis, Terry D; Deangelis, Tamara; Latham, Nancy K; Perloff, Jennifer; Atlas, Steve J; Percac-Lima, Sanja; Bean, Jonathan F

    2017-09-20

    Mobility limitations among older adults increase the risk for disability and healthcare utilization. Rehabilitative care is identified as the most efficacious treatment for maintaining physical function. However, there is insufficient evidence identifying a healthcare model that targets prevention of mobility decline among older adults. The objective of this study is to evaluate the preliminary effectiveness of a physical therapy program, augmented with mobile tele-health technology, on mobility function and healthcare utilization among older adults. This is a quasi-experimental 12-month clinical trial conducted within a metropolitan-based healthcare system in the northeastern United States. It is in parallel with an existing longitudinal cohort study evaluating mobility decline among community-dwelling older adult primary care patients over one year. Seventy-five older adults (≥ 65-95 years) are being recruited using identical inclusion/exclusion criteria to the cohort study. Three aims will be evaluated: the effect of our program on 1) physical function, 2) healthcare utilization, and 3) healthcare costs. Changes in patient-reported function over 1 year in those receiving the intervention (aim 1) will be compared to propensity score matched controls (N = 150) from the cohort study. For aims 2 and 3, propensity scores, derived from logistic regression model that includes demographic and diagnostic information available through claims and enrollment information, will be used to match treatment and control patients in a ratio of 1:2 or 1:3 from a Medicare Claims Registry derived from the same geographic region. The intervention consists of a one-year physical therapy program that is divided between a combination of outpatient and home visits (6-10 total visits) and is augmented on a computerized tablet using of a commercially available application to deliver a progressive home-based exercise program emphasizing lower-extremity function and a walking program. Incorporating mobile health into current healthcare models of rehabilitative care has the potential to decrease hospital visits and provide a longer duration of care. If the hypotheses are supported and demonstrate improved mobility and reduced healthcare utilization, this innovative care model would be applicable for optimizing the maintenance of functional independence among community-dwelling older adults. ClinicalTrial.gov Identifier: NCT02580409 (Date of registration October 14, 2015).

  9. Enabling Data Fusion via a Common Data Model and Programming Interface

    NASA Astrophysics Data System (ADS)

    Lindholm, D. M.; Wilson, A.

    2011-12-01

    Much progress has been made in scientific data interoperability, especially in the areas of metadata and discovery. However, while a data user may have improved techniques for finding data, there is often a large chasm to span when it comes to acquiring the desired subsets of various datasets and integrating them into a data processing environment. Some tools such as OPeNDAP servers and the Unidata Common Data Model (CDM) have introduced improved abstractions for accessing data via a common interface, but they alone do not go far enough to enable fusion of data from multidisciplinary sources. Although data from various scientific disciplines may represent semantically similar concepts (e.g. time series), the user may face widely varying structural representations of the data (e.g. row versus column oriented), not to mention radically different storage formats. It is not enough to convert data to a common format. The key to fusing scientific data is to represent each dataset with consistent sampling. This can best be done by using a data model that expresses the functional relationship that each dataset represents. The domain of those functions determines how the data can be combined. The Visualization for Algorithm Development (VisAD) Java API has provided a sophisticated data model for representing the functional nature of scientific datasets for well over a decade. Because VisAD is largely designed for its visualization capabilities, the data model can be cumbersome to use for numerical computation, especially for those not comfortable with Java. Although both VisAD and the implementation of the CDM are written in Java, neither defines a pure Java interface that others could implement and program to, further limiting potential for interoperability. In this talk, we will present a solution for data integration based on a simple discipline-agnostic scientific data model and programming interface that enables a dataset to be defined in terms of three variable types: Scalar (a), Tuple (a,b), and Function (a -> b). These basic building blocks can be combined and nested to represent any arbitrarily complex dataset. For example, a time series of surface temperature and pressure could be represented as: time -> ((lon,lat) -> (T,P)). Our data model is expressed in UML and can be implemented in numerous programming languages. We will demonstrate an implementation of our data model and interface using the Scala programming language. Given its functional programming constructs, sophisticated type system, and other language features, Scala enables us to construct complex data structures that can be manipulated using natural mathematical expressions while taking advantage of the language's ability to operate on collections in parallel. This API will be applied to the problem of assimilating various measurements of the solar spectrum and other proxies from multiple sources to construct a composite Lyman-alpha irradiance dataset.

  10. On 3D inelastic analysis methods for hot section components

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Chen, P. C.; Dame, L. T.; Holt, R. V.; Huang, H.; Hartle, M.; Gellin, S.; Allen, D. H.; Haisler, W. E.

    1986-01-01

    Accomplishments are described for the 2-year program, to develop advanced 3-D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades and vanes. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulations models were developed; an eight-noded mid-surface shell element, a nine-noded mid-surface shell element and a twenty-noded isoparametric solid element. A separate computer program was developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.

  11. The 3D inelastic analysis methods for hot section components

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Maffeo, R. J.; Tipton, M. T.; Weber, G.

    1992-01-01

    A two-year program to develop advanced 3D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades, and vanes is described. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulation models were developed: an eight-noded midsurface shell element; a nine-noded midsurface shell element; and a twenty-noded isoparametric solid element. A separate computer program has been developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.

  12. Manual of phosphoric acid fuel cell power plant optimization model and computer program

    NASA Technical Reports Server (NTRS)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    An optimized cost and performance model for a phosphoric acid fuel cell power plant system was derived and developed into a modular FORTRAN computer code. Cost, energy, mass, and electrochemical analyses were combined to develop a mathematical model for optimizing the steam to methane ratio in the reformer, hydrogen utilization in the PAFC plates per stack. The nonlinear programming code, COMPUTE, was used to solve this model, in which the method of mixed penalty function combined with Hooke and Jeeves pattern search was chosen to evaluate this specific optimization problem.

  13. New Careers in Nursing: An Effective Model for Increasing Nursing Workforce Diversity.

    PubMed

    Craft-Blacksheare, Melva

    2018-03-01

    The Robert Wood Johnson Foundation and the American Association of Colleges of Nursing developed the New Careers in Nursing (NCIN) program to address the nursing shortage, increase workforce diversity, and raise the profession's educational level. The program provided scholarships to second-degree underrepresented or economically disadvantaged (UED) students attending an accelerated nursing program to earn a Bachelor of Science in Nursing degree. A midwestern university received three academic-year cycles of NCIN funding. The program's model, resources, and functioning are described. The NCIN provided exceptional financial and program support that received high marks from participants. During the three award cycles, 20 UED scholars graduated with a Bachelor of Science in Nursing degree. Nineteen of the 20 scholars passed the NCLEX-RN on the first attempt. While the NCIN program has ended, nursing school administrators and faculty wishing to promote UED student success should consider using the program's model and resources as the basis for their own program. [J Nurs Educ. 2018;57(3):178-183.]. Copyright 2018, SLACK Incorporated.

  14. Neutron Scattering Software

    Science.gov Websites

    Array Manipulation Program (LAMP): IDL-based data analysis and visualization Open Genie: interactive -ray powder data ORTEP: Oak Ridge Thermal Ellipsoid Plot program for crystal structure illustrations structure VRML generator aClimax: modeling of inelastic neutron spectroscopy using Density Functional Theory

  15. Improved Evolutionary Programming with Various Crossover Techniques for Optimal Power Flow Problem

    NASA Astrophysics Data System (ADS)

    Tangpatiphan, Kritsana; Yokoyama, Akihiko

    This paper presents an Improved Evolutionary Programming (IEP) for solving the Optimal Power Flow (OPF) problem, which is considered as a non-linear, non-smooth, and multimodal optimization problem in power system operation. The total generator fuel cost is regarded as an objective function to be minimized. The proposed method is an Evolutionary Programming (EP)-based algorithm with making use of various crossover techniques, normally applied in Real Coded Genetic Algorithm (RCGA). The effectiveness of the proposed approach is investigated on the IEEE 30-bus system with three different types of fuel cost functions; namely the quadratic cost curve, the piecewise quadratic cost curve, and the quadratic cost curve superimposed by sine component. These three cost curves represent the generator fuel cost functions with a simplified model and more accurate models of a combined-cycle generating unit and a thermal unit with value-point loading effect respectively. The OPF solutions by the proposed method and Pure Evolutionary Programming (PEP) are observed and compared. The simulation results indicate that IEP requires less computing time than PEP with better solutions in some cases. Moreover, the influences of important IEP parameters on the OPF solution are described in details.

  16. How does the motor relearning program improve neurological function of brain ischemia monkeys?☆

    PubMed Central

    Yin, Yong; Gu, Zhen; Pan, Lei; Gan, Lu; Qin, Dongdong; Yang, Bo; Guo, Jin; Hu, Xintian; Wang, Tinghua; Feng, Zhongtang

    2013-01-01

    The motor relearning program can significantly improve various functional disturbance induced by ischemic cerebrovascular diseases. However, its mechanism of action remains poorly understood. In injured brain tissues, glial fibrillary acidic protein and neurofilament protein changes can reflect the condition of injured neurons and astrocytes, while vascular endothelial growth factor and basic fibroblast growth factor changes can indicate angiogenesis. In the present study, we induced ischemic brain injury in the rhesus macaque by electrocoagulation of the M1 segment of the right middle cerebral artery. The motor relearning program was conducted for 60 days from the third day after model establishment. Immunohistochemistry and single-photon emission CT showed that the numbers of glial fibrillary acidic protein-, neurofilament protein-, vascular endothelial growth factor- and basic fibroblast growth factor-positive cells were significantly increased in the infarcted side compared with the contralateral hemisphere following the motor relearning program. Moreover, cerebral blood flow in the infarcted side was significantly improved. The clinical rating scale for stroke was used to assess neurological function changes in the rhesus macaque following the motor relearning program. Results showed that motor function was improved, and problems with consciousness, self-care ability and balance function were significantly ameliorated. These findings indicate that the motor relearning program significantly promoted neuronal regeneration, repair and angiogenesis in the surroundings of the infarcted hemisphere, and improve neurological function in the rhesus macaque following brain ischemia. PMID:25206440

  17. Neurite, a Finite Difference Large Scale Parallel Program for the Simulation of Electrical Signal Propagation in Neurites under Mechanical Loading

    PubMed Central

    García-Grajales, Julián A.; Rucabado, Gabriel; García-Dopico, Antonio; Peña, José-María; Jérusalem, Antoine

    2015-01-01

    With the growing body of research on traumatic brain injury and spinal cord injury, computational neuroscience has recently focused its modeling efforts on neuronal functional deficits following mechanical loading. However, in most of these efforts, cell damage is generally only characterized by purely mechanistic criteria, functions of quantities such as stress, strain or their corresponding rates. The modeling of functional deficits in neurites as a consequence of macroscopic mechanical insults has been rarely explored. In particular, a quantitative mechanically based model of electrophysiological impairment in neuronal cells, Neurite, has only very recently been proposed. In this paper, we present the implementation details of this model: a finite difference parallel program for simulating electrical signal propagation along neurites under mechanical loading. Following the application of a macroscopic strain at a given strain rate produced by a mechanical insult, Neurite is able to simulate the resulting neuronal electrical signal propagation, and thus the corresponding functional deficits. The simulation of the coupled mechanical and electrophysiological behaviors requires computational expensive calculations that increase in complexity as the network of the simulated cells grows. The solvers implemented in Neurite—explicit and implicit—were therefore parallelized using graphics processing units in order to reduce the burden of the simulation costs of large scale scenarios. Cable Theory and Hodgkin-Huxley models were implemented to account for the electrophysiological passive and active regions of a neurite, respectively, whereas a coupled mechanical model accounting for the neurite mechanical behavior within its surrounding medium was adopted as a link between electrophysiology and mechanics. This paper provides the details of the parallel implementation of Neurite, along with three different application examples: a long myelinated axon, a segmented dendritic tree, and a damaged axon. The capabilities of the program to deal with large scale scenarios, segmented neuronal structures, and functional deficits under mechanical loading are specifically highlighted. PMID:25680098

  18. SYSTID - A flexible tool for the analysis of communication systems.

    NASA Technical Reports Server (NTRS)

    Dawson, C. T.; Tranter, W. H.

    1972-01-01

    Description of the System Time Domain Simulation (SYSTID) computer-aided analysis program which is specifically structured for communication systems analysis. The SYSTID program is user oriented so that very little knowledge of computer techniques and very little programming ability are required for proper application. The program is designed so that the user can go from a system block diagram to an accurate simulation by simply programming a single English language statement for each block in the system. The mathematical and functional models available in the SYSTID library are presented. An example problem is given which illustrates the ease of modeling communication systems. Examples of the outputs available are presented, and proposed improvements are summarized.

  19. A generalized fuzzy linear programming approach for environmental management problem under uncertainty.

    PubMed

    Fan, Yurui; Huang, Guohe; Veawab, Amornvadee

    2012-01-01

    In this study, a generalized fuzzy linear programming (GFLP) method was developed to deal with uncertainties expressed as fuzzy sets that exist in the constraints and objective function. A stepwise interactive algorithm (SIA) was advanced to solve GFLP model and generate solutions expressed as fuzzy sets. To demonstrate its application, the developed GFLP method was applied to a regional sulfur dioxide (SO2) control planning model to identify effective SO2 mitigation polices with a minimized system performance cost under uncertainty. The results were obtained to represent the amount of SO2 allocated to different control measures from different sources. Compared with the conventional interval-parameter linear programming (ILP) approach, the solutions obtained through GFLP were expressed as fuzzy sets, which can provide intervals for the decision variables and objective function, as well as related possibilities. Therefore, the decision makers can make a tradeoff between model stability and the plausibility based on solutions obtained through GFLP and then identify desired policies for SO2-emission control under uncertainty.

  20. Space station interior noise analysis program

    NASA Technical Reports Server (NTRS)

    Stusnick, E.; Burn, M.

    1987-01-01

    Documentation is provided for a microcomputer program which was developed to evaluate the effect of the vibroacoustic environment on speech communication inside a space station. The program, entitled Space Station Interior Noise Analysis Program (SSINAP), combines a Statistical Energy Analysis (SEA) prediction of sound and vibration levels within the space station with a speech intelligibility model based on the Modulation Transfer Function and the Speech Transmission Index (MTF/STI). The SEA model provides an effective analysis tool for predicting the acoustic environment based on proposed space station design. The MTF/STI model provides a method for evaluating speech communication in the relatively reverberant and potentially noisy environments that are likely to occur in space stations. The combinations of these two models provides a powerful analysis tool for optimizing the acoustic design of space stations from the point of view of speech communications. The mathematical algorithms used in SSINAP are presented to implement the SEA and MTF/STI models. An appendix provides an explanation of the operation of the program along with details of the program structure and code.

  1. Nonlinear Structured Growth Mixture Models in Mplus and OpenMx

    PubMed Central

    Grimm, Kevin J.; Ram, Nilam; Estabrook, Ryne

    2014-01-01

    Growth mixture models (GMMs; Muthén & Muthén, 2000; Muthén & Shedden, 1999) are a combination of latent curve models (LCMs) and finite mixture models to examine the existence of latent classes that follow distinct developmental patterns. GMMs are often fit with linear, latent basis, multiphase, or polynomial change models because of their common use, flexibility in modeling many types of change patterns, the availability of statistical programs to fit such models, and the ease of programming. In this paper, we present additional ways of modeling nonlinear change patterns with GMMs. Specifically, we show how LCMs that follow specific nonlinear functions can be extended to examine the presence of multiple latent classes using the Mplus and OpenMx computer programs. These models are fit to longitudinal reading data from the Early Childhood Longitudinal Study-Kindergarten Cohort to illustrate their use. PMID:25419006

  2. The analytical representation of viscoelastic material properties using optimization techniques

    NASA Technical Reports Server (NTRS)

    Hill, S. A.

    1993-01-01

    This report presents a technique to model viscoelastic material properties with a function of the form of the Prony series. Generally, the method employed to determine the function constants requires assuming values for the exponential constants of the function and then resolving the remaining constants through linear least-squares techniques. The technique presented here allows all the constants to be analytically determined through optimization techniques. This technique is employed in a computer program named PRONY and makes use of commercially available optimization tool developed by VMA Engineering, Inc. The PRONY program was utilized to compare the technique against previously determined models for solid rocket motor TP-H1148 propellant and V747-75 Viton fluoroelastomer. In both cases, the optimization technique generated functions that modeled the test data with at least an order of magnitude better correlation. This technique has demonstrated the capability to use small or large data sets and to use data sets that have uniformly or nonuniformly spaced data pairs. The reduction of experimental data to accurate mathematical models is a vital part of most scientific and engineering research. This technique of regression through optimization can be applied to other mathematical models that are difficult to fit to experimental data through traditional regression techniques.

  3. Evolution of a minimal parallel programming model

    DOE PAGES

    Lusk, Ewing; Butler, Ralph; Pieper, Steven C.

    2017-04-30

    Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generalitymore » and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.« less

  4. The Restorative Healing Model: Implementation at the Woodbourne Center

    ERIC Educational Resources Information Center

    Park, Juyoung; Carlson, George; Weinstein, Stanley; Lee, Bethany

    2008-01-01

    This study describes the Restorative Healing Model used at the Woodbourne Center (Baltimore) to improve socially adaptive functioning and behaviors among youth residing in a residential treatment center. This treatment model requires collaborative work with youth, their families, staff members, and community members. Unlike program models built on…

  5. SSME model, engine dynamic characteristics related to Pogo

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A linear model of the space shuttle main engine for use in Pogo studies was presented. A digital program is included from which engine transfer functions are determined relative to the engine operating level.

  6. An implementation framework for wastewater treatment models requiring a minimum programming expertise.

    PubMed

    Rodríguez, J; Premier, G C; Dinsdale, R; Guwy, A J

    2009-01-01

    Mathematical modelling in environmental biotechnology has been a traditionally difficult resource to access for researchers and students without programming expertise. The great degree of flexibility required from model implementation platforms to be suitable for research applications restricts their use to programming expert users. More user friendly software packages however do not normally incorporate the necessary flexibility for most research applications. This work presents a methodology based on Excel and Matlab-Simulink for both flexible and accessible implementation of mathematical models by researchers with and without programming expertise. The models are almost fully defined in an Excel file in which the names and values of the state variables and parameters are easily created. This information is automatically processed in Matlab to create the model structure and almost immediate model simulation, after only a minimum Matlab code definition, is possible. The framework proposed also provides programming expert researchers with a highly flexible and modifiable platform on which to base more complex model implementations. The method takes advantage of structural generalities in most mathematical models of environmental bioprocesses while enabling the integration of advanced elements (e.g. heuristic functions, correlations). The methodology has already been successfully used in a number of research studies.

  7. F-Nets and Software Cabling: Deriving a Formal Model and Language for Portable Parallel Programming

    NASA Technical Reports Server (NTRS)

    DiNucci, David C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    Parallel programming is still being based upon antiquated sequence-based definitions of the terms "algorithm" and "computation", resulting in programs which are architecture dependent and difficult to design and analyze. By focusing on obstacles inherent in existing practice, a more portable model is derived here, which is then formalized into a model called Soviets which utilizes a combination of imperative and functional styles. This formalization suggests more general notions of algorithm and computation, as well as insights into the meaning of structured programming in a parallel setting. To illustrate how these principles can be applied, a very-high-level graphical architecture-independent parallel language, called Software Cabling, is described, with many of the features normally expected from today's computer languages (e.g. data abstraction, data parallelism, and object-based programming constructs).

  8. An Ada Linear-Algebra Software Package Modeled After HAL/S

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.; Lawson, Charles L.

    1990-01-01

    New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  9. Improved functions and reduced length of stay after inpatient rehabilitation programs in older adults with stroke: A systematic review and meta-analysis of randomized controlled trials.

    PubMed

    Bindawas, Saad M; Vennu, Vishal; Moftah, Emad

    2017-01-01

    to examine the effects of inpatient rehabilitation programs on function and length of stay in older adults with strokeMETHODS: A total of five electronic databases were searched for relevant randomized controlled trials that examined the effects of inpatient rehabilitation programs on functional recovery, as measured by the functional independence measure and length of stay, which was measured in days. We included full-text articles written in English, and no time limit. The methodological quality and risk of bias were assessed using the Physiotherapy Evidence Database Scale and the Cochrane collaboration tools respectively. The effect sizes and confidence intervals were estimated using fixed-effect modelsRESULTS: Eight randomized controlled trials involving 1,910 patients with stroke were included in the meta-analysis showed that patients who participated in the inpatient rehabilitation programs had significantly (p less than 0.05) higher functional independence measure scores (effect size = 0.10; 95 percent confidence interval = 0.01, 0.22) and shorter length of stay (effect size = 0.14; 95 percent confidence interval = 0.03, 0.22). This systematic review provided evidence that inpatient rehabilitation programs have beneficial effects, improving functionality and reducing length of stay for older adults with stroke.

  10. Design strategies and functionality of the Visual Interface for Virtual Interaction Development (VIVID) tool

    NASA Technical Reports Server (NTRS)

    Nguyen, Lac; Kenney, Patrick J.

    1993-01-01

    Development of interactive virtual environments (VE) has typically consisted of three primary activities: model (object) development, model relationship tree development, and environment behavior definition and coding. The model and relationship tree development activities are accomplished with a variety of well-established graphic library (GL) based programs - most utilizing graphical user interfaces (GUI) with point-and-click interactions. Because of this GUI format, little programming expertise on the part of the developer is necessary to create the 3D graphical models or to establish interrelationships between the models. However, the third VE development activity, environment behavior definition and coding, has generally required the greatest amount of time and programmer expertise. Behaviors, characteristics, and interactions between objects and the user within a VE must be defined via command line C coding prior to rendering the environment scenes. In an effort to simplify this environment behavior definition phase for non-programmers, and to provide easy access to model and tree tools, a graphical interface and development tool has been created. The principal thrust of this research is to effect rapid development and prototyping of virtual environments. This presentation will discuss the 'Visual Interface for Virtual Interaction Development' (VIVID) tool; an X-Windows based system employing drop-down menus for user selection of program access, models, and trees, behavior editing, and code generation. Examples of these selection will be highlighted in this presentation, as will the currently available program interfaces. The functionality of this tool allows non-programming users access to all facets of VE development while providing experienced programmers with a collection of pre-coded behaviors. In conjunction with its existing, interfaces and predefined suite of behaviors, future development plans for VIVID will be described. These include incorporation of dual user virtual environment enhancements, tool expansion, and additional behaviors.

  11. BACHSCORE. A tool for evaluating efficiently and reliably the quality of large sets of protein structures

    NASA Astrophysics Data System (ADS)

    Sarti, E.; Zamuner, S.; Cossio, P.; Laio, A.; Seno, F.; Trovato, A.

    2013-12-01

    In protein structure prediction it is of crucial importance, especially at the refinement stage, to score efficiently large sets of models by selecting the ones that are closest to the native state. We here present a new computational tool, BACHSCORE, that allows its users to rank different structural models of the same protein according to their quality, evaluated by using the BACH++ (Bayesian Analysis Conformation Hunt) scoring function. The original BACH statistical potential was already shown to discriminate with very good reliability the protein native state in large sets of misfolded models of the same protein. BACH++ features a novel upgrade in the solvation potential of the scoring function, now computed by adapting the LCPO (Linear Combination of Pairwise Orbitals) algorithm. This change further enhances the already good performance of the scoring function. BACHSCORE can be accessed directly through the web server: bachserver.pd.infn.it. Catalogue identifier: AEQD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQD_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 130159 No. of bytes in distributed program, including test data, etc.: 24 687 455 Distribution format: tar.gz Programming language: C++. Computer: Any computer capable of running an executable produced by a g++ compiler (4.6.3 version). Operating system: Linux, Unix OS-es. RAM: 1 073 741 824 bytes Classification: 3. Nature of problem: Evaluate the quality of a protein structural model, taking into account the possible “a priori” knowledge of a reference primary sequence that may be different from the amino-acid sequence of the model; the native protein structure should be recognized as the best model. Solution method: The contact potential scores the occurrence of any given type of residue pair in 5 possible contact classes (α-helical contact, parallel β-sheet contact, anti-parallel β-sheet contact, side-chain contact, no contact). The solvation potential scores the occurrence of any residue type in 2 possible environments: buried and solvent exposed. Residue environment is assigned by adapting the LCPO algorithm. Residues present in the reference primary sequence and not present in the model structure contribute to the model score as solvent exposed and as non contacting all other residues. Restrictions: Input format file according to the Protein Data Bank standard Additional comments: Parameter values used in the scoring function can be found in the file /folder-to-bachscore/BACH/examples/bach_std.par. Running time: Roughly one minute to score one hundred structures on a desktop PC, depending on their size.

  12. [A Structural Equation Model on Family Strength of Married Working Women].

    PubMed

    Hong, Yeong Seon; Han, Kuem Sun

    2015-12-01

    The purpose of this study was to identify the effect of predictive factors related to family strength and develop a structural equation model that explains family strength among married working women. A hypothesized model was developed based on literature reviews and predictors of family strength by Yoo. This constructed model was built of an eight pathway form. Two exogenous variables included in this model were ego-resilience and family support. Three endogenous variables included in this model were functional couple communication, family stress and family strength. Data were collected using a self-report questionnaire from 319 married working women who were 30~40 of age and lived in cities of Chungnam province in Korea. Data were analyzed with PASW/WIN 18.0 and AMOS 18.0 programs. Family support had a positive direct, indirect and total effect on family strength. Family stress had a negative direct, indirect and total effect on family strength. Functional couple communication had a positive direct and total effect on family strength. These predictive variables of family strength explained 61.8% of model. The results of the study show a structural equation model for family strength of married working women and that predicting factors for family strength are family support, family stress, and functional couple communication. To improve family strength of married working women, the results of this study suggest nursing access and mediative programs to improve family support and functional couple communication, and reduce family stress.

  13. Cost estimation model for advanced planetary programs, fourth edition

    NASA Technical Reports Server (NTRS)

    Spadoni, D. J.

    1983-01-01

    The development of the planetary program cost model is discussed. The Model was updated to incorporate cost data from the most recent US planetary flight projects and extensively revised to more accurately capture the information in the historical cost data base. This data base is comprised of the historical cost data for 13 unmanned lunar and planetary flight programs. The revision was made with a two fold objective: to increase the flexibility of the model in its ability to deal with the broad scope of scenarios under consideration for future missions, and to maintain and possibly improve upon the confidence in the model's capabilities with an expected accuracy of 20%. The Model development included a labor/cost proxy analysis, selection of the functional forms of the estimating relationships, and test statistics. An analysis of the Model is discussed and two sample applications of the cost model are presented.

  14. A Model for Microcontroller Functionality Upset Induced by External Pulsed Electromagnetic Irradiation

    DTIC Science & Technology

    2016-11-21

    AFRL-RD-PS- AFRL-RD-PS- TN-2016-0003 TN-2016-0003 A Model for Microcontroller Functionality Upset Induced by External Pulsed Electromagnetic ...External Pulsed Electromagnetic Irradiation 5a. CONTRACT NUMBER FA9451-15-C-0004 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6 . AUTHOR(S) David...microcontroller (µC) subjected to external irradiation by a narrowband electromagnetic (EM) pulse. In our model, the state of a µC is completely specified by

  15. Probability and Statistics in Sensor Performance Modeling

    DTIC Science & Technology

    2010-12-01

    language software program is called Environmental Awareness for Sensor and Emitter Employment. Some important numerical issues in the implementation...3 Statistical analysis for measuring sensor performance...complementary cumulative distribution function cdf cumulative distribution function DST decision-support tool EASEE Environmental Awareness of

  16. Abdominal surgery process modeling framework for simulation using spreadsheets.

    PubMed

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  17. Time-dependent Data System (TDDS); an interactive program to assemble, manage, and appraise input data and numerical output of flow/transport simulation models

    USGS Publications Warehouse

    Regan, R.S.; Schaffranek, R.W.; Baltzer, R.A.

    1996-01-01

    A system of functional utilities and computer routines, collectively identified as the Time-Dependent Data System CI DDS), has been developed and documented by the U.S. Geological Survey. The TDDS is designed for processing time sequences of discrete, fixed-interval, time-varying geophysical data--in particular, hydrologic data. Such data include various, dependent variables and related parameters typically needed as input for execution of one-, two-, and three-dimensional hydrodynamic/transport and associated water-quality simulation models. Such data can also include time sequences of results generated by numerical simulation models. Specifically, TDDS provides the functional capabilities to process, store, retrieve, and compile data in a Time-Dependent Data Base (TDDB) in response to interactive user commands or pre-programmed directives. Thus, the TDDS, in conjunction with a companion TDDB, provides a ready means for processing, preparation, and assembly of time sequences of data for input to models; collection, categorization, and storage of simulation results from models; and intercomparison of field data and simulation results. The TDDS can be used to edit and verify prototype, time-dependent data to affirm that selected sequences of data are accurate, contiguous, and appropriate for numerical simulation modeling. It can be used to prepare time-varying data in a variety of formats, such as tabular lists, sequential files, arrays, graphical displays, as well as line-printer plots of single or multiparameter data sets. The TDDB is organized and maintained as a direct-access data base by the TDDS, thus providing simple, yet efficient, data management and access. A single, easily used, program interface that provides all access to and from a particular TDDB is available for use directly within models, other user-provided programs, and other data systems. This interface, together with each major functional utility of the TDDS, is described and documented in this report.

  18. ScrumPy: metabolic modelling with Python.

    PubMed

    Poolman, M G

    2006-09-01

    ScrumPy is a software package used for the definition and analysis of metabolic models. It is written using the Python programming language that is also used as a user interface. ScrumPy has features for both kinetic and structural modelling, but the emphasis is on structural modelling and those features of most relevance to analysis of large (genome-scale) models. The aim is at describing ScrumPy's functionality to readers with some knowledge of metabolic modelling, but implementation, programming and other computational details are omitted. ScrumPy is released under the Gnu Public Licence, and available for download from http://mudshark.brookes.ac.uk/ ScrumPy.

  19. Enabling complex queries to drug information sources through functional composition.

    PubMed

    Peters, Lee; Mortensen, Jonathan; Nguyen, Thang; Bodenreider, Olivier

    2013-01-01

    Our objective was to enable an end-user to create complex queries to drug information sources through functional composition, by creating sequences of functions from application program interfaces (API) to drug terminologies. The development of a functional composition model seeks to link functions from two distinct APIs. An ontology was developed using Protégé to model the functions of the RxNorm and NDF-RT APIs by describing the semantics of their input and output. A set of rules were developed to define the interoperable conditions for functional composition. The operational definition of interoperability between function pairs is established by executing the rules on the ontology. We illustrate that the functional composition model supports common use cases, including checking interactions for RxNorm drugs and deploying allergy lists defined in reference to drug properties in NDF-RT. This model supports the RxMix application (http://mor.nlm.nih.gov/RxMix/), an application we developed for enabling complex queries to the RxNorm and NDF-RT APIs.

  20. U.S. intelligence system: model for corporate chiefs?

    PubMed

    Gilad, B

    1991-01-01

    A fully dedicated intelligence support function for senior management is no longer a luxury but a necessity. Companies can enhance their intelligence capabilities by using the government model as a rough blueprint to structure such a program.

  1. ENHANCING HSPF MODEL CHANNEL HYDRAULIC REPRESENTATION

    EPA Science Inventory

    The Hydrological Simulation Program - FORTRAN (HSPF) is a comprehensive watershed model, which employs depth-area-volume-flow relationships known as hydraulic function table (FTABLE) to represent stream channel cross-sections and reservoirs. An accurate FTABLE determination for a...

  2. MPS Solidification Model. Volume 2: Operating guide and software documentation for the unsteady model

    NASA Technical Reports Server (NTRS)

    Maples, A. L.

    1981-01-01

    The operation of solidification Model 2 is described and documentation of the software associated with the model is provided. Model 2 calculates the macrosegregation in a rectangular ingot of a binary alloy as a result of unsteady horizontal axisymmetric bidirectional solidification. The solidification program allows interactive modification of calculation parameters as well as selection of graphical and tabular output. In batch mode, parameter values are input in card image form and output consists of printed tables of solidification functions. The operational aspects of Model 2 that differ substantially from Model 1 are described. The global flow diagrams and data structures of Model 2 are included. The primary program documentation is the code itself.

  3. Reduction of shock induced noise in imperfectly expanded supersonic jets using convex optimization

    NASA Astrophysics Data System (ADS)

    Adhikari, Sam

    2007-11-01

    Imperfectly expanded jets generate screech noise. The imbalance between the backpressure and the exit pressure of the imperfectly expanded jets produce shock cells and expansion or compression waves from the nozzle. The instability waves and the shock cells interact to generate the screech sound. The mathematical model consists of cylindrical coordinate based full Navier-Stokes equations and large-eddy-simulation turbulence modeling. Analytical and computational analysis of the three-dimensional helical effects provide a model that relates several parameters with shock cell patterns, screech frequency and distribution of shock generation locations. Convex optimization techniques minimize the shock cell patterns and the instability waves. The objective functions are (convex) quadratic and the constraint functions are affine. In the quadratic optimization programs, minimization of the quadratic functions over a set of polyhedrons provides the optimal result. Various industry standard methods like regression analysis, distance between polyhedra, bounding variance, Markowitz optimization, and second order cone programming is used for Quadratic Optimization.

  4. A Structural Equation Model on Korean Adolescents' Excessive Use of Smartphones.

    PubMed

    Lee, Hana; Kim, JooHyun

    2018-03-31

    We develop a unified structural model that defines multi-relationships between systematic factors causing excessive use of smartphones and the corresponding results. We conducted a survey with adolescents who live in Seoul, Pusan, Gangneung, Donghae, and Samcheok from Feb. to Mar. 2016. We utilized SPSS Ver. 22 and Amos Ver. 22 to analyze the survey result at a 0.05 significance level. To investigate demographic characteristics of the participants and their variations, we employed descriptive analysis. We adopted the maximum likelihood estimate method to verify the fitness of the hypothetical model and the hypotheses therein. We used χ 2 statistics, GFI, AGFI, CFI, NFI, IFI, RMR, and RMSEA to verify the fitness of our structural model. (1) Our proposed structural model demonstrated a fine fitness level. (2) Our proposed structural model could describe the excessive use of a smartphone with 88.6% accuracy. (3) The absence of the family function and relationship between friends, impulsiveness, and low self-esteem were confirmed as key factors that cause excessive use of smartphones. (4) Further, impulsiveness and low self-esteem are closely related to the absence of family functions and relations between friends by 68.3% and 54.4%, respectively. We suggest that nursing intervention programs from various angles are required to reduce adolescents' excessive use of smartphones. For example, family communication programs would be helpful for both parents and children. Consultant programs about friend relationship also meaningful for the program. Copyright © 2018. Published by Elsevier B.V.

  5. DockoMatic 2.0: high throughput inverse virtual screening and homology modeling.

    PubMed

    Bullock, Casey; Cornia, Nic; Jacob, Reed; Remm, Andrew; Peavey, Thomas; Weekes, Ken; Mallory, Chris; Oxford, Julia T; McDougal, Owen M; Andersen, Timothy L

    2013-08-26

    DockoMatic is a free and open source application that unifies a suite of software programs within a user-friendly graphical user interface (GUI) to facilitate molecular docking experiments. Here we describe the release of DockoMatic 2.0; significant software advances include the ability to (1) conduct high throughput inverse virtual screening (IVS); (2) construct 3D homology models; and (3) customize the user interface. Users can now efficiently setup, start, and manage IVS experiments through the DockoMatic GUI by specifying receptor(s), ligand(s), grid parameter file(s), and docking engine (either AutoDock or AutoDock Vina). DockoMatic automatically generates the needed experiment input files and output directories and allows the user to manage and monitor job progress. Upon job completion, a summary of results is generated by Dockomatic to facilitate interpretation by the user. DockoMatic functionality has also been expanded to facilitate the construction of 3D protein homology models using the Timely Integrated Modeler (TIM) wizard. The wizard TIM provides an interface that accesses the basic local alignment search tool (BLAST) and MODELER programs and guides the user through the necessary steps to easily and efficiently create 3D homology models for biomacromolecular structures. The DockoMatic GUI can be customized by the user, and the software design makes it relatively easy to integrate additional docking engines, scoring functions, or third party programs. DockoMatic is a free comprehensive molecular docking software program for all levels of scientists in both research and education.

  6. A Community Health Worker "logic model": towards a theory of enhanced performance in low- and middle-income countries.

    PubMed

    Naimoli, Joseph F; Frymus, Diana E; Wuliji, Tana; Franco, Lynne M; Newsome, Martha H

    2014-10-02

    There has been a resurgence of interest in national Community Health Worker (CHW) programs in low- and middle-income countries (LMICs). A lack of strong research evidence persists, however, about the most efficient and effective strategies to ensure optimal, sustained performance of CHWs at scale. To facilitate learning and research to address this knowledge gap, the authors developed a generic CHW logic model that proposes a theoretical causal pathway to improved performance. The logic model draws upon available research and expert knowledge on CHWs in LMICs. Construction of the model entailed a multi-stage, inductive, two-year process. It began with the planning and implementation of a structured review of the existing research on community and health system support for enhanced CHW performance. It continued with a facilitated discussion of review findings with experts during a two-day consultation. The process culminated with the authors' review of consultation-generated documentation, additional analysis, and production of multiple iterations of the model. The generic CHW logic model posits that optimal CHW performance is a function of high quality CHW programming, which is reinforced, sustained, and brought to scale by robust, high-performing health and community systems, both of which mobilize inputs and put in place processes needed to fully achieve performance objectives. Multiple contextual factors can influence CHW programming, system functioning, and CHW performance. The model is a novel contribution to current thinking about CHWs. It places CHW performance at the center of the discussion about CHW programming, recognizes the strengths and limitations of discrete, targeted programs, and is comprehensive, reflecting the current state of both scientific and tacit knowledge about support for improving CHW performance. The model is also a practical tool that offers guidance for continuous learning about what works. Despite the model's limitations and several challenges in translating the potential for learning into tangible learning, the CHW generic logic model provides a solid basis for exploring and testing a causal pathway to improved performance.

  7. MaMR: High-performance MapReduce programming model for material cloud applications

    NASA Astrophysics Data System (ADS)

    Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng

    2017-02-01

    With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.

  8. Fatigue strength reduction model: RANDOM3 and RANDOM4 user manual, appendix 2

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    The FORTRAN programs RANDOM3 and RANDOM4 are documented. They are based on fatigue strength reduction, using a probabilistic constitutive model. They predict the random lifetime of an engine component to reach a given fatigue strength. Included in this user manual are details regarding the theoretical backgrounds of RANDOM3 and RANDOM4. Appendix A gives information on the physical quantities, their symbols, FORTRAN names, and both SI and U.S. Customary units. Appendix B and C include photocopies of the actual computer printout corresponding to the sample problems. Appendices D and E detail the IMSL, Version 10(1), subroutines and functions called by RANDOM3 and RANDOM4 and SAS/GRAPH(2) programs that can be used to plot both the probability density functions (p.d.f.) and the cumulative distribution functions (c.d.f.).

  9. Propagation of a radio-frequency pulsed signal over the Earth. The JOLLY programs

    NASA Astrophysics Data System (ADS)

    Carroll, D.; Detch, J. L.; Malik, J.

    1983-07-01

    The interpretation of observed radioflash/electromagnetic pulse (emp) observed signals from nuclear detonations in terms of theoretical models or extrapolation to signals expected at military systems involves correction for ground-wave propagation effects. For most applications, previously developed programs have been adequate. There have been problems when these techniques have been tried for situations in the near tangent regime where a considerable concern exists. It has been found that the problem of predicting propagation response functions in the near tangent regime has been the inconsistent derivation of the equations. Resolution of this problem has evolved into a program to better predict ground-wave propagation. The description of the method and detailed description of the programs are described for both propagation over realistic earth and sea-water paths. Results can be given in terms of amplitude and phase as a function of the frequency or as amplitude versus time, the usual Green's or resolution function.

  10. SAS macro programs for geographically weighted generalized linear modeling with spatial point data: applications to health research.

    PubMed

    Chen, Vivian Yi-Ju; Yang, Tse-Chuan

    2012-08-01

    An increasing interest in exploring spatial non-stationarity has generated several specialized analytic software programs; however, few of these programs can be integrated natively into a well-developed statistical environment such as SAS. We not only developed a set of SAS macro programs to fill this gap, but also expanded the geographically weighted generalized linear modeling (GWGLM) by integrating the strengths of SAS into the GWGLM framework. Three features distinguish our work. First, the macro programs of this study provide more kernel weighting functions than the existing programs. Second, with our codes the users are able to better specify the bandwidth selection process compared to the capabilities of existing programs. Third, the development of the macro programs is fully embedded in the SAS environment, providing great potential for future exploration of complicated spatially varying coefficient models in other disciplines. We provided three empirical examples to illustrate the use of the SAS macro programs and demonstrated the advantages explained above. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  11. TRIGRS - A Fortran Program for Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability Analysis, Version 2.0

    USGS Publications Warehouse

    Baum, Rex L.; Savage, William Z.; Godt, Jonathan W.

    2008-01-01

    The Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability Model (TRIGRS) is a Fortran program designed for modeling the timing and distribution of shallow, rainfall-induced landslides. The program computes transient pore-pressure changes, and attendant changes in the factor of safety, due to rainfall infiltration. The program models rainfall infiltration, resulting from storms that have durations ranging from hours to a few days, using analytical solutions for partial differential equations that represent one-dimensional, vertical flow in isotropic, homogeneous materials for either saturated or unsaturated conditions. Use of step-function series allows the program to represent variable rainfall input, and a simple runoff routing model allows the user to divert excess water from impervious areas onto more permeable downslope areas. The TRIGRS program uses a simple infinite-slope model to compute factor of safety on a cell-by-cell basis. An approximate formula for effective stress in unsaturated materials aids computation of the factor of safety in unsaturated soils. Horizontal heterogeneity is accounted for by allowing material properties, rainfall, and other input values to vary from cell to cell. This command-line program is used in conjunction with geographic information system (GIS) software to prepare input grids and visualize model results.

  12. CIRCUS--A digital computer program for transient analysis of electronic circuits

    NASA Technical Reports Server (NTRS)

    Moore, W. T.; Steinbert, L. L.

    1968-01-01

    Computer program simulates the time domain response of an electronic circuit to an arbitrary forcing function. CIRCUS uses a charge-control parameter model to represent each semiconductor device. Given the primary photocurrent, the transient behavior of a circuit in a radiation environment is determined.

  13. Association Between Sleep and Physical Function in Older Veterans in an Adult Day Healthcare Program.

    PubMed

    Song, Yeonsu; Dzierzewski, Joseph M; Fung, Constance H; Rodriguez, Juan C; Jouldjian, Stella; Mitchell, Michael N; Josephson, Karen R; Alessi, Cathy A; Martin, Jennifer L

    2015-08-01

    To examine whether sleep disturbance is associated with poor physical function in older veterans in an adult day healthcare (ADHC) program. Cross-sectional. One ADHC program in a Veterans Affairs Ambulatory Care Center. Older veterans (N = 50) enrolled in a randomized controlled trial of a sleep intervention program who had complete baseline data. Information on participant characteristics (e.g., age, depression, relationship to caregiver, pain, comorbidity) was collected using appropriate questionnaires. Physical function was measured using activity of daily living (ADL) and instrumental ADL (IADL) total scores from the Older Americans Resources and Services Multidimensional Functional Assessment Questionnaire. Sleep was assessed subjectively (Pittsburgh Sleep Quality Index, Insomnia Severity Index) and objectively (wrist actigraphy). Participants required substantial assistance with ADLs and IADLs. A regression model showed that participant characteristics (marital status, use of sleep medication, comorbidity, posttraumatic stress disorder) and living arrangement (living with a spouse or others) were significantly associated with poor physical function. Poorer objective sleep (total sleep time, total numbers of awakenings, total wake time) was significantly associated with poor physical function, accounting for a significant proportion of the variance other than participant characteristics. Objective measures of nighttime sleep disturbance were associated with poor physical function in older veterans in an ADHC program. Further research is needed to determine whether interventions to improve sleep will delay functional decline in this vulnerable population. © 2015, Copyright the Authors Journal compilation © 2015, The American Geriatrics Society.

  14. A facility location model for municipal solid waste management system under uncertain environment.

    PubMed

    Yadav, Vinay; Bhurjee, A K; Karmakar, Subhankar; Dikshit, A K

    2017-12-15

    In municipal solid waste management system, decision makers have to develop an insight into the processes namely, waste generation, collection, transportation, processing, and disposal methods. Many parameters (e.g., waste generation rate, functioning costs of facilities, transportation cost, and revenues) in this system are associated with uncertainties. Often, these uncertainties of parameters need to be modeled under a situation of data scarcity for generating probability distribution function or membership function for stochastic mathematical programming or fuzzy mathematical programming respectively, with only information of extreme variations. Moreover, if uncertainties are ignored, then the problems like insufficient capacities of waste management facilities or improper utilization of available funds may be raised. To tackle uncertainties of these parameters in a more efficient manner an algorithm, based on interval analysis, has been developed. This algorithm is applied to find optimal solutions for a facility location model, which is formulated to select economically best locations of transfer stations in a hypothetical urban center. Transfer stations are an integral part of contemporary municipal solid waste management systems, and economic siting of transfer stations ensures financial sustainability of this system. The model is written in a mathematical programming language AMPL with KNITRO as a solver. The developed model selects five economically best locations out of ten potential locations with an optimum overall cost of [394,836, 757,440] Rs. 1 /day ([5906, 11,331] USD/day) approximately. Further, the requirement of uncertainty modeling is explained based on the results of sensitivity analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. A One-Layer Recurrent Neural Network for Real-Time Portfolio Optimization With Probability Criterion.

    PubMed

    Liu, Qingshan; Dang, Chuangyin; Huang, Tingwen

    2013-02-01

    This paper presents a decision-making model described by a recurrent neural network for dynamic portfolio optimization. The portfolio-optimization problem is first converted into a constrained fractional programming problem. Since the objective function in the programming problem is not convex, the traditional optimization techniques are no longer applicable for solving this problem. Fortunately, the objective function in the fractional programming is pseudoconvex on the feasible region. It leads to a one-layer recurrent neural network modeled by means of a discontinuous dynamic system. To ensure the optimal solutions for portfolio optimization, the convergence of the proposed neural network is analyzed and proved. In fact, the neural network guarantees to get the optimal solutions for portfolio-investment advice if some mild conditions are satisfied. A numerical example with simulation results substantiates the effectiveness and illustrates the characteristics of the proposed neural network.

  16. ENHANCING HYDROLOGICAL SIMULATION PROGRAM - FORTRAN MODEL CHANNEL HYDRAULIC REPRESENTATION

    EPA Science Inventory

    The Hydrological Simulation Program– FORTRAN (HSPF) is a comprehensive watershed model that employs depth-area - volume - flow relationships known as the hydraulic function table (FTABLE) to represent the hydraulic characteristics of stream channel cross-sections and reservoirs. ...

  17. The NASTRAN user's manual

    NASA Technical Reports Server (NTRS)

    1983-01-01

    All information directly associated with problem solving using the NASTRAN program is presented. This structural analysis program uses the finite element approach to structural modeling wherein the distributed finite properties of a structure are represented by a finite element of structural elements which are interconnected at a finite number of grid points, to which loads are applied and for which displacements are calculated. Procedures are described for defining and loading a structural model. Functional references for every card used for structural modeling, the NASTRAN data deck and control cards, problem solution sequences (rigid formats), using the plotting capability, writing a direct matrix abstraction program, and diagnostic messages are explained. A dictionary of mnemonics, acronyms, phrases, and other commonly used NASTRAN terms is included.

  18. A universal Model-R Coupler to facilitate the use of R functions for model calibration and analysis

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang; Yan, Wende

    2014-01-01

    Mathematical models are useful in various fields of science and engineering. However, it is a challenge to make a model utilize the open and growing functions (e.g., model inversion) on the R platform due to the requirement of accessing and revising the model's source code. To overcome this barrier, we developed a universal tool that aims to convert a model developed in any computer language to an R function using the template and instruction concept of the Parameter ESTimation program (PEST) and the operational structure of the R-Soil and Water Assessment Tool (R-SWAT). The developed tool (Model-R Coupler) is promising because users of any model can connect an external algorithm (written in R) with their model to implement various model behavior analyses (e.g., parameter optimization, sensitivity and uncertainty analysis, performance evaluation, and visualization) without accessing or modifying the model's source code.

  19. Quality assessment of protein model-structures based on structural and functional similarities

    PubMed Central

    2012-01-01

    Background Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. Results GOBA - Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. Conclusions The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best result for two targets of CASP8, and one of CASP9, compared to the contest participants. Consequently, GOBA offers a novel single model quality assessment program that addresses the practical needs of biologists. In conjunction with other Model Quality Assessment Programs (MQAPs), it would prove useful for the evaluation of single protein models. PMID:22998498

  20. Application of GA package in functional packaging

    NASA Astrophysics Data System (ADS)

    Belousova, D. A.; Noskova, E. E.; Kapulin, D. V.

    2018-05-01

    The approach to application program for the task of configuration of the elements of the commutation circuit for design of the radio-electronic equipment on the basis of the genetic algorithm is offered. The efficiency of the used approach for commutation circuits with different characteristics for computer-aided design on radio-electronic manufacturing is shown. The prototype of the computer-aided design subsystem on the basis of a package GA for R with a set of the general functions for optimization of multivariate models is programmed.

  1. Exploration Supply Chain Simulation

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Exploration Supply Chain Simulation project was chartered by the NASA Exploration Systems Mission Directorate to develop a software tool, with proper data, to quantitatively analyze supply chains for future program planning. This tool is a discrete-event simulation that uses the basic supply chain concepts of planning, sourcing, making, delivering, and returning. This supply chain perspective is combined with other discrete or continuous simulation factors. Discrete resource events (such as launch or delivery reviews) are represented as organizational functional units. Continuous resources (such as civil service or contractor program functions) are defined as enabling functional units. Concepts of fixed and variable costs are included in the model to allow the discrete events to interact with cost calculations. The definition file is intrinsic to the model, but a blank start can be initiated at any time. The current definition file is an Orion Ares I crew launch vehicle. Parameters stretch from Kennedy Space Center across and into other program entities (Michaud Assembly Facility, Aliant Techsystems, Stennis Space Center, Johnson Space Center, etc.) though these will only gain detail as the file continues to evolve. The Orion Ares I file definition in the tool continues to evolve, and analysis from this tool is expected in 2008. This is the first application of such business-driven modeling to a NASA/government-- aerospace contractor endeavor.

  2. 77 FR 21565 - Statement of Organization, Functions and Delegations of Authority

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-10

    ... promote early identification of people living with HIV, linking them to care and retaining them in care... Program) including, Planning and Capacity Development programs (Part C), HIV Early Intervention Services... strategies and innovative models for the development and provision of HIV primary care services; (3...

  3. A Model for Integrating Low Vision Services into Educational Programs.

    ERIC Educational Resources Information Center

    Jose, Randall T.; And Others

    1988-01-01

    A project integrating low-vision services into children's educational programs comprised four components: teacher training, functional vision evaluations for each child, a clinical examination by an optometrist, and follow-up visits with the optometrist to evaluate the prescribed low-vision aids. Educational implications of the project and project…

  4. VISUAL PLUMES CONCEPTS TO POTENTIALLY ADAPT OR ADOPT IN MODELING PLATFORMS SUCH AS VISJET

    EPA Science Inventory

    Windows-based programs share many familiar features and components. For example, file dialogue windows are familiar to most Windows-based personal computer users. Such program elements are desirable because the user is already familiar with how they function, obviating the need f...

  5. Academic Planning through Program Review: Can It Work?

    ERIC Educational Resources Information Center

    Fernandez, Thomas V.; Raab, Marjorie K.

    Nassau Community College (NCC) is currently working with a program evaluation model in which faculty from one department serve as peer evaluation consultants to direct the self-evaluations of other departments. The four functional objectives initially motivating the development of NCC's plan directed that: real decisions about academic programs…

  6. The Organization as Client: Broadening the Concept of Employee Assistance Programs.

    ERIC Educational Resources Information Center

    Googins, Bradley; Davidson, Bruce N.

    1993-01-01

    Notes that many employee assistance programs (EAPs) are broadening their function to address rapidly changing human and social issues of environments in which they operate, refocusing practice to include organization as the client. Discusses traditional EAP practice, evolution of EAPs, changes confronting corporations, and alternative model in…

  7. Reasoning about Function Objects

    NASA Astrophysics Data System (ADS)

    Nordio, Martin; Calcagno, Cristiano; Meyer, Bertrand; Müller, Peter; Tschannen, Julian

    Modern object-oriented languages support higher-order implementations through function objects such as delegates in C#, agents in Eiffel, or closures in Scala. Function objects bring a new level of abstraction to the object-oriented programming model, and require a comparable extension to specification and verification techniques. We introduce a verification methodology that extends function objects with auxiliary side-effect free (pure) methods to model logical artifacts: preconditions, postconditions and modifies clauses. These pure methods can be used to specify client code abstractly, that is, independently from specific instantiations of the function objects. To demonstrate the feasibility of our approach, we have implemented an automatic prover, which verifies several non-trivial examples.

  8. Generalized three-dimensional experimental lightning code (G3DXL) user's manual

    NASA Technical Reports Server (NTRS)

    Kunz, Karl S.

    1986-01-01

    Information concerning the programming, maintenance and operation of the G3DXL computer program is presented and the theoretical basis for the code is described. The program computes time domain scattering fields and surface currents and charges induced by a driving function on and within a complex scattering object which may be perfectly conducting or a lossy dielectric. This is accomplished by modeling the object with cells within a three-dimensional, rectangular problem space, enforcing the appropriate boundary conditions and differencing Maxwell's equations in time. In the present version of the program, the driving function can be either the field radiated by a lightning strike or a direct lightning strike. The F-106 B aircraft is used as an example scattering object.

  9. An inexact mixed risk-aversion two-stage stochastic programming model for water resources management under uncertainty.

    PubMed

    Li, W; Wang, B; Xie, Y L; Huang, G H; Liu, L

    2015-02-01

    Uncertainties exist in the water resources system, while traditional two-stage stochastic programming is risk-neutral and compares the random variables (e.g., total benefit) to identify the best decisions. To deal with the risk issues, a risk-aversion inexact two-stage stochastic programming model is developed for water resources management under uncertainty. The model was a hybrid methodology of interval-parameter programming, conditional value-at-risk measure, and a general two-stage stochastic programming framework. The method extends on the traditional two-stage stochastic programming method by enabling uncertainties presented as probability density functions and discrete intervals to be effectively incorporated within the optimization framework. It could not only provide information on the benefits of the allocation plan to the decision makers but also measure the extreme expected loss on the second-stage penalty cost. The developed model was applied to a hypothetical case of water resources management. Results showed that that could help managers generate feasible and balanced risk-aversion allocation plans, and analyze the trade-offs between system stability and economy.

  10. Gaussian-Beam Laser-Resonator Program

    NASA Technical Reports Server (NTRS)

    Cross, Patricia L.; Bair, Clayton H.; Barnes, Norman

    1989-01-01

    Gaussian Beam Laser Resonator Program models laser resonators by use of Gaussian-beam-propagation techniques. Used to determine radii of beams as functions of position in laser resonators. Algorithm used in program has three major components. First, ray-transfer matrix for laser resonator must be calculated. Next, initial parameters of beam calculated. Finally, propagation of beam through optical elements computed. Written in Microsoft FORTRAN (Version 4.01).

  11. Empirical Network Model of Human Higher Cognitive Brain Functions

    DTIC Science & Technology

    1990-03-31

    If applicable) AFOSR j’ F49620-87-0047 8c. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS USAF/AFSC, AIR FORCE OFFICE OF PROGRAM ...8217 Workbench", an interactive exploratory data analysis and display program . Other technical developments include development of methods and programs ...feedback. Electroencephalogr. clin. Neurophysiol., 74:147-160. 11. Illes, J. (1989) Neurolinguistic features of spontaneous language production

  12. Requirements Modeling with Agent Programming

    NASA Astrophysics Data System (ADS)

    Dasgupta, Aniruddha; Krishna, Aneesh; Ghose, Aditya K.

    Agent-oriented conceptual modeling notations are highly effective in representing requirements from an intentional stance and answering questions such as what goals exist, how key actors depend on each other, and what alternatives must be considered. In this chapter, we review an approach to executing i* models by translating these into set of interacting agents implemented in the CASO language and suggest how we can perform reasoning with requirements modeled (both functional and non-functional) using i* models. In this chapter we particularly incorporate deliberation into the agent design. This allows us to benefit from the complementary representational capabilities of the two frameworks.

  13. Application of positive-real functions in hyperstable discrete model-reference adaptive system design.

    NASA Technical Reports Server (NTRS)

    Karmarkar, J. S.

    1972-01-01

    Proposal of an algorithmic procedure, based on mathematical programming methods, to design compensators for hyperstable discrete model-reference adaptive systems (MRAS). The objective of the compensator is to render the MRAS insensitive to initial parameter estimates within a maximized hypercube in the model parameter space.

  14. An Integrated Enrollment Forecast Model. IR Applications, Volume 15, January 18, 2008

    ERIC Educational Resources Information Center

    Chen, Chau-Kuang

    2008-01-01

    Enrollment forecasting is the central component of effective budget and program planning. The integrated enrollment forecast model is developed to achieve a better understanding of the variables affecting student enrollment and, ultimately, to perform accurate forecasts. The transfer function model of the autoregressive integrated moving average…

  15. Nonlinear Growth Models in M"plus" and SAS

    ERIC Educational Resources Information Center

    Grimm, Kevin J.; Ram, Nilam

    2009-01-01

    Nonlinear growth curves or growth curves that follow a specified nonlinear function in time enable researchers to model complex developmental patterns with parameters that are easily interpretable. In this article we describe how a variety of sigmoid curves can be fit using the M"plus" structural modeling program and the nonlinear…

  16. An Early Childhood Movement Laboratory Model: Kindergym

    ERIC Educational Resources Information Center

    Marston, Rip

    2004-01-01

    Early childhood motor activity programs at institutions of higher learning can operate within the tripartite mission of the university while serving a vital function in providing leadership and guidance to educators. This article describes the University of Northern Iowa's Kindergym model. Within this model, curricular areas of games/sports,…

  17. Solid/FEM integration at SNLA

    NASA Technical Reports Server (NTRS)

    Chavez, Patrick F.

    1987-01-01

    The effort at Sandia National Labs. on the methodologies and techniques being used to generate strict hexahedral finite element meshes from a solid model is described. The functionality of the modeler is used to decompose the solid into a set of nonintersecting meshable finite element primitives. The description of the decomposition is exported, via a Boundary Representative format, to the meshing program which uses the information for complete finite element model specification. Particular features of the program are discussed in some detail along with future plans for development which includes automation of the decomposition using artificial intelligence techniques.

  18. TI-59 Programs for Multiple Regression.

    DTIC Science & Technology

    1980-05-01

    general linear hypothesis model of full rank [ Graybill , 19611 can be written as Y = x 8 + C , s-N(O,o 2I) nxl nxk kxl nxl where Y is the vector of n...a "reduced model " solution, and confidence intervals for linear functions of the coefficients can be obtained using (x’x) and a2, based on the t...O107)l UA.LLL. Library ModuIe NASTER -Puter 0NTINA Cards 1 PROGRAM DESCRIPTION (s s 2 ror the general linear hypothesis model Y - XO + C’ calculates

  19. Functional language and data flow architectures

    NASA Technical Reports Server (NTRS)

    Ercegovac, M. D.; Patel, D. R.; Lang, T.

    1983-01-01

    This is a tutorial article about language and architecture approaches for highly concurrent computer systems based on the functional style of programming. The discussion concentrates on the basic aspects of functional languages, and sequencing models such as data-flow, demand-driven and reduction which are essential at the machine organization level. Several examples of highly concurrent machines are described.

  20. 77 FR 47077 - Statement of Organization, Functions, and Delegations of Authority; Office of Planning, Research...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-07

    ...; surveys, research and evaluation methodologies; demonstration testing and model development; synthesis and..., policy and program analysis; surveys, research and evaluation methodologies; demonstration testing and... Organization, Functions, and Delegations of Authority; Office of Planning, Research and Evaluation AGENCY...

  1. GIANT API: an application programming interface for functional genomics

    PubMed Central

    Roberts, Andrew M.; Wong, Aaron K.; Fisk, Ian; Troyanskaya, Olga G.

    2016-01-01

    GIANT API provides biomedical researchers programmatic access to tissue-specific and global networks in humans and model organisms, and associated tools, which includes functional re-prioritization of existing genome-wide association study (GWAS) data. Using tissue-specific interaction networks, researchers are able to predict relationships between genes specific to a tissue or cell lineage, identify the changing roles of genes across tissues and uncover disease-gene associations. Additionally, GIANT API enables computational tools like NetWAS, which leverages tissue-specific networks for re-prioritization of GWAS results. The web services covered by the API include 144 tissue-specific functional gene networks in human, global functional networks for human and six common model organisms and the NetWAS method. GIANT API conforms to the REST architecture, which makes it stateless, cacheable and highly scalable. It can be used by a diverse range of clients including web browsers, command terminals, programming languages and standalone apps for data analysis and visualization. The API is freely available for use at http://giant-api.princeton.edu. PMID:27098035

  2. A distributed model: redefining a robust research subject advocacy program at the Harvard Clinical and Translational Science Center.

    PubMed

    Winkler, Sabune J; Cagliero, Enrico; Witte, Elizabeth; Bierer, Barbara E

    2014-08-01

    The Harvard Clinical and Translational Science Center ("Harvard Catalyst") Research Subject Advocacy (RSA) Program has reengineered subject advocacy, distributing the delivery of advocacy functions through a multi-institutional, central platform rather than vesting these roles and responsibilities in a single individual functioning as a subject advocate. The program is process-oriented and output-driven, drawing on the strengths of participating institutions to engage local stakeholders both in the protection of research subjects and in advocacy for subjects' rights. The program engages stakeholder communities in the collaborative development and distributed delivery of accessible and applicable educational programming and resources. The Harvard Catalyst RSA Program identifies, develops, and supports the sharing and distribution of expertise, education, and resources for the benefit of all institutions, with a particular focus on the frontline: research subjects, researchers, research coordinators, and research nurses. © 2014 Wiley Periodicals, Inc.

  3. A space transportation system operations model

    NASA Technical Reports Server (NTRS)

    Morris, W. Douglas; White, Nancy H.

    1987-01-01

    Presented is a description of a computer program which permits assessment of the operational support requirements of space transportation systems functioning in both a ground- and space-based environment. The scenario depicted provides for the delivery of payloads from Earth to a space station and beyond using upper stages based at the station. Model results are scenario dependent and rely on the input definitions of delivery requirements, task times, and available resources. Output is in terms of flight rate capabilities, resource requirements, and facility utilization. A general program description, program listing, input requirements, and sample output are included.

  4. Modeling the GPR response of leaking, buried pipes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powers, M.H.; Olhoeft, G.R.

    1996-11-01

    Using a 2.5D, dispersive, full waveform GPR modeling program that generates complete GPR response profiles in minutes on a Pentium PC, the effects of leaking versus non-leaking buried pipes are examined. The program accounts for the dispersive, lossy nature of subsurface materials to GPR wave propagation, and accepts complex functions of dielectric permittivity and magnetic permeability versus frequency through Cole-Cole parameters fit to laboratory data. Steel and plastic pipes containing a DNAPL chlorinated solvent, an LNAPL hydrocarbon, and natural gas are modeled in a surrounding medium of wet, moist, and dry sand. Leaking fluids are found to be more detectablemore » when the sand around the pipes is fully water saturated. The short runtimes of the modeling program and its execution on a PC make it a useful tool for exploring various subsurface models.« less

  5. Programming of a flexible computer simulation to visualize pharmacokinetic-pharmacodynamic models.

    PubMed

    Lötsch, J; Kobal, G; Geisslinger, G

    2004-01-01

    Teaching pharmacokinetic-pharmacodynamic (PK/PD) models can be made more effective using computer simulations. We propose the programming of educational PK or PK/PD computer simulations as an alternative to the use of pre-built simulation software. This approach has the advantage of adaptability to non-standard or complicated PK or PK/PD models. Simplicity of the programming procedure was achieved by selecting the LabVIEW programming environment. An intuitive user interface to visualize the time courses of drug concentrations or effects can be obtained with pre-built elements. The environment uses a wiring analogy that resembles electrical circuit diagrams rather than abstract programming code. The goal of high interactivity of the simulation was attained by allowing the program to run in continuously repeating loops. This makes the program behave flexibly to the user input. The programming is described with the aid of a 2-compartment PK simulation. Examples of more sophisticated simulation programs are also given where the PK/PD simulation shows drug input, concentrations in plasma, and at effect site and the effects themselves as a function of time. A multi-compartmental model of morphine, including metabolite kinetics and effects is also included. The programs are available for download from the World Wide Web at http:// www. klinik.uni-frankfurt.de/zpharm/klin/ PKPDsimulation/content.html. For pharmacokineticists who only program occasionally, there is the possibility of building the computer simulation, together with the flexible interactive simulation algorithm for clinical pharmacological teaching in the field of PK/PD models.

  6. Eye growth and myopia development: Unifying theory and Matlab model.

    PubMed

    Hung, George K; Mahadas, Kausalendra; Mohammad, Faisal

    2016-03-01

    The aim of this article is to present an updated unifying theory of the mechanisms underlying eye growth and myopia development. A series of model simulation programs were developed to illustrate the mechanism of eye growth regulation and myopia development. Two fundamental processes are presumed to govern the relationship between physiological optics and eye growth: genetically pre-programmed signaling and blur feedback. Cornea/lens is considered to have only a genetically pre-programmed component, whereas eye growth is considered to have both a genetically pre-programmed and a blur feedback component. Moreover, based on the Incremental Retinal-Defocus Theory (IRDT), the rate of change of blur size provides the direction for blur-driven regulation. The various factors affecting eye growth are shown in 5 simulations: (1 - unregulated eye growth): blur feedback is rendered ineffective, as in the case of form deprivation, so there is only genetically pre-programmed eye growth, generally resulting in myopia; (2 - regulated eye growth): blur feedback regulation demonstrates the emmetropization process, with abnormally excessive or reduced eye growth leading to myopia and hyperopia, respectively; (3 - repeated near-far viewing): simulation of large-to-small change in blur size as seen in the accommodative stimulus/response function, and via IRDT as well as nearwork-induced transient myopia (NITM), leading to the development of myopia; (4 - neurochemical bulk flow and diffusion): release of dopamine from the inner plexiform layer of the retina, and the subsequent diffusion and relay of neurochemical cascade show that a decrease in dopamine results in a reduction of proteoglycan synthesis rate, which leads to myopia; (5 - Simulink model): model of genetically pre-programmed signaling and blur feedback components that allows for different input functions to simulate experimental manipulations that result in hyperopia, emmetropia, and myopia. These model simulation programs (available upon request) can provide a useful tutorial for the general scientist and serve as a quantitative tool for researchers in eye growth and myopia. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. The impact of premorbid adjustment, neurocognition, and depression on social and role functioning in patients in an early psychosis treatment program.

    PubMed

    Minor, Kyle S; Friedman-Yakoobian, Michelle; Leung, Y Jude; Meyer, Eric C; Zimmet, Suzanna V; Caplan, Brina; Monteleone, Thomas; Bryant, Caitlin; Guyer, Margaret; Keshavan, Matcheri S; Seidman, Larry J

    2015-05-01

    Functional impairments are debilitating concomitants of psychotic disorders and are present early in the illness course and, commonly, prior to psychosis onset. The factors affecting social and role functioning in early psychosis (EP) following treatment are unclear. We evaluated whether six months of participation in the PREP(R), Boston, EP treatment program, part of a public-academic community mental health center, was related to improvements in social and role functioning and whether premorbid adjustment in adolescence, baseline neurocognition, and depression symptoms predicted functional improvement. The Global Functioning Social and Role scales, MATRICS neurocognitive battery, and Calgary Depression Scale were assessed at baseline and six months during naturalistic treatment, while premorbid adjustment was measured at baseline. All participants were psychotic disorder patients in PREP(R) (n = 46 with social functioning and 47 with role functioning measures at both time points). Large improvements were observed in role functioning (d = 0.84) and medium to large improvements were observed in social functioning (d = 0.70). Models consisting of adolescent premorbid adjustment and change in depression symptoms predicted social and role functioning change, whereas neuropsychological functioning did not. Substantial improvements in social and role functioning were observed among this sample participating in a recovery-based EP program. The impact of clinical factors on social and role functioning was highlighted. Further studies of premorbid adjustment in adolescence and the treatment of depression in EP programs in controlled treatment trials are needed to confirm these findings. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  8. Development of a model of machine hand eye coordination and program specifications for a topological machine vision system

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A unified approach to computer vision and manipulation is developed which is called choreographic vision. In the model, objects to be viewed by a projected robot in the Viking missions to Mars are seen as objects to be manipulated within choreographic contexts controlled by a multimoded remote, supervisory control system on Earth. A new theory of context relations is introduced as a basis for choreographic programming languages. A topological vision model is developed for recognizing objects by shape and contour. This model is integrated with a projected vision system consisting of a multiaperture image dissector TV camera and a ranging laser system. System program specifications integrate eye-hand coordination and topological vision functions and an aerospace multiprocessor implementation is described.

  9. Model-Driven Approach for Body Area Network Application Development.

    PubMed

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-05-12

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.

  10. Model-Driven Approach for Body Area Network Application Development

    PubMed Central

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-01-01

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application. PMID:27187394

  11. The effects of display and autopilot functions on pilot workload for Single Pilot Instrument Flight Rule (SPIFR) operations

    NASA Technical Reports Server (NTRS)

    Hoh, Roger H.; Smith, James C.; Hinton, David A.

    1987-01-01

    An analytical and experimental research program was conducted to develop criteria for pilot interaction with advanced controls and displays in single pilot instrument flight rules (SPIFR) operations. The analytic phase reviewed fundamental considerations for pilot workload taking into account existing data, and using that data to develop a divided attention SPIFR pilot workload model. The pilot model was utilized to interpret the two experimental phases. The first experimental phase was a flight test program that evaluated pilot workload in the presence of current and near-term displays and autopilot functions. The second experiment was conducted on a King Air simulator, investigating the effects of co-pilot functions in the presence of very high SPIFR workload. The results indicate that the simplest displays tested were marginal for SPIFR operations. A moving map display aided the most in mental orientation, but had inherent deficiencies as a stand alone replacement for an HSI. Autopilot functions were highly effective for reducing pilot workload. The simulator tests showed that extremely high workload situations can be adequately handled when co-pilot functions are provided.

  12. A MATLAB®-based program for 3D visualization of stratigraphic setting and subsidence evolution of sedimentary basins: example application to the Vienna Basin

    NASA Astrophysics Data System (ADS)

    Lee, Eun Young; Novotny, Johannes; Wagreich, Michael

    2015-04-01

    In recent years, 3D visualization of sedimentary basins has become increasingly popular. Stratigraphic and structural mapping is highly important to understand the internal setting of sedimentary basins. And subsequent subsidence analysis provides significant insights for basin evolution. This study focused on developing a simple and user-friendly program which allows geologists to analyze and model sedimentary basin data. The developed program is aimed at stratigraphic and subsidence modelling of sedimentary basins from wells or stratigraphic profile data. This program is mainly based on two numerical methods; surface interpolation and subsidence analysis. For surface visualization four different interpolation techniques (Linear, Natural, Cubic Spline, and Thin-Plate Spline) are provided in this program. The subsidence analysis consists of decompaction and backstripping techniques. The numerical methods are computed in MATLAB® which is a multi-paradigm numerical computing environment used extensively in academic, research, and industrial fields. This program consists of five main processing steps; 1) setup (study area and stratigraphic units), 2) loading of well data, 3) stratigraphic modelling (depth distribution and isopach plots), 4) subsidence parameter input, and 5) subsidence modelling (subsided depth and subsidence rate plots). The graphical user interface intuitively guides users through all process stages and provides tools to analyse and export the results. Interpolation and subsidence results are cached to minimize redundant computations and improve the interactivity of the program. All 2D and 3D visualizations are created by using MATLAB plotting functions, which enables users to fine-tune the visualization results using the full range of available plot options in MATLAB. All functions of this program are illustrated with a case study of Miocene sediments in the Vienna Basin. The basin is an ideal place to test this program, because sufficient data is available to analyse and model stratigraphic setting and subsidence evolution of the basin. The study area covers approximately 1200 km2 including 110 data points in the central part of the Vienna Basin.

  13. Designing post-graduate Master's degree programs: the advanced training program in Dental Functional Analysis and Therapy as one example.

    PubMed

    Ratzmann, Anja; Ruge, Sebastian; Ostendorf, Kristin; Kordass, Bernd

    2014-01-01

    The decision to consolidate European higher education was reached by the Bologna Conference. Based on the Anglo-American system, a two-cycle degree program (Bachelor and Master) has been introduced. Subjects culminating in a state examination, such as Medicine and Dentistry, were excluded from this reform. Since the state examination is already comparable in its caliber to a Master's degree in Medicine or Dentistry, only advanced Master's degree programs with post-graduate specializations come into consideration for these subjects. In the field of dentistry numerous post-graduate study programs are increasingly coming into existence. Many different models and approaches are being pursued. Since the 2004-2005 winter semester, the University of Greifswald has offered the Master's degree program in Dental Functional Analysis and Therapy. Two and a half years in duration, this program is structured to allow program participation while working and targets licensed dentists who wish to attain certified skills for the future in state-of-the-art functional analysis and therapy. The design of this post-graduate program and the initial results of the evaluation by alumni are presented here. Our experiences show that the conceptual idea of an advanced Master's program has proved successful. The program covers a specialty which leads to increased confidence in handling challenging patient cases. The sharing of experiences among colleagues was evaluated as being especially important.

  14. A computer program to calculate zeroes, extrema, and interval integrals for the associated Legendre functions. [for estimation of bounds of truncation error in spherical harmonic expansion of geopotential

    NASA Technical Reports Server (NTRS)

    Payne, M. H.

    1973-01-01

    A computer program is described for the calculation of the zeroes of the associated Legendre functions, Pnm, and their derivatives, for the calculation of the extrema of Pnm and also the integral between pairs of successive zeroes. The program has been run for all n,m from (0,0) to (20,20) and selected cases beyond that for n up to 40. Up to (20,20), the program (written in double precision) retains nearly full accuracy, and indications are that up to (40,40) there is still sufficient precision (4-5 decimal digits for a 54-bit mantissa) for estimation of various bounds and errors involved in geopotential modelling, the purpose for which the program was written.

  15. Fatigue crack growth model RANDOM2 user manual, appendix 1

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    The FORTRAN program RANDOM2 is documented. RANDOM2 is based on fracture mechanics using a probabilistic fatigue crack growth model. It predicts the random lifetime of an engine component to reach a given crack size. Included in this user manual are details regarding the theoretical background of RANDOM2, input data, instructions and a sample problem illustrating the use of RANDOM2. Appendix A gives information on the physical quantities, their symbols, FORTRAN names, and both SI and U.S. Customary units. Appendix B includes photocopies of the actual computer printout corresponding to the sample problem. Appendices C and D detail the IMSL, Ver. 10(1), subroutines and functions called by RANDOM2 and a SAS/GRAPH(2) program that can be used to plot both the probability density function (p.d.f.) and the cumulative distribution function (c.d.f.).

  16. A Guide to Computer Simulations of Three Adaptive Instructional Models for the Advanced Instructional System Phases II and III. Final Report.

    ERIC Educational Resources Information Center

    Hansen, Duncan N.; And Others

    Computer simulations of three individualized adaptive instructional models (AIM) were undertaken to determine if these models function as prescribed in Air Force technical training programs. In addition, the project sought to develop a user's guide for effective understanding of adaptive models during field implementation. Successful simulations…

  17. A Cognitive Processing Account of Individual Differences in Novice Logo Programmers' Conceptualisation and Use of Recursion.

    ERIC Educational Resources Information Center

    Gibbons, Pamela

    1995-01-01

    Describes a study that investigated individual differences in the construction of mental models of recursion in LOGO programming. The learning process was investigated from the perspective of Norman's mental models theory and employed diSessa's ontology regarding distributed, functional, and surrogate mental models, and the Luria model of brain…

  18. Blueprint XAS: a Matlab-based toolbox for the fitting and analysis of XAS spectra.

    PubMed

    Delgado-Jaime, Mario Ulises; Mewis, Craig Philip; Kennepohl, Pierre

    2010-01-01

    Blueprint XAS is a new Matlab-based program developed to fit and analyse X-ray absorption spectroscopy (XAS) data, most specifically in the near-edge region of the spectrum. The program is based on a methodology that introduces a novel background model into the complete fit model and that is capable of generating any number of independent fits with minimal introduction of user bias [Delgado-Jaime & Kennepohl (2010), J. Synchrotron Rad. 17, 119-128]. The functions and settings on the five panels of its graphical user interface are designed to suit the needs of near-edge XAS data analyzers. A batch function allows for the setting of multiple jobs to be run with Matlab in the background. A unique statistics panel allows the user to analyse a family of independent fits, to evaluate fit models and to draw statistically supported conclusions. The version introduced here (v0.2) is currently a toolbox for Matlab. Future stand-alone versions of the program will also incorporate several other new features to create a full package of tools for XAS data processing.

  19. Distributed Function Mining for Gene Expression Programming Based on Fast Reduction.

    PubMed

    Deng, Song; Yue, Dong; Yang, Le-chan; Fu, Xiong; Feng, Ya-zhou

    2016-01-01

    For high-dimensional and massive data sets, traditional centralized gene expression programming (GEP) or improved algorithms lead to increased run-time and decreased prediction accuracy. To solve this problem, this paper proposes a new improved algorithm called distributed function mining for gene expression programming based on fast reduction (DFMGEP-FR). In DFMGEP-FR, fast attribution reduction in binary search algorithms (FAR-BSA) is proposed to quickly find the optimal attribution set, and the function consistency replacement algorithm is given to solve integration of the local function model. Thorough comparative experiments for DFMGEP-FR, centralized GEP and the parallel gene expression programming algorithm based on simulated annealing (parallel GEPSA) are included in this paper. For the waveform, mushroom, connect-4 and musk datasets, the comparative results show that the average time-consumption of DFMGEP-FR drops by 89.09%%, 88.85%, 85.79% and 93.06%, respectively, in contrast to centralized GEP and by 12.5%, 8.42%, 9.62% and 13.75%, respectively, compared with parallel GEPSA. Six well-studied UCI test data sets demonstrate the efficiency and capability of our proposed DFMGEP-FR algorithm for distributed function mining.

  20. DockoMatic 2.0: High Throughput Inverse Virtual Screening and Homology Modeling

    PubMed Central

    Bullock, Casey; Cornia, Nic; Jacob, Reed; Remm, Andrew; Peavey, Thomas; Weekes, Ken; Mallory, Chris; Oxford, Julia T.; McDougal, Owen M.; Andersen, Timothy L.

    2013-01-01

    DockoMatic is a free and open source application that unifies a suite of software programs within a user-friendly Graphical User Interface (GUI) to facilitate molecular docking experiments. Here we describe the release of DockoMatic 2.0; significant software advances include the ability to: (1) conduct high throughput Inverse Virtual Screening (IVS); (2) construct 3D homology models; and (3) customize the user interface. Users can now efficiently setup, start, and manage IVS experiments through the DockoMatic GUI by specifying a receptor(s), ligand(s), grid parameter file(s), and docking engine (either AutoDock or AutoDock Vina). DockoMatic automatically generates the needed experiment input files and output directories, and allows the user to manage and monitor job progress. Upon job completion, a summary of results is generated by Dockomatic to facilitate interpretation by the user. DockoMatic functionality has also been expanded to facilitate the construction of 3D protein homology models using the Timely Integrated Modeler (TIM) wizard. The wizard TIM provides an interface that accesses the basic local alignment search tool (BLAST) and MODELLER programs, and guides the user through the necessary steps to easily and efficiently create 3D homology models for biomacromolecular structures. The DockoMatic GUI can be customized by the user, and the software design makes it relatively easy to integrate additional docking engines, scoring functions, or third party programs. DockoMatic is a free comprehensive molecular docking software program for all levels of scientists in both research and education. PMID:23808933

  1. Heliocentric interplanetary low thrust trajectory optimization program, supplement 1, part 2

    NASA Technical Reports Server (NTRS)

    Mann, F. I.; Horsewood, J. L.

    1978-01-01

    The improvements made to the HILTOP electric propulsion trajectory computer program are described. A more realistic propulsion system model was implemented in which various thrust subsystem efficiencies and specific impulse are modeled as variable functions of power available to the propulsion system. The number of operating thrusters are staged, and the beam voltage is selected from a set of five (or less) constant voltages, based upon the application of variational calculus. The constant beam voltages may be optimized individually or collectively. The propulsion system logic is activated by a single program input key in such a manner as to preserve the HILTOP logic. An analysis describing these features, a complete description of program input quantities, and sample cases of computer output illustrating the program capabilities are presented.

  2. Construction of SO(5)⊃SO(3) spherical harmonics and Clebsch-Gordan coefficients

    NASA Astrophysics Data System (ADS)

    Caprio, M. A.; Rowe, D. J.; Welsh, T. A.

    2009-07-01

    The SO(5)⊃SO(3) spherical harmonics form a natural basis for expansion of nuclear collective model angular wave functions. They underlie the recently-proposed algebraic method for diagonalization of the nuclear collective model Hamiltonian in an SU(1,1)×SO(5) basis. We present a computer code for explicit construction of the SO(5)⊃SO(3) spherical harmonics and use them to compute the Clebsch-Gordan coefficients needed for collective model calculations in an SO(3)-coupled basis. With these Clebsch-Gordan coefficients it becomes possible to compute the matrix elements of collective model observables by purely algebraic methods. Program summaryProgram title: GammaHarmonic Catalogue identifier: AECY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 346 421 No. of bytes in distributed program, including test data, etc.: 16 037 234 Distribution format: tar.gz Programming language: Mathematica 6 Computer: Any which supports Mathematica Operating system: Any which supports Mathematica; tested under Microsoft Windows XP and Linux Classification: 4.2 Nature of problem: Explicit construction of SO(5) ⊃ SO(3) spherical harmonics on S. Evaluation of SO(3)-reduced matrix elements and SO(5) ⊃ SO(3) Clebsch-Gordan coefficients (isoscalar factors). Solution method: Construction of SO(5) ⊃ SO(3) spherical harmonics by orthonormalization, obtained from a generating set of functions, according to the method of Rowe, Turner, and Repka [1]. Matrix elements and Clebsch-Gordan coefficients follow by construction and integration of SO(3) scalar products. Running time: Depends strongly on the maximum SO(5) and SO(3) representation labels involved. A few minutes for the calculation in the Mathematica notebook. References: [1] D.J. Rowe, P.S. Turner, J. Repka, J. Math. Phys. 45 (2004) 2761.

  3. Noise produced by turbulent flow into a rotor: Users manual for noise calculation

    NASA Technical Reports Server (NTRS)

    Amiet, R. K.; Egolf, C. G.; Simonich, J. C.

    1989-01-01

    A users manual for a computer program for the calculation of noise produced by turbulent flow into a helicopter rotor is presented. These inputs to the program are obtained from the atmospheric turbulence model and mean flow distortion calculation, described in another volume of this set of reports. Descriptions of the various program modules and subroutines, their function, programming structure, and the required input and output variables are included. This routine is incorporated as one module of NASA's ROTONET helicopter noise prediction program.

  4. User's Manual for FSLIP-3, FLEXSTAB Loads Integration Program

    NASA Technical Reports Server (NTRS)

    Sims, R. L.

    1981-01-01

    The FSLIP program documentation and user's manual is presented. As a follow on program to the FLEXSTAB computer analysis system, the primary function of this FORTRAN IV program is to integrate panel pressure coefficients computed by FLEXSTAB to obtain total shear, bending, and torque airloads on various surfaces, summed relative to user specified axes. The program essentially replaces the ALOADS module in FLEXSTAB with expanded capabilities and flexibility. As such, FSLIP is generalized to work on any FLEXSTAB model or other pressure data if in a compatible format.

  5. Resource Manual on the Use of Computers in Schooling.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of Technology Applications.

    This resource manual is designed to provide educators with timely information on the use of computers and related technology in schools. Section one includes a review of the new Bureau of Technology Applications' goal, functions, and major programs and activities; a description of the Model Schools Program, which has been conceptually derived from…

  6. Critical Elements of Student Assistance Programs: A Qualitative Study

    ERIC Educational Resources Information Center

    Torres-Rodriguez, Leslie; Beyard, Karen; Goldstein, Marc B.

    2010-01-01

    Student assistance programs (SAPs) are one approach for using teams to respond to student needs, but there is little research on SAP implementation and whether SAPs function as intended. The authors present findings from a study of two SAPs that use a model developed by Connecticut's Governor's Prevention Partnership. The study focused on…

  7. Strategic Leadership: A Model for Promoting, Sustaining, and Advancing Institutional Significance

    ERIC Educational Resources Information Center

    Scott, Kenneth E.; Johnson, Mimi

    2011-01-01

    This article presents the methods, materials, and manpower required to create a strategic leadership program for promoting, sustaining, and advancing institutional significance. The functionality of the program is based on the Original Case Study Design (OCSD) methodology, in which participants are given actual college issues to investigate from a…

  8. JobTIPS: A Transition to Employment Program for Individuals with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Strickland, Dorothy C.; Coles, Claire D.; Southern, Louise B.

    2013-01-01

    This study evaluated the effectiveness of an internet accessed training program that included Theory of Mind-based guidance, video models, visual supports, and virtual reality practice sessions in teaching appropriate job interview skills to individuals with high functioning Autism Spectrum Disorders. In a randomized study, twenty-two youth, ages…

  9. The Contribution of the Insula to Motor Aspects of Speech Production: A Review and a Hypothesis

    ERIC Educational Resources Information Center

    Ackermann, Hermann; Riecker, Axel

    2004-01-01

    Based on clinical and functional imaging data, the left anterior insula has been assumed to support prearticulatory functions of speech motor control such as the ''programming'' of vocal tract gestures. In order to further elucidate this model, a recent functional magnetic resonance imaging (fMRI) study of our group (Riecker, Ackermann,…

  10. Clipping in neurocontrol by adaptive dynamic programming.

    PubMed

    Fairbank, Michael; Prokhorov, Danil; Alonso, Eduardo

    2014-10-01

    In adaptive dynamic programming, neurocontrol, and reinforcement learning, the objective is for an agent to learn to choose actions so as to minimize a total cost function. In this paper, we show that when discretized time is used to model the motion of the agent, it can be very important to do clipping on the motion of the agent in the final time step of the trajectory. By clipping, we mean that the final time step of the trajectory is to be truncated such that the agent stops exactly at the first terminal state reached, and no distance further. We demonstrate that when clipping is omitted, learning performance can fail to reach the optimum, and when clipping is done properly, learning performance can improve significantly. The clipping problem we describe affects algorithms that use explicit derivatives of the model functions of the environment to calculate a learning gradient. These include backpropagation through time for control and methods based on dual heuristic programming. However, the clipping problem does not significantly affect methods based on heuristic dynamic programming, temporal differences learning, or policy-gradient learning algorithms.

  11. GDF v2.0, an enhanced version of GDF

    NASA Astrophysics Data System (ADS)

    Tsoulos, Ioannis G.; Gavrilis, Dimitris; Dermatas, Evangelos

    2007-12-01

    An improved version of the function estimation program GDF is presented. The main enhancements of the new version include: multi-output function estimation, capability of defining custom functions in the grammar and selection of the error function. The new version has been evaluated on a series of classification and regression datasets, that are widely used for the evaluation of such methods. It is compared to two known neural networks and outperforms them in 5 (out of 10) datasets. Program summaryTitle of program: GDF v2.0 Catalogue identifier: ADXC_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXC_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 98 147 No. of bytes in distributed program, including test data, etc.: 2 040 684 Distribution format: tar.gz Programming language: GNU C++ Computer: The program is designed to be portable in all systems running the GNU C++ compiler Operating system: Linux, Solaris, FreeBSD RAM: 200000 bytes Classification: 4.9 Does the new version supersede the previous version?: Yes Nature of problem: The technique of function estimation tries to discover from a series of input data a functional form that best describes them. This can be performed with the use of parametric models, whose parameters can adapt according to the input data. Solution method: Functional forms are being created by genetic programming which are approximations for the symbolic regression problem. Reasons for new version: The GDF package was extended in order to be more flexible and user customizable than the old package. The user can extend the package by defining his own error functions and he can extend the grammar of the package by adding new functions to the function repertoire. Also, the new version can perform function estimation of multi-output functions and it can be used for classification problems. Summary of revisions: The following features have been added to the package GDF: Multi-output function approximation. The package can now approximate any function f:R→R. This feature gives also to the package the capability of performing classification and not only regression. User defined function can be added to the repertoire of the grammar, extending the regression capabilities of the package. This feature is limited to 3 functions, but easily this number can be increased. Capability of selecting the error function. The package offers now to the user apart from the mean square error other error functions such as: mean absolute square error, maximum square error. Also, user defined error functions can be added to the set of error functions. More verbose output. The main program displays more information to the user as well as the default values for the parameters. Also, the package gives to the user the capability to define an output file, where the output of the gdf program for the testing set will be stored after the termination of the process. Additional comments: A technical report describing the revisions, experiments and test runs is packaged with the source code. Running time: Depending on the train data.

  12. Eliciting Responsivity: Exploring Programming Interests of Federal Inmates as a Function of Security Classification.

    PubMed

    Neller, Daniel J; Vitacco, Michael J; Magaletta, Philip R; Phillips-Boyles, A Brooke

    2016-03-01

    Research supports the effectiveness of the Risk-Needs-Responsivity model for reducing criminal recidivism. Yet programming interests of inmates--one facet of responsivity--remain an understudied phenomenon. In the present study, we explored the programming interests of 753 federal inmates housed across three levels of security. Results suggest that inmates, as a group, prefer specific programs over others, and that some of their interests may differ by security level. We discuss possible implications of these findings. © The Author(s) 2014.

  13. Research-Based Model for Adult Consumer-Homemaking Education.

    ERIC Educational Resources Information Center

    Ball State Univ., Muncie, IN.

    This model is designed to be used as a guide by all teachers and designers of adult vocational consumer and homemaking courses who usually function as program planners. Chapter 1 contains an operational definition, the rationale, and description of intended users. Chapter 2 presents the model description with an overview and discussion of the…

  14. A cardiovascular system model for lower-body negative pressure response

    NASA Technical Reports Server (NTRS)

    Mitchell, B. A., Jr.; Giese, R. P.

    1971-01-01

    Mathematical models used to study complex physiological control systems are discussed. Efforts were made to modify a model of the cardiovascular system for use in studying lower body negative pressure. A computer program was written which allows orderly, straightforward expansion to include exercise, metabolism (thermal stress), respiration, and other body functions.

  15. Torn in Two: An Examiniation of Elementary School Counselors' Perceptions on Self-Efficacy

    ERIC Educational Resources Information Center

    Sesto, Casper

    2013-01-01

    The American School Counselor Association (ASCA; The ASCA National Model: A Framework for School Counseling Programs, 2005) developed the ASCA National Model to define the prescribed roles and functions of the professional school counselor. Although the national model initially defines school counselors' roles, counselors find it difficult to…

  16. A study of satisfaction of medical students on their mentoring programs at one medical school in Korea

    PubMed Central

    2017-01-01

    Purpose The purpose of this study was to investigate the awareness levels of medical students regarding the characteristics of each function within a mentoring program conducted within Kyung Hee University and to ultimately suggest points for reformation. Medical students’ awareness levels were determined using a 29-item questionnaire. Methods The questionnaire was conducted on 347 medical students, excluding 25 students who either marked multiple answers or did not reply. The assessment of the program was based on a questionnaire with the use of a 5-point Likert scale using SPSS version 22.0. Multiple regression was conducted to examine the relationship between the satisfaction level, regarding functions of mentoring programs, and characteristics of mentoring programs. Interviews were conducted to supplement additional information that was hard to gain from the questionnaire. Results The results on demographic and functional characteristics revealed that there was no statistically significant differences in satisfaction levels across gender, whereas there were significant differences across grade levels. In addition, there were significant differences in the frequency of meetings and topics of conversation while the length of meetings and meeting place were not significantly different. Conclusion For the improved mentoring programs for medical students, the program should focus on the frequency of meetings and the topics of conversation. Furthermore, mentoring programs of high quality can be expected if professors take interview results into consideration. Also, students want to be provided with psychosocial advice from mentors in various ways such as role model function. PMID:29207456

  17. Structure and Functions of Pediatric Aerodigestive Programs: A Consensus Statement.

    PubMed

    Boesch, R Paul; Balakrishnan, Karthik; Acra, Sari; Benscoter, Dan T; Cofer, Shelagh A; Collaco, Joseph M; Dahl, John P; Daines, Cori L; DeAlarcon, Alessandro; DeBoer, Emily M; Deterding, Robin R; Friedlander, Joel A; Gold, Benjamin D; Grothe, Rayna M; Hart, Catherine K; Kazachkov, Mikhail; Lefton-Greif, Maureen A; Miller, Claire Kane; Moore, Paul E; Pentiuk, Scott; Peterson-Carmichael, Stacey; Piccione, Joseph; Prager, Jeremy D; Putnam, Philip E; Rosen, Rachel; Rutter, Michael J; Ryan, Matthew J; Skinner, Margaret L; Torres-Silva, Cherie; Wootten, Christopher T; Zur, Karen B; Cotton, Robin T; Wood, Robert E

    2018-02-07

    Aerodigestive programs provide coordinated interdisciplinary care to pediatric patients with complex congenital or acquired conditions affecting breathing, swallowing, and growth. Although there has been a proliferation of programs, as well as national meetings, interest groups and early research activity, there is, as of yet, no consensus definition of an aerodigestive patient, standardized structure, and functions of an aerodigestive program or a blueprint for research prioritization. The Delphi method was used by a multidisciplinary and multi-institutional panel of aerodigestive providers to obtain consensus on 4 broad content areas related to aerodigestive care: (1) definition of an aerodigestive patient, (2) essential construct and functions of an aerodigestive program, (3) identification of aerodigestive research priorities, and (4) evaluation and recognition of aerodigestive programs and future directions. After 3 iterations of survey, consensus was obtained by either a supermajority of 75% or stability in median ranking on 33 of 36 items. This included a standard definition of an aerodigestive patient, level of participation of specific pediatric disciplines in a program, essential components of the care cycle and functions of the program, feeding and swallowing assessment and therapy, procedural scope and volume, research priorities and outcome measures, certification, coding, and funding. We propose the first consensus definition of the aerodigestive care model with specific recommendations regarding associated personnel, infrastructure, research, and outcome measures. We hope that this may provide an initial framework to further standardize care, develop clinical guidelines, and improve outcomes for aerodigestive patients. Copyright © 2018 by the American Academy of Pediatrics.

  18. Ideas for the rapid development of the structural models in mechanical engineering

    NASA Astrophysics Data System (ADS)

    Oanta, E.; Raicu, A.; Panait, C.

    2017-08-01

    Conceiving computer based instruments is a long run concern of the authors. Some of the original solutions are: optimal processing of the large matrices, interfaces between the programming languages, approximation theory using spline functions, numerical programming increased accuracy based on the extended arbitrary precision libraries. For the rapid development of the models we identified the following directions: atomization, ‘librarization’, parameterization, automatization and integration. Each of these directions has some particular aspects if we approach mechanical design problems or software development. Atomization means a thorough top-down decomposition analysis which offers an insight regarding the basic features of the phenomenon. Creation of libraries of reusable mechanical parts and libraries of programs (data types, functions) save time, cost and effort when a new model must be conceived. Parameterization leads to flexible definition of the mechanical parts, the values of the parameters being changed either using a dimensioning program or in accord to other parts belonging to the same assembly. The resulting templates may be also included in libraries. Original software applications are useful for the model’s input data generation, to input the data into CAD/FEA commercial applications and for the data integration of the various types of studies included in the same project.

  19. Probabilistic material degradation model for aerospace materials subjected to high temperature, mechanical and thermal fatigue, and creep

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1992-01-01

    A probabilistic general material strength degradation model has been developed for structural components of aerospace propulsion systems subjected to diverse random effects. The model has been implemented in two FORTRAN programs, PROMISS (Probabilistic Material Strength Simulator) and PROMISC (Probabilistic Material Strength Calibrator). PROMISS calculates the random lifetime strength of an aerospace propulsion component due to as many as eighteen diverse random effects. Results are presented in the form of probability density functions and cumulative distribution functions of lifetime strength. PROMISC calibrates the model by calculating the values of empirical material constants.

  20. H-Bridge Inverter Loading Analysis for an Energy Management System

    DTIC Science & Technology

    2013-06-01

    In order to accomplish the stated objectives, a physics-based model of the system was developed in MATLAB/Simulink. The system was also implemented ...functional architecture and then compile the high level design down to VHDL in order to program the designed functions to the FPGA. B. INSULATED

  1. Monte Carlo-based searching as a tool to study carbohydrate structure

    USDA-ARS?s Scientific Manuscript database

    A torsion angle-based Monte-Carlo searching routine was developed and applied to several carbohydrate modeling problems. The routine was developed as a Unix shell script that calls several programs, which allows it to be interfaced with multiple potential functions and various functions for evaluat...

  2. Hopping out of the Swamp: Management of Change in a Downsizing Environment.

    ERIC Educational Resources Information Center

    Burton, Jennus L.

    1993-01-01

    Arizona State University has developed a model for managing declining resources in administrative service functions. A variant of Total Quality Management, it involves clarification of administrative unit functions, unit self-examination, establishment of program priorities, environmental scanning, creation of an infrastructure to manage change,…

  3. Public Relations Roles and Systems Theory: Functional and Historicist Causal Models.

    ERIC Educational Resources Information Center

    Broom, Glen M.

    The effectiveness of an organizations's adaptive behavior depends on the extent to which public relations concerns are considered in goal setting and program planning. The following five open systems propositions, based on a "functional" paradigm, address the complex relationship between public relations and organizational intelligence and do not…

  4. Mechanism to support generic collective communication across a variety of programming models

    DOEpatents

    Almasi, Gheorghe [Ardsley, NY; Dozsa, Gabor [Ardsley, NY; Kumar, Sameer [White Plains, NY

    2011-07-19

    A system and method for supporting collective communications on a plurality of processors that use different parallel programming paradigms, in one aspect, may comprise a schedule defining one or more tasks in a collective operation, an executor that executes the task, a multisend module to perform one or more data transfer functions associated with the tasks, and a connection manager that controls one or more connections and identifies an available connection. The multisend module uses the available connection in performing the one or more data transfer functions. A plurality of processors that use different parallel programming paradigms can use a common implementation of the schedule module, the executor module, the connection manager and the multisend module via a language adaptor specific to a parallel programming paradigm implemented on a processor.

  5. Exercise and end-stage kidney disease: functional exercise capacity and cardiovascular outcomes.

    PubMed

    Parsons, Trisha L; King-Vanvlack, Cheryl E

    2009-11-01

    This review examined published reports of the impact of extradialytic and intradialytic exercise programs on physiologic aerobic exercise capacity, functional exercise endurance, and cardiovascular outcomes in individuals with ESKD. Studies spanning 30 years from the first published report of exercise in the ESKD population were reviewed. Studies conducted in the first half of the publication record focused on the efficacy of exercise training programs performed "off"-dialysis with respect to the modification of traditional cardiovascular risk factors, aerobic capacity, and its underlying determinants. In the latter half of the record, there had been a shift to include other client-centered goals such as physical function and quality of life. There is evidence that both intra- and extradialytic programs can significantly enhance aerobic exercise capacity, but moderate-intensity extradialytic programs may result in greater gains in those individuals who initially have extremely poor aerobic capacity. Functionally, substantive improvements in exercise endurance in excess of the minimum clinical significant difference can occur following either low- or moderate-intensity exercise regardless of the initial level of performance. Reductions in blood pressure and enhanced vascular functioning reported after predominantly intradialytic exercise programs suggest that either low- or moderate-intensity exercise programs can confer cardiovascular benefit. Regardless of prescription model, there was an overall lack of evidence regarding the impact of exercise-induced changes in exercise capacity, endurance, and cardiovascular function on a number of relevant health outcomes (survival, morbidity, and cardiovascular risk), and, more importantly, there is no evidence on the long-term impact of exercise and/or physical activity interventions on these health outcomes.

  6. Catalog of Residential Depth-Damage Functions Used by the Army Corps of Engineers in Flood Damage Estimation

    DTIC Science & Technology

    1992-05-01

    regression analysis. The strength of any one variable can be estimated along with the strength of the entire model in explaining the variance of percent... applicable a set of damage functions is to a particular situation. Sometimes depth- damage functions are embedded in computer programs which calculate...functions. Chapter Six concludes with recommended policies on the development and application of depth-damage functions. 5 6 CHAPTER TWO CONSTRUCTION OF

  7. Lidar performance analysis

    NASA Technical Reports Server (NTRS)

    Spiers, Gary D.

    1994-01-01

    Section 1 details the theory used to build the lidar model, provides results of using the model to evaluate AEOLUS design instrument designs, and provides snapshots of the visual appearance of the coded model. Appendix A contains a Fortran program to calculate various forms of the refractive index structure function. This program was used to determine the refractive index structure function used in the main lidar simulation code. Appendix B contains a memo on the optimization of the lidar telescope geometry for a line-scan geometry. Appendix C contains the code for the main lidar simulation and brief instruction on running the code. Appendix D contains a Fortran code to calculate the maximum permissible exposure for the eye from the ANSI Z136.1-1992 eye safety standards. Appendix E contains a paper on the eye safety analysis of a space-based coherent lidar presented at the 7th Coherent Laser Radar Applications and Technology Conference, Paris, France, 19-23 July 1993.

  8. Investment portfolio of a pension fund: Stochastic model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosch-Princep, M.; Fontanals-Albiol, H.

    1994-12-31

    This paper presents a stochastic programming model that aims at getting the optimal investment portfolio of a Pension Funds. The model has been designed bearing in mind the liabilities of the Funds to its members. The essential characteristic of the objective function and the constraints is the randomness of the coefficients and the right hand side of the constraints, so it`s necessary to use techniques of stochastic mathematical programming to get information about the amount of money that should be assigned to each sort of investment. It`s important to know the risky attitude of the person that has to takemore » decisions towards running risks. It incorporates the relation between the different coefficients of the objective function and constraints of each period of temporal horizon, through lineal and discrete random processes. Likewise, it includes the hypotheses that are related to Spanish law concerning the subject of Pension Funds.« less

  9. Comparison of two non-convex mixed-integer nonlinear programming algorithms applied to autoregressive moving average model structure and parameter estimation

    NASA Astrophysics Data System (ADS)

    Uilhoorn, F. E.

    2016-10-01

    In this article, the stochastic modelling approach proposed by Box and Jenkins is treated as a mixed-integer nonlinear programming (MINLP) problem solved with a mesh adaptive direct search and a real-coded genetic class of algorithms. The aim is to estimate the real-valued parameters and non-negative integer, correlated structure of stationary autoregressive moving average (ARMA) processes. The maximum likelihood function of the stationary ARMA process is embedded in Akaike's information criterion and the Bayesian information criterion, whereas the estimation procedure is based on Kalman filter recursions. The constraints imposed on the objective function enforce stability and invertibility. The best ARMA model is regarded as the global minimum of the non-convex MINLP problem. The robustness and computational performance of the MINLP solvers are compared with brute-force enumeration. Numerical experiments are done for existing time series and one new data set.

  10. RINGMesh: A programming library for developing mesh-based geomodeling applications

    NASA Astrophysics Data System (ADS)

    Pellerin, Jeanne; Botella, Arnaud; Bonneau, François; Mazuyer, Antoine; Chauvin, Benjamin; Lévy, Bruno; Caumon, Guillaume

    2017-07-01

    RINGMesh is a C++ open-source programming library for manipulating discretized geological models. It is designed to ease the development of applications and workflows that use discretized 3D models. It is neither a geomodeler, nor a meshing software. RINGMesh implements functionalities to read discretized surface-based or volumetric structural models and to check their validity. The models can be then exported in various file formats. RINGMesh provides data structures to represent geological structural models, either defined by their discretized boundary surfaces, and/or by discretized volumes. A programming interface allows to develop of new geomodeling methods, and to plug in external software. The goal of RINGMesh is to help researchers to focus on the implementation of their specific method rather than on tedious tasks common to many applications. The documented code is open-source and distributed under the modified BSD license. It is available at https://www.ring-team.org/index.php/software/ringmesh.

  11. pyres: a Python wrapper for electrical resistivity modeling with R2

    NASA Astrophysics Data System (ADS)

    Befus, Kevin M.

    2018-04-01

    A Python package, pyres, was written to handle common as well as specialized input and output tasks for the R2 electrical resistivity (ER) modeling program. Input steps including handling field data, creating quadrilateral or triangular meshes, and data filtering allow repeatable and flexible ER modeling within a programming environment. pyres includes non-trivial routines and functions for locating and constraining specific known or separately-parameterized regions in both quadrilateral and triangular meshes. Three basic examples of how to run forward and inverse models with pyres are provided. The importance of testing mesh convergence and model sensitivity are also addressed with higher-level examples that show how pyres can facilitate future research-grade ER analyses.

  12. SCI Identification (SCIDNT) program user's guide. [maximum likelihood method for linear rotorcraft models

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The computer program Linear SCIDNT which evaluates rotorcraft stability and control coefficients from flight or wind tunnel test data is described. It implements the maximum likelihood method to maximize the likelihood function of the parameters based on measured input/output time histories. Linear SCIDNT may be applied to systems modeled by linear constant-coefficient differential equations. This restriction in scope allows the application of several analytical results which simplify the computation and improve its efficiency over the general nonlinear case.

  13. A new implementation of the programming system for structural synthesis (PROSSS-2)

    NASA Technical Reports Server (NTRS)

    Rogers, James L., Jr.

    1984-01-01

    This new implementation of the PROgramming System for Structural Synthesis (PROSSS-2) combines a general-purpose finite element computer program for structural analysis, a state-of-the-art optimization program, and several user-supplied, problem-dependent computer programs. The results are flexibility of the optimization procedure, organization, and versatility of the formulation of constraints and design variables. The analysis-optimization process results in a minimized objective function, typically the mass. The analysis and optimization programs are executed repeatedly by looping through the system until the process is stopped by a user-defined termination criterion. However, some of the analysis, such as model definition, need only be one time and the results are saved for future use. The user must write some small, simple FORTRAN programs to interface between the analysis and optimization programs. One of these programs, the front processor, converts the design variables output from the optimizer into the suitable format for input into the analyzer. Another, the end processor, retrieves the behavior variables and, optionally, their gradients from the analysis program and evaluates the objective function and constraints and optionally their gradients. These quantities are output in a format suitable for input into the optimizer. These user-supplied programs are problem-dependent because they depend primarily upon which finite elements are being used in the model. PROSSS-2 differs from the original PROSSS in that the optimizer and front and end processors have been integrated into the finite element computer program. This was done to reduce the complexity and increase portability of the system, and to take advantage of the data handling features found in the finite element program.

  14. Towards aspect-oriented functional–structural plant modelling

    PubMed Central

    Cieslak, Mikolaj; Seleznyova, Alla N.; Prusinkiewicz, Przemyslaw; Hanan, Jim

    2011-01-01

    Background and Aims Functional–structural plant models (FSPMs) are used to integrate knowledge and test hypotheses of plant behaviour, and to aid in the development of decision support systems. A significant amount of effort is being put into providing a sound methodology for building them. Standard techniques, such as procedural or object-oriented programming, are not suited for clearly separating aspects of plant function that criss-cross between different components of plant structure, which makes it difficult to reuse and share their implementations. The aim of this paper is to present an aspect-oriented programming approach that helps to overcome this difficulty. Methods The L-system-based plant modelling language L+C was used to develop an aspect-oriented approach to plant modelling based on multi-modules. Each element of the plant structure was represented by a sequence of L-system modules (rather than a single module), with each module representing an aspect of the element's function. Separate sets of productions were used for modelling each aspect, with context-sensitive rules facilitated by local lists of modules to consider/ignore. Aspect weaving or communication between aspects was made possible through the use of pseudo-L-systems, where the strict-predecessor of a production rule was specified as a multi-module. Key Results The new approach was used to integrate previously modelled aspects of carbon dynamics, apical dominance and biomechanics with a model of a developing kiwifruit shoot. These aspects were specified independently and their implementation was based on source code provided by the original authors without major changes. Conclusions This new aspect-oriented approach to plant modelling is well suited for studying complex phenomena in plant science, because it can be used to integrate separate models of individual aspects of plant development and function, both previously constructed and new, into clearly organized, comprehensive FSPMs. In a future work, this approach could be further extended into an aspect-oriented programming language for FSPMs. PMID:21724653

  15. Effects of classroom animal-assisted activities on social functioning in children with autism spectrum disorder.

    PubMed

    O'Haire, Marguerite E; McKenzie, Samantha J; McCune, Sandra; Slaughter, Virginia

    2014-03-01

    The objective of this study was to implement and evaluate a classroom-based Animal-Assisted Activities (AAA) program on social functioning in children with autism spectrum disorder (ASD). This was a multisite, control-to-intervention design study. The study was conducted in 41 classrooms in 15 schools in Brisbane, Australia. Sixty-four (64) 5- to 12-year-old children diagnosed with ASD comprised the study group. The AAA program consisted of 8 weeks of animal exposure in the school classroom in addition to 16 20-minute animal-interaction sessions. Teacher- and parent-reported child behavior and social functioning were assessed through standardized instruments at three time points: upon study entry (Time 1), after an 8-week waiting period during the week prior to the AAA program (Time 2), and during the week following the 8-week AAA program (Time 3). Significant improvements were identified in social functioning, including increases in social approach behaviors and social skills, and decreases in social withdrawal behaviors, from before to after the AAA program, but not during the waitlist period. Over half of parents also reported that participants demonstrated an increased interest in attending school during the program. Results demonstrate the feasibility and potential efficacy of a new classroom-based Animal-Assisted Activities model, which may provide a relatively simple and cost-effective means of helping educators and families to improve the social functioning of children with ASD.

  16. Space Generic Open Avionics Architecture (SGOAA): Overview

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.; Stovall, John R.

    1992-01-01

    A space generic open avionics architecture created for NASA is described. It will serve as the basis for entities in spacecraft core avionics, capable of being tailored by NASA for future space program avionics ranging from small vehicles such as Moon ascent/descent vehicles to large ones such as Mars transfer vehicles or orbiting stations. The standard consists of: (1) a system architecture; (2) a generic processing hardware architecture; (3) a six class architecture interface model; (4) a system services functional subsystem architectural model; and (5) an operations control functional subsystem architectural model.

  17. Clinical Assessment of Family Caregivers in Dementia.

    ERIC Educational Resources Information Center

    Rankin, Eric D.; And Others

    1992-01-01

    Evaluated development of integrated family assessment inventory based on Double ABCX and Circumplex models of family functioning and its clinical utility with 121 primary family caregivers from cognitive disorders program. Proposed model predicted significant proportion of variance associated with caregiver stress and strain. Several aspects of…

  18. Evaluation of space shuttle main engine fluid dynamic frequency response characteristics

    NASA Technical Reports Server (NTRS)

    Gardner, T. G.

    1980-01-01

    In order to determine the POGO stability characteristics of the space shuttle main engine liquid oxygen (LOX) system, the fluid dynamic frequency response functions between elements in the SSME LOX system was evaluated, both analytically and experimentally. For the experimental data evaluation, a software package was written for the Hewlett-Packard 5451C Fourier analyzer. The POGO analysis software is documented and consists of five separate segments. Each segment is stored on the 5451C disc as an individual program and performs its own unique function. Two separate data reduction methods, a signal calibration, coherence or pulser signal based frequency response function blanking, and automatic plotting features are included in the program. The 5451C allows variable parameter transfer from program to program. This feature is used to advantage and requires only minimal user interface during the data reduction process. Experimental results are included and compared with the analytical predictions in order to adjust the general model and arrive at a realistic simulation of the POGO characteristics.

  19. Measuring effectiveness of a university by a parallel network DEA model

    NASA Astrophysics Data System (ADS)

    Kashim, Rosmaini; Kasim, Maznah Mat; Rahman, Rosshairy Abd

    2017-11-01

    Universities contribute significantly to the development of human capital and socio-economic improvement of a country. Due to that, Malaysian universities carried out various initiatives to improve their performance. Most studies have used the Data Envelopment Analysis (DEA) model to measure efficiency rather than effectiveness, even though, the measurement of effectiveness is important to realize how effective a university in achieving its ultimate goals. A university system has two major functions, namely teaching and research and every function has different resources based on its emphasis. Therefore, a university is actually structured as a parallel production system with its overall effectiveness is the aggregated effectiveness of teaching and research. Hence, this paper is proposing a parallel network DEA model to measure the effectiveness of a university. This model includes internal operations of both teaching and research functions into account in computing the effectiveness of a university system. In literature, the graduate and the number of program offered are defined as the outputs, then, the employed graduates and the numbers of programs accredited from professional bodies are considered as the outcomes for measuring the teaching effectiveness. Amount of grants is regarded as the output of research, while the different quality of publications considered as the outcomes of research. A system is considered effective if only all functions are effective. This model has been tested using a hypothetical set of data consisting of 14 faculties at a public university in Malaysia. The results show that none of the faculties is relatively effective for the overall performance. Three faculties are effective in teaching and two faculties are effective in research. The potential applications of the parallel network DEA model allow the top management of a university to identify weaknesses in any functions in their universities and take rational steps for improvement.

  20. Frequency Domain Identification Toolbox

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Juang, Jer-Nan; Chen, Chung-Wen

    1996-01-01

    This report documents software written in MATLAB programming language for performing identification of systems from frequency response functions. MATLAB is a commercial software environment which allows easy manipulation of data matrices and provides other intrinsic matrix functions capabilities. Algorithms programmed in this collection of subroutines have been documented elsewhere but all references are provided in this document. A main feature of this software is the use of matrix fraction descriptions and system realization theory to identify state space models directly from test data. All subroutines have templates for the user to use as guidelines.

  1. Smoke Alarm Giveaway and Installation Programs

    PubMed Central

    Liu, Ying; Mack, Karin A.; Diekman, Shane T.

    2015-01-01

    Background The burden of residential fire injury and death is substantial. Targeted smoke alarm giveaway and installation programs are popular interventions used to reduce residential fire mortality and morbidity. Purpose To evaluate the cost effectiveness and cost benefit of implementing a giveaway or installation program in a small hypothetic community with a high risk of fire death and injury through a decision-analysis model. Methods Model inputs included program costs; program effectiveness (life-years and quality-adjusted life-years saved); and monetized program benefits (medical cost, productivity, property loss and quality-of-life losses averted) and were identified through structured reviews of existing literature (done in 2011) and supplemented by expert opinion. Future costs and effectiveness were discounted at a rate of 3% per year. All costs were expressed in 2011 U.S. dollars. Results Cost-effectiveness analysis (CEA) resulted in anaverage cost-effectiveness ratio (ACER) of $51,404 per quality-adjusted life-years (QALYs) saved and $45,630 per QALY for the giveaway and installation programs, respectively. Cost–benefit analysis (CBA) showed that both programs were associated with a positive net benefit with a benefit–cost ratio of 2.1 and 2.3, respectively. Smoke alarm functional rate, baseline prevalence of functional alarms, and baseline home fire death rate were among the most influential factors for the CEA and CBA results. Conclusions Both giveaway and installation programs have an average cost-effectiveness ratio similar to or lower than the median cost-effectiveness ratio reported for other interventionsto reduce fatal injuries in homes. Although more effort is required, installation programs result in lower cost per outcome achieved compared with giveaways. PMID:22992356

  2. Object-oriented approach for gas turbine engine simulation

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.; Felder, James L.

    1995-01-01

    An object-oriented gas turbine engine simulation program was developed. This program is a prototype for a more complete, commercial grade engine performance program now being proposed as part of the Numerical Propulsion System Simulator (NPSS). This report discusses architectural issues of this complex software system and the lessons learned from developing the prototype code. The prototype code is a fully functional, general purpose engine simulation program, however, only the component models necessary to model a transient compressor test rig have been written. The production system will be capable of steady state and transient modeling of almost any turbine engine configuration. Chief among the architectural considerations for this code was the framework in which the various software modules will interact. These modules include the equation solver, simulation code, data model, event handler, and user interface. Also documented in this report is the component based design of the simulation module and the inter-component communication paradigm. Object class hierarchies for some of the code modules are given.

  3. Enhancements to the SSME transfer function modeling code

    NASA Technical Reports Server (NTRS)

    Irwin, R. Dennis; Mitchell, Jerrel R.; Bartholomew, David L.; Glenn, Russell D.

    1995-01-01

    This report details the results of a one year effort by Ohio University to apply the transfer function modeling and analysis tools developed under NASA Grant NAG8-167 (Irwin, 1992), (Bartholomew, 1992) to attempt the generation of Space Shuttle Main Engine High Pressure Turbopump transfer functions from time domain data. In addition, new enhancements to the transfer function modeling codes which enhance the code functionality are presented, along with some ideas for improved modeling methods and future work. Section 2 contains a review of the analytical background used to generate transfer functions with the SSME transfer function modeling software. Section 2.1 presents the 'ratio method' developed for obtaining models of systems that are subject to single unmeasured excitation sources and have two or more measured output signals. Since most of the models developed during the investigation use the Eigensystem Realization Algorithm (ERA) for model generation, Section 2.2 presents an introduction of ERA, and Section 2.3 describes how it can be used to model spectral quantities. Section 2.4 details the Residue Identification Algorithm (RID) including the use of Constrained Least Squares (CLS) and Total Least Squares (TLS). Most of this information can be found in the report (and is repeated for convenience). Section 3 chronicles the effort of applying the SSME transfer function modeling codes to the a51p394.dat and a51p1294.dat time data files to generate transfer functions from the unmeasured input to the 129.4 degree sensor output. Included are transfer function modeling attempts using five methods. The first method is a direct application of the SSME codes to the data files and the second method uses the underlying trends in the spectral density estimates to form transfer function models with less clustering of poles and zeros than the models obtained by the direct method. In the third approach, the time data is low pass filtered prior to the modeling process in an effort to filter out high frequency characteristics. The fourth method removes the presumed system excitation and its harmonics in order to investigate the effects of the excitation on the modeling process. The fifth method is an attempt to apply constrained RID to obtain better transfer functions through more accurate modeling over certain frequency ranges. Section 4 presents some new C main files which were created to round out the functionality of the existing SSME transfer function modeling code. It is now possible to go from time data to transfer function models using only the C codes; it is not necessary to rely on external software. The new C main files and instructions for their use are included. Section 5 presents current and future enhancements to the XPLOT graphics program which was delivered with the initial software. Several new features which have been added to the program are detailed in the first part of this section. The remainder of Section 5 then lists some possible features which may be added in the future. Section 6 contains the conclusion section of this report. Section 6.1 is an overview of the work including a summary and observations relating to finding transfer functions with the SSME code. Section 6.2 contains information relating to future work on the project.

  4. Function-based payment model for inpatient medical rehabilitation: an evaluation.

    PubMed

    Sutton, J P; DeJong, G; Wilkerson, D

    1996-07-01

    To describe the components of a function-based prospective payment model for inpatient medical rehabilitation that parallels diagnosis-related groups (DRGs), to evaluate this model in relation to stakeholder objectives, and to detail the components of a quality of care incentive program that, when combined with this payment model, creates an incentive for provides to maximize functional outcomes. This article describes a conceptual model, involving no data collection or data synthesis. The basic payment model described parallels DRGs. Information on the potential impact of this model on medical rehabilitation is gleaned from the literature evaluating the impact of DRGs. The conceptual model described is evaluated against the results of a Delphi Survey of rehabilitation providers, consumers, policymakers, and researchers previously conducted by members of the research team. The major shortcoming of a function-based prospective payment model for inpatient medical rehabilitation is that it contains no inherent incentive to maximize functional outcomes. Linkage of reimbursement to outcomes, however, by withholding a fixed proportion of the standard FRG payment amount, placing that amount in a "quality of care" pool, and distributing that pool annually among providers whose predesignated, facility-level, case-mix-adjusted outcomes are attained, may be one strategy for maximizing outcome goals.

  5. Evaluating the Bias of Alternative Cost Progress Models: Tests Using Aerospace Industry Acquisition Programs

    DTIC Science & Technology

    1992-12-01

    suspect :mat, -n2 extent predict:.on cas jas ccsiziveiv crrei:=e amonc e v:arious models, :he fandom *.;aik, learn ha r ur e, i;<ea- variable and Bemis...Functions, Production Rate Adjustment Model, Learning Curve Model. Random Walk Model. Bemis Model. Evaluating Model Bias, Cost Prediction Bias. Cost...of four cost progress models--a random walk model, the tradiuonai learning curve model, a production rate model Ifixed-variable model). and a model

  6. Designing post-graduate Master's degree programs: the advanced training program in Dental Functional Analysis and Therapy as one example

    PubMed Central

    Ratzmann, Anja; Ruge, Sebastian; Ostendorf, Kristin; Kordaß, Bernd

    2014-01-01

    Introduction: The decision to consolidate European higher education was reached by the Bologna Conference. Based on the Anglo-American system, a two-cycle degree program (Bachelor and Master) has been introduced. Subjects culminating in a state examination, such as Medicine and Dentistry, were excluded from this reform. Since the state examination is already comparable in its caliber to a Master’s degree in Medicine or Dentistry, only advanced Master’s degree programs with post-graduate specializations come into consideration for these subjects. In the field of dentistry numerous post-graduate study programs are increasingly coming into existence. Many different models and approaches are being pursued. Method: Since the 2004-2005 winter semester, the University of Greifswald has offered the Master’s degree program in Dental Functional Analysis and Therapy. Two and a half years in duration, this program is structured to allow program participation while working and targets licensed dentists who wish to attain certified skills for the future in state-of-the-art functional analysis and therapy. Aim: The design of this post-graduate program and the initial results of the evaluation by alumni are presented here. Conclusion: Our experiences show that the conceptual idea of an advanced Master’s program has proved successful. The program covers a specialty which leads to increased confidence in handling challenging patient cases. The sharing of experiences among colleagues was evaluated as being especially important. PMID:24872853

  7. Non-linear Growth Models in Mplus and SAS

    PubMed Central

    Grimm, Kevin J.; Ram, Nilam

    2013-01-01

    Non-linear growth curves or growth curves that follow a specified non-linear function in time enable researchers to model complex developmental patterns with parameters that are easily interpretable. In this paper we describe how a variety of sigmoid curves can be fit using the Mplus structural modeling program and the non-linear mixed-effects modeling procedure NLMIXED in SAS. Using longitudinal achievement data collected as part of a study examining the effects of preschool instruction on academic gain we illustrate the procedures for fitting growth models of logistic, Gompertz, and Richards functions. Brief notes regarding the practical benefits, limitations, and choices faced in the fitting and estimation of such models are included. PMID:23882134

  8. TEAM-HF Cost-Effectiveness Model: A Web-Based Program Designed to Evaluate the Cost-Effectiveness of Disease Management Programs in Heart Failure

    PubMed Central

    Reed, Shelby D.; Neilson, Matthew P.; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H.; Polsky, Daniel E.; Graham, Felicia L.; Bowers, Margaret T.; Paul, Sara C.; Granger, Bradi B.; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara; Levy, Wayne C.

    2015-01-01

    Background Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. Methods We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics, use of evidence-based medications, and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model (SHFM). Projections of resource use and quality of life are modeled using relationships with time-varying SHFM scores. The model can be used to evaluate parallel-group and single-cohort designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. Results The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. Conclusion The TEAM-HF Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. PMID:26542504

  9. Multiple utility constrained multi-objective programs using Bayesian theory

    NASA Astrophysics Data System (ADS)

    Abbasian, Pooneh; Mahdavi-Amiri, Nezam; Fazlollahtabar, Hamed

    2018-03-01

    A utility function is an important tool for representing a DM's preference. We adjoin utility functions to multi-objective optimization problems. In current studies, usually one utility function is used for each objective function. Situations may arise for a goal to have multiple utility functions. Here, we consider a constrained multi-objective problem with each objective having multiple utility functions. We induce the probability of the utilities for each objective function using Bayesian theory. Illustrative examples considering dependence and independence of variables are worked through to demonstrate the usefulness of the proposed model.

  10. PyMCT: A Very High Level Language Coupling Tool For Climate System Models

    NASA Astrophysics Data System (ADS)

    Tobis, M.; Pierrehumbert, R. T.; Steder, M.; Jacob, R. L.

    2006-12-01

    At the Climate Systems Center of the University of Chicago, we have been examining strategies for applying agile programming techniques to complex high-performance modeling experiments. While the "agile" development methodology differs from a conventional requirements process and its associated milestones, the process remain a formal one. It is distinguished by continuous improvement in functionality, large numbers of small releases, extensive and ongoing testing strategies, and a strong reliance on very high level languages (VHLL). Here we report on PyMCT, which we intend as a core element in a model ensemble control superstructure. PyMCT is a set of Python bindings for MCT, the Fortran-90 based Model Coupling Toolkit, which forms the infrastructure for the inter-component communication in the Community Climate System Model (CCSM). MCT provides a scalable model communication infrastructure. In order to take maximum advantage of agile software development methodologies, we exposed MCT functionality to Python, a prominent VHLL. We describe how the scalable architecture of MCT allows us to overcome the relatively weak runtime performance of Python, so that the performance of the combined system is not severely impacted. To demonstrate these advantages, we reimplemented the CCSM coupler in Python. While this alone offers no new functionality, it does provide a rigorous test of PyMCT functionality and performance. We reimplemented the CPL6 library, presenting an interesting case study of the comparison between conventional Fortran-90 programming and the higher abstraction level provided by a VHLL. The powerful abstractions provided by Python will allow much more complex experimental paradigms. In particular, we hope to build on the scriptability of our coupling strategy to enable systematic sensitivity tests. Our most ambitious objective is to combine our efforts with Bayesian inverse modeling techniques toward objective tuning at the highest level, across model architectures.

  11. Energy-state formulation of lumped volume dynamic equations with application to a simplified free piston Stirling engine

    NASA Technical Reports Server (NTRS)

    Daniele, C. J.; Lorenzo, C. F.

    1979-01-01

    Lumped volume dynamic equations are derived using an energy state formulation. This technique requires that kinetic and potential energy state functions be written for the physical system being investigated. To account for losses in the system, a Rayleigh dissipation function is formed. Using these functions, a Lagrangian is formed and using Lagrange's equation, the equations of motion for the system are derived. The results of the application of this technique to a lumped volume are used to derive a model for the free piston Stirling engine. The model was simplified and programmed on an analog computer. Results are given comparing the model response with experimental data.

  12. Energy-state formulation of lumped volume dynamic equations with application to a simplified free piston Stirling engine

    NASA Technical Reports Server (NTRS)

    Daniele, C. J.; Lorenzo, C. F.

    1979-01-01

    Lumped volume dynamic equations are derived using an energy-state formulation. This technique requires that kinetic and potential energy state functions be written for the physical system being investigated. To account for losses in the system, a Rayleigh dissipation function is also formed. Using these functions, a Lagrangian is formed and using Lagrange's equation, the equations of motion for the system are derived. The results of the application of this technique to a lumped volume are used to derive a model for the free-piston Stirling engine. The model was simplified and programmed on an analog computer. Results are given comparing the model response with experimental data.

  13. Longitudinal analysis of pulmonary dysfunction in the initial years of employment in the grain industry.

    PubMed

    Olfert, S M; Pahwa, P; Dosman, J A

    2005-11-01

    The negative health effects of exposure to grain dust have previously been examined, but few studies have observed the effects on newly hired employees. Young grain workers are of interest because changes in pulmonary function may occur after a short duration of employment, and because older grain workers may represent a survivor population. The New Grain Workers Study (NGWS), a longitudinal study of 299 newly hired male grain industry workers, was conducted between 1980 and 1985. The objectives were to determine the effects of employment in the grain industry on pulmonary function. Pre-employment physical examinations and pulmonary function tests were conducted on subjects at the Division of Respiratory Medicine, Department of Medicine, Royal University Hospital, University of Saskatchewan. The Grain Dust Medical Surveillance Program (GDMSP) was a Labour Canada program that began in 1978. All subjects were grain workers employed in the grain industry in Saskatchewan. All subjects completed a respiratory symptoms questionnaire and underwent pulmonary function testing. Baseline observations were recorded every three years between 1978 and 1993. Data were available on 2184 grain workers. Generalized estimating equations were used to fit marginal and transitional multivariable regression models to determine the effects of grain dust exposure on pulmonary function. Marginal and transitional models were then compared. Height, exposure weeks, and previous FVC were predictive of FVC in the NGWS, while exposure weeks and previous FEV1 were predictive of FEV1. These models, as well as a transitional regression model built using the GDMSP data, were used to compute predicted mean annual decline inpulmonary function. Non-smoking grain workers in the NGWS had the highest pulmonary function test values, but also had the greatest predicted annual decline in pulmonary function. Ever-smoking grain workers in the GDMSP had the lowest pulmonary function test values. Non-smoking grain workers in the GDMSP had the least predicted annual decline in pulmonary function.

  14. Role of conceptual models in a physical therapy curriculum: application of an integrated model of theory, research, and clinical practice.

    PubMed

    Darrah, Johanna; Loomis, Joan; Manns, Patricia; Norton, Barbara; May, Laura

    2006-11-01

    The Department of Physical Therapy, University of Alberta, Edmonton, Alberta, Canada, recently implemented a Master of Physical Therapy (MPT) entry-level degree program. As part of the curriculum design, two models were developed, a Model of Best Practice and the Clinical Decision-Making Model. Both models incorporate four key concepts of the new curriculum: 1) the concept that theory, research, and clinical practice are interdependent and inform each other; 2) the importance of client-centered practice; 3) the terminology and philosophical framework of the World Health Organization's International Classification of Functioning, Disability, and Health; and 4) the importance of evidence-based practice. In this article the general purposes of models for learning are described; the two models developed for the MPT program are described; and examples of their use with curriculum design and teaching are provided. Our experiences with both the development and use of models of practice have been positive. The models have provided both faculty and students with a simple, systematic structured framework to organize teaching and learning in the MPT program.

  15. A Case Study of Undergraduate Women in a Leadership Development Program at a Coeducational Institution

    ERIC Educational Resources Information Center

    Haight, Lori P.

    2010-01-01

    The purpose of this interpretive case study was to explore the collegiate experiences of undergraduate women participating in a cohort women's-only leadership development program at a coeducational institution. Using a framework based on Kurt Lewin's psycho-social model of behavior being the function of a person interacting with the environment…

  16. Structural Equation Modeling of Multivariate Time Series

    ERIC Educational Resources Information Center

    du Toit, Stephen H. C.; Browne, Michael W.

    2007-01-01

    The covariance structure of a vector autoregressive process with moving average residuals (VARMA) is derived. It differs from other available expressions for the covariance function of a stationary VARMA process and is compatible with current structural equation methodology. Structural equation modeling programs, such as LISREL, may therefore be…

  17. Improvements and new features in the PDF module

    NASA Technical Reports Server (NTRS)

    Norris, Andrew T.

    1995-01-01

    This viewgraph presentation discusses what models are used in this package and what their advantages and disadvantages are, how the probability density function (PDF) model is implemented and the features of the program, and what can be expected in the future from the NASA Lewis PDF code.

  18. The Rape Prevention and Education (RPE) Theory Model of Community Change: Connecting Individual and Social Change

    ERIC Educational Resources Information Center

    Cox, Pamela J.; Lang, Karen S.; Townsend, Stephanie M.; Campbell, Rebecca

    2010-01-01

    Social work practice has long focused on the connections between an individual and the social environment that affect the individual's social functioning. The Rape Prevention and Education (RPE) Program's theory model, Creating Safer Communities: The Rape Prevention and Education Model of Community Change, provides family social workers with a…

  19. [The application of gene expression programming in the diagnosis of heart disease].

    PubMed

    Dai, Wenbin; Zhang, Yuntao; Gao, Xingyu

    2009-02-01

    GEP (Gene expression programming) is a new genetic algorithm, and it has been proved to be excellent in function finding. In this paper, for the purpose of setting up a diagnostic model, GEP is used to deal with the data of heart disease. Eight variables, Sex, Chest pain, Blood pressure, Angina, Peak, Slope, Colored vessels and Thal, are picked out of thirteen variables to form a classified function. This function is used to predict a forecasting set of 100 samples, and the accuracy is 87%. Other algorithms such as SVM (Support vector machine) are applied to the same data and the forecasting results show that GEP is better than other algorithms.

  20. Performance and Architecture Lab Modeling Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this linkmore » makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior. The model -- an executable program -- is a hierarchical composition of annotation functions, synthesized functions, statistics for runtime values, and performance measurements.« less

  1. Use of a Microsoft Excel based add-in program to calculate plasma sinistrin clearance by a two-compartment model analysis in dogs.

    PubMed

    Steinbach, Sarah M L; Sturgess, Christopher P; Dunning, Mark D; Neiger, Reto

    2015-06-01

    Assessment of renal function by means of plasma clearance of a suitable marker has become standard procedure for estimation of glomerular filtration rate (GFR). Sinistrin, a polyfructan solely cleared by the kidney, is often used for this purpose. Pharmacokinetic modeling using adequate software is necessary to calculate disappearance rate and half-life of sinistrin. The purpose of this study was to describe the use of a Microsoft excel based add-in program to calculate plasma sinistrin clearance, as well as additional pharmacokinetic parameters such as transfer rates (k), half-life (t1/2) and volume of distribution (Vss) for sinistrin in dogs with varying degrees of renal function. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Software to model AXAF image quality

    NASA Technical Reports Server (NTRS)

    Ahmad, Anees

    1993-01-01

    This draft final report describes the work performed under this delivery order from May 1992 through June 1993. The purpose of this contract was to enhance and develop an integrated optical performance modeling software for complex x-ray optical systems such as AXAF. The GRAZTRACE program developed by the MSFC Optical Systems Branch for modeling VETA-I was used as the starting baseline program. The original program was a large single file program and, therefore, could not be modified very efficiently. The original source code has been reorganized, and a 'Make Utility' has been written to update the original program. The new version of the source code consists of 36 small source files to make it easier for the code developer to manage and modify the program. A user library has also been built and a 'Makelib' utility has been furnished to update the library. With the user library, the users can easily access the GRAZTRACE source files and build a custom library. A user manual for the new version of GRAZTRACE has been compiled. The plotting capability for the 3-D point spread functions and contour plots has been provided in the GRAZTRACE using the graphics package DISPLAY. The Graphics emulator over the network has been set up for programming the graphics routine. The point spread function and the contour plot routines have also been modified to display the plot centroid, and to allow the user to specify the plot range, and the viewing angle options. A Command Mode version of GRAZTRACE has also been developed. More than 60 commands have been implemented in a Code-V like format. The functions covered in this version include data manipulation, performance evaluation, and inquiry and setting of internal parameters. The user manual for these commands has been formatted as in Code-V, showing the command syntax, synopsis, and options. An interactive on-line help system for the command mode has also been accomplished to allow the user to find valid commands, command syntax, and command function. A translation program has been written to convert FEA output from structural analysis to GRAZTRACE surface deformation file (.dfm file). The program can accept standard output files and list files from COSMOS/M and NASTRAN finite analysis programs. Some interactive options are also provided, such as Cartesian or cylindrical coordinate transformation, coordinate shift and scale, and axial length change. A computerized database for technical documents relating to the AXAF project has been established. Over 5000 technical documents have been entered into the master database. A user can now rapidly retrieve the desired documents relating to the AXAF project. The summary of the work performed under this contract is shown.

  3. A Combination of Hand-held Models and Computer Imaging Programs Helps Students Answer Oral Questions about Molecular Structure and Function: A Controlled Investigation of Student Learning

    PubMed Central

    Peck, Ronald F.; Colton, Shannon; Morris, Jennifer; Chaibub Neto, Elias; Kallio, Julie

    2009-01-01

    We conducted a controlled investigation to examine whether a combination of computer imagery and tactile tools helps introductory cell biology laboratory undergraduate students better learn about protein structure/function relationships as compared with computer imagery alone. In all five laboratory sections, students used the molecular imaging program, Protein Explorer (PE). In the three experimental sections, three-dimensional physical models were made available to the students, in addition to PE. Student learning was assessed via oral and written research summaries and videotaped interviews. Differences between the experimental and control group students were not found in our typical course assessments such as research papers, but rather were revealed during one-on-one interviews with students at the end of the semester. A subset of students in the experimental group produced superior answers to some higher-order interview questions as compared with students in the control group. During the interview, students in both groups preferred to use either the hand-held models alone or in combination with the PE imaging program. Students typically did not use any tools when answering knowledge (lower-level thinking) questions, but when challenged with higher-level thinking questions, students in both the control and experimental groups elected to use the models. PMID:19255134

  4. PharmDock: a pharmacophore-based docking program

    PubMed Central

    2014-01-01

    Background Protein-based pharmacophore models are enriched with the information of potential interactions between ligands and the protein target. We have shown in a previous study that protein-based pharmacophore models can be applied for ligand pose prediction and pose ranking. In this publication, we present a new pharmacophore-based docking program PharmDock that combines pose sampling and ranking based on optimized protein-based pharmacophore models with local optimization using an empirical scoring function. Results Tests of PharmDock on ligand pose prediction, binding affinity estimation, compound ranking and virtual screening yielded comparable or better performance to existing and widely used docking programs. The docking program comes with an easy-to-use GUI within PyMOL. Two features have been incorporated in the program suite that allow for user-defined guidance of the docking process based on previous experimental data. Docking with those features demonstrated superior performance compared to unbiased docking. Conclusion A protein pharmacophore-based docking program, PharmDock, has been made available with a PyMOL plugin. PharmDock and the PyMOL plugin are freely available from http://people.pharmacy.purdue.edu/~mlill/software/pharmdock. PMID:24739488

  5. GIANT API: an application programming interface for functional genomics.

    PubMed

    Roberts, Andrew M; Wong, Aaron K; Fisk, Ian; Troyanskaya, Olga G

    2016-07-08

    GIANT API provides biomedical researchers programmatic access to tissue-specific and global networks in humans and model organisms, and associated tools, which includes functional re-prioritization of existing genome-wide association study (GWAS) data. Using tissue-specific interaction networks, researchers are able to predict relationships between genes specific to a tissue or cell lineage, identify the changing roles of genes across tissues and uncover disease-gene associations. Additionally, GIANT API enables computational tools like NetWAS, which leverages tissue-specific networks for re-prioritization of GWAS results. The web services covered by the API include 144 tissue-specific functional gene networks in human, global functional networks for human and six common model organisms and the NetWAS method. GIANT API conforms to the REST architecture, which makes it stateless, cacheable and highly scalable. It can be used by a diverse range of clients including web browsers, command terminals, programming languages and standalone apps for data analysis and visualization. The API is freely available for use at http://giant-api.princeton.edu. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. Comparison of the Effects of Video Modeling with Narration vs. Video Modeling on the Functional Skill Acquisition of Adolescents with Autism

    ERIC Educational Resources Information Center

    Smith, Molly; Ayres, Kevin; Mechling, Linda; Smith, Katie

    2013-01-01

    The purpose of this study was to compare the effects of two forms of video modeling: video modeling that includes narration (VMN) and video models without narration (VM) on skill acquisition of four adolescent boys with a primary diagnosis of autism enrolled in an Extended School Year (ESY) summer program. An adapted alternating treatment design…

  7. An export coefficient based inexact fuzzy bi-level multi-objective programming model for the management of agricultural nonpoint source pollution under uncertainty

    NASA Astrophysics Data System (ADS)

    Cai, Yanpeng; Rong, Qiangqiang; Yang, Zhifeng; Yue, Wencong; Tan, Qian

    2018-02-01

    In this research, an export coefficient based inexact fuzzy bi-level multi-objective programming (EC-IFBLMOP) model was developed through integrating export coefficient model (ECM), interval parameter programming (IPP) and fuzzy parameter programming (FPP) within a bi-level multi-objective programming framework. The proposed EC-IFBLMOP model can effectively deal with the multiple uncertainties expressed as discrete intervals and fuzzy membership functions. Also, the complexities in agricultural systems, such as the cooperation and gaming relationship between the decision makers at different levels, can be fully considered in the model. The developed model was then applied to identify the optimal land use patterns and BMP implementing levels for agricultural nonpoint source (NPS) pollution management in a subcatchment in the upper stream watershed of the Miyun Reservoir in north China. The results of the model showed that the desired optimal land use patterns and implementing levels of best management of practices (BMPs) would be obtained. It is the gaming result between the upper- and lower-level decision makers, when the allowable discharge amounts of NPS pollutants were limited. Moreover, results corresponding to different decision scenarios could provide a set of decision alternatives for the upper- and lower-level decision makers to identify the most appropriate management strategy. The model has a good applicability and can be effectively utilized for agricultural NPS pollution management.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lusk, Ewing; Butler, Ralph; Pieper, Steven C.

    Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generalitymore » and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.« less

  9. Simulation of mixing in the quick quench region of a rich burn-quick quench mix-lean burn combustor

    NASA Technical Reports Server (NTRS)

    Shih, Tom I.-P.; Nguyen, H. Lee; Howe, Gregory W.; Li, Z.

    1991-01-01

    A computer program was developed to study the mixing process in the quick quench region of a rich burn-quick quench mix-lean burn combustor. The computer program developed was based on the density-weighted, ensemble-averaged conservation equations of mass, momentum (full compressible Navier-Stokes), total energy, and species, closed by a k-epsilon turbulence model with wall functions. The combustion process was modeled by a two-step global reaction mechanism, and NO(x) formation was modeled by the Zeldovich mechanism. The formulation employed in the computer program and the essence of the numerical method of solution are described. Some results obtained for nonreacting and reacting flows with different main-flow to dilution-jet momentum flux ratios are also presented.

  10. Approaches in highly parameterized inversion - GENIE, a general model-independent TCP/IP run manager

    USGS Publications Warehouse

    Muffels, Christopher T.; Schreuder, Willem A.; Doherty, John E.; Karanovic, Marinko; Tonkin, Matthew J.; Hunt, Randall J.; Welter, David E.

    2012-01-01

    GENIE is a model-independent suite of programs that can be used to generally distribute, manage, and execute multiple model runs via the TCP/IP infrastructure. The suite consists of a file distribution interface, a run manage, a run executer, and a routine that can be compiled as part of a program and used to exchange model runs with the run manager. Because communication is via a standard protocol (TCP/IP), any computer connected to the Internet can serve in any of the capacities offered by this suite. Model independence is consistent with the existing template and instruction file protocols of the widely used PEST parameter estimation program. This report describes (1) the problem addressed; (2) the approach used by GENIE to queue, distribute, and retrieve model runs; and (3) user instructions, classes, and functions developed. It also includes (4) an example to illustrate the linking of GENIE with Parallel PEST using the interface routine.

  11. Semilinear programming: applications and implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohan, S.

    Semilinear programming is a method of solving optimization problems with linear constraints where the non-negativity restrictions on the variables are dropped and the objective function coefficients can take on different values depending on whether the variable is positive or negative. The simplex method for linear programming is modified in this thesis to solve general semilinear and piecewise linear programs efficiently without having to transform them into equivalent standard linear programs. Several models in widely different areas of optimization such as production smoothing, facility locations, goal programming and L/sub 1/ estimation are presented first to demonstrate the compact formulation that arisesmore » when such problems are formulated as semilinear programs. A code SLP is constructed using the semilinear programming techniques. Problems in aggregate planning and L/sub 1/ estimation are solved using SLP and equivalent linear programs using a linear programming simplex code. Comparisons of CPU times and number iterations indicate SLP to be far superior. The semilinear programming techniques are extended to piecewise linear programming in the implementation of the code PLP. Piecewise linear models in aggregate planning are solved using PLP and equivalent standard linear programs using a simple upper bounded linear programming code SUBLP.« less

  12. Evaluation of a home-based exercise program in the treatment of Alzheimer's disease: the Maximizing Independence in Dementia (MIND) study.

    PubMed

    Steinberg, Martin; Leoutsakos, Jeannie-Marie Sheppard; Podewils, Laura Jean; Lyketsos, C G

    2009-07-01

    To determine the feasibility and efficacy of a home-based exercise intervention program to improve the functional performance of patients with Alzheimer's Disease (AD). Twenty-seven home-dwelling patients with AD were randomized to either an exercise intervention program delivered by their caregivers or a home safety assessment control. Measures of functional performance (primary), cognition, neuropsychiatric symptoms, quality of life and caregiver burden (secondary) were obtained at baseline and at 6 and 12 weeks following randomization. For each outcome measure, intent-to-treat analyses using linear random effects models were performed. Feasibility and adverse events were also assessed. Adherence to the exercise program was good. On the primary outcomes (functional performance) patients in the exercise group demonstrated a trend for improved performance on measures of hand function and lower extremity strength. On secondary outcome measures, trends toward worse depression and lower quality of life ratings were noted. The physical exercise intervention developed for the study, delivered by caregivers to home-dwelling patients with AD, was feasible and was associated with a trend for improved functional performance in this group of frail patients. Given the limited efficacy to date of pharmacotherapies for AD, further study of exercise intervention, in a variety of care setting, is warranted.

  13. Programmed disorders of beta-cell development and function as one cause for type 2 diabetes? The GK rat paradigm.

    PubMed

    Portha, Bernard

    2005-01-01

    Now that the reduction in beta-mass has been clearly established in humans with type 2 diabetes mellitus (T2DM) 1-4, the debate focuses on the possible mechanisms responsible for decreased beta-cell number and impaired beta-cell function and their multifactorial etiology. Appropriate inbred rodent models are essential tools for identification of genes and environmental factors that increase the risk of abnormal beta-cell function and of T2DM. The information available in the Goto-Kakizaki (GK) rat, one of the best characterized animal models of spontaneous T2DM, are reviewed in such a perspective. We propose that the defective beta-cell mass and function in the GK model reflect the complex interactions of three pathogenic players: (1) several independent loci containing genes causing impaired insulin secretion; (2) gestational metabolic impairment inducing a programming of endocrine pancreas (decreased beta-cell neogenesis) which is transmitted to the next generation; and (3) secondary (acquired) loss of beta-cell differentiation due to chronic exposure to hyperglycemia (glucotoxicity). An important message is that the 'heritable' determinants of T2DM are not simply dependant on genetic factors, but probably involve transgenerational epigenetic responses. Copyright (c) 2005 John Wiley & Sons, Ltd.

  14. Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model: A Web-based program designed to evaluate the cost-effectiveness of disease management programs in heart failure.

    PubMed

    Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C

    2015-11-01

    Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. 3D Printing of Protein Models in an Undergraduate Laboratory: Leucine Zippers

    ERIC Educational Resources Information Center

    Meyer, Scott C.

    2015-01-01

    An upper-division undergraduate laboratory experiment is described that explores the structure/function relationship of protein domains, namely leucine zippers, through a molecular graphics computer program and physical models fabricated by 3D printing. By generating solvent accessible surfaces and color-coding hydrophobic, basic, and acidic amino…

  16. Preparation of X-ray astronomy satellite experiment Development of computer programs for the Salyut-HEXE X-ray experiment ground station

    NASA Astrophysics Data System (ADS)

    Petrik, J.

    The engineering model of the Salyut-HEXE experiment is described. The detector system, electronics box, and ground station are addressed. The microprocessor system is considered, discussing the cards and presenting block diagrams of their functions. The telemetry is examined, including the various modes and the direct and indirect transmission modes. The ground station programs are discussed, including the tasks, program development, input and output programs, status, power supply, count rates, telemetry dump, hard copy, and checksum.

  17. GEODYN programmers guide, volume 2, part 1

    NASA Technical Reports Server (NTRS)

    Mullins, N. E.; Goad, C. C.; Dao, N. C.; Martin, T. V.; Boulware, N. L.; Chin, M. M.

    1972-01-01

    A guide to the GEODYN Program is presented. The program estimates orbit and geodetic parameters. It possesses the capability to estimate that set of orbital elements, station positions, measurement biases, and a set of force model parameters such that the orbital tracking data from multiple arcs of multiple satellites best fit the entire set of estimated parameters. GEODYN consists of 113 different program segments, including the main program, subroutines, functions, and block data routines. All are in G or H level FORTRAN and are currently operational on GSFC's IBM 360/95 and IBM 360/91.

  18. SICONID: a FORTRAN-77 program for conditional simulation in one dimension

    NASA Astrophysics Data System (ADS)

    Pardo-Igúzquiza, E.; Chica-Olmo, M.; Delgado-García, J.

    1992-07-01

    The SICONID program, written in FORTRAN 77 for the conditional simulation of geological variables in one dimension, is presented. The program permits all the necessary steps to obtain a simulated series of the experimental data to be carried out. These states are: acquisition of the experimental values, modelization of the anamorphosis function, variogram of the normal scores, conditional simulation, and restoration of the experimental histogram. A practical case of simulation of the evolution of the groundwater level in a survey to show the operation of the program is given.

  19. Effects of theory of mind performance training on reducing bullying involvement in children and adolescents with high-functioning autism spectrum disorder.

    PubMed

    Liu, Meng-Jung; Ma, Le-Yin; Chou, Wen-Jiun; Chen, Yu-Min; Liu, Tai-Ling; Hsiao, Ray C; Hu, Huei-Fan; Yen, Cheng-Fang

    2018-01-01

    Bullying involvement is prevalent among children and adolescents with autism spectrum disorder (ASD). This study examined the effects of theory of mind performance training (ToMPT) on reducing bullying involvement in children and adolescents with high-functioning ASD. Children and adolescents with high-functioning ASD completed ToMPT (n = 26) and social skills training (SST; n = 23) programs. Participants in both groups and their mothers rated the pretraining and posttraining bullying involvement of participants on the Chinese version of the School Bullying Experience Questionnaire. The paired t test was used to evaluate changes in bullying victimization and perpetration between the pretraining and posttraining assessments. Furthermore, the linear mixed-effect model was used to examine the difference in the training effect between the ToMPT and SST groups. The paired t test indicated that in the ToMPT group, the severities of both self-reported (p = .039) and mother-reported (p = .003) bullying victimization significantly decreased from the pretraining to posttraining assessments, whereas in the SST group, only self-reported bullying victimization significantly decreased (p = .027). The linear mixed-effect model indicated that compared with the SST program, the ToMPT program significantly reduced the severity of mother-reported bullying victimization (p = .041). The present study supports the effects of ToMPT on reducing mother-reported bullying victimization in children and adolescents with high-functioning ASD.

  20. Continuous-time quantum Monte Carlo impurity solvers

    NASA Astrophysics Data System (ADS)

    Gull, Emanuel; Werner, Philipp; Fuchs, Sebastian; Surer, Brigitte; Pruschke, Thomas; Troyer, Matthias

    2011-04-01

    Continuous-time quantum Monte Carlo impurity solvers are algorithms that sample the partition function of an impurity model using diagrammatic Monte Carlo techniques. The present paper describes codes that implement the interaction expansion algorithm originally developed by Rubtsov, Savkin, and Lichtenstein, as well as the hybridization expansion method developed by Werner, Millis, Troyer, et al. These impurity solvers are part of the ALPS-DMFT application package and are accompanied by an implementation of dynamical mean-field self-consistency equations for (single orbital single site) dynamical mean-field problems with arbitrary densities of states. Program summaryProgram title: dmft Catalogue identifier: AEIL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: ALPS LIBRARY LICENSE version 1.1 No. of lines in distributed program, including test data, etc.: 899 806 No. of bytes in distributed program, including test data, etc.: 32 153 916 Distribution format: tar.gz Programming language: C++ Operating system: The ALPS libraries have been tested on the following platforms and compilers: Linux with GNU Compiler Collection (g++ version 3.1 and higher), and Intel C++ Compiler (icc version 7.0 and higher) MacOS X with GNU Compiler (g++ Apple-version 3.1, 3.3 and 4.0) IBM AIX with Visual Age C++ (xlC version 6.0) and GNU (g++ version 3.1 and higher) compilers Compaq Tru64 UNIX with Compq C++ Compiler (cxx) SGI IRIX with MIPSpro C++ Compiler (CC) HP-UX with HP C++ Compiler (aCC) Windows with Cygwin or coLinux platforms and GNU Compiler Collection (g++ version 3.1 and higher) RAM: 10 MB-1 GB Classification: 7.3 External routines: ALPS [1], BLAS/LAPACK, HDF5 Nature of problem: (See [2].) Quantum impurity models describe an atom or molecule embedded in a host material with which it can exchange electrons. They are basic to nanoscience as representations of quantum dots and molecular conductors and play an increasingly important role in the theory of "correlated electron" materials as auxiliary problems whose solution gives the "dynamical mean field" approximation to the self-energy and local correlation functions. Solution method: Quantum impurity models require a method of solution which provides access to both high and low energy scales and is effective for wide classes of physically realistic models. The continuous-time quantum Monte Carlo algorithms for which we present implementations here meet this challenge. Continuous-time quantum impurity methods are based on partition function expansions of quantum impurity models that are stochastically sampled to all orders using diagrammatic quantum Monte Carlo techniques. For a review of quantum impurity models and their applications and of continuous-time quantum Monte Carlo methods for impurity models we refer the reader to [2]. Additional comments: Use of dmft requires citation of this paper. Use of any ALPS program requires citation of the ALPS [1] paper. Running time: 60 s-8 h per iteration.

  1. SEISRISK II; a computer program for seismic hazard estimation

    USGS Publications Warehouse

    Bender, Bernice; Perkins, D.M.

    1982-01-01

    The computer program SEISRISK II calculates probabilistic ground motion values for use in seismic hazard mapping. SEISRISK II employs a model that allows earthquakes to occur as points within source zones and as finite-length ruptures along faults. It assumes that earthquake occurrences have a Poisson distribution, that occurrence rates remain constant during the time period considered, that ground motion resulting from an earthquake is a known function of magnitude and distance, that seismically homogeneous source zones are defined, that fault locations are known, that fault rupture lengths depend on magnitude, and that earthquake rates as a function of magnitude are specified for each source. SEISRISK II calculates for each site on a grid of sites the level of ground motion that has a specified probability of being exceeded during a given time period. The program was designed to process a large (essentially unlimited) number of sites and sources efficiently and has been used to produce regional and national maps of seismic hazard.}t is a substantial revision of an earlier program SEISRISK I, which has never been documented. SEISRISK II runs considerably [aster and gives more accurate results than the earlier program and in addition includes rupture length and acceleration variability which were not contained in the original version. We describe the model and how it is implemented in the computer program and provide a flowchart and listing of the code.

  2. Results of prevention programs with adolescents.

    PubMed

    Perry, C L

    1987-09-01

    Programs for preventing smoking and alcohol and drug abuse have radically changed in the past decade. Instead of being regarded as a health or discipline problem that involves only a few deviant adolescents, drug use has begun to be viewed as social behavior that is functional for adolescents, not capricious, and is normative for that population. The most successful prevention programs have sought to delay the onset of tobacco use. Based on theoretical and etiological research, these programs target factors that have repeatedly been predictive of adolescent smoking, alcohol and drug use. The programs teach adolescents (1) why people their age smoke tobacco or use alcohol and drugs; (2) how these meanings get established by peers, older role models and advertising; (3) how to resist these influences to smoke or to use alcohol and drugs; and (4) life skills and competencies to counterbalance the functions that drug use serves. Because of the association with the onset of smoking and the onset of using other drugs, these strategies are being studied for alcohol use and other drugs. In addition, elected peer leaders are trained to conduct these activities with their classmates and act as new role models for non-use. Evaluations of these approaches are optimistic. Studies in northern California and Minnesota reveal 50-70% reductions in the onset of smoking. Botvin's 'Life Skills Training' program demonstrates success in delaying heavy alcohol and marijuana use.

  3. Combatting Global Infectious Diseases: A Network Effect of Specimen Referral Systems.

    PubMed

    Fonjungo, Peter N; Alemnji, George A; Kebede, Yenew; Opio, Alex; Mwangi, Christina; Spira, Thomas J; Beard, R Suzanne; Nkengasong, John N

    2017-02-13

    The recent Ebola virus outbreak in West Africa clearly demonstrated the critical role of laboratory systems and networks in responding to epidemics. Because of the huge challenges in establishing functional laboratories at all tiers of health systems in developing countries, strengthening specimen referral networks is critical. In this review article, we propose a platform strategy for developing specimen referral networks based on 2 models: centralized and decentralized laboratory specimen referral networks. These models have been shown to be effective in patient management in programs in resource-limited settings. Both models lead to reduced turnaround time and retain flexibility for integrating different specimen types. In Haiti, decentralized specimen referral systems resulted in a 182% increase in patients enrolling in human immunodeficiency virus treatment programs within 6 months. In Uganda, cost savings of up to 62% were observed with a centralized model. A platform strategy will create a network effect that will benefit multiple disease programs.

  4. A Multiobjective Interval Programming Model for Wind-Hydrothermal Power System Dispatching Using 2-Step Optimization Algorithm

    PubMed Central

    Jihong, Qu

    2014-01-01

    Wind-hydrothermal power system dispatching has received intensive attention in recent years because it can help develop various reasonable plans to schedule the power generation efficiency. But future data such as wind power output and power load would not be accurately predicted and the nonlinear nature involved in the complex multiobjective scheduling model; therefore, to achieve accurate solution to such complex problem is a very difficult task. This paper presents an interval programming model with 2-step optimization algorithm to solve multiobjective dispatching. Initially, we represented the future data into interval numbers and simplified the object function to a linear programming problem to search the feasible and preliminary solutions to construct the Pareto set. Then the simulated annealing method was used to search the optimal solution of initial model. Thorough experimental results suggest that the proposed method performed reasonably well in terms of both operating efficiency and precision. PMID:24895663

  5. A multiobjective interval programming model for wind-hydrothermal power system dispatching using 2-step optimization algorithm.

    PubMed

    Ren, Kun; Jihong, Qu

    2014-01-01

    Wind-hydrothermal power system dispatching has received intensive attention in recent years because it can help develop various reasonable plans to schedule the power generation efficiency. But future data such as wind power output and power load would not be accurately predicted and the nonlinear nature involved in the complex multiobjective scheduling model; therefore, to achieve accurate solution to such complex problem is a very difficult task. This paper presents an interval programming model with 2-step optimization algorithm to solve multiobjective dispatching. Initially, we represented the future data into interval numbers and simplified the object function to a linear programming problem to search the feasible and preliminary solutions to construct the Pareto set. Then the simulated annealing method was used to search the optimal solution of initial model. Thorough experimental results suggest that the proposed method performed reasonably well in terms of both operating efficiency and precision.

  6. Optimal Investment in HIV Prevention Programs: More Is Not Always Better

    PubMed Central

    Brandeau, Margaret L.; Zaric, Gregory S.

    2008-01-01

    This paper develops a mathematical/economic framework to address the following question: Given a particular population, a specific HIV prevention program, and a fixed amount of funds that could be invested in the program, how much money should be invested? We consider the impact of investment in a prevention program on the HIV sufficient contact rate (defined via production functions that describe the change in the sufficient contact rate as a function of expenditure on a prevention program), and the impact of changes in the sufficient contact rate on the spread of HIV (via an epidemic model). In general, the cost per HIV infection averted is not constant as the level of investment changes, so the fact that some investment in a program is cost effective does not mean that more investment in the program is cost effective. Our framework provides a formal means for determining how the cost per infection averted changes with the level of expenditure. We can use this information as follows: When the program has decreasing marginal cost per infection averted (which occurs, for example, with a growing epidemic and a prevention program with increasing returns to scale), it is optimal either to spend nothing on the program or to spend the entire budget. When the program has increasing marginal cost per infection averted (which occurs, for example, with a shrinking epidemic and a prevention program with decreasing returns to scale), it may be optimal to spend some but not all of the budget. The amount that should be spent depends on both the rate of disease spread and the production function for the prevention program. We illustrate our ideas with two examples: that of a needle exchange program, and that of a methadone maintenance program. PMID:19938440

  7. A computer program for analyzing channel geometry

    USGS Publications Warehouse

    Regan, R.S.; Schaffranek, R.W.

    1985-01-01

    The Channel Geometry Analysis Program (CGAP) provides the capability to process, analyze, and format cross-sectional data for input to flow/transport simulation models or other computational programs. CGAP allows for a variety of cross-sectional data input formats through use of variable format specification. The program accepts data from various computer media and provides for modification of machine-stored parameter values. CGAP has been devised to provide a rapid and efficient means of computing and analyzing the physical properties of an open-channel reach defined by a sequence of cross sections. CGAP 's 16 options provide a wide range of methods by which to analyze and depict a channel reach and its individual cross-sectional properties. The primary function of the program is to compute the area, width, wetted perimeter, and hydraulic radius of cross sections at successive increments of water surface elevation (stage) from data that consist of coordinate pairs of cross-channel distances and land surface or channel bottom elevations. Longitudinal rates-of-change of cross-sectional properties are also computed, as are the mean properties of a channel reach. Output products include tabular lists of cross-sectional area, channel width, wetted perimeter, hydraulic radius, average depth, and cross-sectional symmetry computed as functions of stage; plots of cross sections; plots of cross-sectional area and (or) channel width as functions of stage; tabular lists of cross-sectional area and channel width computed as functions of stage for subdivisions of a cross section; plots of cross sections in isometric projection; and plots of cross-sectional area at a fixed stage as a function of longitudinal distance along an open-channel reach. A Command Procedure Language program and Job Control Language procedure exist to facilitate program execution on the U.S. Geological Survey Prime and Amdahl computer systems respectively. (Lantz-PTT)

  8. Non-linear dynamics in muscle fatigue and strength model during maximal self-perceived elbow extensors training.

    PubMed

    Gacesa, Jelena Popadic; Ivancevic, Tijana; Ivancevic, Nik; Paljic, Feodora Popic; Grujic, Nikola

    2010-08-26

    Our aim was to determine the dynamics in muscle strength increase and fatigue development during repetitive maximal contraction in specific maximal self-perceived elbow extensors training program. We will derive our functional model for m. triceps brachii in spirit of traditional Hill's two-component muscular model and after fitting our data, develop a prediction tool for this specific training system. Thirty-six healthy young men (21 +/- 1.0 y, BMI 25.4 +/- 7.2 kg/m(2)), who did not take part in any formal resistance exercise regime, volunteered for this study. The training protocol was performed on the isoacceleration dynamometer, lasted for 12 weeks, with a frequency of five sessions per week. Each training session included five sets of 10 maximal contractions (elbow extensions) with a 1 min resting period between each set. The non-linear dynamic system model was used for fitting our data in conjunction with the Levenberg-Marquardt regression algorithm. As a proper dynamical system, our functional model of m. triceps brachii can be used for prediction and control. The model can be used for the predictions of muscular fatigue in a single series, the cumulative daily muscular fatigue and the muscular growth throughout the training process. In conclusion, the application of non-linear dynamics in this particular training model allows us to mathematically explain some functional changes in the skeletal muscle as a result of its adaptation to programmed physical activity-training. 2010 Elsevier Ltd. All rights reserved.

  9. YASEIS: Yet Another computer program to calculate synthetic SEISmograms for a spherically multi-layered Earth model

    NASA Astrophysics Data System (ADS)

    Ma, Yanlu

    2013-04-01

    Although most researches focus on the lateral heterogeneity of 3D Earth nowadays, a spherically multi-layered model where the parameters depend only on depth still represents a good first order approximation of real Earth. Such 1D models could be used as starting models for seismic tomographic inversion or as background model where the source mechanisms are inverted. The problem of wave propagation in a spherically layered model had been solved theoretically long time ago (Takeuchi and Saito, 1972). The existing computer programs such as Mineos (developed by G. Master, J. Woodhouse and F. Gilbert), Gemini (Friederich and Dalkolmo 1995), DSM (Kawai et. al. 2006) and QSSP (Wang 1999) tackled the computational aspects of the problem. A new simple and fast program for computing the Green's function of a stack of spherical dissipative layers is presented here. The analytical solutions within each homogeneous spherical layer are joined through the continuous boundary conditions and propagated from the center of model up to the level of source depth. Another solution is built by propagating downwardly from the free surface of model to the source level. The final solution is then constructed in frequency domain from the previous two solutions to satisfy the discontinuities of displacements and stresses at the source level which are required by the focal mechanism. The numerical instability in the propagator approach is solved by complementing the matrix propagating with an orthonormalization procedure (Wang 1999). Another unstable difficulty due to the high attenuation in the upper mantle low velocity zone is overcome by switching the bases of solutions from the spherical Bessel functions to the spherical Hankel functions when necessary. We compared the synthetic seismograms obtained from the new program YASEIS with those computed by Gemini and QSSP. In the range of near distances, the synthetics by a reflectivity code for the horizontally layers are also compared with those from YASEIS. Finally the static displacements in the source region are computed by choosing a very small frequency value in YASEIS which is designed for computing the dynamic response, and compared with the results in a homogeneous half-space model (Okada 1992). [1] Friederich, W. and J. Dalkolmo (1995). Complete synthetic seismograms for a spherically symmetric Earth a numerical computation of the Green's function in the frequency domain, Geophys. J. Int., vol. 122, 537-550. [2] Kawai, K., N. Takeuchi, and R.J. Geller (2006). Complete synthetic seismograms up to 2Hz for transversely isotropic spherically symmetric media, Geophys. J. Int., vol. 164, 411-424. [3] Okada, Y. (1992). Internal deformation due to shear and tensile faults in a half space, Bull. Seismol. Soc. Am., vol. 82, no. 2, 1018-1040. [4] Takeuchi, H. and M. Saito (1972). Seismic surface waves, Methods in computational physics, vol. II, 217-295. [5] Wang, R. (1999). A simple orthonormalization method for stable and efficient computation of Green's functions, Bull. Seismol. Soc. Am., vol. 89, no. 3, 733-741.

  10. A Bayesian framework based on a Gaussian mixture model and radial-basis-function Fisher discriminant analysis (BayGmmKda V1.1) for spatial prediction of floods

    NASA Astrophysics Data System (ADS)

    Tien Bui, Dieu; Hoang, Nhat-Duc

    2017-09-01

    In this study, a probabilistic model, named as BayGmmKda, is proposed for flood susceptibility assessment in a study area in central Vietnam. The new model is a Bayesian framework constructed by a combination of a Gaussian mixture model (GMM), radial-basis-function Fisher discriminant analysis (RBFDA), and a geographic information system (GIS) database. In the Bayesian framework, GMM is used for modeling the data distribution of flood-influencing factors in the GIS database, whereas RBFDA is utilized to construct a latent variable that aims at enhancing the model performance. As a result, the posterior probabilistic output of the BayGmmKda model is used as flood susceptibility index. Experiment results showed that the proposed hybrid framework is superior to other benchmark models, including the adaptive neuro-fuzzy inference system and the support vector machine. To facilitate the model implementation, a software program of BayGmmKda has been developed in MATLAB. The BayGmmKda program can accurately establish a flood susceptibility map for the study region. Accordingly, local authorities can overlay this susceptibility map onto various land-use maps for the purpose of land-use planning or management.

  11. The NIAID Radiation Countermeasures Program Business Model

    PubMed Central

    Hafer, Nathaniel; Maidment, Bert W.

    2010-01-01

    The National Institute of Allergy and Infectious Diseases (NIAID) Radiation/Nuclear Medical Countermeasures Development Program has developed an integrated approach to providing the resources and expertise required for the research, discovery, and development of radiation/nuclear medical countermeasures (MCMs). These resources and services lower the opportunity costs and reduce the barriers to entry for companies interested in working in this area and accelerate translational progress by providing goal-oriented stewardship of promising projects. In many ways, the radiation countermeasures program functions as a “virtual pharmaceutical firm,” coordinating the early and mid-stage development of a wide array of radiation/nuclear MCMs. This commentary describes the radiation countermeasures program and discusses a novel business model that has facilitated product development partnerships between the federal government and academic investigators and biopharmaceutical companies. PMID:21142762

  12. Multiprocessor speed-up, Amdahl's Law, and the Activity Set Model of parallel program behavior

    NASA Technical Reports Server (NTRS)

    Gelenbe, Erol

    1988-01-01

    An important issue in the effective use of parallel processing is the estimation of the speed-up one may expect as a function of the number of processors used. Amdahl's Law has traditionally provided a guideline to this issue, although it appears excessively pessimistic in the light of recent experimental results. In this note, Amdahl's Law is amended by giving a greater importance to the capacity of a program to make effective use of parallel processing, but also recognizing the fact that imbalance of the workload of each processor is bound to occur. An activity set model of parallel program behavior is then introduced along with the corresponding parallelism index of a program, leading to upper and lower bounds to the speed-up.

  13. The NIAID Radiation Countermeasures Program business model.

    PubMed

    Hafer, Nathaniel; Maidment, Bert W; Hatchett, Richard J

    2010-12-01

    The National Institute of Allergy and Infectious Diseases (NIAID) Radiation/Nuclear Medical Countermeasures Development Program has developed an integrated approach to providing the resources and expertise required for the research, discovery, and development of radiation/nuclear medical countermeasures (MCMs). These resources and services lower the opportunity costs and reduce the barriers to entry for companies interested in working in this area and accelerate translational progress by providing goal-oriented stewardship of promising projects. In many ways, the radiation countermeasures program functions as a "virtual pharmaceutical firm," coordinating the early and mid-stage development of a wide array of radiation/nuclear MCMs. This commentary describes the radiation countermeasures program and discusses a novel business model that has facilitated product development partnerships between the federal government and academic investigators and biopharmaceutical companies.

  14. CIRCAL-2 - General-purpose on-line circuit design.

    NASA Technical Reports Server (NTRS)

    Dertouzos, M. L.; Jessel, G. P.; Stinger, J. R.

    1972-01-01

    CIRCAL-2 is a second-generation general-purpose on-line circuit-design program with the following main features: (1) multiple-analysis capability; (2) uniform and general data structures for handling text editing, network representations, and output results, regardless of analysis; (3) special techniques and structures for minimizing and controlling user-program interaction; (4) use of functionals for the description of hysteresis and heat effects; and (5) ability to define optimization procedures that 'replace' the user. The paper discusses the organization of CIRCAL-2, the aforementioned main features, and their consequences, such as a set of network elements and models general enough for most analyses and a set of functions tailored to circuit-design requirements. The presentation is descriptive, concentrating on conceptual rather than on program implementation details.

  15. Rapid tooling for functional prototyping of metal mold processes. CRADA final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zacharia, T.; Ludtka, G.M.; Bjerke, M.A.

    1997-12-01

    The overall scope of this endeavor was to develop an integrated computer system, running on a network of heterogeneous computers, that would allow the rapid development of tool designs, and then use process models to determine whether the initial tooling would have characteristics which produce the prototype parts. The major thrust of this program for ORNL was the definition of the requirements for the development of the integrated die design system with the functional purpose to link part design, tool design, and component fabrication through a seamless software environment. The principal product would be a system control program that wouldmore » coordinate the various application programs and implement the data transfer so that any networked workstation would be useable. The overall system control architecture was to be required to easily facilitate any changes, upgrades, or replacements of the model from either the manufacturing end or the design criteria standpoint. The initial design of such a program is described in the section labeled ``Control Program Design``. A critical aspect of this research was the design of the system flow chart showing the exact system components and the data to be transferred. All of the major system components would have been configured to ensure data file compatibility and transferability across the Internet. The intent was to use commercially available packages to model the various manufacturing processes for creating the die and die inserts in addition to modeling the processes for which these parts were to be used. In order to meet all of these requirements, investigative research was conducted to determine the system flow features and software components within the various organizations contributing to this project. This research is summarized.« less

  16. Pathways of Risk and Resilience: Impact of a Family Resilience Program on Active-Duty Military Parents.

    PubMed

    Saltzman, William R; Lester, Patricia; Milburn, Norweeta; Woodward, Kirsten; Stein, Judith

    2016-12-01

    Over the past decade, studies into the impact of wartime deployment and related adversities on service members and their families have offered empirical support for systemic models of family functioning and a more nuanced understanding of the mechanisms by which stress and trauma reverberate across family and partner relationships. They have also advanced our understanding of the ways in which families may contribute to the resilience of children and parents contending with the stressors of serial deployments and parental physical and psychological injuries. This study is the latest in a series designed to further clarify the systemic functioning of military families and to explicate the role of resilient family processes in reducing symptoms of distress and poor adaptation among family members. Drawing upon the implementation of the Families Overcoming Under Stress (FOCUS) Family Resilience Program at 14 active-duty military installations across the United States, structural equation modeling was conducted with data from 434 marine and navy active-duty families who participated in the FOCUS program. The goal was to better understand the ways in which parental distress reverberates across military family systems and, through longitudinal path analytic modeling, determine the pathways of program impact on parental distress. The findings indicated significant cross-influence of distress between the military and civilian parents within families, families with more distressed military parents were more likely to sustain participation in the program, and reductions in distress among both military and civilian parents were significantly mediated by improvements in resilient family processes. These results are consistent with family systemic and resilient models that support preventive interventions designed to enhance family resilient processes as an important part of comprehensive services for distressed military families. © 2016 Family Process Institute.

  17. Developmental Programming of Renal Function and Re-Programming Approaches.

    PubMed

    Nüsken, Eva; Dötsch, Jörg; Weber, Lutz T; Nüsken, Kai-Dietrich

    2018-01-01

    Chronic kidney disease affects more than 10% of the population. Programming studies have examined the interrelationship between environmental factors in early life and differences in morbidity and mortality between individuals. A number of important principles has been identified, namely permanent structural modifications of organs and cells, long-lasting adjustments of endocrine regulatory circuits, as well as altered gene transcription. Risk factors include intrauterine deficiencies by disturbed placental function or maternal malnutrition, prematurity, intrauterine and postnatal stress, intrauterine and postnatal overnutrition, as well as dietary dysbalances in postnatal life. This mini-review discusses critical developmental periods and long-term sequelae of renal programming in humans and presents studies examining the underlying mechanisms as well as interventional approaches to "re-program" renal susceptibility toward disease. Clinical manifestations of programmed kidney disease include arterial hypertension, proteinuria, aggravation of inflammatory glomerular disease, and loss of kidney function. Nephron number, regulation of the renin-angiotensin-aldosterone system, renal sodium transport, vasomotor and endothelial function, myogenic response, and tubuloglomerular feedback have been identified as being vulnerable to environmental factors. Oxidative stress levels, metabolic pathways, including insulin, leptin, steroids, and arachidonic acid, DNA methylation, and histone configuration may be significantly altered by adverse environmental conditions. Studies on re-programming interventions focused on dietary or anti-oxidative approaches so far. Further studies that broaden our understanding of renal programming mechanisms are needed to ultimately develop preventive strategies. Targeted re-programming interventions in animal models focusing on known mechanisms will contribute to new concepts which finally will have to be translated to human application. Early nutritional concepts with specific modifications in macro- or micronutrients are among the most promising approaches to improve future renal health.

  18. Development of Flight Safety Prediction Methodology for U. S. Naval Safety Center. Revision 1

    DTIC Science & Technology

    1970-02-01

    Safety Center. The methodology develoned encompassed functional analysis of the F-4J aircraft, assessment of the importance of safety- sensitive ... Sensitivity ... ....... . 4-8 V 4.5 Model Implementation ........ ......... . 4-10 4.5.1 Functional Analysis ..... ........... . 4-11 4. 5. 2 Major...Function Sensitivity Assignment ........ ... 4-13 i 4.5.3 Link Dependency Assignment ... ......... . 4-14 4.5.4 Computer Program for Sensitivity

  19. From non-preemptive to preemptive scheduling using synchronization synthesis.

    PubMed

    Černý, Pavol; Clarke, Edmund M; Henzinger, Thomas A; Radhakrishna, Arjun; Ryzhyk, Leonid; Samanta, Roopsha; Tarrach, Thorsten

    2017-01-01

    We present a computer-aided programming approach to concurrency. The approach allows programmers to program assuming a friendly, non-preemptive scheduler, and our synthesis procedure inserts synchronization to ensure that the final program works even with a preemptive scheduler. The correctness specification is implicit, inferred from the non-preemptive behavior. Let us consider sequences of calls that the program makes to an external interface. The specification requires that any such sequence produced under a preemptive scheduler should be included in the set of sequences produced under a non-preemptive scheduler. We guarantee that our synthesis does not introduce deadlocks and that the synchronization inserted is optimal w.r.t. a given objective function. The solution is based on a finitary abstraction, an algorithm for bounded language inclusion modulo an independence relation, and generation of a set of global constraints over synchronization placements. Each model of the global constraints set corresponds to a correctness-ensuring synchronization placement. The placement that is optimal w.r.t. the given objective function is chosen as the synchronization solution. We apply the approach to device-driver programming, where the driver threads call the software interface of the device and the API provided by the operating system. Our experiments demonstrate that our synthesis method is precise and efficient. The implicit specification helped us find one concurrency bug previously missed when model-checking using an explicit, user-provided specification. We implemented objective functions for coarse-grained and fine-grained locking and observed that different synchronization placements are produced for our experiments, favoring a minimal number of synchronization operations or maximum concurrency, respectively.

  20. An innovative approach for modeling and simulation of an automated industrial robotic arm operated electro-pneumatically

    NASA Astrophysics Data System (ADS)

    Popa, L.; Popa, V.

    2017-08-01

    The article is focused on modeling an automated industrial robotic arm operated electro-pneumatically and to simulate the robotic arm operation. It is used the graphic language FBD (Function Block Diagram) to program the robotic arm on Zelio Logic automation. The innovative modeling and simulation procedures are considered specific problems regarding the development of a new type of technical products in the field of robotics. Thus, were identified new applications of a Programmable Logic Controller (PLC) as a specialized computer performing control functions with a variety of high levels of complexit.

  1. The Weak Link HP-41C hand-held calculator program

    Treesearch

    Ross A. Phillips; Penn A. Peters; Gary D. Falk

    1982-01-01

    The Weak Link hand-held calculator program (HP-41C) quickly analyzes a system for logging production and costs. The production equations model conventional chain saw, skidder, loader, and tandemaxle truck operations in eastern mountain areas. Production of each function of the logging system may be determined so that the system may be balanced for minimum cost. The...

  2. Conventions and workflows for using Situs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wriggers, Willy, E-mail: wriggers@biomachina.org

    2012-04-01

    Recent developments of the Situs software suite for multi-scale modeling are reviewed. Typical workflows and conventions encountered during processing of biophysical data from electron microscopy, tomography or small-angle X-ray scattering are described. Situs is a modular program package for the multi-scale modeling of atomic resolution structures and low-resolution biophysical data from electron microscopy, tomography or small-angle X-ray scattering. This article provides an overview of recent developments in the Situs package, with an emphasis on workflows and conventions that are important for practical applications. The modular design of the programs facilitates scripting in the bash shell that allows specific programs tomore » be combined in creative ways that go beyond the original intent of the developers. Several scripting-enabled functionalities, such as flexible transformations of data type, the use of symmetry constraints or the creation of two-dimensional projection images, are described. The processing of low-resolution biophysical maps in such workflows follows not only first principles but often relies on implicit conventions. Situs conventions related to map formats, resolution, correlation functions and feature detection are reviewed and summarized. The compatibility of the Situs workflow with CCP4 conventions and programs is discussed.« less

  3. IFSM fractal image compression with entropy and sparsity constraints: A sequential quadratic programming approach

    NASA Astrophysics Data System (ADS)

    Kunze, Herb; La Torre, Davide; Lin, Jianyi

    2017-01-01

    We consider the inverse problem associated with IFSM: Given a target function f , find an IFSM, such that its fixed point f ¯ is sufficiently close to f in the Lp distance. Forte and Vrscay [1] showed how to reduce this problem to a quadratic optimization model. In this paper, we extend the collage-based method developed by Kunze, La Torre and Vrscay ([2][3][4]), by proposing the minimization of the 1-norm instead of the 0-norm. In fact, optimization problems involving the 0-norm are combinatorial in nature, and hence in general NP-hard. To overcome these difficulties, we introduce the 1-norm and propose a Sequential Quadratic Programming algorithm to solve the corresponding inverse problem. As in Kunze, La Torre and Vrscay [3] in our formulation, the minimization of collage error is treated as a multi-criteria problem that includes three different and conflicting criteria i.e., collage error, entropy and sparsity. This multi-criteria program is solved by means of a scalarization technique which reduces the model to a single-criterion program by combining all objective functions with different trade-off weights. The results of some numerical computations are presented.

  4. Bereavement Support in an Acute Hospital: An Irish Model

    ERIC Educational Resources Information Center

    Walsh, Trish; Foreman, Maeve; Curry, Philip; O'Driscoll, Siobhan; McCormack, Martin

    2008-01-01

    In the first Irish study to examine a hospital-based bereavement care program, 1 year's cohort of bereaved people was surveyed. A response rate of over 40% provided 339 completed questionnaires from bereaved next-of-kin. The findings suggest that a tiered pyramid model of bereavement care (the Beaumont model) may be functional in a number of ways.…

  5. A recurrent neural network for solving bilevel linear programming problem.

    PubMed

    He, Xing; Li, Chuandong; Huang, Tingwen; Li, Chaojie; Huang, Junjian

    2014-04-01

    In this brief, based on the method of penalty functions, a recurrent neural network (NN) modeled by means of a differential inclusion is proposed for solving the bilevel linear programming problem (BLPP). Compared with the existing NNs for BLPP, the model has the least number of state variables and simple structure. Using nonsmooth analysis, the theory of differential inclusions, and Lyapunov-like method, the equilibrium point sequence of the proposed NNs can approximately converge to an optimal solution of BLPP under certain conditions. Finally, the numerical simulations of a supply chain distribution model have shown excellent performance of the proposed recurrent NNs.

  6. Computing Reliabilities Of Ceramic Components Subject To Fracture

    NASA Technical Reports Server (NTRS)

    Nemeth, N. N.; Gyekenyesi, J. P.; Manderscheid, J. M.

    1992-01-01

    CARES calculates fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. Program uses results from commercial structural-analysis program (MSC/NASTRAN or ANSYS) to evaluate reliability of component in presence of inherent surface- and/or volume-type flaws. Computes measure of reliability by use of finite-element mathematical model applicable to multiple materials in sense model made function of statistical characterizations of many ceramic materials. Reliability analysis uses element stress, temperature, area, and volume outputs, obtained from two-dimensional shell and three-dimensional solid isoparametric or axisymmetric finite elements. Written in FORTRAN 77.

  7. Multivariate Heteroscedasticity Models for Functional Brain Connectivity.

    PubMed

    Seiler, Christof; Holmes, Susan

    2017-01-01

    Functional brain connectivity is the co-occurrence of brain activity in different areas during resting and while doing tasks. The data of interest are multivariate timeseries measured simultaneously across brain parcels using resting-state fMRI (rfMRI). We analyze functional connectivity using two heteroscedasticity models. Our first model is low-dimensional and scales linearly in the number of brain parcels. Our second model scales quadratically. We apply both models to data from the Human Connectome Project (HCP) comparing connectivity between short and conventional sleepers. We find stronger functional connectivity in short than conventional sleepers in brain areas consistent with previous findings. This might be due to subjects falling asleep in the scanner. Consequently, we recommend the inclusion of average sleep duration as a covariate to remove unwanted variation in rfMRI studies. A power analysis using the HCP data shows that a sample size of 40 detects 50% of the connectivity at a false discovery rate of 20%. We provide implementations using R and the probabilistic programming language Stan.

  8. Parameter estimation in a human operator describing function model for a two-dimensional tracking task

    NASA Technical Reports Server (NTRS)

    Vanlunteren, A.

    1977-01-01

    A previously described parameter estimation program was applied to a number of control tasks, each involving a human operator model consisting of more than one describing function. One of these experiments is treated in more detail. It consisted of a two dimensional tracking task with identical controlled elements. The tracking errors were presented on one display as two vertically moving horizontal lines. Each loop had its own manipulator. The two forcing functions were mutually independent and consisted each of 9 sine waves. A human operator model was chosen consisting of 4 describing functions, thus taking into account possible linear cross couplings. From the Fourier coefficients of the relevant signals the model parameters were estimated after alignment, averaging over a number of runs and decoupling. The results show that for the elements in the main loops the crossover model applies. A weak linear cross coupling existed with the same dynamics as the elements in the main loops but with a negative sign.

  9. Smoke alarm giveaway and installation programs: an economic evaluation.

    PubMed

    Liu, Ying; Mack, Karin A; Diekman, Shane T

    2012-10-01

    The burden of residential fire injury and death is substantial. Targeted smoke alarm giveaway and installation programs are popular interventions used to reduce residential fire mortality and morbidity. To evaluate the cost effectiveness and cost benefit of implementing a giveaway or installation program in a small hypothetic community with a high risk of fire death and injury through a decision-analysis model. Model inputs included program costs; program effectiveness (life-years and quality-adjusted life-years saved); and monetized program benefits (medical cost, productivity, property loss and quality-of-life losses averted) and were identified through structured reviews of existing literature (done in 2011) and supplemented by expert opinion. Future costs and effectiveness were discounted at a rate of 3% per year. All costs were expressed in 2011 U.S. dollars. Cost-effectiveness analysis (CEA) resulted in an average cost-effectiveness ratio (ACER) of $51,404 per quality-adjusted life-years (QALYs) saved and $45,630 per QALY for the giveaway and installation programs, respectively. Cost-benefit analysis (CBA) showed that both programs were associated with a positive net benefit with a benefit-cost ratio of 2.1 and 2.3, respectively. Smoke alarm functional rate, baseline prevalence of functional alarms, and baseline home fire death rate were among the most influential factors for the CEA and CBA results. Both giveaway and installation programs have an average cost-effectiveness ratio similar to or lower than the median cost-effectiveness ratio reported for other interventions to reduce fatal injuries in homes. Although more effort is required, installation programs result in lower cost per outcome achieved compared with giveaways. Published by Elsevier Inc.

  10. The Impact of a Participatory Care Model on Work Satisfaction of Care Workers and the Functionality, Connectedness, and Mental Health of Community-Dwelling Older People.

    PubMed

    Bernoth, Maree; Burmeister, Oliver K; Morrison, Mark; Islam, Md Zahidul; Onslow, Fiona; Cleary, Michelle

    2016-06-01

    This study describes and evaluates an innovative program designed to reduce functional decline among seniors, using a participatory care approach and integrated health teams. The evaluation provides older people and community support workers (CSWs) with the opportunity to share their experiences of being involved with an innovative program to reduce functional decline (mobility, skin integrity, nutrition, mental health, continence) of older, community dwelling adults implemented by a Nursing Service in a major capital city in Australia. As part of the program, CSWs were trained to provide care that aimed to reduce functional decline, and improve the quality of life for the care recipients. Data were collected through in-depth interviews with older people receiving care and a focus group (FG) was conducted with CSWs. Seven themes emerged during data analysis: 1) functionality/independence; 2) prevention; 3) confidence; 4) connection; 5) the approach; 6) care plans; and 7) the role of the CSWs. The relationship built between care giver and receiver and the mutual respect facilitated through adopting a participatory care approach was crucial. This relationship-focused care contributed to improved functionality and consequently quality of life for the older person, and for the CSW professional it contributed to their development, improved satisfaction with their role, and increased pride in the difference they make in the lives of their clients. Opportunities for improvement of the program included ensuring that participants understood the rationale for all aspects of the program, including regular reminders, as well as the use of regular reviews of individual outcomes.

  11. French research program on the physiological problems caused by weightlessness. Use of the primate model

    NASA Astrophysics Data System (ADS)

    Pesquies, P. C.; Milhaud, C.; Nogues, C.; Klein, M.; Cailler, B.; Bost, R.

    The need to acquire a better knowledge of the main biological problems induced by microgravity implies—in addition to human experimentation—the use of animal models, and primates seem to be particularly well adapted to this type of research. The major areas of investigation to be considered are the phospho-calcium metabolism and the metabolism of supporting tissues, the hydroelectrolytic metabolism, the cardiovascular function, awakeness, sleep-awakeness cycles, the physiology of equilibrium and the pathophysiology of space sickness. Considering this program, the Centre d'Etudes et de Recherches de Medecine Aerospatiale, under the sponsorship of the Centre National d'Etudes Spatiales, developed both a program of research on restrained primates for the French-U.S. space cooperation (Spacelab program) and for the French-Soviet space cooperation (Bio-cosmos program), and simulation of the effects of microgravity by head-down bedrest. Its major characteristics are discussed in the study.

  12. A simplified computer program for the prediction of the linear stability behavior of liquid propellant combustors

    NASA Technical Reports Server (NTRS)

    Mitchell, C. E.; Eckert, K.

    1979-01-01

    A program for predicting the linear stability of liquid propellant rocket engines is presented. The underlying model assumptions and analytical steps necessary for understanding the program and its input and output are also given. The rocket engine is modeled as a right circular cylinder with an injector with a concentrated combustion zone, a nozzle, finite mean flow, and an acoustic admittance, or the sensitive time lag theory. The resulting partial differential equations are combined into two governing integral equations by the use of the Green's function method. These equations are solved using a successive approximation technique for the small amplitude (linear) case. The computational method used as well as the various user options available are discussed. Finally, a flow diagram, sample input and output for a typical application and a complete program listing for program MODULE are presented.

  13. Tree-Structured Digital Organisms Model

    NASA Astrophysics Data System (ADS)

    Suzuki, Teruhiko; Nobesawa, Shiho; Tahara, Ikuo

    Tierra and Avida are well-known models of digital organisms. They describe a life process as a sequence of computation codes. A linear sequence model may not be the only way to describe a digital organism, though it is very simple for a computer-based model. Thus we propose a new digital organism model based on a tree structure, which is rather similar to the generic programming. With our model, a life process is a combination of various functions, as if life in the real world is. This implies that our model can easily describe the hierarchical structure of life, and it can simulate evolutionary computation through mutual interaction of functions. We verified our model by simulations that our model can be regarded as a digital organism model according to its definitions. Our model even succeeded in creating species such as viruses and parasites.

  14. An Educational Programming Framework for a Subset of Students with Diverse Learning Needs: Borderline Intellectual Functioning

    ERIC Educational Resources Information Center

    Shaw, Steven R.

    2008-01-01

    Students with intelligence test scores between 70 and 85 frequently fall into the gap between general and special education. Students with borderline intellectual functioning are a large population at-risk for school failure. Recent educational trends (e.g., the use of response to intervention models of special education eligibility,…

  15. Computing the Power-Density Spectrum for an Engineering Model

    NASA Technical Reports Server (NTRS)

    Dunn, H. J.

    1982-01-01

    Computer program for calculating of power-density spectrum (PDS) from data base generated by Advanced Continuous Simulation Language (ACSL) uses algorithm that employs fast Fourier transform (FFT) to calculate PDS of variable. Accomplished by first estimating autocovariance function of variable and then taking FFT of smoothed autocovariance function to obtain PDS. Fast-Fourier-transform technique conserves computer resources.

  16. Cost Estimation of Software Development and the Implications for the Program Manager

    DTIC Science & Technology

    1992-06-01

    Software Lifecycle Model (SLIM), the Jensen System-4 model, the Software Productivity, Quality, and Reliability Estimator ( SPQR \\20), the Constructive...function models in current use are the Software Productivity, Quality, and Reliability Estimator ( SPQR /20) and the Software Architecture Sizing and...Estimator ( SPQR /20) was developed by T. Capers Jones of Software Productivity Research, Inc., in 1985. The model is intended to estimate the outcome

  17. A linear goal programming model for human resource allocation in a health-care organization.

    PubMed

    Kwak, N K; Lee, C

    1997-06-01

    This paper presents the development of a goal programming (GP) model as an aid to strategic planning and allocation for limited human resources in a health-care organization. The purpose of this study is to assign the personnel to the proper shift hours that enable management to meet the objective of minimizing the total payroll costs while patients are satisfied. A GP model is illustrated using the data provided by a health-care organization in the midwest area. The goals are identified and prioritized. The model result is examined and a sensitivity analysis is performed to improve the model applicability. The GP model application adds insight to the planning functions of resource allocation in the health-care organizations. The proposed model is easily applicable to other human resource planning process.

  18. Modeling Operations Costs for Human Exploration Architectures

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2013-01-01

    Operations and support (O&S) costs for human spaceflight have not received the same attention in the cost estimating community as have development costs. This is unfortunate as O&S costs typically comprise a majority of life-cycle costs (LCC) in such programs as the International Space Station (ISS) and the now-cancelled Constellation Program. Recognizing this, the Constellation Program and NASA HQs supported the development of an O&S cost model specifically for human spaceflight. This model, known as the Exploration Architectures Operations Cost Model (ExAOCM), provided the operations cost estimates for a variety of alternative human missions to the moon, Mars, and Near-Earth Objects (NEOs) in architectural studies. ExAOCM is philosophically based on the DoD Architecture Framework (DoDAF) concepts of operational nodes, systems, operational functions, and milestones. This paper presents some of the historical background surrounding the development of the model, and discusses the underlying structure, its unusual user interface, and lastly, previous examples of its use in the aforementioned architectural studies.

  19. Saul: Towards Declarative Learning Based Programming

    PubMed Central

    Kordjamshidi, Parisa; Roth, Dan; Wu, Hao

    2015-01-01

    We present Saul, a new probabilistic programming language designed to address some of the shortcomings of programming languages that aim at advancing and simplifying the development of AI systems. Such languages need to interact with messy, naturally occurring data, to allow a programmer to specify what needs to be done at an appropriate level of abstraction rather than at the data level, to be developed on a solid theory that supports moving to and reasoning at this level of abstraction and, finally, to support flexible integration of these learning and inference models within an application program. Saul is an object-functional programming language written in Scala that facilitates these by (1) allowing a programmer to learn, name and manipulate named abstractions over relational data; (2) supporting seamless incorporation of trainable (probabilistic or discriminative) components into the program, and (3) providing a level of inference over trainable models to support composition and make decisions that respect domain and application constraints. Saul is developed over a declaratively defined relational data model, can use piecewise learned factor graphs with declaratively specified learning and inference objectives, and it supports inference over probabilistic models augmented with declarative knowledge-based constraints. We describe the key constructs of Saul and exemplify its use in developing applications that require relational feature engineering and structured output prediction. PMID:26635465

  20. Saul: Towards Declarative Learning Based Programming.

    PubMed

    Kordjamshidi, Parisa; Roth, Dan; Wu, Hao

    2015-07-01

    We present Saul , a new probabilistic programming language designed to address some of the shortcomings of programming languages that aim at advancing and simplifying the development of AI systems. Such languages need to interact with messy, naturally occurring data, to allow a programmer to specify what needs to be done at an appropriate level of abstraction rather than at the data level, to be developed on a solid theory that supports moving to and reasoning at this level of abstraction and, finally, to support flexible integration of these learning and inference models within an application program. Saul is an object-functional programming language written in Scala that facilitates these by (1) allowing a programmer to learn, name and manipulate named abstractions over relational data; (2) supporting seamless incorporation of trainable (probabilistic or discriminative) components into the program, and (3) providing a level of inference over trainable models to support composition and make decisions that respect domain and application constraints. Saul is developed over a declaratively defined relational data model, can use piecewise learned factor graphs with declaratively specified learning and inference objectives, and it supports inference over probabilistic models augmented with declarative knowledge-based constraints. We describe the key constructs of Saul and exemplify its use in developing applications that require relational feature engineering and structured output prediction.

  1. Structured Analysis/Design - LSA Task 101, Early Logistic Support Analysis Strategy, Subtask 101.2.1, Develop Early LSA Strategy

    DTIC Science & Technology

    1990-07-01

    replacing "logic diagrams" or "flow charts") to aid in coordinating the functions to be performed by a computer program and its associated Inputs...ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT ITASK IWORK UNIT ELEMENT NO. NO. NO. ACCESSION NO. 11. TITLE...the analysis. Both the logical model and detailed procedures are used to develop the application software programs which will be provided to Government

  2. Wind Factor Simulation Model: User’s Manual.

    DTIC Science & Technology

    1980-04-01

    computer program documentation; com- puterized simulation; equivalent headwind technique; great circle; great circle distance; great circle equation ; great... equation of a great circle. Program listing and flow chart are included. iv UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE(WIh.n Date EnItrd) USER’S...THE EQUATOR . 336 C 337 NTRIFG = 0 338 C 339 C END OF FUNCTION ICONV I 1. RETURN TO MAIN PROGRAM . 340 C 42 341 RETURN 34? C 343 C 344 C 345 C * PART II

  3. Deep anistropic shell program for tire analysis

    NASA Technical Reports Server (NTRS)

    Andersen, C. M.

    1981-01-01

    A finite element program was constructed to model the mechanical response of a tire, treated as a deep anisotropic shell, to specified static loads. The program is based on a Sanders Budiansky type shell theory with the effects of transverse shear deformation and bending-extensional coupling included. A displacement formulation is used together with a total Lagrangian description of the deformation. Sixteen-node quadrilateral elements with bicubic shape functions are employed. The Noor basis reduction technique and various type of symmetry considerations serve to improve the computational efficiency.

  4. NLSCIDNT user's guide maximum likehood parameter identification computer program with nonlinear rotorcraft model

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A nonlinear, maximum likelihood, parameter identification computer program (NLSCIDNT) is described which evaluates rotorcraft stability and control coefficients from flight test data. The optimal estimates of the parameters (stability and control coefficients) are determined (identified) by minimizing the negative log likelihood cost function. The minimization technique is the Levenberg-Marquardt method, which behaves like the steepest descent method when it is far from the minimum and behaves like the modified Newton-Raphson method when it is nearer the minimum. Twenty-one states and 40 measurement variables are modeled, and any subset may be selected. States which are not integrated may be fixed at an input value, or time history data may be substituted for the state in the equations of motion. Any aerodynamic coefficient may be expressed as a nonlinear polynomial function of selected 'expansion variables'.

  5. MHSS: a material handling system simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pomernacki, L.; Hollstien, R.B.

    1976-04-07

    A Material Handling System Simulator (MHSS) program is described that provides specialized functional blocks for modeling and simulation of nuclear material handling systems. Models of nuclear fuel fabrication plants may be built using functional blocks that simulate material receiving, storage, transport, inventory, processing, and shipping operations as well as the control and reporting tasks of operators or on-line computers. Blocks are also provided that allow the user to observe and gather statistical information on the dynamic behavior of simulated plants over single or replicated runs. Although it is currently being developed for the nuclear materials handling application, MHSS can bemore » adapted to other industries in which material accountability is important. In this paper, emphasis is on the simulation methodology of the MHSS program with application to the nuclear material safeguards problem. (auth)« less

  6. Defining pediatric inpatient cardiology care delivery models: A survey of pediatric cardiology programs in the USA and Canada.

    PubMed

    Mott, Antonio R; Neish, Steven R; Challman, Melissa; Feltes, Timothy F

    2017-05-01

    The treatment of children with cardiac disease is one of the most prevalent and costly pediatric inpatient conditions. The design of inpatient medical services for children admitted to and discharged from noncritical cardiology care units, however, is undefined. North American Pediatric Cardiology Programs were surveyed to define noncritical cardiac care unit models in current practice. An online survey that explored institutional and functional domains for noncritical cardiac care unit was crafted. All questions were multi-choice with comment boxes for further explanation. The survey was distributed by email four times over a 5-month period. Most programs (n = 45, 60%) exist in free-standing children's hospitals. Most programs cohort cardiac patients on noncritical cardiac care units that are restricted to cardiac patients in 39 (54%) programs or restricted to cardiac and other subspecialty patients in 23 (32%) programs. The most common frontline providers are categorical pediatric residents (n = 58, 81%) and nurse practitioners (n = 48, 67%). However, nurse practitioners are autonomous providers in only 21 (29%) programs. Only 33% of programs use a postoperative fast-track protocol. When transitioning care to referring physicians, most programs (n = 53, 72%) use facsimile to deliver pertinent patient information. Twenty-two programs (31%) use email to transition care, and eighteen (25%) programs use verbal communication. Most programs exist in free-standing children's hospitals in which the noncritical cardiac care units are in some form restricted to cardiac patients. While nurse practitioners are used on most noncritical cardiac care units, they rarely function as autonomous providers. The majority of programs in this survey do not incorporate any postoperative fast-track protocols in their practice. Given the current era of focused handoffs within hospital systems, relatively few programs utilize verbal handoffs to the referring pediatric cardiologist/pediatrician. © 2016 Wiley Periodicals, Inc.

  7. Imaging regional renal function parameters using radionuclide tracers

    NASA Astrophysics Data System (ADS)

    Qiao, Yi

    A compartmental model is given for evaluating kidney function accurately and noninvasively. This model is cast into a parallel multi-compartment structure and each pixel region (picture element) of kidneys is considered as a single kidney compartment. The loss of radionuclide tracers from the blood to the kidney and from the kidney to the bladder are modelled in great detail. Both the uptake function and the excretion function of the kidneys can be evaluated pixel by pixel, and regional diagnostic information on renal function is obtained. Gamma Camera image data are required by this model and a screening test based renal function measurement is provided. The regional blood background is subtracted from the kidney region of interest (ROI) and the kidney regional rate constants are estimated analytically using the Kuhn-Pucker multiplier method in convex programming by considering the input/output behavior of the kidney compartments. The detailed physiological model of the peripheral compartments of the system, which is not available for most radionuclide tracers, is not required in the determination of the kidney regional rate constants and the regional blood background factors within the kidney ROI. Moreover, the statistical significance of measurements is considered to assure the improved statistical properties of the estimated kidney rate constants. The relations between various renal function parameters and the kidney rate constants are established. Multiple renal function measurements can be found from the renal compartmental model. The blood radioactivity curve and the regional (or total) radiorenogram determining the regional (or total) summed behavior of the kidneys are obtained analytically with the consideration of the statistical significance of measurements using convex programming methods for a single peripheral compartment system. In addition, a new technique for the determination of 'initial conditions' in both the blood compartment and the kidney compartment is presented. The blood curve and the radiorenogram are analyzed in great detail and a physiological analysis from the radiorenogram is given. Applications of Kuhn-Tucker multiplier methods are illustrated for the renal compartmental model in the field of nuclear medicine. Conventional kinetic data analysis methods, the maximum likehood method, and the weighted integration method are investigated and used for comparisons. Moreover, the effect of the blood background subtraction is shown by using the gamma camera images in man. Several functional images are calculated and the functional imaging technique is applied for evaluating renal function in man quantitatively and visually and compared with comments from a physician.

  8. Intergenerational Long-Term Effects of Preschool - Structural Estimates from a Discrete Dynamic Programming Model*

    PubMed Central

    Heckman, James J.; Raut, Lakshmi K.

    2015-01-01

    This paper formulates a structural dynamic programming model of preschool investment choices of altruistic parents and then empirically estimates the structural parameters of the model using the NLSY79 data. The paper finds that preschool investment significantly boosts cognitive and non-cognitive skills, which enhance earnings and school outcomes. It also finds that a standard Mincer earnings function, by omitting measures of non-cognitive skills on the right-hand side, overestimates the rate of return to schooling. From the estimated equilibrium Markov process, the paper studies the nature of within generation earnings distribution, intergenerational earnings mobility, and schooling mobility. The paper finds that a tax-financed free preschool program for the children of poor socioeconomic status generates positive net gains to the society in terms of average earnings, higher intergenerational earnings mobility, and schooling mobility. PMID:26709326

  9. A class of finite-time dual neural networks for solving quadratic programming problems and its k-winners-take-all application.

    PubMed

    Li, Shuai; Li, Yangming; Wang, Zheng

    2013-03-01

    This paper presents a class of recurrent neural networks to solve quadratic programming problems. Different from most existing recurrent neural networks for solving quadratic programming problems, the proposed neural network model converges in finite time and the activation function is not required to be a hard-limiting function for finite convergence time. The stability, finite-time convergence property and the optimality of the proposed neural network for solving the original quadratic programming problem are proven in theory. Extensive simulations are performed to evaluate the performance of the neural network with different parameters. In addition, the proposed neural network is applied to solving the k-winner-take-all (k-WTA) problem. Both theoretical analysis and numerical simulations validate the effectiveness of our method for solving the k-WTA problem. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. 12 CFR Appendix A to Subpart C of... - Model Stipulation for Protective Order and Model Protective Order

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., DEPARTMENT OF THE TREASURY ORGANIZATION AND FUNCTIONS, AVAILABILITY AND RELEASE OF INFORMATION, CONTRACTING OUTREACH PROGRAM, POST-EMPLOYMENT RESTRICTIONS FOR SENIOR EXAMINERS Release of Non-Public OCC Information... such records and any information contained in such records confidential and shall in no way divulge the...

  11. 12 CFR Appendix A to Subpart C of... - Model Stipulation for Protective Order and Model Protective Order

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., DEPARTMENT OF THE TREASURY ORGANIZATION AND FUNCTIONS, AVAILABILITY AND RELEASE OF INFORMATION, CONTRACTING OUTREACH PROGRAM, POST-EMPLOYMENT RESTRICTIONS FOR SENIOR EXAMINERS Release of Non-Public OCC Information... shall keep such records and any information contained in such records confidential and shall in no way...

  12. 12 CFR Appendix A to Subpart C of... - Model Stipulation for Protective Order and Model Protective Order

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., DEPARTMENT OF THE TREASURY ORGANIZATION AND FUNCTIONS, AVAILABILITY AND RELEASE OF INFORMATION, CONTRACTING OUTREACH PROGRAM, POST-EMPLOYMENT RESTRICTIONS FOR SENIOR EXAMINERS Release of Non-Public OCC Information... such records and any information contained in such records confidential and shall in no way divulge the...

  13. Integrated Services for Frail Elders (SIPA): A Trial of a Model for Canada

    ERIC Educational Resources Information Center

    Beland, Francois; Bergman, Howard; Lebel, Paule; Dallaire, Luc; Fletcher, John; Contandriopoulos, Andre-Pierre; Solidage, Tousignant Pierre

    2006-01-01

    The complex formed by chronic illness, episodes of acute illness, physiological disabilities, functional limitations, and cognitive problems is prevalent among frail elderly persons. These individuals rely on assistance from social and health care programs, which in Canada are still fragmented. SIPA is an integrated service model based on…

  14. Observed-Score Equating as a Test Assembly Problem.

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Luecht, Richard M.

    1998-01-01

    Derives a set of linear conditions of item-response functions that guarantees identical observed-score distributions on two test forms. The conditions can be added as constraints to a linear programming model for test assembly. An example illustrates the use of the model for an item pool from the Law School Admissions Test (LSAT). (SLD)

  15. Evaluation of Two Teaching Programs Based on Structural Learning Principles.

    ERIC Educational Resources Information Center

    Haussler, Peter

    1978-01-01

    Structural learning theory and the Rasch model measured learning gain, retention, and transfer in 1,037 students, grades 7-10. Students learned nine functional relationships with either spontaneous or synthetic algorithms. The Rasch model gave the better description of the data. The hypothesis that the synthetic method was superior was refuted.…

  16. Equilibrator: Modeling Chemical Equilibria with Excel

    ERIC Educational Resources Information Center

    Vander Griend, Douglas A.

    2011-01-01

    Equilibrator is a Microsoft Excel program for learning about chemical equilibria through modeling, similar in function to EQS4WIN, which is no longer supported and does not work well with newer Windows operating systems. Similar to EQS4WIN, Equilibrator allows the user to define a system with temperature, initial moles, and then either total…

  17. Portfolio Optimization with Stochastic Dividends and Stochastic Volatility

    ERIC Educational Resources Information Center

    Varga, Katherine Yvonne

    2015-01-01

    We consider an optimal investment-consumption portfolio optimization model in which an investor receives stochastic dividends. As a first problem, we allow the drift of stock price to be a bounded function. Next, we consider a stochastic volatility model. In each problem, we use the dynamic programming method to derive the Hamilton-Jacobi-Bellman…

  18. WRAP-RIB antenna technology development

    NASA Technical Reports Server (NTRS)

    Freeland, R. E.; Garcia, N. F.; Iwamoto, H.

    1985-01-01

    The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.

  19. A dynamic model for plant growth: validation study under changing temperatures

    NASA Technical Reports Server (NTRS)

    Wann, M.; Raper, C. D. Jr; Raper CD, J. r. (Principal Investigator)

    1984-01-01

    A dynamic simulation model to describe vegetative growth of plants, for which some functions and parameter values have been estimated previously by optimization search techniques and numerical experimentation based on data from constant temperature experiments, is validated under conditions of changing temperatures. To test the predictive capacity of the model, dry matter accumulation in the leaves, stems, and roots of tobacco plants (Nicotiana tabacum L.) was measured at 2- or 3-day intervals during a 5-week period when temperatures in controlled-environment rooms were programmed for changes at weekly and daily intervals and in ascending or descending sequences within a range of 14 to 34 degrees C. Simulations of dry matter accumulation and distribution were carried out using the programmed changes for experimental temperatures and compared with the measured values. The agreement between measured and predicted values was close and indicates that the temperature-dependent functional forms derived from constant-temperature experiments are adequate for modelling plant growth responses to conditions of changing temperatures with switching intervals as short as 1 day.

  20. Designers' unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  1. Supporting university students with autism spectrum disorder.

    PubMed

    Hillier, Ashleigh; Goldstein, Jody; Murphy, Deirdra; Trietsch, Rhoda; Keeves, Jacqueline; Mendes, Eva; Queenan, Alexa

    2018-01-01

    Increasing numbers of students with autism spectrum disorder are entering higher education. Their success can be jeopardized by organizational, social/emotional, and academic challenges if appropriate supports are not in place. Our objective was to evaluate the effectiveness of a support group model for university students with autism spectrum disorder in improving psychological and functional outcomes. A curriculum guided the weekly discussions and consisted of topics such as time and stress management, managing group work, and social communication. Efficacy was assessed through pre- and post self-report measures focused on self-esteem, loneliness, anxiety, and depression. Functional changes in academic and social skills were examined through qualitative analysis of focus groups. Findings from the self-report measures indicated significant reductions in feelings of loneliness and general anxiety, and a significant increase in self-esteem at the end of the program compared to the beginning. Five prominent themes were identified in the focus-group analysis and reflected how the program had positively impacted participants' skills and coping: executive functioning; goal setting; academics and resources; stress and anxiety; and social. Given the cost effectiveness of "in-house" interventions and the potential for improving academic outcomes and retention of students with autism spectrum disorder, further research examining similar program models is warranted.

  2. Psychological health of military children: Longitudinal evaluation of a family-centered prevention program to enhance family resilience

    PubMed Central

    Lester, Patricia; Stein, Judith A.; Saltzman, William; Woodward, Kirsten; MacDermid, Shelley W.; Milburn, Norweeta; Mogil, Catherine; Beardslee, William

    2014-01-01

    Family-centered preventive interventions have been proposed as relevant to mitigating psychological health risk and promoting resilience in military families facing wartime deployment and reintegration. This study evaluates the impact of a family-centered prevention program, Families OverComing Under Stress Family Resilience Training (FOCUS), on the psychological adjustment of military children. Two primary goals include: 1) Understanding the relationships of distress among family members using a longitudinal path model to assess relations at the child and family level, and 2) Determining pathways of program impact on child adjustment. Multilevel data analysis using structural equation modeling was conducted with de-identified service delivery data from 280 families (505 children ages 3-17) in two follow-up assessments. Standardized measures included Service Member and Civilian parental distress (Brief Symptom Inventory, PTSD Checklist – Military), child adjustment (Strengths and Difficulties Questionnaire), and family functioning (McMaster Family Assessment Device). Distress was significantly related among the service member parent, civilian parent and children. FOCUS improved family functioning, which in turn significantly reduced child distress at follow-up. Salient components of improved family functioning in reducing child distress mirrored resilience processes targeted by FOCUS. These findings underscore the public health potential of family-centered prevention for military families, and suggest areas for future research. PMID:23929043

  3. The Fixed-Point Theory of Strictly Causal Functions

    DTIC Science & Technology

    2013-06-09

    functions were defined to be the functions that are strictly contracting with respect to the Cantor metric (also called the Baire distance) on signals...of Lecture Notes in Computer Science, pages 447–484. Springer Berlin / Heidelberg, 1992. [36] George Markowsky. Chain-complete posets and directed...Journal of Logic Programming, 42(2):59–70, 2000. [52] George M. Reed and A. William Roscoe. A timed model for communicating sequential processes. In Laurent

  4. Optimization of neural network architecture using genetic programming improves detection and modeling of gene-gene interactions in studies of human diseases

    PubMed Central

    Ritchie, Marylyn D; White, Bill C; Parker, Joel S; Hahn, Lance W; Moore, Jason H

    2003-01-01

    Background Appropriate definition of neural network architecture prior to data analysis is crucial for successful data mining. This can be challenging when the underlying model of the data is unknown. The goal of this study was to determine whether optimizing neural network architecture using genetic programming as a machine learning strategy would improve the ability of neural networks to model and detect nonlinear interactions among genes in studies of common human diseases. Results Using simulated data, we show that a genetic programming optimized neural network approach is able to model gene-gene interactions as well as a traditional back propagation neural network. Furthermore, the genetic programming optimized neural network is better than the traditional back propagation neural network approach in terms of predictive ability and power to detect gene-gene interactions when non-functional polymorphisms are present. Conclusion This study suggests that a machine learning strategy for optimizing neural network architecture may be preferable to traditional trial-and-error approaches for the identification and characterization of gene-gene interactions in common, complex human diseases. PMID:12846935

  5. The Boston Health Care for the Homeless Program: a public health framework.

    PubMed

    O'Connell, James J; Oppenheimer, Sarah C; Judge, Christine M; Taube, Robert L; Blanchfield, Bonnie B; Swain, Stacy E; Koh, Howard K

    2010-08-01

    During the past 25 years, the Boston Health Care for the Homeless Program has evolved into a service model embodying the core functions and essential services of public health. Each year the program provides integrated medical, behavioral, and oral health care, as well as preventive services, to more than 11 000 homeless people. Services are delivered in clinics located in 2 teaching hospitals, 80 shelters and soup kitchens, and an innovative 104-bed medical respite unit. We explain the program's principles of care, describe the public health framework that undergirds the program, and offer lessons for the elimination of health disparities suffered by this vulnerable population.

  6. Gait Speed among Older Participants Enrolled in an Evidence-Based Fall Risk Reduction Program: A Subgroup Analysis.

    PubMed

    Cho, Jinmyoung; Smith, Matthew Lee; Shubert, Tiffany E; Jiang, Luohua; Ahn, SangNam; Ory, Marcia G

    2015-01-01

    Functional decline is a primary risk factor for institutionalization and mortality among older adults. Although community-based fall risk reduction programs have been widely disseminated, little is known about their impact on gait speed, a key indicator of functional performance. Changes in functional performance between baseline and post-intervention were examined by means of timed up and go (TUG), a standardized functional assessment test administered to participants enrolled in A Matter of Balance/Volunteer Lay Leader (AMOB/VLL) model, an evidence-based fall risk reduction program. This study included 71 participants enrolled in an AMOB/VLL program in the Brazos Valley and South Plain regions of Texas. Paired t-tests were employed to assess program effects on gait speed at baseline and post-intervention for all participants and by subgroups of age, sex, living status, delivery sites, and self-rated health. The Bonferroni correction was applied to adjust inflated Type I error rate associated with performing multiple t-tests, for which p-values <0.0042 (i.e., 0.5/12 comparisons) were deemed statistically significant. Overall, gait speed of enrolled participants improved from baseline to post-intervention (t = 3.22, p = 0.002). Significant changes in TUG scores were observed among participants who lived with others (t = 4.45, p < 0.001), rated their health as excellent, very good, or good (t = 3.05, p = 0.003), and attended program workshops at senior centers (t = 3.52, p = 0.003). Findings suggest community-based fall risk reduction programs can improve gait speed for older adults. More translational research is needed to understand factors related to the effectiveness of fall risk reduction programs in various populations and settings.

  7. Developmental Programming of Renal Function and Re-Programming Approaches

    PubMed Central

    Nüsken, Eva; Dötsch, Jörg; Weber, Lutz T.; Nüsken, Kai-Dietrich

    2018-01-01

    Chronic kidney disease affects more than 10% of the population. Programming studies have examined the interrelationship between environmental factors in early life and differences in morbidity and mortality between individuals. A number of important principles has been identified, namely permanent structural modifications of organs and cells, long-lasting adjustments of endocrine regulatory circuits, as well as altered gene transcription. Risk factors include intrauterine deficiencies by disturbed placental function or maternal malnutrition, prematurity, intrauterine and postnatal stress, intrauterine and postnatal overnutrition, as well as dietary dysbalances in postnatal life. This mini-review discusses critical developmental periods and long-term sequelae of renal programming in humans and presents studies examining the underlying mechanisms as well as interventional approaches to “re-program” renal susceptibility toward disease. Clinical manifestations of programmed kidney disease include arterial hypertension, proteinuria, aggravation of inflammatory glomerular disease, and loss of kidney function. Nephron number, regulation of the renin–angiotensin–aldosterone system, renal sodium transport, vasomotor and endothelial function, myogenic response, and tubuloglomerular feedback have been identified as being vulnerable to environmental factors. Oxidative stress levels, metabolic pathways, including insulin, leptin, steroids, and arachidonic acid, DNA methylation, and histone configuration may be significantly altered by adverse environmental conditions. Studies on re-programming interventions focused on dietary or anti-oxidative approaches so far. Further studies that broaden our understanding of renal programming mechanisms are needed to ultimately develop preventive strategies. Targeted re-programming interventions in animal models focusing on known mechanisms will contribute to new concepts which finally will have to be translated to human application. Early nutritional concepts with specific modifications in macro- or micronutrients are among the most promising approaches to improve future renal health. PMID:29535992

  8. PYFLOW_2.0: a computer program for calculating flow properties and impact parameters of past dilute pyroclastic density currents based on field data

    NASA Astrophysics Data System (ADS)

    Dioguardi, Fabio; Mele, Daniela

    2018-03-01

    This paper presents PYFLOW_2.0, a hazard tool for the calculation of the impact parameters of dilute pyroclastic density currents (DPDCs). DPDCs represent the dilute turbulent type of gravity flows that occur during explosive volcanic eruptions; their hazard is the result of their mobility and the capability to laterally impact buildings and infrastructures and to transport variable amounts of volcanic ash along the path. Starting from data coming from the analysis of deposits formed by DPDCs, PYFLOW_2.0 calculates the flow properties (e.g., velocity, bulk density, thickness) and impact parameters (dynamic pressure, deposition time) at the location of the sampled outcrop. Given the inherent uncertainties related to sampling, laboratory analyses, and modeling assumptions, the program provides ranges of variations and probability density functions of the impact parameters rather than single specific values; from these functions, the user can interrogate the program to obtain the value of the computed impact parameter at any specified exceedance probability. In this paper, the sedimentological models implemented in PYFLOW_2.0 are presented, program functionalities are briefly introduced, and two application examples are discussed so as to show the capabilities of the software in quantifying the impact of the analyzed DPDCs in terms of dynamic pressure, volcanic ash concentration, and residence time in the atmosphere. The software and user's manual are made available as a downloadable electronic supplement.

  9. Antimicrobial Stewardship Programs: Comparison of a Program with Infectious Diseases Pharmacist Support to a Program with a Geographic Pharmacist Staffing Model

    PubMed Central

    Ma, Andrew; Clegg, Daniel; Fugit, Randolph V.; Pepe, Anthony; Goetz, Matthew Bidwell; Graber, Christopher J.

    2015-01-01

    Background: Stewardship of antimicrobial agents is an essential function of hospital pharmacies. The ideal pharmacist staffing model for antimicrobial stewardship programs is not known. Objective: To inform staffing decisions for antimicrobial stewardship teams, we aimed to compare an antimicrobial stewardship program with a dedicated Infectious Diseases (ID) pharmacist (Dedicated ID Pharmacist Hospital) to a program relying on ward pharmacists for stewardship activities (Geographic Model Hospital). Methods: We reviewed a randomly selected sample of 290 cases of inpatient parenteral antibiotic use. The electronic medical record was reviewed for compliance with indicators of appropriate antimicrobial stewardship. Results: At the hospital staffed by a dedicated ID pharmacist, 96.8% of patients received initial antimicrobial therapy that adhered to local treatment guidelines compared to 87% of patients at the hospital that assigned antimicrobial stewardship duties to ward pharmacists (P < .002). Therapy was modified within 24 hours of availability of laboratory data in 86.7% of cases at the Dedicated ID Pharmacist Hospital versus 72.6% of cases at the Geographic Model Hospital (P < .03). When a patient’s illness was determined not to be caused by a bacterial infection, antibiotics were discontinued in 78.0% of cases at the Dedicated ID Pharmacist Hospital and in 33.3% of cases at the Geographic Model Hospital (P < .0002). Conclusion: An antimicrobial stewardship program with a dedicated ID pharmacist was associated with greater adherence to recommended antimicrobial therapy practices when compared to a stewardship program that relied on ward pharmacists. PMID:26405339

  10. Accurate Estimation of Target amounts Using Expanded BASS Model for Demand-Side Management

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Woong; Park, Jong-Jin; Kim, Jin-O.

    2008-10-01

    The electricity demand in Korea has rapidly increased along with a steady economic growth since 1970s. Therefore Korea has positively propelled not only SSM (Supply-Side Management) but also DSM (Demand-Side Management) activities to reduce investment cost of generating units and to save supply costs of electricity through the enhancement of whole national energy utilization efficiency. However study for rebate, which have influence on success or failure on DSM program, is not sufficient. This paper executed to modeling mathematically expanded Bass model considering rebates, which have influence on penetration amounts for DSM program. To reflect rebate effect more preciously, the pricing function using in expanded Bass model directly reflects response of potential participants for rebate level.

  11. Reprogramming: A Preventive Strategy in Hypertension Focusing on the Kidney

    PubMed Central

    Tain, You-Lin; Joles, Jaap A.

    2015-01-01

    Adulthood hypertension can be programmed in response to a suboptimal environment in early life. However, developmental plasticity also implies that one can prevent hypertension in adult life by administrating appropriate compounds during early development. We have termed this reprogramming. While the risk of hypertension has been assessed in many mother-child cohorts of human developmental programming, interventions necessary to prove causation and provide a reprogramming strategy are lacking. Since the developing kidney is particularly vulnerable to environmental insults and blood pressure is determined by kidney function, renal programming is considered key in developmental programming of hypertension. Common pathways, whereby both genetic and acquired developmental programming converge into the same phenotype, have been recognized. For instance, the same reprogramming interventions aimed at shifting nitric oxide (NO)-reactive oxygen species (ROS) balance, such as perinatal citrulline or melatonin supplements, can be protective in both genetic and developmentally programmed hypertension. Furthermore, a significantly increased expression of gene Ephx2 (soluble epoxide hydrolase) was noted in both genetic and acquired animal models of hypertension. Since a suboptimal environment is often multifactorial, such common reprogramming pathways are a practical finding for translation to the clinic. This review provides an overview of potential clinical applications of reprogramming strategies to prevent programmed hypertension. We emphasize the kidney in the following areas: mechanistic insights from human studies and animal models to interpret programmed hypertension; identified risk factors of human programmed hypertension from mother-child cohorts; and the impact of reprogramming strategies on programmed hypertension from animal models. It is critical that the observed effects on developmental reprogramming in animal models are replicated in human studies. PMID:26712746

  12. ACIRF user's guide: Theory and examples

    NASA Astrophysics Data System (ADS)

    Dana, Roger A.

    1989-12-01

    Design and evaluation of radio frequency systems that must operate through ionospheric disturbances resulting from high altitude nuclear detonations requires an accurate channel model. This model must include the effects of high gain antennas that may be used to receive the signals. Such a model can then be used to construct realizations of the received signal for use in digital simulations of trans-ionospheric links or for use in hardware channel simulators. The FORTRAN channel model ACIRF (Antenna Channel Impulse Response Function) generates random realizations of the impulse response function at the outputs of multiple antennas. This user's guide describes the FORTRAN program ACIRF (version 2.0) that generates realizations of channel impulse response functions at the outputs of multiple antennas with arbitrary beamwidths, pointing angles, and relatives positions. This channel model is valid under strong scattering conditions when Rayleigh fading statistics apply. Both frozen-in and turbulent models for the temporal fluctuations are included in this version of ACIRF. The theory of the channel model is described and several examples are given.

  13. ELRIS2D: A MATLAB Package for the 2D Inversion of DC Resistivity/IP Data

    NASA Astrophysics Data System (ADS)

    Akca, Irfan

    2016-04-01

    ELRIS2D is an open source code written in MATLAB for the two-dimensional inversion of direct current resistivity (DCR) and time domain induced polarization (IP) data. The user interface of the program is designed for functionality and ease of use. All available settings of the program can be reached from the main window. The subsurface is discre-tized using a hybrid mesh generated by the combination of structured and unstructured meshes, which reduces the computational cost of the whole inversion procedure. The inversion routine is based on the smoothness constrained least squares method. In order to verify the program, responses of two test models and field data sets were inverted. The models inverted from the synthetic data sets are consistent with the original test models in both DC resistivity and IP cases. A field data set acquired in an archaeological site is also used for the verification of outcomes of the program in comparison with the excavation results.

  14. Computer-Aided Modeling and Analysis of Power Processing Systems (CAMAPPS). Phase 1: Users handbook

    NASA Technical Reports Server (NTRS)

    Kim, S.; Lee, J.; Cho, B. H.; Lee, F. C.

    1986-01-01

    The EASY5 macro component models developed for the spacecraft power system simulation are described. A brief explanation about how to use the macro components with the EASY5 Standard Components to build a specific system is given through an example. The macro components are ordered according to the following functional group: converter power stage models, compensator models, current-feedback models, constant frequency control models, load models, solar array models, and shunt regulator models. Major equations, a circuit model, and a program listing are provided for each macro component.

  15. PROTO-PLASM: parallel language for adaptive and scalable modelling of biosystems.

    PubMed

    Bajaj, Chandrajit; DiCarlo, Antonio; Paoluzzi, Alberto

    2008-09-13

    This paper discusses the design goals and the first developments of PROTO-PLASM, a novel computational environment to produce libraries of executable, combinable and customizable computer models of natural and synthetic biosystems, aiming to provide a supporting framework for predictive understanding of structure and behaviour through multiscale geometric modelling and multiphysics simulations. Admittedly, the PROTO-PLASM platform is still in its infancy. Its computational framework--language, model library, integrated development environment and parallel engine--intends to provide patient-specific computational modelling and simulation of organs and biosystem, exploiting novel functionalities resulting from the symbolic combination of parametrized models of parts at various scales. PROTO-PLASM may define the model equations, but it is currently focused on the symbolic description of model geometry and on the parallel support of simulations. Conversely, CellML and SBML could be viewed as defining the behavioural functions (the model equations) to be used within a PROTO-PLASM program. Here we exemplify the basic functionalities of PROTO-PLASM, by constructing a schematic heart model. We also discuss multiscale issues with reference to the geometric and physical modelling of neuromuscular junctions.

  16. Proto-Plasm: parallel language for adaptive and scalable modelling of biosystems

    PubMed Central

    Bajaj, Chandrajit; DiCarlo, Antonio; Paoluzzi, Alberto

    2008-01-01

    This paper discusses the design goals and the first developments of Proto-Plasm, a novel computational environment to produce libraries of executable, combinable and customizable computer models of natural and synthetic biosystems, aiming to provide a supporting framework for predictive understanding of structure and behaviour through multiscale geometric modelling and multiphysics simulations. Admittedly, the Proto-Plasm platform is still in its infancy. Its computational framework—language, model library, integrated development environment and parallel engine—intends to provide patient-specific computational modelling and simulation of organs and biosystem, exploiting novel functionalities resulting from the symbolic combination of parametrized models of parts at various scales. Proto-Plasm may define the model equations, but it is currently focused on the symbolic description of model geometry and on the parallel support of simulations. Conversely, CellML and SBML could be viewed as defining the behavioural functions (the model equations) to be used within a Proto-Plasm program. Here we exemplify the basic functionalities of Proto-Plasm, by constructing a schematic heart model. We also discuss multiscale issues with reference to the geometric and physical modelling of neuromuscular junctions. PMID:18559320

  17. Feasibility of a Brief Community-Based Train-the-Trainer Lesson to Reduce the Risk of Falls among Community Dwelling Older Adults

    ERIC Educational Resources Information Center

    Gunter, Katherine B.; John, Deborah H.

    2014-01-01

    The Better Balance, Better Bones, Better Bodies (B-Better©) program was developed to disseminate simple home-based strategies to prevent falls and improve functional health of older adults using a train-the-trainer model. Delivered by Family & Community Education Study Group program volunteers, the lesson stresses the importance of a…

  18. Mathematical model of ambulance resources in Saint-Petersburg

    NASA Astrophysics Data System (ADS)

    Shavidze, G. G.; Balykina, Y. E.; Lejnina, E. A.; Svirkin, M. V.

    2016-06-01

    Emergency medical system is one of the main elements in city infrastructure. The article contains analysis of existing system of ambulance resource distribution. Paper considers the idea of using multiperiodicity as a tool to increase the efficiency of the Emergency Medical Services. The program developed in programming environment Matlab helps to evaluate the changes in the functioning of the system of emergency medical service.

  19. An improved risk-explicit interval linear programming model for pollution load allocation for watershed management.

    PubMed

    Xia, Bisheng; Qian, Xin; Yao, Hong

    2017-11-01

    Although the risk-explicit interval linear programming (REILP) model has solved the problem of having interval solutions, it has an equity problem, which can lead to unbalanced allocation between different decision variables. Therefore, an improved REILP model is proposed. This model adds an equity objective function and three constraint conditions to overcome this equity problem. In this case, pollution reduction is in proportion to pollutant load, which supports balanced development between different regional economies. The model is used to solve the problem of pollution load allocation in a small transboundary watershed. Compared with the REILP original model result, our model achieves equity between the upstream and downstream pollutant loads; it also overcomes the problem of greatest pollution reduction, where sources are nearest to the control section. The model provides a better solution to the problem of pollution load allocation than previous versions.

  20. Recognition of prokaryotic and eukaryotic promoters using convolutional deep learning neural networks.

    PubMed

    Umarov, Ramzan Kh; Solovyev, Victor V

    2017-01-01

    Accurate computational identification of promoters remains a challenge as these key DNA regulatory regions have variable structures composed of functional motifs that provide gene-specific initiation of transcription. In this paper we utilize Convolutional Neural Networks (CNN) to analyze sequence characteristics of prokaryotic and eukaryotic promoters and build their predictive models. We trained a similar CNN architecture on promoters of five distant organisms: human, mouse, plant (Arabidopsis), and two bacteria (Escherichia coli and Bacillus subtilis). We found that CNN trained on sigma70 subclass of Escherichia coli promoter gives an excellent classification of promoters and non-promoter sequences (Sn = 0.90, Sp = 0.96, CC = 0.84). The Bacillus subtilis promoters identification CNN model achieves Sn = 0.91, Sp = 0.95, and CC = 0.86. For human, mouse and Arabidopsis promoters we employed CNNs for identification of two well-known promoter classes (TATA and non-TATA promoters). CNN models nicely recognize these complex functional regions. For human promoters Sn/Sp/CC accuracy of prediction reached 0.95/0.98/0,90 on TATA and 0.90/0.98/0.89 for non-TATA promoter sequences, respectively. For Arabidopsis we observed Sn/Sp/CC 0.95/0.97/0.91 (TATA) and 0.94/0.94/0.86 (non-TATA) promoters. Thus, the developed CNN models, implemented in CNNProm program, demonstrated the ability of deep learning approach to grasp complex promoter sequence characteristics and achieve significantly higher accuracy compared to the previously developed promoter prediction programs. We also propose random substitution procedure to discover positionally conserved promoter functional elements. As the suggested approach does not require knowledge of any specific promoter features, it can be easily extended to identify promoters and other complex functional regions in sequences of many other and especially newly sequenced genomes. The CNNProm program is available to run at web server http://www.softberry.com.

  1. GTOOLS: an Interactive Computer Program to Process Gravity Data for High-Resolution Applications

    NASA Astrophysics Data System (ADS)

    Battaglia, M.; Poland, M. P.; Kauahikaua, J. P.

    2012-12-01

    An interactive computer program, GTOOLS, has been developed to process gravity data acquired by the Scintrex CG-5 and LaCoste & Romberg EG, G and D gravity meters. The aim of GTOOLS is to provide a validated methodology for computing relative gravity values in a consistent way accounting for as many environmental factors as possible (e.g., tides, ocean loading, solar constraints, etc.), as well as instrument drift. The program has a modular architecture. Each processing step is implemented in a tool (function) that can be either run independently or within an automated task. The tools allow the user to (a) read the gravity data acquired during field surveys completed using different types of gravity meters; (b) compute Earth tides using an improved version of Longman's (1959) model; (c) compute ocean loading using the HARDISP code by Petit and Luzum (2010) and ocean loading harmonics from the TPXO7.2 ocean tide model; (d) estimate the instrument drift using linear functions as appropriate; and (e) compute the weighted least-square-adjusted gravity values and their errors. The corrections are performed up to microGal ( μGal) precision, in accordance with the specifications of high-resolution surveys. The program has the ability to incorporate calibration factors that allow for surveys done using different gravimeters to be compared. Two additional tools (functions) allow the user to (1) estimate the instrument calibration factor by processing data collected by a gravimeter on a calibration range; (2) plot gravity time-series at a chosen benchmark. The interactive procedures and the program output (jpeg plots and text files) have been designed to ease data handling and archiving, to provide useful information for future data interpretation or modeling, and facilitate comparison of gravity surveys conducted at different times. All formulas have been checked for typographical errors in the original reference. GTOOLS, developed using Matlab, is open source and machine independent. We will demonstrate program use and utility with data from multiple microgravity surveys at Kilauea volcano, Hawai'i.

  2. φ-evo: A program to evolve phenotypic models of biological networks.

    PubMed

    Henry, Adrien; Hemery, Mathieu; François, Paul

    2018-06-01

    Molecular networks are at the core of most cellular decisions, but are often difficult to comprehend. Reverse engineering of network architecture from their functions has proved fruitful to classify and predict the structure and function of molecular networks, suggesting new experimental tests and biological predictions. We present φ-evo, an open-source program to evolve in silico phenotypic networks performing a given biological function. We include implementations for evolution of biochemical adaptation, adaptive sorting for immune recognition, metazoan development (somitogenesis, hox patterning), as well as Pareto evolution. We detail the program architecture based on C, Python 3, and a Jupyter interface for project configuration and network analysis. We illustrate the predictive power of φ-evo by first recovering the asymmetrical structure of the lac operon regulation from an objective function with symmetrical constraints. Second, we use the problem of hox-like embryonic patterning to show how a single effective fitness can emerge from multi-objective (Pareto) evolution. φ-evo provides an efficient approach and user-friendly interface for the phenotypic prediction of networks and the numerical study of evolution itself.

  3. Enhanced TCAS 2/CDTI traffic Sensor digital simulation model and program description

    NASA Technical Reports Server (NTRS)

    Goka, T.

    1984-01-01

    Digital simulation models of enhanced TCAS 2/CDTI traffic sensors are developed, based on actual or projected operational and performance characteristics. Two enhanced Traffic (or Threat) Alert and Collision Avoidance Systems are considered. A digital simulation program is developed in FORTRAN. The program contains an executive with a semireal time batch processing capability. The simulation program can be interfaced with other modules with a minimum requirement. Both the traffic sensor and CAS logic modules are validated by means of extensive simulation runs. Selected validation cases are discussed in detail, and capabilities and limitations of the actual and simulated systems are noted. The TCAS systems are not specifically intended for Cockpit Display of Traffic Information (CDTI) applications. These systems are sufficiently general to allow implementation of CDTI functions within the real systems' constraints.

  4. General MACOS Interface for Modeling and Analysis for Controlled Optical Systems

    NASA Technical Reports Server (NTRS)

    Sigrist, Norbert; Basinger, Scott A.; Redding, David C.

    2012-01-01

    The General MACOS Interface (GMI) for Modeling and Analysis for Controlled Optical Systems (MACOS) enables the use of MATLAB as a front-end for JPL s critical optical modeling package, MACOS. MACOS is JPL s in-house optical modeling software, which has proven to be a superb tool for advanced systems engineering of optical systems. GMI, coupled with MACOS, allows for seamless interfacing with modeling tools from other disciplines to make possible integration of dynamics, structures, and thermal models with the addition of control systems for deformable optics and other actuated optics. This software package is designed as a tool for analysts to quickly and easily use MACOS without needing to be an expert at programming MACOS. The strength of MACOS is its ability to interface with various modeling/development platforms, allowing evaluation of system performance with thermal, mechanical, and optical modeling parameter variations. GMI provides an improved means for accessing selected key MACOS functionalities. The main objective of GMI is to marry the vast mathematical and graphical capabilities of MATLAB with the powerful optical analysis engine of MACOS, thereby providing a useful tool to anyone who can program in MATLAB. GMI also improves modeling efficiency by eliminating the need to write an interface function for each task/project, reducing error sources, speeding up user/modeling tasks, and making MACOS well suited for fast prototyping.

  5. ProQ3: Improved model quality assessments using Rosetta energy terms

    PubMed Central

    Uziela, Karolis; Shu, Nanjiang; Wallner, Björn; Elofsson, Arne

    2016-01-01

    Quality assessment of protein models using no other information than the structure of the model itself has been shown to be useful for structure prediction. Here, we introduce two novel methods, ProQRosFA and ProQRosCen, inspired by the state-of-art method ProQ2, but using a completely different description of a protein model. ProQ2 uses contacts and other features calculated from a model, while the new predictors are based on Rosetta energies: ProQRosFA uses the full-atom energy function that takes into account all atoms, while ProQRosCen uses the coarse-grained centroid energy function. The two new predictors also include residue conservation and terms corresponding to the agreement of a model with predicted secondary structure and surface area, as in ProQ2. We show that the performance of these predictors is on par with ProQ2 and significantly better than all other model quality assessment programs. Furthermore, we show that combining the input features from all three predictors, the resulting predictor ProQ3 performs better than any of the individual methods. ProQ3, ProQRosFA and ProQRosCen are freely available both as a webserver and stand-alone programs at http://proq3.bioinfo.se/. PMID:27698390

  6. Matrix management for aerospace 2000

    NASA Technical Reports Server (NTRS)

    Mccarthy, J. F., Jr.

    1980-01-01

    The martix management approach to program management is an organized effort for attaining program objectives by defining and structuring all elements so as to form a single system whose parts are united by interaction. The objective of the systems approach is uncompromisingly complete coverage of the program management endeavor. Starting with an analysis of the functions necessary to carry out a given program, a model must be defined; a matrix of responsibility assignment must be prepared; and each operational process must be examined to establish how it is to be carried out and how it relates to all other processes.

  7. 45 CFR 310.10 - What are the functional requirements for the Model Tribal IV-D System?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Number; and (E) Participant Identification Number; (ii) Delinquency and enforcement activities; (iii... operations and to assess program performance through the audit of financial and statistical data maintained...

  8. Multi-Modal Transportation System Simulation

    DOT National Transportation Integrated Search

    1971-01-01

    THE PRESENT STATUS OF A LABORATORY BEING DEVELOPED FOR REAL-TIME SIMULATION OF COMMAND AND CONTROL FUNCTIONS IN TRANSPORTATION SYSTEMS IS DISCUSSED. DETAILS ARE GIVEN ON THE SIMULATION MODELS AND ON PROGRAMMING TECHNIQUES USED IN DEFINING AND EVALUAT...

  9. Geometric models, antenna gains, and protection ratios as developed for BC SAT-R2 conference software

    NASA Technical Reports Server (NTRS)

    Miller, E. F.

    1982-01-01

    Mathematical models used in the software package developed for use at the 1983 Regional Administrative Radio Conference on broadcasting satellites. The models described are those used in the Spectrum Orbit Utilization Program (SOUP) analysis. The geometric relationships necessary to model broadcasting satellite systems are discussed. Antenna models represent copolarized and cross polarized performance as functions of the off axis angle. The protection ratio is modelled as a co-channel value and a template representing systems with frequency offsets.

  10. Software for Building Models of 3D Objects via the Internet

    NASA Technical Reports Server (NTRS)

    Schramer, Tim; Jensen, Jeff

    2003-01-01

    The Virtual EDF Builder (where EDF signifies Electronic Development Fixture) is a computer program that facilitates the use of the Internet for building and displaying digital models of three-dimensional (3D) objects that ordinarily comprise assemblies of solid models created previously by use of computer-aided-design (CAD) programs. The Virtual EDF Builder resides on a Unix-based server computer. It is used in conjunction with a commercially available Web-based plug-in viewer program that runs on a client computer. The Virtual EDF Builder acts as a translator between the viewer program and a database stored on the server. The translation function includes the provision of uniform resource locator (URL) links to other Web-based computer systems and databases. The Virtual EDF builder can be used in two ways: (1) If the client computer is Unix-based, then it can assemble a model locally; the computational load is transferred from the server to the client computer. (2) Alternatively, the server can be made to build the model, in which case the server bears the computational load and the results are downloaded to the client computer or workstation upon completion.

  11. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  12. Train repathing in emergencies based on fuzzy linear programming.

    PubMed

    Meng, Xuelei; Cui, Bingmou

    2014-01-01

    Train pathing is a typical problem which is to assign the train trips on the sets of rail segments, such as rail tracks and links. This paper focuses on the train pathing problem, determining the paths of the train trips in emergencies. We analyze the influencing factors of train pathing, such as transferring cost, running cost, and social adverse effect cost. With the overall consideration of the segment and station capability constraints, we build the fuzzy linear programming model to solve the train pathing problem. We design the fuzzy membership function to describe the fuzzy coefficients. Furthermore, the contraction-expansion factors are introduced to contract or expand the value ranges of the fuzzy coefficients, coping with the uncertainty of the value range of the fuzzy coefficients. We propose a method based on triangular fuzzy coefficient and transfer the train pathing (fuzzy linear programming model) to a determinate linear model to solve the fuzzy linear programming problem. An emergency is supposed based on the real data of the Beijing-Shanghai Railway. The model in this paper was solved and the computation results prove the availability of the model and efficiency of the algorithm.

  13. Using Cultural Modeling to Inform a NEDSS-Compatible System Functionality Evaluation

    PubMed Central

    Anderson, Olympia; Torres-Urquidy, Miguel

    2013-01-01

    Objective The culture by which public health professionals work defines their organizational objectives, expectations, policies, and values. These aspects of culture are often intangible and difficult to qualify. The introduction of an information system could further complicate the culture of a jurisdiction if the intangibles of a culture are not clearly understood. This report describes how cultural modeling can be used to capture intangible elements or factors that may affect NEDSS-compatible (NC) system functionalities within the culture of public health jurisdictions. Introduction The National Notifiable Disease Surveillance System (NNDSS) comprises many activities including collaborations, processes, standards, and systems which support gathering data from US states and territories. As part of NNDSS, the National Electronic Disease Surveillance System (NEDSS) provides the standards, tools, and resources to support reporting public health jurisdictions (jurisdictions). The NEDSS Base System (NBS) is a CDC-developed, software application available to jurisdictions to collect, manage, analyze and report national notifiable disease (NND) data. An evaluation of NEDSS with the objective of identifying the functionalities of NC systems and the impact of these features on the user’s culture is underway. Methods We used cultural models to capture additional NC system functionality gaps within the culture of the user. Cultural modeling is a process of graphically depicting people and organizations referred to as influencers and the intangible factors that affect the user’s operations or work as influences. Influencers are denoted as bubbles while influences are depicted as arrows penetrating the bubbles. In the cultural model, influence can be seen by the size and proximity (or lack of) in the model. We restricted the models to secondary data sources and interviews of CDC programs (data users) and public health jurisdictions (data reporters). Results Three cultural models were developed from the secondary information sources; these models include the NBS vendor, public health jurisdiction (jurisdiction) activities, and NEDSS technical consultants. The vendor cultural model identified channels of communication about functionalities flowing from the vendor and the NBS users with CDC as the approval mechanism. The jurisdiction activities model highlighted perceived issues external to the organization that had some impact in their organization. Key disconnecting issues in the jurisdiction model included situational awareness, data competency, and bureaucracy. This model also identified poor coordination as a major influencer of the jurisdiction’s activities. The NEDSS technical model identified major issues and disconnects among data access, capture and reporting, processing, and ELR functionalities (Figure 1). The data processing functionality resulted in the largest negative influencer with issues that included: loss of data specificity, lengthy submission strategies, and risk of data use. Collectively, the models depict issues with the system functionality but mostly identify other factors that may influence how jurisdictions use the system, moreover determining the functionalities to be included. Conclusions By using the cultural model as a guide, we are able to clarify complex relationships using multiple data sources and improve our understanding of the impacts of the NC system functionalities on user’s operations. Modeling the recipients of the data (e.g. CDC programs) will provide insight on additional factors that may inform the NEDSS evaluation.

  14. Analysis of economics of a TV broadcasting satellite for additional nationwide TV programs

    NASA Technical Reports Server (NTRS)

    Becker, D.; Mertens, G.; Rappold, A.; Seith, W.

    1977-01-01

    The influence of a TV broadcasting satellite, transmitting four additional TV networks was analyzed. It is assumed that the cost of the satellite systems will be financed by the cable TV system operators. The additional TV programs increase income by attracting additional subscribers. Two economic models were established: (1) each local network is regarded as an independent economic unit with individual fees (cost price model) and (2) all networks are part of one public cable TV company with uniform fees (uniform price model). Assumptions are made for penetration as a function of subscription rates. Main results of the study are: the installation of a TV broadcasting satellite improves the economics of CTV-networks in both models; the overall coverage achievable by the uniform price model is significantly higher than that achievable by the cost price model.

  15. Genetic programming-based mathematical modeling of influence of weather parameters in BOD5 removal by Lemna minor.

    PubMed

    Chandrasekaran, Sivapragasam; Sankararajan, Vanitha; Neelakandhan, Nampoothiri; Ram Kumar, Mahalakshmi

    2017-11-04

    This study, through extensive experiments and mathematical modeling, reveals that other than retention time and wastewater temperature (T w ), atmospheric parameters also play important role in the effective functioning of aquatic macrophyte-based treatment system. Duckweed species Lemna minor is considered in this study. It is observed that the combined effect of atmospheric temperature (T atm ), wind speed (U w ), and relative humidity (RH) can be reflected through one parameter, namely the "apparent temperature" (T a ). A total of eight different models are considered based on the combination of input parameters and the best mathematical model is arrived at which is validated through a new experimental set-up outside the modeling period. The validation results are highly encouraging. Genetic programming (GP)-based models are found to reveal deeper understandings of the wetland process.

  16. Wing Leading Edge RCC Rapid Response Damage Prediction Tool (IMPACT2)

    NASA Technical Reports Server (NTRS)

    Clark, Robert; Cottter, Paul; Michalopoulos, Constantine

    2013-01-01

    This rapid response computer program predicts Orbiter Wing Leading Edge (WLE) damage caused by ice or foam impact during a Space Shuttle launch (Program "IMPACT2"). The program was developed after the Columbia accident in order to assess quickly WLE damage due to ice, foam, or metal impact (if any) during a Shuttle launch. IMPACT2 simulates an impact event in a few minutes for foam impactors, and in seconds for ice and metal impactors. The damage criterion is derived from results obtained from one sophisticated commercial program, which requires hours to carry out simulations of the same impact events. The program was designed to run much faster than the commercial program with prediction of projectile threshold velocities within 10 to 15% of commercial-program values. The mathematical model involves coupling of Orbiter wing normal modes of vibration to nonlinear or linear springmass models. IMPACT2 solves nonlinear or linear impact problems using classical normal modes of vibration of a target, and nonlinear/ linear time-domain equations for the projectile. Impact loads and stresses developed in the target are computed as functions of time. This model is novel because of its speed of execution. A typical model of foam, or other projectile characterized by material nonlinearities, impacting an RCC panel is executed in minutes instead of hours needed by the commercial programs. Target damage due to impact can be assessed quickly, provided that target vibration modes and allowable stress are known.

  17. Launch and Landing Effects Ground Operations (LLEGO) Model

    NASA Technical Reports Server (NTRS)

    2008-01-01

    LLEGO is a model for understanding recurring launch and landing operations costs at Kennedy Space Center for human space flight. Launch and landing operations are often referred to as ground processing, or ground operations. Currently, this function is specific to the ground operations for the Space Shuttle Space Transportation System within the Space Shuttle Program. The Constellation system to follow the Space Shuttle consists of the crewed Orion spacecraft atop an Ares I launch vehicle and the uncrewed Ares V cargo launch vehicle. The Constellation flight and ground systems build upon many elements of the existing Shuttle flight and ground hardware, as well as upon existing organizations and processes. In turn, the LLEGO model builds upon past ground operations research, modeling, data, and experience in estimating for future programs. Rather than to simply provide estimates, the LLEGO model s main purpose is to improve expenses by relating complex relationships among functions (ground operations contractor, subcontractors, civil service technical, center management, operations, etc.) to tangible drivers. Drivers include flight system complexity and reliability, as well as operations and supply chain management processes and technology. Together these factors define the operability and potential improvements for any future system, from the most direct to the least direct expenses.

  18. Computer modelling of cyclic deformation of high-temperature materials. Technical progress report, 1 September-30 November 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duesbery, M.S.

    1993-11-30

    This program aims at improving current methods of lifetime assessment by building in the characteristics of the micro-mechanisms known to be responsible for damage and failure. The broad approach entails the integration and, where necessary, augmentation of the micro-scale research results currently available in the literature into a macro-sale model with predictive capability. In more detail, the program will develop a set of hierarchically structured models at different length scales, from atomic to macroscopic, at each level taking as parametric input the results of the model at the next smaller scale. In this way the known microscopic properties can bemore » transported by systematic procedures to the unknown macro-scale region. It may mot be possible to eliminate empiricism completely, because some of the quantities involved cannot yet be estimated to the required degree of precision. In this case the aim will be at least to eliminate functional empiricism. Restriction of empiricism to the choice of parameters to be input to known functional forms permits some confidence in extrapolation procedures and has the advantage that the models can readily be updated as better estimates of the parameters become available.« less

  19. Three-level global resource allocation model for hiv control: A hierarchical decision system approach.

    PubMed

    Kassa, Semu Mitiku

    2018-02-01

    Funds from various global organizations, such as, The Global Fund, The World Bank, etc. are not directly distributed to the targeted risk groups. Especially in the so-called third-world-countries, the major part of the fund in HIV prevention programs comes from these global funding organizations. The allocations of these funds usually pass through several levels of decision making bodies that have their own specific parameters to control and specific objectives to achieve. However, these decisions are made mostly in a heuristic manner and this may lead to a non-optimal allocation of the scarce resources. In this paper, a hierarchical mathematical optimization model is proposed to solve such a problem. Combining existing epidemiological models with the kind of interventions being on practice, a 3-level hierarchical decision making model in optimally allocating such resources has been developed and analyzed. When the impact of antiretroviral therapy (ART) is included in the model, it has been shown that the objective function of the lower level decision making structure is a non-convex minimization problem in the allocation variables even if all the production functions for the intervention programs are assumed to be linear.

  20. Specification for a standard radar sea clutter model

    NASA Astrophysics Data System (ADS)

    Paulus, Richard A.

    1990-09-01

    A model for the average sea clutter radar cross section is proposed for the Oceanographic and Atmospheric Master Library. This model is a function of wind speed (or sea state), wind direction relative to the antenna, refractive conditions, radar antenna height, frequency, polarization, horizontal beamwidth, and compressed pulse length. The model is fully described, a FORTRAN 77 computer listing is provided, and test cases are given to demonstrate the proper operation of the program.

  1. Aircraft model prototypes which have specified handling-quality time histories

    NASA Technical Reports Server (NTRS)

    Johnson, S. H.

    1976-01-01

    Several techniques for obtaining linear constant-coefficient airplane models from specified handling-quality time histories are discussed. One technique, the pseudodata method, solves the basic problem, yields specified eigenvalues, and accommodates state-variable transfer-function zero suppression. The method is fully illustrated for a fourth-order stability-axis small-motion model with three lateral handling-quality time histories specified. The FORTRAN program which obtains and verifies the model is included and fully documented.

  2. Modeling Limited Foresight in Water Management Systems

    NASA Astrophysics Data System (ADS)

    Howitt, R.

    2005-12-01

    The inability to forecast future water supplies means that their management inevitably occurs under situations of limited foresight. Three modeling problems arise, first what type of objective function is a manager with limited foresight optimizing? Second how can we measure these objectives? Third can objective functions that incorporate uncertainty be integrated within the structure of optimizing water management models? The paper reviews the concepts of relative risk aversion and intertemporal substitution that underlie stochastic dynamic preference functions. Some initial results from the estimation of such functions for four different dam operations in northern California are presented and discussed. It appears that the path of previous water decisions and states influences the decision-makers willingness to trade off water supplies between periods. A compromise modeling approach that incorporates carry-over value functions under limited foresight within a broader net work optimal water management model is developed. The approach uses annual carry-over value functions derived from small dimension stochastic dynamic programs embedded within a larger dimension water allocation network. The disaggregation of the carry-over value functions to the broader network is extended using the space rule concept. Initial results suggest that the solution of such annual nonlinear network optimizations is comparable to, or faster than, the solution of linear network problems over long time series.

  3. Imfit: A Fast, Flexible Program for Astronomical Image Fitting

    NASA Astrophysics Data System (ADS)

    Erwin, Peter

    2014-08-01

    Imift is an open-source astronomical image-fitting program specialized for galaxies but potentially useful for other sources, which is fast, flexible, and highly extensible. Its object-oriented design allows new types of image components (2D surface-brightness functions) to be easily written and added to the program. Image functions provided with Imfit include Sersic, exponential, and Gaussian galaxy decompositions along with Core-Sersic and broken-exponential profiles, elliptical rings, and three components that perform line-of-sight integration through 3D luminosity-density models of disks and rings seen at arbitrary inclinations. Available minimization algorithms include Levenberg-Marquardt, Nelder-Mead simplex, and Differential Evolution, allowing trade-offs between speed and decreased sensitivity to local minima in the fit landscape. Minimization can be done using the standard chi^2 statistic (using either data or model values to estimate per-pixel Gaussian errors, or else user-supplied error images) or the Cash statistic; the latter is particularly appropriate for cases of Poisson data in the low-count regime. The C++ source code for Imfit is available under the GNU Public License.

  4. Use of nonlinear programming to optimize performance response to energy density in broiler feed formulation.

    PubMed

    Guevara, V R

    2004-02-01

    A nonlinear programming optimization model was developed to maximize margin over feed cost in broiler feed formulation and is described in this paper. The model identifies the optimal feed mix that maximizes profit margin. Optimum metabolizable energy level and performance were found by using Excel Solver nonlinear programming. Data from an energy density study with broilers were fitted to quadratic equations to express weight gain, feed consumption, and the objective function income over feed cost in terms of energy density. Nutrient:energy ratio constraints were transformed into equivalent linear constraints. National Research Council nutrient requirements and feeding program were used for examining changes in variables. The nonlinear programming feed formulation method was used to illustrate the effects of changes in different variables on the optimum energy density, performance, and profitability and was compared with conventional linear programming. To demonstrate the capabilities of the model, I determined the impact of variation in prices. Prices for broiler, corn, fish meal, and soybean meal were increased and decreased by 25%. Formulations were identical in all other respects. Energy density, margin, and diet cost changed compared with conventional linear programming formulation. This study suggests that nonlinear programming can be more useful than conventional linear programming to optimize performance response to energy density in broiler feed formulation because an energy level does not need to be set.

  5. Proceedings of the First Joint NASA Cardiopulmonary Workshop

    NASA Technical Reports Server (NTRS)

    Fortney, Suzanne M. (Editor); Hargens, Alan R. (Editor)

    1991-01-01

    The topics covered include the following: flight echocardiography, pulmonary function, central hemodynamics, glycerol hyperhydration, spectral analysis, lower body negative pressure countermeasures, orthostatic tolerance, autonomic function, cardiac deconditioning, fluid and renal responses to head-down tilt, local fluid regulation, endocrine regulation during bed rest, autogenic feedback, and chronic cardiovascular measurements. The program ended with a general discussion of weightlessness models and countermeasures.

  6. Equations for predicting uncompacted crown ratio based on compacted crown ratio and tree attributes.

    Treesearch

    Vicente J. Monleon; David Azuma; Donald Gedney

    2004-01-01

    Equations to predict uncompacted crown ratio as a function of compacted crown ratio, tree diameter, and tree height are developed for the main tree species in Oregon, Washington, and California using data from the Forest Health Monitoring Program, USDA Forest Service. The uncompacted crown ratio was modeled with a logistic function and fitted using weighted, nonlinear...

  7. Minitrack tracking function description, volume 2

    NASA Technical Reports Server (NTRS)

    Englar, T. S.; Mango, S. A.; Roettcher, C. A.; Watters, D. L.

    1973-01-01

    The minitrack tracking function is described and specific operations are identified. The subjects discussed are: (1) preprocessor listing, (2) minitrack hardware, (3) system calibration, (4) quadratic listing, and (5) quadratic flow diagram. Detailed information is provided on the construction of the tracking system and its operation. The calibration procedures are supported by mathematical models to show the application of the computer programs.

  8. Effects of theory of mind performance training on reducing bullying involvement in children and adolescents with high-functioning autism spectrum disorder

    PubMed Central

    Chen, Yu-Min; Liu, Tai-Ling; Hsiao, Ray C.; Hu, Huei-Fan

    2018-01-01

    Bullying involvement is prevalent among children and adolescents with autism spectrum disorder (ASD). This study examined the effects of theory of mind performance training (ToMPT) on reducing bullying involvement in children and adolescents with high-functioning ASD. Children and adolescents with high-functioning ASD completed ToMPT (n = 26) and social skills training (SST; n = 23) programs. Participants in both groups and their mothers rated the pretraining and posttraining bullying involvement of participants on the Chinese version of the School Bullying Experience Questionnaire. The paired t test was used to evaluate changes in bullying victimization and perpetration between the pretraining and posttraining assessments. Furthermore, the linear mixed-effect model was used to examine the difference in the training effect between the ToMPT and SST groups. The paired t test indicated that in the ToMPT group, the severities of both self-reported (p = .039) and mother-reported (p = .003) bullying victimization significantly decreased from the pretraining to posttraining assessments, whereas in the SST group, only self-reported bullying victimization significantly decreased (p = .027). The linear mixed-effect model indicated that compared with the SST program, the ToMPT program significantly reduced the severity of mother-reported bullying victimization (p = .041). The present study supports the effects of ToMPT on reducing mother-reported bullying victimization in children and adolescents with high-functioning ASD. PMID:29342210

  9. LIGO detector characterization with genetic programming

    NASA Astrophysics Data System (ADS)

    Cavaglia, Marco; Staats, Kai; Errico, Luciano; Mogushi, Kentaro; Gabbard, Hunter

    2017-01-01

    Genetic Programming (GP) is a supervised approach to Machine Learning. GP has for two decades been applied to a diversity of problems, from predictive and financial modelling to data mining, from code repair to optical character recognition and product design. GP uses a stochastic search, tournament, and fitness function to explore a solution space. GP evolves a population of individual programs, through multiple generations, following the principals of biological evolution (mutation and reproduction) to discover a model that best fits or categorizes features in a given data set. We apply GP to categorization of LIGO noise and show that it can effectively be used to characterize the detector non-astrophysical noise both in low latency and offline searches. National Science Foundation award PHY-1404139.

  10. Reuseable Objects Software Environment (ROSE): Introduction to Air Force Software Reuse Workshop

    NASA Technical Reports Server (NTRS)

    Cottrell, William L.

    1994-01-01

    The Reusable Objects Software Environment (ROSE) is a common, consistent, consolidated implementation of software functionality using modern object oriented software engineering including designed-in reuse and adaptable requirements. ROSE is designed to minimize abstraction and reduce complexity. A planning model for the reverse engineering of selected objects through object oriented analysis is depicted. Dynamic and functional modeling are used to develop a system design, the object design, the language, and a database management system. The return on investment for a ROSE pilot program and timelines are charted.

  11. High-performance space shuttle auxiliary propellant valve system

    NASA Technical Reports Server (NTRS)

    Smith, G. M.

    1973-01-01

    Several potential valve closures for the space shuttle auxiliary propulsion system (SS/APS) were investigated analytically and experimentally in a modeling program. The most promising of these were analyzed and experimentally evaluated in a full-size functional valve test fixture of novel design. The engineering investigations conducted for both model and scale evaluations of the SS/APS valve closures and functional valve fixture are described. Preliminary designs, laboratory tests, and overall valve test fixture designs are presented, and a final recommended flightweight SS/APS valve design is presented.

  12. QEDMOD: Fortran program for calculating the model Lamb-shift operator

    NASA Astrophysics Data System (ADS)

    Shabaev, V. M.; Tupitsyn, I. I.; Yerokhin, V. A.

    2018-02-01

    We present Fortran package QEDMOD for computing the model QED operator hQED that can be used to account for the Lamb shift in accurate atomic-structure calculations. The package routines calculate the matrix elements of hQED with the user-specified one-electron wave functions. The operator can be used to calculate Lamb shift in many-electron atomic systems with a typical accuracy of few percent, either by evaluating the matrix element of hQED with the many-electron wave function, or by adding hQED to the Dirac-Coulomb-Breit Hamiltonian.

  13. An inverse problem strategy based on forward model evaluations: Gradient-based optimization without adjoint solves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilo Valentin, Miguel Alejandro

    2016-07-01

    This study presents a new nonlinear programming formulation for the solution of inverse problems. First, a general inverse problem formulation based on the compliance error functional is presented. The proposed error functional enables the computation of the Lagrange multipliers, and thus the first order derivative information, at the expense of just one model evaluation. Therefore, the calculation of the Lagrange multipliers does not require the solution of the computationally intensive adjoint problem. This leads to significant speedups for large-scale, gradient-based inverse problems.

  14. Training of attention functions in children with attention deficit hyperactivity disorder.

    PubMed

    Tucha, Oliver; Tucha, Lara; Kaumann, Gesa; König, Sebastian; Lange, Katharina M; Stasik, Dorota; Streather, Zoe; Engelschalk, Tobias; Lange, Klaus W

    2011-09-01

    Pharmacological treatment of children with ADHD has been shown to be successful; however, medication may not normalize attention functions. The present study was based on a neuropsychological model of attention and assessed the effect of an attention training program on attentional functioning of children with ADHD. Thirty-two children with ADHD and 16 healthy children participated in the study. Children with ADHD were randomly assigned to one of the two conditions, i.e., an attention training program which trained aspects of vigilance, selective attention and divided attention, or a visual perception training which trained perceptual skills, such as perception of figure and ground, form constancy and position in space. The training programs were applied in individual sessions, twice a week, for a period of four consecutive weeks. Healthy children did not receive any training. Alertness, vigilance, selective attention, divided attention, and flexibility were examined prior to and following the interventions. Children with ADHD were assessed and trained while on ADHD medications. Data analysis revealed that the attention training used in the present study led to significant improvements of various aspects of attention, including vigilance, divided attention, and flexibility, while the visual perception training had no specific effects. The findings indicate that attention training programs have the potential to facilitate attentional functioning in children with ADHD treated with ADHD drugs.

  15. AESOP- INTERACTIVE DESIGN OF LINEAR QUADRATIC REGULATORS AND KALMAN FILTERS

    NASA Technical Reports Server (NTRS)

    Lehtinen, B.

    1994-01-01

    AESOP was developed to solve a number of problems associated with the design of controls and state estimators for linear time-invariant systems. The systems considered are modeled in state-variable form by a set of linear differential and algebraic equations with constant coefficients. Two key problems solved by AESOP are the linear quadratic regulator (LQR) design problem and the steady-state Kalman filter design problem. AESOP is designed to be used in an interactive manner. The user can solve design problems and analyze the solutions in a single interactive session. Both numerical and graphical information are available to the user during the session. The AESOP program is structured around a list of predefined functions. Each function performs a single computation associated with control, estimation, or system response determination. AESOP contains over sixty functions and permits the easy inclusion of user defined functions. The user accesses these functions either by inputting a list of desired functions in the order they are to be performed, or by specifying a single function to be performed. The latter case is used when the choice of function and function order depends on the results of previous functions. The available AESOP functions are divided into several general areas including: 1) program control, 2) matrix input and revision, 3) matrix formation, 4) open-loop system analysis, 5) frequency response, 6) transient response, 7) transient function zeros, 8) LQR and Kalman filter design, 9) eigenvalues and eigenvectors, 10) covariances, and 11) user-defined functions. The most important functions are those that design linear quadratic regulators and Kalman filters. The user interacts with AESOP when using these functions by inputting design weighting parameters and by viewing displays of designed system response. Support functions obtain system transient and frequency responses, transfer functions, and covariance matrices. AESOP can also provide the user with open-loop system information including stability, controllability, and observability. The AESOP program is written in FORTRAN IV for interactive execution and has been implemented on an IBM 3033 computer using TSS 370. As currently configured, AESOP has a central memory requirement of approximately 2 Megs of 8 bit bytes. Memory requirements can be reduced by redimensioning arrays in the AESOP program. Graphical output requires adaptation of the AESOP plot routines to whatever device is available. The AESOP program was developed in 1984.

  16. Analysis of the Multi Strategy Goal Programming for Micro-Grid Based on Dynamic ant Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Qiu, J. P.; Niu, D. X.

    Micro-grid is one of the key technologies of the future energy supplies. Take economic planning. reliability, and environmental protection of micro grid as a basis for the analysis of multi-strategy objective programming problems for micro grid which contains wind power, solar power, and battery and micro gas turbine. Establish the mathematical model of each power generation characteristics and energy dissipation. and change micro grid planning multi-objective function under different operating strategies to a single objective model based on AHP method. Example analysis shows that in combination with dynamic ant mixed genetic algorithm can get the optimal power output of this model.

  17. Utility of an emulation and simulation computer model for air revitalization system hardware design, development, and test

    NASA Technical Reports Server (NTRS)

    Yanosy, J. L.; Rowell, L. F.

    1985-01-01

    Efforts to make increasingly use of suitable computer programs in the design of hardware have the potential to reduce expenditures. In this context, NASA has evaluated the benefits provided by software tools through an application to the Environmental Control and Life Support (ECLS) system. The present paper is concerned with the benefits obtained by an employment of simulation tools in the case of the Air Revitalization System (ARS) of a Space Station life support system. Attention is given to the ARS functions and components, a computer program overview, a SAND (solid amine water desorbed) bed model description, a model validation, and details regarding the simulation benefits.

  18. Algebraic tools for dealing with the atomic shell model. I. Wavefunctions and integrals for hydrogen-like ions

    NASA Astrophysics Data System (ADS)

    Surzhykov, Andrey; Koval, Peter; Fritzsche, Stephan

    2005-01-01

    Today, the 'hydrogen atom model' is known to play its role not only in teaching the basic elements of quantum mechanics but also for building up effective theories in atomic and molecular physics, quantum optics, plasma physics, or even in the design of semiconductor devices. Therefore, the analytical as well as numerical solutions of the hydrogen-like ions are frequently required both, for analyzing experimental data and for carrying out quite advanced theoretical studies. In order to support a fast and consistent access to these (Coulomb-field) solutions, here we present the DIRAC program which has been developed originally for studying the properties and dynamical behavior of the (hydrogen-like) ions. In the present version, a set of MAPLE procedures is provided for the Coulomb wave and Green's functions by applying the (wave) equations from both, the nonrelativistic and relativistic theory. Apart from the interactive access to these functions, moreover, a number of radial integrals are also implemented in the DIRAC program which may help the user to construct transition amplitudes and cross sections as they occur frequently in the theory of ion-atom and ion-photon collisions. Program summaryTitle of program:DIRAC Catalogue number: ADUQ Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUQ Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions: None Computer for which the program is designed and has been tested: All computers with a license of the computer algebra package MAPLE [1] Program language used: Maple 8 and 9 No. of lines in distributed program, including test data, etc.:2186 No. of bytes in distributed program, including test data, etc.: 162 591 Distribution format: tar gzip file CPC Program Library subprograms required: None Nature of the physical problem: Analytical solutions of the hydrogen atom are widely used in very different fields of physics [2,3]. Despite of the rather simple structure of the hydrogen-like ions, however, the underlying 'mathematics' is not always that easy to deal with. Apart from the well-known level structure of these ions as obtained from either the Schrödinger or Dirac equation, namely, a great deal of other properties are often needed. These properties are related to the interaction of bound electron(s) with external particles and fields and, hence, require to evaluate transition amplitudes, including wavefunctions and (transition) operators of quite different complexity. Although various special functions, such as the Laguerre polynomials, spherical harmonics, Whittaker functions, or the hypergeometric functions of various kinds can be used in most cases in order to express these amplitudes in a concise form, their derivation is time consuming and prone for making errors. In addition to their complexity, moreover, there exist a large number of mathematical relations among these functions which are difficult to remember in detail and which have often hampered quantitative studies in the past. Method of solution: A set of MAPLE procedures is developed which provides both the nonrelativistic and relativistic (analytical) solutions of the 'hydrogen atom model' and which facilitates the symbolic evaluation of various transition amplitudes. Restrictions onto the complexity of the problem: Over the past decades, a large number of representations have been worked out for the hydrogenic wave and Green's functions, using different variables and coordinates [2]. From these, the position-space representation in spherical coordinates is certainly of most practical interest and has been used as the basis of the present implementation. No attempt has been made by us so far to provide the wave and Green's functions also in momentum space, for which the relativistic momentum functions would have to be constructed numerically. Although the DIRAC program supports both symbolic and numerical computations, the latter one are based on MAPLE's standard software floating-point algorithms and on the (attempted) precision as defined by the global Digits variable. Although the default number, Digits = 10, appears sufficient for many computations, it often leads to a rather dramatic loss in the accuracy of the relativistic wave functions and integrals, mainly owing to MAPLE's imprecise internal evaluation of the corresponding special functions. Therefore, in order to avoid such computational difficulties, the Digits variable is set to 20 whenever the DIRAC program is (re-)loaded. Unusual features of the program: The DIRAC program has been designed for interactive work which, apart from the standard solutions and integrals of the hydrogen atom, also support the use of (approximate) semirelativistic wave functions for both, the bound- and continuum-states of the electron. To provide a fast and accurate access to a number of radial integrals which arise frequently in applications, the analytical expressions for these integrals have been implemented for the one-particle operators r, e, d/dr, j(kr) as well as for the (so-called) two-particle Slater integrals which are needed to describe the Coulomb repulsion among the electrons. Further procedures of the DIRAC program concern, for instance, the conversion of the physical results between different unit systems or for different sets of quantum numbers. A brief description of all procedures as available in the present version of the DIRAC program is given in the user manual Dirac-commands.pdf which is distributed together with the code. Typical running time: Although the program replies promptly on most requests, the running time also depends on the particular task. References: [1] Maple is a registered trademark of Waterloo Maple Inc. [2] H.A. Bethe and E.E. Salpeter, Quantum Mechanics of One- and Two-Electron Atoms, Springer, Berlin, 1957. [3] J. Eichler and W. Meyerhof, Relativistic Atomic Collisions, Academic Press, New York, 1995.

  19. The tailored activity program (TAP) to address behavioral disturbances in frontotemporal dementia: a feasibility and pilot study.

    PubMed

    O'Connor, Claire M; Clemson, Lindy; Brodaty, Henry; Low, Lee-Fay; Jeon, Yun-Hee; Gitlin, Laura N; Piguet, Olivier; Mioshi, Eneida

    2017-10-15

    To explore the feasibility of implementing the Tailored Activity Program with a cohort of people with frontotemporal dementia and their carers (dyads). The Tailored Activity Program is an occupational therapy based intervention that involves working collaboratively with family carers and prescribes personalized activities for behavioral management in people with dementia. Twenty dyads randomized into the study (Tailored Activity Program: n = 9; Control: n = 11) were assessed at baseline and 4-months. Qualitative analyzes evaluated feasibility and acceptability of the program for the frontotemporal dementia cohort, and quantitative analyzes (linear mixed model analyzes, Spearman's rho correlations) measured the impact of the program on the dyads. The Tailored Activity Program was an acceptable intervention for the frontotemporal dementia dyads. Qualitative analyses identified five themes: "carer perceived benefits", "carer readiness to change", "strategies used by carer to engage person with dementia", "barriers to the Tailored Activity Program uptake/implementation", and "person with dementia engagement". Quantitative outcomes showed an overall reduction of behavioral symptoms (F 18.34  = 8.073, p = 0.011) and maintenance of functional performance in the person with dementia (F 18.03  = 0.375, p = 0.548). This study demonstrates the potential for using an activity-based intervention such as the Tailored Activity Program in frontotemporal dementia. Service providers should recognize that while people with frontotemporal dementia present with challenging issues, tailored therapies may support their function and reduce their behavioral symptoms. Implications for rehabilitation The Tailored Activity Program is an occupational therapy based intervention that involves prescribing personalized activities for behavioral management in dementia. The Tailored Activity Program is an acceptable and feasible intervention approach to address some of the unique behavioral and functional impairments inherent in frontotemporal dementia.

  20. Operations research investigations of satellite power stations

    NASA Technical Reports Server (NTRS)

    Cole, J. W.; Ballard, J. L.

    1976-01-01

    A systems model reflecting the design concepts of Satellite Power Stations (SPS) was developed. The model is of sufficient scope to include the interrelationships of the following major design parameters: the transportation to and between orbits; assembly of the SPS; and maintenance of the SPS. The systems model is composed of a set of equations that are nonlinear with respect to the system parameters and decision variables. The model determines a figure of merit from which alternative concepts concerning transportation, assembly, and maintenance of satellite power stations are studied. A hybrid optimization model was developed to optimize the system's decision variables. The optimization model consists of a random search procedure and the optimal-steepest descent method. A FORTRAN computer program was developed to enable the user to optimize nonlinear functions using the model. Specifically, the computer program was used to optimize Satellite Power Station system components.

  1. A progressive 5-week exercise therapy program leads to significant improvement in knee function early after anterior cruciate ligament injury.

    PubMed

    Eitzen, Ingrid; Moksnes, Håvard; Snyder-Mackler, Lynn; Risberg, May Arna

    2010-11-01

    Prospective cohort study without a control group. Firstly, to present our 5-week progressive exercise therapy program in the early stage after anterior cruciate ligament (ACL) injury. Secondly, to evaluate changes in knee function after completion of the program for patients with ACL injury in general and also when classified as potential copers or noncopers, and, finally, to examine potential adverse events. Few studies concerning early-stage ACL rehabilitation protocols exist. Consequently, little is known about the tolerance for, and outcomes from, short-term exercise therapy programs in the early stage after injury. One-hundred patients were included in a 5-week progressive exercise therapy program, within 3 months after injury. Knee function before and after completion of the program was evaluated from isokinetic quadriceps and hamstrings muscle strength tests, 4 single-leg hop tests, 2 different self-assessment questionnaires, and a global rating of knee function. A 2-way mixed-model analysis of variance was conducted to evaluate changes from pretest to posttest for the limb symmetry index for muscle strength and single-leg hop tests, and the change in scores for the patient-reported questionnaires. In addition, absolute values and the standardized response mean for muscle strength and single-leg hop tests were calculated at pretest and posttest for the injured and uninjured limb. Adverse events during the 5-week period were recorded. The progressive 5-week exercise therapy program led to significant improvements (P<.05) in knee function from pretest to posttest both for patients classified as potential copers and noncopers. Standardized response mean values for changes in muscle strength and single-leg hop performance from pretest to posttest for the injured limb were moderate to strong (0.49-0.84), indicating the observed improvements to be clinically relevant. Adverse events occurred in 3.9% of the patients. Short-term progressive exercise therapy programs are well tolerated and should be incorporated in early-stage ACL rehabilitation, either to improve knee function before ACL reconstruction or as a first step in further nonoperative management. Therapy, level 2b.

  2. RADC Multi-Dimensional Signal-Processing Research Program.

    DTIC Science & Technology

    1980-09-30

    Formulation 7 3.2.2 Methods of Accelerating Convergence 8 3.2.3 Application to Image Deblurring 8 3.2.4 Extensions 11 3.3 Convergence of Iterative Signal... noise -driven linear filters, permit development of the joint probability density function oz " kelihood function for the image. With an expression...spatial linear filter driven by white noise (see Fig. i). If the probability density function for the white noise is known, Fig. t. Model for image

  3. Analysis of Potentially Hazardous Asteroids

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.; Burkhard, C. D.; Dotson, J. L.; Prabhu, D. K.; Mathias, D. L.; Aftosmis, M. J.; Venkatapathy, Ethiraj; Morrison, D. D.; Sears, D. W. G.; Berger, M. J.

    2015-01-01

    The National Aeronautics and Space Administration initiated a new project focused on Planetary Defense on October 1, 2014. The new project is funded by NASAs Near Earth Object Program (Lindley Johnson, Program Executive). This presentation describes the objectives, functions and plans of four tasks encompassed in the new project and their inter-relations. Additionally, this project provides for outreach to facilitate partnerships with other organizations to help meet the objectives of the planetary defense community. The four tasks are (1) Characterization of Near Earth Asteroids, (2) Physics-Based Modeling of Meteor Entry and Breakup (3) Surface Impact Modeling and (4) Physics-Based Impact Risk Assessment.

  4. Validation of a Scalable Solar Sailcraft

    NASA Technical Reports Server (NTRS)

    Murphy, D. M.

    2006-01-01

    The NASA In-Space Propulsion (ISP) program sponsored intensive solar sail technology and systems design, development, and hardware demonstration activities over the past 3 years. Efforts to validate a scalable solar sail system by functional demonstration in relevant environments, together with test-analysis correlation activities on a scalable solar sail system have recently been successfully completed. A review of the program, with descriptions of the design, results of testing, and analytical model validations of component and assembly functional, strength, stiffness, shape, and dynamic behavior are discussed. The scaled performance of the validated system is projected to demonstrate the applicability to flight demonstration and important NASA road-map missions.

  5. Preliminary design data package, appendices C1 and C2

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The HYBRID2 program which computes the fuel and energy consumption of a hybrid vehicle with a bi-modal control strategy over specified component driving cycles is described. Fuel and energy consumption are computed separately for the two modes of operation. The program also computes yearly average fuel and energy consumption using a composite driving cycle which varies as a function of daily travel. The modelling techniques are described, and subroutines and their functions are given. The composition of modern automobiles is discussed along with the energy required to manufacture an American automobile. The energy required to scrap and recycle automobiles is also discussed.

  6. CAD of control systems: Application of nonlinear programming to a linear quadratic formulation

    NASA Technical Reports Server (NTRS)

    Fleming, P.

    1983-01-01

    The familiar suboptimal regulator design approach is recast as a constrained optimization problem and incorporated in a Computer Aided Design (CAD) package where both design objective and constraints are quadratic cost functions. This formulation permits the separate consideration of, for example, model following errors, sensitivity measures and control energy as objectives to be minimized or limits to be observed. Efficient techniques for computing the interrelated cost functions and their gradients are utilized in conjunction with a nonlinear programming algorithm. The effectiveness of the approach and the degree of insight into the problem which it affords is illustrated in a helicopter regulation design example.

  7. Analysis of Effects of Organizational Behavior on Evolving System of Systems Acquisition Programs Through Agent Based Modeling

    DTIC Science & Technology

    2013-03-01

    function is based on how individualistic or collectivistic a system is. Low individualism values mean the system is more collective and is less likely...Hofstede’s cultural dimensions, integrated with a modified version of the Bak- Sneppen biological evolutionary model, this research highlights which set...14 Hofstede’s Cultural Dimensions

  8. Operations Research techniques in the management of large-scale reforestation programs

    Treesearch

    Joseph Buongiorno; D.E. Teeguarden

    1978-01-01

    A reforestation planning system for the Douglas-fir region of the Western United States is described. Part of the system is a simulation model to predict plantation growth and to determine economic thinning regimes and rotation ages as a function of site characteristics, initial density, reforestation costs, and management constraints. A second model estimates the...

  9. Dynamic reserve selection: Optimal land retention with land-price feedbacks

    Treesearch

    Sandor F. Toth; Robert G. Haight; Luke W. Rogers

    2011-01-01

    Urban growth compromises open space and ecosystem functions. To mitigate the negative effects, some agencies use reserve selection models to identify conservation sites for purchase or retention. Existing models assume that conservation has no impact on nearby land prices. We propose a new integer program that relaxes this assumption via adaptive cost coefficients. Our...

  10. Beyond the Model: Building an Effective and Dynamic IT Curriculum

    ERIC Educational Resources Information Center

    Brewer, Jeffrey; Harriger, Alka; Mendonca, John

    2006-01-01

    A model curriculum, such as that developed by the ACM/SIGITE Curriculum Committee (2005), has two important functions. First, it provides a base structure for newly developing programs that can use it as a platform for articulating a curriculum. Second, it offers an existing curriculum framework that can be used for validation by existing…

  11. A Dynamic Process Model for Optimizing the Hospital Environment Cash-Flow

    NASA Astrophysics Data System (ADS)

    Pater, Flavius; Rosu, Serban

    2011-09-01

    In this article is presented a new approach to some fundamental techniques of solving dynamic programming problems with the use of functional equations. We will analyze the problem of minimizing the cost of treatment in a hospital environment. Mathematical modeling of this process leads to an optimal control problem with a finite horizon.

  12. An integer programming model to optimize resource allocation for wildfire containment.

    Treesearch

    Geoffrey H. Donovan; Douglas B. Rideout

    2003-01-01

    Determining the specific mix of fire-fighting resources for a given fire is a necessary condition for identifying the minimum of the Cost Plus Net Value Change (C+NVC) function. Current wildland fire management models may not reliably do so. The problem of identifying the most efficient wildland fire organization is characterized mathematically using integer-...

  13. For the Arts To Have Meaning...A Model of Adult Education in Performing Arts Organizations.

    ERIC Educational Resources Information Center

    Kitinoja, L.; Heimlich, J. E.

    A model of adult education appears to function in the outreach programs of three Columbus (Ohio) performing arts organizations. The first tier represents the arts organization's board of trustees, and the second represents the internal administration of the company. Two administrative bodies are arbitrarily labelled as education and marketing,…

  14. Effects of an Updated Preventive Home Visit Program Based on a Systematic Structured Assessment of Care Needs for Ambulatory Frail Older Adults in Japan: A Randomized Controlled Trial.

    PubMed

    Kono, Ayumi; Izumi, Kyoko; Yoshiyuki, Noriko; Kanaya, Yukiko; Rubenstein, Laurence Z

    2016-12-01

    The aim of this randomized controlled trial was to determine the effects on functional parameters of an updated preventive home visit program for frail older adults in the Japanese Long-term Care Insurance (LTCI) system. The program included home visits by nurses or care managers every 3 months for 24 months, with a systematic assessment of care needs to prevent functional decline. Eligible participants (N = 360) were randomly assigned to the visit (VG: n = 179) or control group (CG: n = 181). Functional parameters were gathered via mail questionnaires at baseline and at 12- and 24-month follow-ups. Care-need levels in the LTCI were obtained at 12-, 24-, and 36-month follow-ups and the utilization of the LTCI service through 36 months. Participants in VG were significantly more likely to maintain their activities of daily living (ADL) functioning (p = .0113) and less likely to increase care-needs level, compared with CG participants, over 24 months. A generalized linear model showed that the estimate of the effect on increase in care-needs level (ie, functional decline) was -0.53 (p = .042) over 36 months. These results suggest that the updated preventive home visit program could be effective for the prevention of ADL and care-needs deterioration, and these effects could continue up to 1 year after program completion. © The Author 2016. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Branch and bound algorithm for accurate estimation of analytical isotropic bidirectional reflectance distribution function models.

    PubMed

    Yu, Chanki; Lee, Sang Wook

    2016-05-20

    We present a reliable and accurate global optimization framework for estimating parameters of isotropic analytical bidirectional reflectance distribution function (BRDF) models. This approach is based on a branch and bound strategy with linear programming and interval analysis. Conventional local optimization is often very inefficient for BRDF estimation since its fitting quality is highly dependent on initial guesses due to the nonlinearity of analytical BRDF models. The algorithm presented in this paper employs L1-norm error minimization to estimate BRDF parameters in a globally optimal way and interval arithmetic to derive our feasibility problem and lower bounding function. Our method is developed for the Cook-Torrance model but with several normal distribution functions such as the Beckmann, Berry, and GGX functions. Experiments have been carried out to validate the presented method using 100 isotropic materials from the MERL BRDF database, and our experimental results demonstrate that the L1-norm minimization provides a more accurate and reliable solution than the L2-norm minimization.

  16. The Importance of Information Requirements in Designing Acquisition to Information Systems

    NASA Technical Reports Server (NTRS)

    Davis, Bruce A.; Hill, Chuck; Maughan, Paul M.

    1998-01-01

    The partnership model used by NASA's Commercial Remote Sensing Program has been successful in better defining remote sensing functional requirements and translation to technical specifications to address environmental needs of the 21st century.

  17. Learning during Group Therapy Leadership Training.

    ERIC Educational Resources Information Center

    Stone, Walter N.; Green, Bonnie L.

    1978-01-01

    Examined factors affecting congitive learning during a combined experiential-didactic group therapy training program. The overall goal for trainees was the acquisition of a cognitive model of group functioning, which can be translated into consistent leadership techniques. (Author/PD)

  18. 45 CFR 310.10 - What are the functional requirements for the Model Tribal IV-D System?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... plan, including: (1) Identifying information such as Social Security numbers, names, dates of birth... operations and to assess program performance through the audit of financial and statistical data maintained...

  19. A Model Program for Translational Medicine in Epilepsy Genetics

    PubMed Central

    Smith, Lacey A.; Ullmann, Jeremy F. P.; Olson, Heather E.; El Achkar, Christelle M.; Truglio, Gessica; Kelly, McKenna; Rosen-Sheidley, Beth; Poduri, Annapurna

    2017-01-01

    Recent technological advances in gene sequencing have led to a rapid increase in gene discovery in epilepsy. However, the ability to assess pathogenicity of variants, provide functional analysis, and develop targeted therapies has not kept pace with rapid advances in sequencing technology. Thus, although clinical genetic testing may lead to a specific molecular diagnosis for some patients, test results often lead to more questions than answers. As the field begins to focus on therapeutic applications of genetic diagnoses using precision medicine, developing processes that offer more than equivocal test results is essential. The success of precision medicine in epilepsy relies on establishing a correct genetic diagnosis, analyzing functional consequences of genetic variants, screening potential therapeutics in the preclinical laboratory setting, and initiating targeted therapy trials for patients. We describe the structure of a comprehensive, pediatric Epilepsy Genetics Program that can serve as a model for translational medicine in epilepsy. PMID:28056630

  20. Measuring infrastructure: A key step in program evaluation and planning.

    PubMed

    Schmitt, Carol L; Glasgow, LaShawn; Lavinghouze, S Rene; Rieker, Patricia P; Fulmer, Erika; McAleer, Kelly; Rogers, Todd

    2016-06-01

    State tobacco prevention and control programs (TCPs) require a fully functioning infrastructure to respond effectively to the Surgeon General's call for accelerating the national reduction in tobacco use. The literature describes common elements of infrastructure; however, a lack of valid and reliable measures has made it difficult for program planners to monitor relevant infrastructure indicators and address observed deficiencies, or for evaluators to determine the association among infrastructure, program efforts, and program outcomes. The Component Model of Infrastructure (CMI) is a comprehensive, evidence-based framework that facilitates TCP program planning efforts to develop and maintain their infrastructure. Measures of CMI components were needed to evaluate the model's utility and predictive capability for assessing infrastructure. This paper describes the development of CMI measures and results of a pilot test with nine state TCP managers. Pilot test findings indicate that the tool has good face validity and is clear and easy to follow. The CMI tool yields data that can enhance public health efforts in a funding-constrained environment and provides insight into program sustainability. Ultimately, the CMI measurement tool could facilitate better evaluation and program planning across public health programs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Cognitive and Social Functioning Correlates of Employment Among People with Severe Mental Illness.

    PubMed

    Saavedra, Javier; López, Marcelino; González, Sergio; Arias, Samuel; Crawford, Paul

    2016-10-01

    We assess how social and cognitive functioning is associated to gaining employment for 213 people diagnosed with severe mental illness taking part in employment programs in Andalusia (Spain). We used the Repeatable Battery for the Assessment of Neuropsychological Status and the Social Functioning Scale and conducted two binary logistical regression analyses. Response variables were: having a job or not, in ordinary companies (OCs) and social enterprises, and working in an OC or not. There were two variables with significant adjusted odds ratios for having a job: "attention" and "Educational level". There were five variables with significant odds ratios for having a job in an OC: "Sex", "Educational level", "Attention", "Communication", and "Independence-competence". The study looks at the possible benefits of combining employment with support and social enterprises in employment programs for these people and underlines how both social and cognitive functioning are central to developing employment models.

  2. Factors leading to different viability predictions for a grizzly bear data set

    USGS Publications Warehouse

    Mills, L.S.; Hayes, S.G.; Wisdom, M.J.; Citta, J.; Mattson, D.J.; Murphy, K.

    1996-01-01

    Population viability analysis programs are being used increasingly in research and management applications, but there has not been a systematic study of the congruence of different program predictions based on a single data set. We performed such an analysis using four population viability analysis computer programs: GAPPS, INMAT, RAMAS/AGE, and VORTEX. The standardized demographic rates used in all programs were generalized from hypothetical increasing and decreasing grizzly bear (Ursus arctos horribilis) populations. Idiosyncracies of input format for each program led to minor differences in intrinsic growth rates that translated into striking differences in estimates of extinction rates and expected population size. In contrast, the addition of demographic stochasticity, environmental stochasticity, and inbreeding costs caused only a small divergence in viability predictions. But, the addition of density dependence caused large deviations between the programs despite our best attempts to use the same density-dependent functions. Population viability programs differ in how density dependence is incorporated, and the necessary functions are difficult to parameterize accurately. Thus, we recommend that unless data clearly suggest a particular density-dependent model, predictions based on population viability analysis should include at least one scenario without density dependence. Further, we describe output metrics that may differ between programs; development of future software could benefit from standardized input and output formats across different programs.

  3. R-SWAT-FME user's guide

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shu-Guang

    2012-01-01

    R program language-Soil and Water Assessment Tool-Flexible Modeling Environment (R-SWAT-FME) (Wu and Liu, 2012) is a comprehensive modeling framework that adopts an R package, Flexible Modeling Environment (FME) (Soetaert and Petzoldt, 2010), for the Soil and Water Assessment Tool (SWAT) model (Arnold and others, 1998; Neitsch and others, 2005). This framework provides the functionalities of parameter identifiability, model calibration, and sensitivity and uncertainty analysis with instant visualization. This user's guide shows how to apply this framework for a customized SWAT project.

  4. System cost/performance analysis (study 2.3). Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Kazangey, T.

    1973-01-01

    The relationships between performance, safety, cost, and schedule parameters were identified and quantified in support of an overall effort to generate program models and methodology that provide insight into a total space vehicle program. A specific space vehicle system, the attitude control system (ACS), was used, and a modeling methodology was selected that develops a consistent set of quantitative relationships among performance, safety, cost, and schedule, based on the characteristics of the components utilized in candidate mechanisms. These descriptive equations were developed for a three-axis, earth-pointing, mass expulsion ACS. A data base describing typical candidate ACS components was implemented, along with a computer program to perform sample calculations. This approach, implemented on a computer, is capable of determining the effect of a change in functional requirements to the ACS mechanization and the resulting cost and schedule. By a simple extension of this modeling methodology to the other systems in a space vehicle, a complete space vehicle model can be developed. Study results and recommendations are presented.

  5. Probabilistic analysis for fatigue strength degradation of materials

    NASA Technical Reports Server (NTRS)

    Royce, Lola

    1989-01-01

    This report presents the results of the first year of a research program conducted for NASA-LeRC by the University of Texas at San Antonio. The research included development of methodology that provides a probabilistic treatment of lifetime prediction of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Linear elastic fracture mechanics is utilized in the latter model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs, RANDOM2, RANDOM3, and RANDOM4. These programs determine the random lifetime of an engine component, in mechanical load cycles, to reach a critical fatigue strength or crack size. The material considered was a cast nickel base superalloy, one typical of those used in the Space Shuttle Main Engine.

  6. Structure, Function, and Applications of the Georgetown-Einstein (GE) Breast Cancer Simulation Model.

    PubMed

    Schechter, Clyde B; Near, Aimee M; Jayasekera, Jinani; Chandler, Young; Mandelblatt, Jeanne S

    2018-04-01

    The Georgetown University-Albert Einstein College of Medicine breast cancer simulation model (Model GE) has evolved over time in structure and function to reflect advances in knowledge about breast cancer, improvements in early detection and treatment technology, and progress in computing resources. This article describes the model and provides examples of model applications. The model is a discrete events microsimulation of single-life histories of women from multiple birth cohorts. Events are simulated in the absence of screening and treatment, and interventions are then applied to assess their impact on population breast cancer trends. The model accommodates differences in natural history associated with estrogen receptor (ER) and human epidermal growth factor receptor 2 (HER2) biomarkers, as well as conventional breast cancer risk factors. The approach for simulating breast cancer natural history is phenomenological, relying on dates, stage, and age of clinical and screen detection for a tumor molecular subtype without explicitly modeling tumor growth. The inputs to the model are regularly updated to reflect current practice. Numerous technical modifications, including the use of object-oriented programming (C++), and more efficient algorithms, along with hardware advances, have increased program efficiency permitting simulations of large samples. The model results consistently match key temporal trends in US breast cancer incidence and mortality. The model has been used in collaboration with other CISNET models to assess cancer control policies and will be applied to evaluate clinical trial design, recurrence risk, and polygenic risk-based screening.

  7. Functional Limitations and Nativity Status among Older Arab, Asian, Black, Hispanic, and White Americans

    PubMed Central

    Dallo, Florence J.; Booza, Jason; Nguyen, Norma D.

    2013-01-01

    Background To examine the association between nativity status (foreign and US-born) by race/ethnicity (Arab, Asian, black, Hispanic, white) on having a functional limitation. Methods We used American Community Survey data (2001-2007; n=1,964,777; 65+ years) and estimated odds ratios (95% confidence intervals). Results In the crude model, foreign-born Blacks, Hispanics and Arabs were more likely, while Asians were less likely to report having a functional limitation compared to white. In the fully adjusted model, Blacks, Hispanics, and Asians were less likely, while Arabs were more likely to report having a functional limitation. In both the crude and fully adjusted models, US-born Blacks and Hispanics were more likely, while Asians and Arabs were less likely to report having a functional limitation compared to whites. Discussion Policies and programs tailored to foreign-born Arab Americans may help prevent or delay the onset of disability, especially when initiated shortly after their arrival to the US. PMID:24165988

  8. Mathematical model of the current density for the 30-cm engineering model thruster

    NASA Technical Reports Server (NTRS)

    Cuffel, R. F.

    1975-01-01

    Mathematical models are presented for both the singly and doubly charged ion current densities downstream of the 30-cm engineering model thruster with 0.5% compensated dished grids. These models are based on the experimental measurements of Vahrenkamp at a 2-amp ion beam operating condition. The cylindrically symmetric beam of constant velocity ions is modeled with continuous radial source and focusing functions across 'plane' grids with similar angular distribution functions. A computer program is used to evaluate the double integral for current densities in the near field and to obtain a far field approximation beyond 10 grid radii. The utility of the model is demonstrated for (1) calculating the directed thrust and (2) determining the impingement levels on various spacecraft surfaces from a two-axis gimballed, 2 x 3 thruster array.

  9. Are Delinquents Different? Predictive Patterns for Low, Mid and High Delinquency Levels in a General Youth Sample via the HEW Youth Development Model's Impact Scales.

    ERIC Educational Resources Information Center

    Truckenmiller, James L.

    The Health, Education and Welfare (HEW) Office of Youth Development's National Strategy for Youth Development model was promoted as a community-based planning and procedural tool for enhancing positive youth development and reducing delinquency. To test the applicability of the model as a function of delinquency level, the program's Impact Scales…

  10. A Maximin Model for Test Design with Practical Constraints. Project Psychometric Aspects of Item Banking No. 25. Research Report 87-10.

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Boekkooi-Timminga, Ellen

    A "maximin" model for item response theory based test design is proposed. In this model only the relative shape of the target test information function is specified. It serves as a constraint subject to which a linear programming algorithm maximizes the information in the test. In the practice of test construction there may be several…

  11. Modeling Ocean Ecosystems: The PARADIGM Program

    DTIC Science & Technology

    2006-03-01

    of biological reality: the wonderful com- 2. Nitrogen-fixing bacteria and archaea our concept of a species (e.g., Venter et plexity of ocean...ecosystems will never be ( diazotrophs ), which convert atmo- al., 2004; Doney et al., 2004; DeLong and fully described with numerical models of spheric...applying ocean inventory of nitrogen nutrients. numerical models, we are confronted Specifying "Functional Groups" Some diazotrophs fix both CO 2 and with

  12. Non Linear Programming (NLP) Formulation for Quantitative Modeling of Protein Signal Transduction Pathways

    PubMed Central

    Morris, Melody K.; Saez-Rodriguez, Julio; Lauffenburger, Douglas A.; Alexopoulos, Leonidas G.

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms. PMID:23226239

  13. Non Linear Programming (NLP) formulation for quantitative modeling of protein signal transduction pathways.

    PubMed

    Mitsos, Alexander; Melas, Ioannis N; Morris, Melody K; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Alexopoulos, Leonidas G

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.

  14. An Operational Model for the Application of Planning-Programming-Budgeting Systems to Local School Districts. Post-Pilot-Test Version. Parts One and Two.

    ERIC Educational Resources Information Center

    Kiser, Chester; And Others

    This 2-part document is designed to aid school districts in the implementation of a planning programing budgeting system. The first part of the manual contains (1) statements of policy, (2) a master flowchart, (3) organization and functions of a PPBS system, (4) a flowscript of procedures, (5) job outlines, and (6) supplementary appendix material.…

  15. Neuro Emotional Literacy Program: Does Teaching the Function of Affect and Affect Regulation Strategies Improve Affect Management and Well-Being?

    ERIC Educational Resources Information Center

    Patten, Kathryn E.; Campbell, Stephen R.

    2016-01-01

    Although research on Emotion Regulation (ER) is developing at a rapid rate, much of it lacks a clear theoretical framework and most focuses on a narrow set of ER strategies. This work presents the details of a pilot project, the Neuro Emotional Literacy Program (NELP), designed for parents and based on the Somatic Appraisal Model of Affect (SAMA).…

  16. Parton distribution functions with QED corrections in the valon model

    NASA Astrophysics Data System (ADS)

    Mottaghizadeh, Marzieh; Taghavi Shahri, Fatemeh; Eslami, Parvin

    2017-10-01

    The parton distribution functions (PDFs) with QED corrections are obtained by solving the QCD ⊗QED DGLAP evolution equations in the framework of the "valon" model at the next-to-leading-order QCD and the leading-order QED approximations. Our results for the PDFs with QED corrections in this phenomenological model are in good agreement with the newly related CT14QED global fits code [Phys. Rev. D 93, 114015 (2016), 10.1103/PhysRevD.93.114015] and APFEL (NNPDF2.3QED) program [Comput. Phys. Commun. 185, 1647 (2014), 10.1016/j.cpc.2014.03.007] in a wide range of x =[10-5,1 ] and Q2=[0.283 ,108] GeV2 . The model calculations agree rather well with those codes. In the latter, we proposed a new method for studying the symmetry breaking of the sea quark distribution functions inside the proton.

  17. Achieving Developmental Synchrony in Young Children With Hearing Loss

    PubMed Central

    Mellon, Nancy K.; Ouellette, Meredith; Greer, Tracy; Gates-Ulanet, Patricia

    2009-01-01

    Children with hearing loss, with early and appropriate amplification and intervention, demonstrate gains in speech, language, and literacy skills. Despite these improvements many children continue to exhibit disturbances in cognitive, behavioral, and emotional control, self-regulation, and aspects of executive function. Given the complexity of developmental learning, educational settings should provide services that foster the growth of skills across multiple dimensions. Transdisciplinary intervention services that target the domains of language, communication, psychosocial functioning, motor, and cognitive development can promote academic and social success. Educational programs must provide children with access to the full range of basic skills necessary for academic and social achievement. In addition to an integrated curriculum that nurtures speech, language, and literacy development, innovations in the areas of auditory perception, social emotional learning, motor development, and vestibular function can enhance student outcomes. Through ongoing evaluation and modification, clearly articulated curricular approaches can serve as a model for early intervention and special education programs. The purpose of this article is to propose an intervention model that combines best practices from a variety of disciplines that affect developmental outcomes for young children with hearing loss, along with specific strategies and approaches that may help to promote optimal development across domains. Access to typically developing peers who model age-appropriate skills in language and behavior, small class sizes, a co-teaching model, and a social constructivist perspective of teaching and learning, are among the key elements of the model. PMID:20150187

  18. Biocontrol in an impulsive predator-prey model.

    PubMed

    Terry, Alan J

    2014-10-01

    We study a model for biological pest control (or "biocontrol") in which a pest population is controlled by a program of periodic releases of a fixed yield of predators that prey on the pest. Releases are represented as impulsive increases in the predator population. Between releases, predator-pest dynamics evolve according to a predator-prey model with some fairly general properties: the pest population grows logistically in the absence of predation; the predator functional response is either of Beddington-DeAngelis type or Holling type II; the predator per capita birth rate is bounded above by a constant multiple of the predator functional response; and the predator per capita death rate is allowed to be decreasing in the predator functional response and increasing in the predator population, though the special case in which it is constant is permitted too. We prove that, when the predator functional response is of Beddington-DeAngelis type and the predators are not sufficiently voracious, then the biocontrol program will fail to reduce the pest population below a particular economic threshold, regardless of the frequency or yield of the releases. We prove also that our model possesses a pest-eradication solution, which is both locally and globally stable provided that predators are sufficiently voracious and that releases occur sufficiently often. We establish, curiously, that the pest-eradication solution can be locally stable whilst not being globally stable, the upshot of which is that, if we delay a biocontrol response to a new pest invasion, then this can change the outcome of the response from pest eradication to pest persistence. Finally, we state a number of specific examples for our model, and, for one of these examples, we corroborate parts of our analysis by numerical simulations. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. THERMUS—A thermal model package for ROOT

    NASA Astrophysics Data System (ADS)

    Wheaton, S.; Cleymans, J.; Hauer, M.

    2009-01-01

    THERMUS is a package of C++ classes and functions allowing statistical-thermal model analyses of particle production in relativistic heavy-ion collisions to be performed within the ROOT framework of analysis. Calculations are possible within three statistical ensembles; a grand-canonical treatment of the conserved charges B, S and Q, a fully canonical treatment of the conserved charges, and a mixed-canonical ensemble combining a canonical treatment of strangeness with a grand-canonical treatment of baryon number and electric charge. THERMUS allows for the assignment of decay chains and detector efficiencies specific to each particle yield, which enables sensible fitting of model parameters to experimental data. Program summaryProgram title: THERMUS, version 2.1 Catalogue identifier: AEBW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 17 152 No. of bytes in distributed program, including test data, etc.: 93 581 Distribution format: tar.gz Programming language: C++ Computer: PC, Pentium 4, 1 GB RAM (not hardware dependent) Operating system: Linux: FEDORA, RedHat, etc. Classification: 17.7 External routines: Numerical Recipes in C [1], ROOT [2] Nature of problem: Statistical-thermal model analyses of heavy-ion collision data require the calculation of both primordial particle densities and contributions from resonance decay. A set of thermal parameters (the number depending on the particular model imposed) and a set of thermalized particles, with their decays specified, is required as input to these models. The output is then a complete set of primordial thermal quantities for each particle, together with the contributions to the final particle yields from resonance decay. In many applications of statistical-thermal models it is required to fit experimental particle multiplicities or particle ratios. In such analyses, the input is a set of experimental yields and ratios, a set of particles comprising the assumed hadron resonance gas formed in the collision and the constraints to be placed on the system. The thermal model parameters consistent with the specified constraints leading to the best-fit to the experimental data are then output. Solution method: THERMUS is a package designed for incorporation into the ROOT [2] framework, used extensively by the heavy-ion community. As such, it utilizes a great deal of ROOT's functionality in its operation. ROOT features used in THERMUS include its containers, the wrapper TMinuit implementing the MINUIT fitting package, and the TMath class of mathematical functions and routines. Arguably the most useful feature is the utilization of CINT as the control language, which allows interactive access to the THERMUS objects. Three distinct statistical ensembles are included in THERMUS, while additional options to include quantum statistics, resonance width and excluded volume corrections are also available. THERMUS provides a default particle list including all mesons (up to the K4∗ (2045)) and baryons (up to the Ω) listed in the July 2002 Particle Physics Booklet [3]. For each typically unstable particle in this list, THERMUS includes a text-file listing its decays. With thermal parameters specified, THERMUS calculates primordial thermal densities either by performing numerical integrations or else, in the case of the Boltzmann approximation without resonance width in the grand-canonical ensemble, by evaluating Bessel functions. Particle decay chains are then used to evaluate experimental observables (i.e. particle yields following resonance decay). Additional detector efficiency factors allow fine-tuning of the model predictions to a specific detector arrangement. When parameters are required to be constrained, use is made of the 'Numerical Recipes in C' [1] function which applies the Broyden globally convergent secant method of solving nonlinear systems of equations. Since the NRC software is not freely-available, it has to be purchased by the user. THERMUS provides the means of imposing a large number of constraints on the chosen model (amongst others, THERMUS can fix the baryon-to-charge ratio of the system, the strangeness density of the system and the primordial energy per hadron). Fits to experimental data are accomplished in THERMUS by using the ROOT TMinuit class. In its default operation, the standard χ function is minimized, yielding the set of best-fit thermal parameters. THERMUS allows the assignment of separate decay chains to each experimental input. In this way, the model is able to match the specific feed-down corrections of a particular data set. Running time: Depending on the analysis required, run-times vary from seconds (for the evaluation of particle multiplicities given a set of parameters) to several minutes (for fits to experimental data subject to constraints). References:W.H. Press, S.A. Teukolsky, W.T. Vetterling, B.P. Flannery, Numerical Recipes in C: The Art of Scientific Computing, Cambridge University Press, Cambridge, 2002. R. Brun, F. Rademakers, Nucl. Inst. Meth. Phys. Res. A 389 (1997) 81. See also http://root.cern.ch/. K. Hagiwara et al., Phys. Rev. D 66 (2002) 010001.

  20. Modeling and Simulation Behavior Validation Methodology and Extension Model Validation for the Individual Soldier

    DTIC Science & Technology

    2015-03-01

    domains. Major model functions include: • Ground combat: Light and heavy forces. • Air mobile forces. • Future forces. • Fixed-wing and rotary-wing...Constraints: • Study must be completed no later than 31 December 2014. • Entity behavior limited to select COMBATXXI Mobility , Unmanned Aerial System...and SQL backend , as well as any open application programming interface API. • Allows data transparency and data driven navigation through the model

  1. SAI (Systems Applications, Incorporated) Urban Airshed Model. Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schere, K.L.

    1985-06-01

    This magnetic tape contains the FORTRAN source code, sample input data, and sample output data for the SAI Urban Airshed Model (UAM). The UAM is a 3-dimensional gridded air-quality simulation model that is well suited for predicting the spatial and temporal distribution of photochemical pollutant concentrations in an urban area. The model is based on the equations of conservation of mass for a set of reactive pollutants in a turbulent-flow field. To solve these equations, the UAM uses numerical techniques set in a 3-D finite-difference grid array of cells, each about 1 to 10 kilometers wide and 10 to severalmore » hundred meters deep. As output, the model provides the calculated pollutant concentrations in each cell as a function of time. The chemical species of prime interest included in the UAM simulations are O3, NO, NO/sub 2/ and several organic compounds and classes of compounds. The UAM system contains at its core the Airshed Simulation Program that accesses input data consisting of 10 to 14 files, depending on the program options chosen. Each file is created by a separate data-preparation program. There are 17 programs in the entire UAM system. The services of a qualified dispersion meteorologist, a chemist, and a computer programmer will be necessary to implement and apply the UAM and to interpret the results. Software Description: The program is written in the FORTRAN programming language for implementation on a UNIVAC 1110 computer under the UNIVAC 110 0 operating system level 38R5A. Memory requirement is 80K.« less

  2. SEQassembly: A Practical Tools Program for Coding Sequences Splicing

    NASA Astrophysics Data System (ADS)

    Lee, Hongbin; Yang, Hang; Fu, Lei; Qin, Long; Li, Huili; He, Feng; Wang, Bo; Wu, Xiaoming

    CDS (Coding Sequences) is a portion of mRNA sequences, which are composed by a number of exon sequence segments. The construction of CDS sequence is important for profound genetic analysis such as genotyping. A program in MATLAB environment is presented, which can process batch of samples sequences into code segments under the guide of reference exon models, and splice these code segments of same sample source into CDS according to the exon order in queue file. This program is useful in transcriptional polymorphism detection and gene function study.

  3. Checking Equivalence of SPMD Programs Using Non-Interference

    DTIC Science & Technology

    2010-01-29

    with it hopes to go beyond the limits of Moore’s law, but also worries that programming will become harder [5]. One of the reasons why parallel...array name in G or L, and e is an arithmetic expression of integer type. In the CUDA code shown in Section 3, b and t are represented by coreId and...b+ t. A second, optimized version of the program (using function “reverse2”, see Section 3) can be modeled as a tuple P2 = ( G ,L2, F 2), with G same

  4. Highlights of the SEASAT-SASS program - A review

    NASA Technical Reports Server (NTRS)

    Pierson, W. J., Jr.

    1983-01-01

    Some important concepts of the SEASAT-SASS program are described and some of the decisions made during the program as to methods for relating wind to backscatter are discussed. The radar scatterometer design is analyzed along with the model function, which is an empirical relationship between the backscatter value and the wind speed, wind direction, and incidence angle of the radar beam with the sea surface. The results of Monte Carlo studies of mesoscale turbulence and of studies of wind stress on the sea surface involving SASS are reviewed.

  5. NASIS data base management system: IBM 360 TSS implementation. Volume 4: Program design specifications

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The design specifications for the programs and modules within the NASA Aerospace Safety Information System (NASIS) are presented. The purpose of the design specifications is to standardize the preparation of the specifications and to guide the program design. Each major functional module within the system is a separate entity for documentation purposes. The design specifications contain a description of, and specifications for, all detail processing which occurs in the module. Sub-models, reference tables, and data sets which are common to several modules are documented separately.

  6. Heat generation in Aircraft tires under yawed rolling conditions

    NASA Technical Reports Server (NTRS)

    Dodge, Richard N.; Clark, Samuel K.

    1987-01-01

    An analytical model was developed for approximating the internal temperature distribution in an aircraft tire operating under conditions of yawed rolling. The model employs an assembly of elements to represent the tire cross section and treats the heat generated within the tire as a function of the change in strain energy associated with predicted tire flexure. Special contact scrubbing terms are superimposed on the symmetrical free rolling model to account for the slip during yawed rolling. An extensive experimental program was conducted to verify temperatures predicted from the analytical model. Data from this program were compared with calculation over a range of operating conditions, namely, vertical deflection, inflation pressure, yaw angle, and direction of yaw. Generally the analytical model predicted overall trends well and correlated reasonably well with individual measurements at locations throughout the cross section.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Tian-Jy; Kim, Younghun

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented thatmore » communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.« less

  8. The Boston Health Care for the Homeless Program: A Public Health Framework

    PubMed Central

    Oppenheimer, Sarah C.; Judge, Christine M.; Taube, Robert L.; Blanchfield, Bonnie B.; Swain, Stacy E.; Koh, Howard K.

    2010-01-01

    During the past 25 years, the Boston Health Care for the Homeless Program has evolved into a service model embodying the core functions and essential services of public health. Each year the program provides integrated medical, behavioral, and oral health care, as well as preventive services, to more than 11 000 homeless people. Services are delivered in clinics located in 2 teaching hospitals, 80 shelters and soup kitchens, and an innovative 104-bed medical respite unit. We explain the program's principles of care, describe the public health framework that undergirds the program, and offer lessons for the elimination of health disparities suffered by this vulnerable population. PMID:20558804

  9. Three essays on multi-level optimization models and applications

    NASA Astrophysics Data System (ADS)

    Rahdar, Mohammad

    The general form of a multi-level mathematical programming problem is a set of nested optimization problems, in which each level controls a series of decision variables independently. However, the value of decision variables may also impact the objective function of other levels. A two-level model is called a bilevel model and can be considered as a Stackelberg game with a leader and a follower. The leader anticipates the response of the follower and optimizes its objective function, and then the follower reacts to the leader's action. The multi-level decision-making model has many real-world applications such as government decisions, energy policies, market economy, network design, etc. However, there is a lack of capable algorithms to solve medium and large scale these types of problems. The dissertation is devoted to both theoretical research and applications of multi-level mathematical programming models, which consists of three parts, each in a paper format. The first part studies the renewable energy portfolio under two major renewable energy policies. The potential competition for biomass for the growth of the renewable energy portfolio in the United States and other interactions between two policies over the next twenty years are investigated. This problem mainly has two levels of decision makers: the government/policy makers and biofuel producers/electricity generators/farmers. We focus on the lower-level problem to predict the amount of capacity expansions, fuel production, and power generation. In the second part, we address uncertainty over demand and lead time in a multi-stage mathematical programming problem. We propose a two-stage tri-level optimization model in the concept of rolling horizon approach to reducing the dimensionality of the multi-stage problem. In the third part of the dissertation, we introduce a new branch and bound algorithm to solve bilevel linear programming problems. The total time is reduced by solving a smaller relaxation problem in each node and decreasing the number of iterations. Computational experiments show that the proposed algorithm is faster than the existing ones.

  10. Parser Combinators: a Practical Application for Generating Parsers for NMR Data

    PubMed Central

    Fenwick, Matthew; Weatherby, Gerard; Ellis, Heidi JC; Gryk, Michael R.

    2013-01-01

    Nuclear Magnetic Resonance (NMR) spectroscopy is a technique for acquiring protein data at atomic resolution and determining the three-dimensional structure of large protein molecules. A typical structure determination process results in the deposition of a large data sets to the BMRB (Bio-Magnetic Resonance Data Bank). This data is stored and shared in a file format called NMR-Star. This format is syntactically and semantically complex making it challenging to parse. Nevertheless, parsing these files is crucial to applying the vast amounts of biological information stored in NMR-Star files, allowing researchers to harness the results of previous studies to direct and validate future work. One powerful approach for parsing files is to apply a Backus-Naur Form (BNF) grammar, which is a high-level model of a file format. Translation of the grammatical model to an executable parser may be automatically accomplished. This paper will show how we applied a model BNF grammar of the NMR-Star format to create a free, open-source parser, using a method that originated in the functional programming world known as “parser combinators”. This paper demonstrates the effectiveness of a principled approach to file specification and parsing. This paper also builds upon our previous work [1], in that 1) it applies concepts from Functional Programming (which is relevant even though the implementation language, Java, is more mainstream than Functional Programming), and 2) all work and accomplishments from this project will be made available under standard open source licenses to provide the community with the opportunity to learn from our techniques and methods. PMID:24352525

  11. A portal for the ocean biogeographic information system

    USGS Publications Warehouse

    Zhang, Yunqing; Grassle, J. F.

    2002-01-01

    Since its inception in 1999 the Ocean Biogeographic Information System (OBIS) has developed into an international science program as well as a globally distributed network of biogeographic databases. An OBIS portal at Rutgers University provides the links and functional interoperability among member database systems. Protocols and standards have been established to support effective communication between the portal and these functional units. The portal provides distributed data searching, a taxonomy name service, a GIS with access to relevant environmental data, biological modeling, and education modules for mariners, students, environmental managers, and scientists. The portal will integrate Census of Marine Life field projects, national data archives, and other functional modules, and provides for network-wide analyses and modeling tools.

  12. Care satisfaction, hope, and life functioning among adults with bipolar disorder: data from the first 1000 participants in the Systematic Treatment Enhancement Program.

    PubMed

    Morris, Chad D; Miklowitz, David J; Wisniewski, Stephen R; Giese, Alexis A; Thomas, Marshall R; Allen, Michael H

    2005-01-01

    The Systematic Treatment Enhancement Program for Bipolar Disorder (STEP-BD) is designed to evaluate the longitudinal outcome of patients with bipolar disorder. The STEP-BD disease-management model is built on evidence-based practices and a collaborative care approach designed to maximize specific and nonspecific treatment mechanisms. This prospective study examined the longitudinal relationships between patients' satisfaction with care, levels of hope, and life functioning in the first 1000 patients to enter STEP-BD. The study used scores from the Care Satisfaction Questionnaire, Beck Hopelessness Scale, Range of Impaired Functioning Tool, Young Mania Rating Scale, and Montgomery-Asberg Depression Rating Scale at 5 time points during a 1-year interval. Analyses tested mediational pathways between care satisfaction, hope, and life functioning, depression, and mania using mixed-effects (random and fixed) regression models. Increases in care satisfaction were associated with decreased hopelessness (P < .01) but not related to symptoms of depression or mania. Similarly, decreased hopelessness was associated with better life functioning (P < .01) but not related to symptoms of depression or mania. Depression was independently associated with poorer life functioning (P < .0001). This study provided support for the hypothesized mediational pathway between care satisfaction, hopelessness, and life functioning. Findings suggest that providing care that maximizes patient hope may be important. By so doing, patients might overcome the learned helplessness/hopelessness that often accompanies a cyclical illness and build a realistic illness-management strategy.

  13. The Effects of an Exercise Program on Anxiety Levels and Metabolic Functions in Patients With Anxiety Disorders.

    PubMed

    Ma, Wei-Fen; Wu, Po-Lun; Su, Chia-Hsien; Yang, Tzu-Ching

    2017-05-01

    The purpose of this study was to evaluate the effects of a home-based (HB) exercise program on anxiety levels and metabolic functions in patients with anxiety disorders in Taiwan. Purposive sampling was used to recruit 86 participants for this randomized, experimental study. Participants were asked to complete a pretest before the 3-month exercise program, a posttest at 1 week, and a follow-up test at 3 months after the exercise program. Study measures included four Self-Report Scales and biophysical assessments to collect and assess personal data, lifestyle behaviors, anxiety levels, and metabolic control functions. Of the 86 study participants, 83 completed the posttest and the 3-month follow-up test, including 41 in the experimental group and 42 in the control group. Participants in the experimental group showed significant improvements in body mass index, high-density lipoprotein cholesterol levels, and the level of moderate exercise after the program relative to the control group, as analyzed by generalized estimating equations mixed-model repeated measures. State and trait anxiety levels were also significantly improved from pretest to follow-up test in the experimental group. Finally, the prevalence of metabolic syndrome declined for participants in the experimental group. The HB exercise program produced positive effects on the metabolic indicators and anxiety levels of Taiwanese adults with anxiety disorders. Health providers should consider using similar HB exercise programs to help improve the mental and physical health of patients with anxiety disorders in their communities.

  14. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  15. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  16. An Optimization Code for Nonlinear Transient Problems of a Large Scale Multidisciplinary Mathematical Model

    NASA Astrophysics Data System (ADS)

    Takasaki, Koichi

    This paper presents a program for the multidisciplinary optimization and identification problem of the nonlinear model of large aerospace vehicle structures. The program constructs the global matrix of the dynamic system in the time direction by the p-version finite element method (pFEM), and the basic matrix for each pFEM node in the time direction is described by a sparse matrix similarly to the static finite element problem. The algorithm used by the program does not require the Hessian matrix of the objective function and so has low memory requirements. It also has a relatively low computational cost, and is suited to parallel computation. The program was integrated as a solver module of the multidisciplinary analysis system CUMuLOUS (Computational Utility for Multidisciplinary Large scale Optimization of Undense System) which is under development by the Aerospace Research and Development Directorate (ARD) of the Japan Aerospace Exploration Agency (JAXA).

  17. Conceptual ecological models to guide integrated landscape monitoring of the Great Basin

    USGS Publications Warehouse

    Miller, D.M.; Finn, S.P.; Woodward, Andrea; Torregrosa, Alicia; Miller, M.E.; Bedford, D.R.; Brasher, A.M.

    2010-01-01

    The Great Basin Integrated Landscape Monitoring Pilot Project was developed in response to the need for a monitoring and predictive capability that addresses changes in broad landscapes and waterscapes. Human communities and needs are nested within landscapes formed by interactions among the hydrosphere, geosphere, and biosphere. Understanding the complex processes that shape landscapes and deriving ways to manage them sustainably while meeting human needs require sophisticated modeling and monitoring. This document summarizes current understanding of ecosystem structure and function for many of the ecosystems within the Great Basin using conceptual models. The conceptual ecosystem models identify key ecological components and processes, identify external drivers, develop a hierarchical set of models that address both site and landscape attributes, inform regional monitoring strategy, and identify critical gaps in our knowledge of ecosystem function. The report also illustrates an approach for temporal and spatial scaling from site-specific models to landscape models and for understanding cumulative effects. Eventually, conceptual models can provide a structure for designing monitoring programs, interpreting monitoring and other data, and assessing the accuracy of our understanding of ecosystem functions and processes.

  18. The NHV rehabilitation services program improves long-term physical functioning in survivors of the 2008 Sichuan earthquake: a longitudinal quasi experiment.

    PubMed

    Zhang, Xia; Reinhardt, Jan D; Gosney, James E; Li, Jianan

    2013-01-01

    Long-term disability following natural disasters significantly burdens survivors and the impacted society. Nevertheless, medical rehabilitation programming has been historically neglected in disaster relief planning. 'NHV' is a rehabilitation services program comprised of non-governmental organizations (NGOs) (N), local health departments (H), and professional rehabilitation volunteers (V) which aims to improve long-term physical functioning in survivors of the 2008 Sichuan earthquake. We aimed to evaluate the effectiveness of the NHV program. 510 of 591 enrolled earthquake survivors participated in this longitudinal quasi-experimental study (86.3%). The early intervention group (NHV-E) consisted of 298 survivors who received institutional-based rehabilitation (IBR) followed by community-based rehabilitation (CBR); the late intervention group (NHV-L) was comprised of 101 survivors who began rehabilitation one year later. The control group of 111 earthquake survivors did not receive IBR/CBR. Physical functioning was assessed using the Barthel Index (BI). Data were analyzed with a mixed-effects Tobit regression model. Physical functioning was significantly increased in the NHV-E and NHV-L groups at follow-up but not in the control group after adjustment for gender, age, type of injury, and time to measurement. We found significant effects of both NHV (11.14, 95% CI 9.0-13.3) and sponaneaous recovery (5.03; 95% CI 1.73-8.34). The effect of NHV-E (11.3, 95% CI 9.0-13.7) was marginally greater than that of NHV-L (10.7, 95% CI 7.9-13.6). It could, however, not be determined whether specific IBR or CBR program components were effective since individual component exposures were not evaluated. Our analysis shows that the NHV improved the long-term physical functioning of Sichuan earthquake survivors with disabling injuries. The comprehensive rehabilitation program benefitted the individual and society, rehabilitation services in China, and international rehabilitation disaster relief planning. Similar IBR/CBR programs should therefore be considered for future large-scale rehabilitation disaster relief efforts.

  19. Requirements for implementation of Kuessner and Wagner indicial lift growth functions into the FLEXSTAB computer program system for use in dynamic loads analyses

    NASA Technical Reports Server (NTRS)

    Miller, R. D.; Rogers, J. T.

    1975-01-01

    General requirements for dynamic loads analyses are described. The indicial lift growth function unsteady subsonic aerodynamic representation is reviewed, and the FLEXSTAB CPS is evaluated with respect to these general requirements. The effects of residual flexibility techniques on dynamic loads analyses are also evaluated using a simple dynamic model.

  20. A Combination of Hand-Held Models and Computer Imaging Programs Helps Students Answer Oral Questions about Molecular Structure and Function: A Controlled Investigation of Student Learning

    ERIC Educational Resources Information Center

    Harris, Michelle A.; Peck, Ronald F.; Colton, Shannon; Morris, Jennifer; Neto, Elias Chaibub; Kallio, Julie

    2009-01-01

    We conducted a controlled investigation to examine whether a combination of computer imagery and tactile tools helps introductory cell biology laboratory undergraduate students better learn about protein structure/function relationships as compared with computer imagery alone. In all five laboratory sections, students used the molecular imaging…

Top