Security and Cloud Outsourcing Framework for Economic Dispatch
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi
The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less
Security and Cloud Outsourcing Framework for Economic Dispatch
Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi; ...
2017-04-24
The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less
NASA Technical Reports Server (NTRS)
Nakazawa, Shohei
1991-01-01
Formulations and algorithms implemented in the MHOST finite element program are discussed. The code uses a novel concept of the mixed iterative solution technique for the efficient 3-D computations of turbine engine hot section components. The general framework of variational formulation and solution algorithms are discussed which were derived from the mixed three field Hu-Washizu principle. This formulation enables the use of nodal interpolation for coordinates, displacements, strains, and stresses. Algorithmic description of the mixed iterative method includes variations for the quasi static, transient dynamic and buckling analyses. The global-local analysis procedure referred to as the subelement refinement is developed in the framework of the mixed iterative solution, of which the detail is presented. The numerically integrated isoparametric elements implemented in the framework is discussed. Methods to filter certain parts of strain and project the element discontinuous quantities to the nodes are developed for a family of linear elements. Integration algorithms are described for linear and nonlinear equations included in MHOST program.
A Mixed Integer Linear Programming Approach to Electrical Stimulation Optimization Problems.
Abouelseoud, Gehan; Abouelseoud, Yasmine; Shoukry, Amin; Ismail, Nour; Mekky, Jaidaa
2018-02-01
Electrical stimulation optimization is a challenging problem. Even when a single region is targeted for excitation, the problem remains a constrained multi-objective optimization problem. The constrained nature of the problem results from safety concerns while its multi-objectives originate from the requirement that non-targeted regions should remain unaffected. In this paper, we propose a mixed integer linear programming formulation that can successfully address the challenges facing this problem. Moreover, the proposed framework can conclusively check the feasibility of the stimulation goals. This helps researchers to avoid wasting time trying to achieve goals that are impossible under a chosen stimulation setup. The superiority of the proposed framework over alternative methods is demonstrated through simulation examples.
Runtime Analysis of Linear Temporal Logic Specifications
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Havelund, Klaus
2001-01-01
This report presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to B chi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.
Low-rank regularization for learning gene expression programs.
Ye, Guibo; Tang, Mengfan; Cai, Jian-Feng; Nie, Qing; Xie, Xiaohui
2013-01-01
Learning gene expression programs directly from a set of observations is challenging due to the complexity of gene regulation, high noise of experimental measurements, and insufficient number of experimental measurements. Imposing additional constraints with strong and biologically motivated regularizations is critical in developing reliable and effective algorithms for inferring gene expression programs. Here we propose a new form of regulation that constrains the number of independent connectivity patterns between regulators and targets, motivated by the modular design of gene regulatory programs and the belief that the total number of independent regulatory modules should be small. We formulate a multi-target linear regression framework to incorporate this type of regulation, in which the number of independent connectivity patterns is expressed as the rank of the connectivity matrix between regulators and targets. We then generalize the linear framework to nonlinear cases, and prove that the generalized low-rank regularization model is still convex. Efficient algorithms are derived to solve both the linear and nonlinear low-rank regularized problems. Finally, we test the algorithms on three gene expression datasets, and show that the low-rank regularization improves the accuracy of gene expression prediction in these three datasets.
Establishing a conceptual framework for handoffs using communication theory.
Mohorek, Matthew; Webb, Travis P
2015-01-01
A significant consequence of the 2003 Accreditation Council for Graduate Medical Education duty hour restrictions has been the dramatic increase in patient care handoffs. Ineffective handoffs have been identified as the third most common cause of medical error. However, research into health care handoffs lacks a unifying foundational structure. We sought to identify a conceptual framework that could be used to critically analyze handoffs. A scholarly review focusing on communication theory as a possible conceptual framework for handoffs was conducted. A PubMed search of published handoff research was also performed, and the literature was analyzed and matched to the most relevant theory for health care handoff models. The Shannon-Weaver Linear Model of Communication was identified as the most appropriate conceptual framework for health care handoffs. The Linear Model describes communication as a linear process. A source encodes a message into a signal, the signal is sent through a channel, and the signal is decoded back into a message at the destination, all in the presence of internal and external noise. The Linear Model identifies 3 separate instances in handoff communication where error occurs: the transmitter (message encoding), channel, and receiver (signal decoding). The Linear Model of Communication is a suitable conceptual framework for handoff research and provides a structured approach for describing handoff variables. We propose the Linear Model should be used as a foundation for further research into interventions to improve health care handoffs. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Robust nonlinear control of vectored thrust aircraft
NASA Technical Reports Server (NTRS)
Doyle, John C.; Murray, Richard; Morris, John
1993-01-01
An interdisciplinary program in robust control for nonlinear systems with applications to a variety of engineering problems is outlined. Major emphasis will be placed on flight control, with both experimental and analytical studies. This program builds on recent new results in control theory for stability, stabilization, robust stability, robust performance, synthesis, and model reduction in a unified framework using Linear Fractional Transformations (LFT's), Linear Matrix Inequalities (LMI's), and the structured singular value micron. Most of these new advances have been accomplished by the Caltech controls group independently or in collaboration with researchers in other institutions. These recent results offer a new and remarkably unified framework for all aspects of robust control, but what is particularly important for this program is that they also have important implications for system identification and control of nonlinear systems. This combines well with Caltech's expertise in nonlinear control theory, both in geometric methods and methods for systems with constraints and saturations.
Shillcutt, Samuel D; LeFevre, Amnesty E; Fischer-Walker, Christa L; Taneja, Sunita; Black, Robert E; Mazumder, Sarmila
2017-01-01
This study evaluates the cost-effectiveness of the DAZT program for scaling up treatment of acute child diarrhea in Gujarat India using a net-benefit regression framework. Costs were calculated from societal and caregivers' perspectives and effectiveness was assessed in terms of coverage of zinc and both zinc and Oral Rehydration Salt. Regression models were tested in simple linear regression, with a specified set of covariates, and with a specified set of covariates and interaction terms using linear regression with endogenous treatment effects was used as the reference case. The DAZT program was cost-effective with over 95% certainty above $5.50 and $7.50 per appropriately treated child in the unadjusted and adjusted models respectively, with specifications including interaction terms being cost-effective with 85-97% certainty. Findings from this study should be combined with other evidence when considering decisions to scale up programs such as the DAZT program to promote the use of ORS and zinc to treat child diarrhea.
NASA Astrophysics Data System (ADS)
Guo, Sangang
2017-09-01
There are two stages in solving security-constrained unit commitment problems (SCUC) within Lagrangian framework: one is to obtain feasible units’ states (UC), the other is power economic dispatch (ED) for each unit. The accurate solution of ED is more important for enhancing the efficiency of the solution to SCUC for the fixed feasible units’ statues. Two novel methods named after Convex Combinatorial Coefficient Method and Power Increment Method respectively based on linear programming problem for solving ED are proposed by the piecewise linear approximation to the nonlinear convex fuel cost functions. Numerical testing results show that the methods are effective and efficient.
SLFP: a stochastic linear fractional programming approach for sustainable waste management.
Zhu, H; Huang, G H
2011-12-01
A stochastic linear fractional programming (SLFP) approach is developed for supporting sustainable municipal solid waste management under uncertainty. The SLFP method can solve ratio optimization problems associated with random information, where chance-constrained programming is integrated into a linear fractional programming framework. It has advantages in: (1) comparing objectives of two aspects, (2) reflecting system efficiency, (3) dealing with uncertainty expressed as probability distributions, and (4) providing optimal-ratio solutions under different system-reliability conditions. The method is applied to a case study of waste flow allocation within a municipal solid waste (MSW) management system. The obtained solutions are useful for identifying sustainable MSW management schemes with maximized system efficiency under various constraint-violation risks. The results indicate that SLFP can support in-depth analysis of the interrelationships among system efficiency, system cost and system-failure risk. Copyright © 2011 Elsevier Ltd. All rights reserved.
Hogan, Vijaya; Rowley, Diane L; White, Stephanie Baker; Faustin, Yanica
2018-02-01
Introduction Existing health disparities frameworks do not adequately incorporate unique interacting contributing factors leading to health inequities among African Americans, resulting in public health stakeholders' inability to translate these frameworks into practice. Methods We developed dimensionality and R4P to integrate multiple theoretical perspectives into a framework of action to eliminate health inequities experienced by African Americans. Results The dimensional framework incorporates Critical Race Theory and intersectionality, and includes dimensions of time-past, present and future. Dimensionality captures the complex linear and non-linear array of influences that cause health inequities, but these pathways do not lend themselves to approaches to developing empirically derived programs, policies and interventions to promote health equity. R4P provides a framework for addressing the scope of actions needed. The five components of R4P are (1) Remove, (2) Repair, (3) Remediate, (4) Restructure and (5) Provide. Conclusion R4P is designed to translate complex causality into a public health equity planning, assessment, evaluation and research tool.
NASA Astrophysics Data System (ADS)
Mortensen, Mikael; Langtangen, Hans Petter; Wells, Garth N.
2011-09-01
Finding an appropriate turbulence model for a given flow case usually calls for extensive experimentation with both models and numerical solution methods. This work presents the design and implementation of a flexible, programmable software framework for assisting with numerical experiments in computational turbulence. The framework targets Reynolds-averaged Navier-Stokes models, discretized by finite element methods. The novel implementation makes use of Python and the FEniCS package, the combination of which leads to compact and reusable code, where model- and solver-specific code resemble closely the mathematical formulation of equations and algorithms. The presented ideas and programming techniques are also applicable to other fields that involve systems of nonlinear partial differential equations. We demonstrate the framework in two applications and investigate the impact of various linearizations on the convergence properties of nonlinear solvers for a Reynolds-averaged Navier-Stokes model.
Automata-Based Verification of Temporal Properties on Running Programs
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Havelund, Klaus; Lan, Sonie (Technical Monitor)
2001-01-01
This paper presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to Buchi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.
Automatic design of synthetic gene circuits through mixed integer non-linear programming.
Huynh, Linh; Kececioglu, John; Köppe, Matthias; Tagkopoulos, Ilias
2012-01-01
Automatic design of synthetic gene circuits poses a significant challenge to synthetic biology, primarily due to the complexity of biological systems, and the lack of rigorous optimization methods that can cope with the combinatorial explosion as the number of biological parts increases. Current optimization methods for synthetic gene design rely on heuristic algorithms that are usually not deterministic, deliver sub-optimal solutions, and provide no guaranties on convergence or error bounds. Here, we introduce an optimization framework for the problem of part selection in synthetic gene circuits that is based on mixed integer non-linear programming (MINLP), which is a deterministic method that finds the globally optimal solution and guarantees convergence in finite time. Given a synthetic gene circuit, a library of characterized parts, and user-defined constraints, our method can find the optimal selection of parts that satisfy the constraints and best approximates the objective function given by the user. We evaluated the proposed method in the design of three synthetic circuits (a toggle switch, a transcriptional cascade, and a band detector), with both experimentally constructed and synthetic promoter libraries. Scalability and robustness analysis shows that the proposed framework scales well with the library size and the solution space. The work described here is a step towards a unifying, realistic framework for the automated design of biological circuits.
Chen, Vivian Yi-Ju; Yang, Tse-Chuan
2012-08-01
An increasing interest in exploring spatial non-stationarity has generated several specialized analytic software programs; however, few of these programs can be integrated natively into a well-developed statistical environment such as SAS. We not only developed a set of SAS macro programs to fill this gap, but also expanded the geographically weighted generalized linear modeling (GWGLM) by integrating the strengths of SAS into the GWGLM framework. Three features distinguish our work. First, the macro programs of this study provide more kernel weighting functions than the existing programs. Second, with our codes the users are able to better specify the bandwidth selection process compared to the capabilities of existing programs. Third, the development of the macro programs is fully embedded in the SAS environment, providing great potential for future exploration of complicated spatially varying coefficient models in other disciplines. We provided three empirical examples to illustrate the use of the SAS macro programs and demonstrated the advantages explained above. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Consideration in selecting crops for the human-rated life support system: a Linear Programming model
NASA Technical Reports Server (NTRS)
Wheeler, E. F.; Kossowski, J.; Goto, E.; Langhans, R. W.; White, G.; Albright, L. D.; Wilcox, D.; Henninger, D. L. (Principal Investigator)
1996-01-01
A Linear Programming model has been constructed which aids in selecting appropriate crops for CELSS (Controlled Environment Life Support System) food production. A team of Controlled Environment Agriculture (CEA) faculty, staff, graduate students and invited experts representing more than a dozen disciplines, provided a wide range of expertise in developing the model and the crop production program. The model incorporates nutritional content and controlled-environment based production yields of carefully chosen crops into a framework where a crop mix can be constructed to suit the astronauts' needs. The crew's nutritional requirements can be adequately satisfied with only a few crops (assuming vitamin mineral supplements are provided) but this will not be satisfactory from a culinary standpoint. This model is flexible enough that taste and variety driven food choices can be built into the model.
Consideration in selecting crops for the human-rated life support system: a linear programming model
NASA Astrophysics Data System (ADS)
Wheeler, E. F.; Kossowski, J.; Goto, E.; Langhans, R. W.; White, G.; Albright, L. D.; Wilcox, D.
A Linear Programming model has been constructed which aids in selecting appropriate crops for CELSS (Controlled Environment Life Support System) food production. A team of Controlled Environment Agriculture (CEA) faculty, staff, graduate students and invited experts representing more than a dozen disciplines, provided a wide range of expertise in developing the model and the crop production program. The model incorporates nutritional content and controlled-environment based production yields of carefully chosen crops into a framework where a crop mix can be constructed to suit the astronauts' needs. The crew's nutritional requirements can be adequately satisfied with only a few crops (assuming vitamin mineral supplements are provided) but this will not be satisfactory from a culinary standpoint. This model is flexible enough that taste and variety driven food choices can be built into the model.
Approximate labeling via graph cuts based on linear programming.
Komodakis, Nikos; Tziritas, Georgios
2007-08-01
A new framework is presented for both understanding and developing graph-cut-based combinatorial algorithms suitable for the approximate optimization of a very wide class of Markov Random Fields (MRFs) that are frequently encountered in computer vision. The proposed framework utilizes tools from the duality theory of linear programming in order to provide an alternative and more general view of state-of-the-art techniques like the \\alpha-expansion algorithm, which is included merely as a special case. Moreover, contrary to \\alpha-expansion, the derived algorithms generate solutions with guaranteed optimality properties for a much wider class of problems, for example, even for MRFs with nonmetric potentials. In addition, they are capable of providing per-instance suboptimality bounds in all occasions, including discrete MRFs with an arbitrary potential function. These bounds prove to be very tight in practice (that is, very close to 1), which means that the resulting solutions are almost optimal. Our algorithms' effectiveness is demonstrated by presenting experimental results on a variety of low-level vision tasks, such as stereo matching, image restoration, image completion, and optical flow estimation, as well as on synthetic problems.
Palazuelos, Daniel; DaEun Im, Dana; Peckarsky, Matthew; Schwarz, Dan; Farmer, Didi Bertrand; Dhillon, Ranu; Johnson, Ari; Orihuela, Claudia; Hackett, Jill; Bazile, Junior; Berman, Leslie; Ballard, Madeleine; Panjabi, Raj; Ternier, Ralph; Slavin, Sam; Lee, Scott; Selinsky, Steve; Mitnick, Carole Diane
2013-01-01
Introduction Despite decades of experience with community health workers (CHWs) in a wide variety of global health projects, there is no established conceptual framework that structures how implementers and researchers can understand, study and improve their respective programs based on lessons learned by other CHW programs. Objective To apply an original, non-linear framework and case study method, 5-SPICE, to multiple sister projects of a large, international non-governmental organization (NGO), and other CHW projects. Design Engaging a large group of implementers, researchers and the best available literature, the 5-SPICE framework was refined and then applied to a selection of CHW programs. Insights gleaned from the case study method were summarized in a tabular format named the ‘5×5-SPICE chart’. This format graphically lists the ways in which essential CHW program elements interact, both positively and negatively, in the implementation field. Results The 5×5-SPICE charts reveal a variety of insights that come from a more complex understanding of how essential CHW projects interact and influence each other in their unique context. Some have been well described in the literature previously, while others are exclusive to this article. An analysis of how best to compensate CHWs is also offered as an example of the type of insights that this method may yield. Conclusions The 5-SPICE framework is a novel instrument that can be used to guide discussions about CHW projects. Insights from this process can help guide quality improvement efforts, or be used as hypothesis that will form the basis of a program's research agenda. Recent experience with research protocols embedded into successfully implemented projects demonstrates how such hypothesis can be rigorously tested. PMID:23561023
Automatic Design of Synthetic Gene Circuits through Mixed Integer Non-linear Programming
Huynh, Linh; Kececioglu, John; Köppe, Matthias; Tagkopoulos, Ilias
2012-01-01
Automatic design of synthetic gene circuits poses a significant challenge to synthetic biology, primarily due to the complexity of biological systems, and the lack of rigorous optimization methods that can cope with the combinatorial explosion as the number of biological parts increases. Current optimization methods for synthetic gene design rely on heuristic algorithms that are usually not deterministic, deliver sub-optimal solutions, and provide no guaranties on convergence or error bounds. Here, we introduce an optimization framework for the problem of part selection in synthetic gene circuits that is based on mixed integer non-linear programming (MINLP), which is a deterministic method that finds the globally optimal solution and guarantees convergence in finite time. Given a synthetic gene circuit, a library of characterized parts, and user-defined constraints, our method can find the optimal selection of parts that satisfy the constraints and best approximates the objective function given by the user. We evaluated the proposed method in the design of three synthetic circuits (a toggle switch, a transcriptional cascade, and a band detector), with both experimentally constructed and synthetic promoter libraries. Scalability and robustness analysis shows that the proposed framework scales well with the library size and the solution space. The work described here is a step towards a unifying, realistic framework for the automated design of biological circuits. PMID:22536398
A framework for reporting tree cover attributes in agricultural landscapes
Dacia M. Meneguzzo; Greg C. Liknes
2012-01-01
The definition of forest land used by the USDA Forest Serviceâs Forest Inventory and Analysis program includes area, width, and density requirements. These requirements frequently exclude from the inventory any trees occupyingnarrow riparian corridors or linear tree plantings (e.g., windbreaks and shelterbelts). With recent attention being paid to such topics as bio-...
The integration of claims to health-care: a programming approach.
Anand, Paul
2003-09-01
The paper contributes to the use of social choice and welfare theory in health economics by developing and applying the integration of claims framework to health-care rationing. Related to Sen's critique of neo-classical welfare economics, the integration of claims framework recognises three primitive sources of claim: consequences, deontology and procedures. A taxonomy is presented with the aid of which it is shown that social welfare functions reflecting these claims individually or together, can be specified. Some of the resulting social choice rules can be regarded as generalisations of health-maximisation and all have normative justifications, though the justifications may not be universally acceptable. The paper shows how non-linear programming can be used to operationalise such choice rules and illustrates their differential impacts on the optimal provision of health-care. Following discussion of relations to the capabilities framework and the context in which rationing occurs, the paper concludes that the integration of claims provides a viable framework for modelling health-care rationing that is technically rigorous, general and tractable, as well as being consistent with relevant moral considerations and citizen preferences.
Aether: leveraging linear programming for optimal cloud computing in genomics.
Luber, Jacob M; Tierney, Braden T; Cofer, Evan M; Patel, Chirag J; Kostic, Aleksandar D
2018-05-01
Across biology, we are seeing rapid developments in scale of data production without a corresponding increase in data analysis capabilities. Here, we present Aether (http://aether.kosticlab.org), an intuitive, easy-to-use, cost-effective and scalable framework that uses linear programming to optimally bid on and deploy combinations of underutilized cloud computing resources. Our approach simultaneously minimizes the cost of data analysis and provides an easy transition from users' existing HPC pipelines. Data utilized are available at https://pubs.broadinstitute.org/diabimmune and with EBI SRA accession ERP005989. Source code is available at (https://github.com/kosticlab/aether). Examples, documentation and a tutorial are available at http://aether.kosticlab.org. chirag_patel@hms.harvard.edu or aleksandar.kostic@joslin.harvard.edu. Supplementary data are available at Bioinformatics online.
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
A variety of artificial intelligence techniques which could be used with regard to NASA space applications and robotics were evaluated. The techniques studied were decision tree manipulators, problem solvers, rule based systems, logic programming languages, representation language languages, and expert systems. The overall structure of a robotic simulation tool was defined and a framework for that tool developed. Nonlinear and linearized dynamics equations were formulated for n link manipulator configurations. A framework for the robotic simulation was established which uses validated manipulator component models connected according to a user defined configuration.
NASA Astrophysics Data System (ADS)
Zhang, Chenglong; Guo, Ping
2017-10-01
The vague and fuzzy parametric information is a challenging issue in irrigation water management problems. In response to this problem, a generalized fuzzy credibility-constrained linear fractional programming (GFCCFP) model is developed for optimal irrigation water allocation under uncertainty. The model can be derived from integrating generalized fuzzy credibility-constrained programming (GFCCP) into a linear fractional programming (LFP) optimization framework. Therefore, it can solve ratio optimization problems associated with fuzzy parameters, and examine the variation of results under different credibility levels and weight coefficients of possibility and necessary. It has advantages in: (1) balancing the economic and resources objectives directly; (2) analyzing system efficiency; (3) generating more flexible decision solutions by giving different credibility levels and weight coefficients of possibility and (4) supporting in-depth analysis of the interrelationships among system efficiency, credibility level and weight coefficient. The model is applied to a case study of irrigation water allocation in the middle reaches of Heihe River Basin, northwest China. Therefore, optimal irrigation water allocation solutions from the GFCCFP model can be obtained. Moreover, factorial analysis on the two parameters (i.e. λ and γ) indicates that the weight coefficient is a main factor compared with credibility level for system efficiency. These results can be effective for support reasonable irrigation water resources management and agricultural production.
2013-01-01
Background Optimization procedures to identify gene knockouts for targeted biochemical overproduction have been widely in use in modern metabolic engineering. Flux balance analysis (FBA) framework has provided conceptual simplifications for genome-scale dynamic analysis at steady states. Based on FBA, many current optimization methods for targeted bio-productions have been developed under the maximum cell growth assumption. The optimization problem to derive gene knockout strategies recently has been formulated as a bi-level programming problem in OptKnock for maximum targeted bio-productions with maximum growth rates. However, it has been shown that knockout mutants in fact reach the steady states with the minimization of metabolic adjustment (MOMA) from the corresponding wild-type strains instead of having maximal growth rates after genetic or metabolic intervention. In this work, we propose a new bi-level computational framework--MOMAKnock--which can derive robust knockout strategies under the MOMA flux distribution approximation. Methods In this new bi-level optimization framework, we aim to maximize the production of targeted chemicals by identifying candidate knockout genes or reactions under phenotypic constraints approximated by the MOMA assumption. Hence, the targeted chemical production is the primary objective of MOMAKnock while the MOMA assumption is formulated as the inner problem of constraining the knockout metabolic flux to be as close as possible to the steady-state phenotypes of wide-type strains. As this new inner problem becomes a quadratic programming problem, a novel adaptive piecewise linearization algorithm is developed in this paper to obtain the exact optimal solution to this new bi-level integer quadratic programming problem for MOMAKnock. Results Our new MOMAKnock model and the adaptive piecewise linearization solution algorithm are tested with a small E. coli core metabolic network and a large-scale iAF1260 E. coli metabolic network. The derived knockout strategies are compared with those from OptKnock. Our preliminary experimental results show that MOMAKnock can provide improved targeted productions with more robust knockout strategies. PMID:23368729
Ren, Shaogang; Zeng, Bo; Qian, Xiaoning
2013-01-01
Optimization procedures to identify gene knockouts for targeted biochemical overproduction have been widely in use in modern metabolic engineering. Flux balance analysis (FBA) framework has provided conceptual simplifications for genome-scale dynamic analysis at steady states. Based on FBA, many current optimization methods for targeted bio-productions have been developed under the maximum cell growth assumption. The optimization problem to derive gene knockout strategies recently has been formulated as a bi-level programming problem in OptKnock for maximum targeted bio-productions with maximum growth rates. However, it has been shown that knockout mutants in fact reach the steady states with the minimization of metabolic adjustment (MOMA) from the corresponding wild-type strains instead of having maximal growth rates after genetic or metabolic intervention. In this work, we propose a new bi-level computational framework--MOMAKnock--which can derive robust knockout strategies under the MOMA flux distribution approximation. In this new bi-level optimization framework, we aim to maximize the production of targeted chemicals by identifying candidate knockout genes or reactions under phenotypic constraints approximated by the MOMA assumption. Hence, the targeted chemical production is the primary objective of MOMAKnock while the MOMA assumption is formulated as the inner problem of constraining the knockout metabolic flux to be as close as possible to the steady-state phenotypes of wide-type strains. As this new inner problem becomes a quadratic programming problem, a novel adaptive piecewise linearization algorithm is developed in this paper to obtain the exact optimal solution to this new bi-level integer quadratic programming problem for MOMAKnock. Our new MOMAKnock model and the adaptive piecewise linearization solution algorithm are tested with a small E. coli core metabolic network and a large-scale iAF1260 E. coli metabolic network. The derived knockout strategies are compared with those from OptKnock. Our preliminary experimental results show that MOMAKnock can provide improved targeted productions with more robust knockout strategies.
Glocker, Ben; Paragios, Nikos; Komodakis, Nikos; Tziritas, Georgios; Navab, Nassir
2007-01-01
In this paper we propose a novel non-rigid volume registration based on discrete labeling and linear programming. The proposed framework reformulates registration as a minimal path extraction in a weighted graph. The space of solutions is represented using a set of a labels which are assigned to predefined displacements. The graph topology corresponds to a superimposed regular grid onto the volume. Links between neighborhood control points introduce smoothness, while links between the graph nodes and the labels (end-nodes) measure the cost induced to the objective function through the selection of a particular deformation for a given control point once projected to the entire volume domain, Higher order polynomials are used to express the volume deformation from the ones of the control points. Efficient linear programming that can guarantee the optimal solution up to (a user-defined) bound is considered to recover the optimal registration parameters. Therefore, the method is gradient free, can encode various similarity metrics (simple changes on the graph construction), can guarantee a globally sub-optimal solution and is computational tractable. Experimental validation using simulated data with known deformation, as well as manually segmented data demonstrate the extreme potentials of our approach.
Aether: leveraging linear programming for optimal cloud computing in genomics
Luber, Jacob M; Tierney, Braden T; Cofer, Evan M; Patel, Chirag J
2018-01-01
Abstract Motivation Across biology, we are seeing rapid developments in scale of data production without a corresponding increase in data analysis capabilities. Results Here, we present Aether (http://aether.kosticlab.org), an intuitive, easy-to-use, cost-effective and scalable framework that uses linear programming to optimally bid on and deploy combinations of underutilized cloud computing resources. Our approach simultaneously minimizes the cost of data analysis and provides an easy transition from users’ existing HPC pipelines. Availability and implementation Data utilized are available at https://pubs.broadinstitute.org/diabimmune and with EBI SRA accession ERP005989. Source code is available at (https://github.com/kosticlab/aether). Examples, documentation and a tutorial are available at http://aether.kosticlab.org. Contact chirag_patel@hms.harvard.edu or aleksandar.kostic@joslin.harvard.edu Supplementary information Supplementary data are available at Bioinformatics online. PMID:29228186
Spiral: Automated Computing for Linear Transforms
NASA Astrophysics Data System (ADS)
Püschel, Markus
2010-09-01
Writing fast software has become extraordinarily difficult. For optimal performance, programs and their underlying algorithms have to be adapted to take full advantage of the platform's parallelism, memory hierarchy, and available instruction set. To make things worse, the best implementations are often platform-dependent and platforms are constantly evolving, which quickly renders libraries obsolete. We present Spiral, a domain-specific program generation system for important functionality used in signal processing and communication including linear transforms, filters, and other functions. Spiral completely replaces the human programmer. For a desired function, Spiral generates alternative algorithms, optimizes them, compiles them into programs, and intelligently searches for the best match to the computing platform. The main idea behind Spiral is a mathematical, declarative, domain-specific framework to represent algorithms and the use of rewriting systems to generate and optimize algorithms at a high level of abstraction. Experimental results show that the code generated by Spiral competes with, and sometimes outperforms, the best available human-written code.
NASA Astrophysics Data System (ADS)
Mimasu, Ken; Sanz, Verónica; Williams, Ciaran
2016-08-01
We present predictions for the associated production of a Higgs boson at NLO+PS accuracy, including the effect of anomalous interactions between the Higgs and gauge bosons. We present our results in different frameworks, one in which the interaction vertex between the Higgs boson and Standard Model W and Z bosons is parameterized in terms of general Lorentz structures, and one in which Electroweak symmetry breaking is manifestly linear and the resulting operators arise through a six-dimensional effective field theory framework. We present analytic calculations of the Standard Model and Beyond the Standard Model contributions, and discuss the phenomenological impact of the higher order pieces. Our results are implemented in the NLO Monte Carlo program MCFM, and interfaced to shower Monte Carlos through the Powheg box framework.
Dynamic constitutional frameworks for DNA biomimetic recognition.
Catana, Romina; Barboiu, Mihail; Moleavin, Ioana; Clima, Lilia; Rotaru, Alexandru; Ursu, Elena-Laura; Pinteala, Mariana
2015-02-07
Linear and cross-linked dynamic constitutional frameworks generated from reversibly interacting linear PEG/core constituents and cationic sites shed light on the dominant coiling versus linear DNA binding behaviours, closer to the histone DNA binding wrapping mechanism.
LQR-Based Optimal Distributed Cooperative Design for Linear Discrete-Time Multiagent Systems.
Zhang, Huaguang; Feng, Tao; Liang, Hongjing; Luo, Yanhong
2017-03-01
In this paper, a novel linear quadratic regulator (LQR)-based optimal distributed cooperative design method is developed for synchronization control of general linear discrete-time multiagent systems on a fixed, directed graph. Sufficient conditions are derived for synchronization, which restrict the graph eigenvalues into a bounded circular region in the complex plane. The synchronizing speed issue is also considered, and it turns out that the synchronizing region reduces as the synchronizing speed becomes faster. To obtain more desirable synchronizing capacity, the weighting matrices are selected by sufficiently utilizing the guaranteed gain margin of the optimal regulators. Based on the developed LQR-based cooperative design framework, an approximate dynamic programming technique is successfully introduced to overcome the (partially or completely) model-free cooperative design for linear multiagent systems. Finally, two numerical examples are given to illustrate the effectiveness of the proposed design methods.
A symbiotic approach to fluid equations and non-linear flux-driven simulations of plasma dynamics
NASA Astrophysics Data System (ADS)
Halpern, Federico
2017-10-01
The fluid framework is ubiquitous in studies of plasma transport and stability. Typical forms of the fluid equations are motivated by analytical work dating several decades ago, before computer simulations were indispensable, and can be, therefore, not optimal for numerical computation. We demonstrate a new first-principles approach to obtaining manifestly consistent, skew-symmetric fluid models, ensuring internal consistency and conservation properties even in discrete form. Mass, kinetic, and internal energy become quadratic (and always positive) invariants of the system. The model lends itself to a robust, straightforward discretization scheme with inherent non-linear stability. A simpler, drift-ordered form of the equations is obtained, and first results of their numerical implementation as a binary framework for bulk-fluid global plasma simulations are demonstrated. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Fusion Energy Sciences, Theory Program, under Award No. DE-FG02-95ER54309.
Klamt, Steffen; Müller, Stefan; Regensburger, Georg; Zanghellini, Jürgen
2018-05-01
The optimization of metabolic rates (as linear objective functions) represents the methodical core of flux-balance analysis techniques which have become a standard tool for the study of genome-scale metabolic models. Besides (growth and synthesis) rates, metabolic yields are key parameters for the characterization of biochemical transformation processes, especially in the context of biotechnological applications. However, yields are ratios of rates, and hence the optimization of yields (as nonlinear objective functions) under arbitrary linear constraints is not possible with current flux-balance analysis techniques. Despite the fundamental importance of yields in constraint-based modeling, a comprehensive mathematical framework for yield optimization is still missing. We present a mathematical theory that allows one to systematically compute and analyze yield-optimal solutions of metabolic models under arbitrary linear constraints. In particular, we formulate yield optimization as a linear-fractional program. For practical computations, we transform the linear-fractional yield optimization problem to a (higher-dimensional) linear problem. Its solutions determine the solutions of the original problem and can be used to predict yield-optimal flux distributions in genome-scale metabolic models. For the theoretical analysis, we consider the linear-fractional problem directly. Most importantly, we show that the yield-optimal solution set (like the rate-optimal solution set) is determined by (yield-optimal) elementary flux vectors of the underlying metabolic model. However, yield- and rate-optimal solutions may differ from each other, and hence optimal (biomass or product) yields are not necessarily obtained at solutions with optimal (growth or synthesis) rates. Moreover, we discuss phase planes/production envelopes and yield spaces, in particular, we prove that yield spaces are convex and provide algorithms for their computation. We illustrate our findings by a small example and demonstrate their relevance for metabolic engineering with realistic models of E. coli. We develop a comprehensive mathematical framework for yield optimization in metabolic models. Our theory is particularly useful for the study and rational modification of cell factories designed under given yield and/or rate requirements. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
A Regularized Linear Dynamical System Framework for Multivariate Time Series Analysis.
Liu, Zitao; Hauskrecht, Milos
2015-01-01
Linear Dynamical System (LDS) is an elegant mathematical framework for modeling and learning Multivariate Time Series (MTS). However, in general, it is difficult to set the dimension of an LDS's hidden state space. A small number of hidden states may not be able to model the complexities of a MTS, while a large number of hidden states can lead to overfitting. In this paper, we study learning methods that impose various regularization penalties on the transition matrix of the LDS model and propose a regularized LDS learning framework (rLDS) which aims to (1) automatically shut down LDSs' spurious and unnecessary dimensions, and consequently, address the problem of choosing the optimal number of hidden states; (2) prevent the overfitting problem given a small amount of MTS data; and (3) support accurate MTS forecasting. To learn the regularized LDS from data we incorporate a second order cone program and a generalized gradient descent method into the Maximum a Posteriori framework and use Expectation Maximization to obtain a low-rank transition matrix of the LDS model. We propose two priors for modeling the matrix which lead to two instances of our rLDS. We show that our rLDS is able to recover well the intrinsic dimensionality of the time series dynamics and it improves the predictive performance when compared to baselines on both synthetic and real-world MTS datasets.
NASA Astrophysics Data System (ADS)
Sun, Jingliang; Liu, Chunsheng
2018-01-01
In this paper, the problem of intercepting a manoeuvring target within a fixed final time is posed in a non-linear constrained zero-sum differential game framework. The Nash equilibrium solution is found by solving the finite-horizon constrained differential game problem via adaptive dynamic programming technique. Besides, a suitable non-quadratic functional is utilised to encode the control constraints into a differential game problem. The single critic network with constant weights and time-varying activation functions is constructed to approximate the solution of associated time-varying Hamilton-Jacobi-Isaacs equation online. To properly satisfy the terminal constraint, an additional error term is incorporated in a novel weight-updating law such that the terminal constraint error is also minimised over time. By utilising Lyapunov's direct method, the closed-loop differential game system and the estimation weight error of the critic network are proved to be uniformly ultimately bounded. Finally, the effectiveness of the proposed method is demonstrated by using a simple non-linear system and a non-linear missile-target interception system, assuming first-order dynamics for the interceptor and target.
Warid, Warid; Hizam, Hashim; Mariun, Norman; Abdul-Wahab, Noor Izzri
2016-01-01
This paper proposes a new formulation for the multi-objective optimal power flow (MOOPF) problem for meshed power networks considering distributed generation. An efficacious multi-objective fuzzy linear programming optimization (MFLP) algorithm is proposed to solve the aforementioned problem with and without considering the distributed generation (DG) effect. A variant combination of objectives is considered for simultaneous optimization, including power loss, voltage stability, and shunt capacitors MVAR reserve. Fuzzy membership functions for these objectives are designed with extreme targets, whereas the inequality constraints are treated as hard constraints. The multi-objective fuzzy optimal power flow (OPF) formulation was converted into a crisp OPF in a successive linear programming (SLP) framework and solved using an efficient interior point method (IPM). To test the efficacy of the proposed approach, simulations are performed on the IEEE 30-busand IEEE 118-bus test systems. The MFLP optimization is solved for several optimization cases. The obtained results are compared with those presented in the literature. A unique solution with a high satisfaction for the assigned targets is gained. Results demonstrate the effectiveness of the proposed MFLP technique in terms of solution optimality and rapid convergence. Moreover, the results indicate that using the optimal DG location with the MFLP algorithm provides the solution with the highest quality.
Warid, Warid; Hizam, Hashim; Mariun, Norman; Abdul-Wahab, Noor Izzri
2016-01-01
This paper proposes a new formulation for the multi-objective optimal power flow (MOOPF) problem for meshed power networks considering distributed generation. An efficacious multi-objective fuzzy linear programming optimization (MFLP) algorithm is proposed to solve the aforementioned problem with and without considering the distributed generation (DG) effect. A variant combination of objectives is considered for simultaneous optimization, including power loss, voltage stability, and shunt capacitors MVAR reserve. Fuzzy membership functions for these objectives are designed with extreme targets, whereas the inequality constraints are treated as hard constraints. The multi-objective fuzzy optimal power flow (OPF) formulation was converted into a crisp OPF in a successive linear programming (SLP) framework and solved using an efficient interior point method (IPM). To test the efficacy of the proposed approach, simulations are performed on the IEEE 30-busand IEEE 118-bus test systems. The MFLP optimization is solved for several optimization cases. The obtained results are compared with those presented in the literature. A unique solution with a high satisfaction for the assigned targets is gained. Results demonstrate the effectiveness of the proposed MFLP technique in terms of solution optimality and rapid convergence. Moreover, the results indicate that using the optimal DG location with the MFLP algorithm provides the solution with the highest quality. PMID:26954783
A Current Logical Framework: The Propositional Fragment
2003-01-01
Under the Curry- Howard isomorphism, M can also be read as a proof term, and A as a proposition of intuitionistic linear logic in its formulation as DILL...the obliga- tion to ensure that the underlying logic (via the Curry- Howard isomorphism, if you like) is sensible. In particular, the principles of...Proceedings of the International Logic Programming Symposium (ILPS), pages 51-65, Portland, Oregon, December 1995. MIT Press. 6. G. Bellin and P. J
NASA Astrophysics Data System (ADS)
Milgram, David L.; Kahn, Philip; Conner, Gary D.; Lawton, Daryl T.
1988-12-01
The goal of this effort is to develop and demonstrate prototype processing capabilities for a knowledge-based system to automatically extract and analyze features from Synthetic Aperture Radar (SAR) imagery. This effort constitutes Phase 2 funding through the Defense Small Business Innovative Research (SBIR) Program. Previous work examined the feasibility of and technology issues involved in the development of an automated linear feature extraction system. This final report documents this examination and the technologies involved in automating this image understanding task. In particular, it reports on a major software delivery containing an image processing algorithmic base, a perceptual structures manipulation package, a preliminary hypothesis management framework and an enhanced user interface.
Klein-Weyl's program and the ontology of gauge and quantum systems
NASA Astrophysics Data System (ADS)
Catren, Gabriel
2018-02-01
We distinguish two orientations in Weyl's analysis of the fundamental role played by the notion of symmetry in physics, namely an orientation inspired by Klein's Erlangen program and a phenomenological-transcendental orientation. By privileging the former to the detriment of the latter, we sketch a group(oid)-theoretical program-that we call the Klein-Weyl program-for the interpretation of both gauge theories and quantum mechanics in a single conceptual framework. This program is based on Weyl's notion of a "structure-endowed entity" equipped with a "group of automorphisms". First, we analyze what Weyl calls the "problem of relativity" in the frameworks provided by special relativity, general relativity, and Yang-Mills theories. We argue that both general relativity and Yang-Mills theories can be understood in terms of a localization of Klein's Erlangen program: while the latter describes the group-theoretical automorphisms of a single structure (such as homogenous geometries), local gauge symmetries and the corresponding gauge fields (Ehresmann connections) can be naturally understood in terms of the groupoid-theoretical isomorphisms in a family of identical structures. Second, we argue that quantum mechanics can be understood in terms of a linearization of Klein's Erlangen program. This stance leads us to an interpretation of the fact that quantum numbers are "indices characterizing representations of groups" ((Weyl, 1931a), p.xxi) in terms of a correspondence between the ontological categories of identity and determinateness.
NASA Technical Reports Server (NTRS)
Folta, David C.; Carpenter, J. Russell
1999-01-01
A decentralized control is investigated for applicability to the autonomous formation flying control algorithm developed by GSFC for the New Millenium Program Earth Observer-1 (EO-1) mission. This decentralized framework has the following characteristics: The approach is non-hierarchical, and coordination by a central supervisor is not required; Detected failures degrade the system performance gracefully; Each node in the decentralized network processes only its own measurement data, in parallel with the other nodes; Although the total computational burden over the entire network is greater than it would be for a single, centralized controller, fewer computations are required locally at each node; Requirements for data transmission between nodes are limited to only the dimension of the control vector, at the cost of maintaining a local additional data vector. The data vector compresses all past measurement history from all the nodes into a single vector of the dimension of the state; and The approach is optimal with respect to standard cost functions. The current approach is valid for linear time-invariant systems only. Similar to the GSFC formation flying algorithm, the extension to linear LQG time-varying systems requires that each node propagate its filter covariance forward (navigation) and controller Riccati matrix backward (guidance) at each time step. Extension of the GSFC algorithm to non-linear systems can also be accomplished via linearization about a reference trajectory in the standard fashion, or linearization about the current state estimate as with the extended Kalman filter. To investigate the feasibility of the decentralized integration with the GSFC algorithm, an existing centralized LQG design for a single spacecraft orbit control problem is adapted to the decentralized framework while using the GSFC algorithm's state transition matrices and framework. The existing GSFC design uses both reference trajectories of each spacecraft in formation and by appropriate choice of coordinates and simplified measurement modeling is formulated as a linear time-invariant system. Results for improvements to the GSFC algorithm and a multiple satellite formation will be addressed. The goal of this investigation is to progressively relax the assumptions that result in linear time-invariance, ultimately to the point of linearization of the non-linear dynamics about the current state estimate as in the extended Kalman filter. An assessment will then be made about the feasibility of the decentralized approach to the realistic formation flying application of the EO-1/Landsat 7 formation flying experiment.
Quantifying circular RNA expression from RNA-seq data using model-based framework.
Li, Musheng; Xie, Xueying; Zhou, Jing; Sheng, Mengying; Yin, Xiaofeng; Ko, Eun-A; Zhou, Tong; Gu, Wanjun
2017-07-15
Circular RNAs (circRNAs) are a class of non-coding RNAs that are widely expressed in various cell lines and tissues of many organisms. Although the exact function of many circRNAs is largely unknown, the cell type-and tissue-specific circRNA expression has implicated their crucial functions in many biological processes. Hence, the quantification of circRNA expression from high-throughput RNA-seq data is becoming important to ascertain. Although many model-based methods have been developed to quantify linear RNA expression from RNA-seq data, these methods are not applicable to circRNA quantification. Here, we proposed a novel strategy that transforms circular transcripts to pseudo-linear transcripts and estimates the expression values of both circular and linear transcripts using an existing model-based algorithm, Sailfish. The new strategy can accurately estimate transcript expression of both linear and circular transcripts from RNA-seq data. Several factors, such as gene length, amount of expression and the ratio of circular to linear transcripts, had impacts on quantification performance of circular transcripts. In comparison to count-based tools, the new computational framework had superior performance in estimating the amount of circRNA expression from both simulated and real ribosomal RNA-depleted (rRNA-depleted) RNA-seq datasets. On the other hand, the consideration of circular transcripts in expression quantification from rRNA-depleted RNA-seq data showed substantial increased accuracy of linear transcript expression. Our proposed strategy was implemented in a program named Sailfish-cir. Sailfish-cir is freely available at https://github.com/zerodel/Sailfish-cir . tongz@medicine.nevada.edu or wanjun.gu@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Kapinos, Kandice A; Caloyeras, John P; Liu, Hangsheng; Mattke, Soeren
2015-12-01
This article aims to test whether a workplace wellness program reduces health care cost for higher risk employees or employees with greater participation. The program effect on costs was estimated using a generalized linear model with a log-link function using a difference-in-difference framework with a propensity score matched sample of employees using claims and program data from a large US firm from 2003 to 2011. The program targeting higher risk employees did not yield cost savings. Employees participating in five or more sessions aimed at encouraging more healthful living had about $20 lower per member per month costs relative to matched comparisons (P = 0.002). Our results add to the growing evidence base that workplace wellness programs aimed at primary prevention do not reduce health care cost, with the exception of those employees who choose to participate more actively.
MM Algorithms for Geometric and Signomial Programming
Lange, Kenneth; Zhou, Hua
2013-01-01
This paper derives new algorithms for signomial programming, a generalization of geometric programming. The algorithms are based on a generic principle for optimization called the MM algorithm. In this setting, one can apply the geometric-arithmetic mean inequality and a supporting hyperplane inequality to create a surrogate function with parameters separated. Thus, unconstrained signomial programming reduces to a sequence of one-dimensional minimization problems. Simple examples demonstrate that the MM algorithm derived can converge to a boundary point or to one point of a continuum of minimum points. Conditions under which the minimum point is unique or occurs in the interior of parameter space are proved for geometric programming. Convergence to an interior point occurs at a linear rate. Finally, the MM framework easily accommodates equality and inequality constraints of signomial type. For the most important special case, constrained quadratic programming, the MM algorithm involves very simple updates. PMID:24634545
MM Algorithms for Geometric and Signomial Programming.
Lange, Kenneth; Zhou, Hua
2014-02-01
This paper derives new algorithms for signomial programming, a generalization of geometric programming. The algorithms are based on a generic principle for optimization called the MM algorithm. In this setting, one can apply the geometric-arithmetic mean inequality and a supporting hyperplane inequality to create a surrogate function with parameters separated. Thus, unconstrained signomial programming reduces to a sequence of one-dimensional minimization problems. Simple examples demonstrate that the MM algorithm derived can converge to a boundary point or to one point of a continuum of minimum points. Conditions under which the minimum point is unique or occurs in the interior of parameter space are proved for geometric programming. Convergence to an interior point occurs at a linear rate. Finally, the MM framework easily accommodates equality and inequality constraints of signomial type. For the most important special case, constrained quadratic programming, the MM algorithm involves very simple updates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulvatunyou, Boonserm; Wysk, Richard A.; Cho, Hyunbo
2004-06-01
In today's global manufacturing environment, manufacturing functions are distributed as never before. Design, engineering, fabrication, and assembly of new products are done routinely in many different enterprises scattered around the world. Successful business transactions require the sharing of design and engineering data on an unprecedented scale. This paper describes a framework that facilitates the collaboration of engineering tasks, particularly process planning and analysis, to support such globalized manufacturing activities. The information models of data and the software components that integrate those information models are described. The integration framework uses an Integrated Product and Process Data (IPPD) representation called a Resourcemore » Independent Operation Summary (RIOS) to facilitate the communication of business and manufacturing requirements. Hierarchical process modeling, process planning decomposition and an augmented AND/OR directed graph are used in this representation. The Resource Specific Process Planning (RSPP) module assigns required equipment and tools, selects process parameters, and determines manufacturing costs based on two-level hierarchical RIOS data. The shop floor knowledge (resource and process knowledge) and a hybrid approach (heuristic and linear programming) to linearize the AND/OR graph provide the basis for the planning. Finally, a prototype system is developed and demonstrated with an exemplary part. Java and XML (Extensible Markup Language) are used to ensure software and information portability.« less
Cheng, Guanhui; Huang, Guohe; Dong, Cong; Xu, Ye; Chen, Xiujuan; Chen, Jiapei
2017-03-01
Due to the existence of complexities of heterogeneities, hierarchy, discreteness, and interactions in municipal solid waste management (MSWM) systems such as Beijing, China, a series of socio-economic and eco-environmental problems may emerge or worsen and result in irredeemable damages in the following decades. Meanwhile, existing studies, especially ones focusing on MSWM in Beijing, could hardly reflect these complexities in system simulations and provide reliable decision support for management practices. Thus, a framework of distributed mixed-integer fuzzy hierarchical programming (DMIFHP) is developed in this study for MSWM under these complexities. Beijing is selected as a representative case. The Beijing MSWM system is comprehensively analyzed in many aspects such as socio-economic conditions, natural conditions, spatial heterogeneities, treatment facilities, and system complexities, building a solid foundation for system simulation and optimization. Correspondingly, the MSWM system in Beijing is discretized as 235 grids to reflect spatial heterogeneity. A DMIFHP model which is a nonlinear programming problem is constructed to parameterize the Beijing MSWM system. To enable scientific solving of it, a solution algorithm is proposed based on coupling of fuzzy programming and mixed-integer linear programming. Innovations and advantages of the DMIFHP framework are discussed. The optimal MSWM schemes and mechanism revelations will be discussed in another companion paper due to length limitation.
jInv: A Modular and Scalable Framework for Electromagnetic Inverse Problems
NASA Astrophysics Data System (ADS)
Belliveau, P. T.; Haber, E.
2016-12-01
Inversion is a key tool in the interpretation of geophysical electromagnetic (EM) data. Three-dimensional (3D) EM inversion is very computationally expensive and practical software for inverting large 3D EM surveys must be able to take advantage of high performance computing (HPC) resources. It has traditionally been difficult to achieve those goals in a high level dynamic programming environment that allows rapid development and testing of new algorithms, which is important in a research setting. With those goals in mind, we have developed jInv, a framework for PDE constrained parameter estimation problems. jInv provides optimization and regularization routines, a framework for user defined forward problems, and interfaces to several direct and iterative solvers for sparse linear systems. The forward modeling framework provides finite volume discretizations of differential operators on rectangular tensor product meshes and tetrahedral unstructured meshes that can be used to easily construct forward modeling and sensitivity routines for forward problems described by partial differential equations. jInv is written in the emerging programming language Julia. Julia is a dynamic language targeted at the computational science community with a focus on high performance and native support for parallel programming. We have developed frequency and time-domain EM forward modeling and sensitivity routines for jInv. We will illustrate its capabilities and performance with two synthetic time-domain EM inversion examples. First, in airborne surveys, which use many sources, we achieve distributed memory parallelism by decoupling the forward and inverse meshes and performing forward modeling for each source on small, locally refined meshes. Secondly, we invert grounded source time-domain data from a gradient array style induced polarization survey using a novel time-stepping technique that allows us to compute data from different time-steps in parallel. These examples both show that it is possible to invert large scale 3D time-domain EM datasets within a modular, extensible framework written in a high-level, easy to use programming language.
Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J
2015-01-01
A generalized linear modeling framework to the analysis of responses and response times is outlined. In this framework, referred to as bivariate generalized linear item response theory (B-GLIRT), separate generalized linear measurement models are specified for the responses and the response times that are subsequently linked by cross-relations. The cross-relations can take various forms. Here, we focus on cross-relations with a linear or interaction term for ability tests, and cross-relations with a curvilinear term for personality tests. In addition, we discuss how popular existing models from the psychometric literature are special cases in the B-GLIRT framework depending on restrictions in the cross-relation. This allows us to compare existing models conceptually and empirically. We discuss various extensions of the traditional models motivated by practical problems. We also illustrate the applicability of our approach using various real data examples, including data on personality and cognitive ability.
Runtime Verification of C Programs
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2008-01-01
We present in this paper a framework, RMOR, for monitoring the execution of C programs against state machines, expressed in a textual (nongraphical) format in files separate from the program. The state machine language has been inspired by a graphical state machine language RCAT recently developed at the Jet Propulsion Laboratory, as an alternative to using Linear Temporal Logic (LTL) for requirements capture. Transitions between states are labeled with abstract event names and Boolean expressions over such. The abstract events are connected to code fragments using an aspect-oriented pointcut language similar to ASPECTJ's or ASPECTC's pointcut language. The system is implemented in the C analysis and transformation package CIL, and is programmed in OCAML, the implementation language of CIL. The work is closely related to the notion of stateful aspects within aspect-oriented programming, where pointcut languages are extended with temporal assertions over the execution trace.
NASA Astrophysics Data System (ADS)
Conner, Gary D.; Milgram, David L.; Lawton, Daryl T.; McConnell, Christopher C.
1988-04-01
The goal of this effort is to develop and demonstrate prototype processing capabilities for a knowledge-based system to automatically extract and analyze linear features from synthetic aperture radar (SAR) imagery. This effort constitutes Phase 2 funding through the Defense Small Business Innovative Research (SBIR) Program. Previous work examined the feasibility of the technology issues involved in the development of an automatedlinear feature extraction system. This Option 1 Final Report documents this examination and the technologies involved in automating this image understanding task. In particular, it reports on a major software delivery containing an image processing algorithmic base, a perceptual structures manipulation package, a preliminary hypothesis management framework and an enhanced user interface.
Towards Stability Analysis of Jump Linear Systems with State-Dependent and Stochastic Switching
NASA Technical Reports Server (NTRS)
Tejada, Arturo; Gonzalez, Oscar R.; Gray, W. Steven
2004-01-01
This paper analyzes the stability of hierarchical jump linear systems where the supervisor is driven by a Markovian stochastic process and by the values of the supervised jump linear system s states. The stability framework for this class of systems is developed over infinite and finite time horizons. The framework is then used to derive sufficient stability conditions for a specific class of hybrid jump linear systems with performance supervision. New sufficient stochastic stability conditions for discrete-time jump linear systems are also presented.
Koopman Operator Framework for Time Series Modeling and Analysis
NASA Astrophysics Data System (ADS)
Surana, Amit
2018-01-01
We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.
Multivariate quadrature for representing cloud condensation nuclei activity of aerosol populations
Fierce, Laura; McGraw, Robert L.
2017-07-26
Here, sparse representations of atmospheric aerosols are needed for efficient regional- and global-scale chemical transport models. Here we introduce a new framework for representing aerosol distributions, based on the quadrature method of moments. Given a set of moment constraints, we show how linear programming, combined with an entropy-inspired cost function, can be used to construct optimized quadrature representations of aerosol distributions. The sparse representations derived from this approach accurately reproduce cloud condensation nuclei (CCN) activity for realistically complex distributions simulated by a particleresolved model. Additionally, the linear programming techniques described in this study can be used to bound key aerosolmore » properties, such as the number concentration of CCN. Unlike the commonly used sparse representations, such as modal and sectional schemes, the maximum-entropy approach described here is not constrained to pre-determined size bins or assumed distribution shapes. This study is a first step toward a particle-based aerosol scheme that will track multivariate aerosol distributions with sufficient computational efficiency for large-scale simulations.« less
Multivariate quadrature for representing cloud condensation nuclei activity of aerosol populations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fierce, Laura; McGraw, Robert L.
Here, sparse representations of atmospheric aerosols are needed for efficient regional- and global-scale chemical transport models. Here we introduce a new framework for representing aerosol distributions, based on the quadrature method of moments. Given a set of moment constraints, we show how linear programming, combined with an entropy-inspired cost function, can be used to construct optimized quadrature representations of aerosol distributions. The sparse representations derived from this approach accurately reproduce cloud condensation nuclei (CCN) activity for realistically complex distributions simulated by a particleresolved model. Additionally, the linear programming techniques described in this study can be used to bound key aerosolmore » properties, such as the number concentration of CCN. Unlike the commonly used sparse representations, such as modal and sectional schemes, the maximum-entropy approach described here is not constrained to pre-determined size bins or assumed distribution shapes. This study is a first step toward a particle-based aerosol scheme that will track multivariate aerosol distributions with sufficient computational efficiency for large-scale simulations.« less
Control of Distributed Parameter Systems
1990-08-01
vari- ant of the general Lotka - Volterra model for interspecific competition. The variant described the emergence of one subpopulation from another as a...distribut ion unlimited. I&. ARSTRACT (MAUMUnw2O1 A unified arioroximation framework for Parameter estimation In general linear POE models has been completed...unified approximation framework for parameter estimation in general linear PDE models. This framework has provided the theoretical basis for a number of
Khammash, Mustafa
2014-01-01
Reaction networks are systems in which the populations of a finite number of species evolve through predefined interactions. Such networks are found as modeling tools in many biological disciplines such as biochemistry, ecology, epidemiology, immunology, systems biology and synthetic biology. It is now well-established that, for small population sizes, stochastic models for biochemical reaction networks are necessary to capture randomness in the interactions. The tools for analyzing such models, however, still lag far behind their deterministic counterparts. In this paper, we bridge this gap by developing a constructive framework for examining the long-term behavior and stability properties of the reaction dynamics in a stochastic setting. In particular, we address the problems of determining ergodicity of the reaction dynamics, which is analogous to having a globally attracting fixed point for deterministic dynamics. We also examine when the statistical moments of the underlying process remain bounded with time and when they converge to their steady state values. The framework we develop relies on a blend of ideas from probability theory, linear algebra and optimization theory. We demonstrate that the stability properties of a wide class of biological networks can be assessed from our sufficient theoretical conditions that can be recast as efficient and scalable linear programs, well-known for their tractability. It is notably shown that the computational complexity is often linear in the number of species. We illustrate the validity, the efficiency and the wide applicability of our results on several reaction networks arising in biochemistry, systems biology, epidemiology and ecology. The biological implications of the results as well as an example of a non-ergodic biological network are also discussed. PMID:24968191
NASA Astrophysics Data System (ADS)
Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro
2016-08-01
We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication necessary for the interaction calculation. We discuss how we can overcome these bottlenecks.
Optimizing financial effects of HIE: a multi-party linear programming approach.
Sridhar, Srikrishna; Brennan, Patricia Flatley; Wright, Stephen J; Robinson, Stephen M
2012-01-01
To describe an analytical framework for quantifying the societal savings and financial consequences of a health information exchange (HIE), and to demonstrate its use in designing pricing policies for sustainable HIEs. We developed a linear programming model to (1) quantify the financial worth of HIE information to each of its participating institutions and (2) evaluate three HIE pricing policies: fixed-rate annual, charge per visit, and charge per look-up. We considered three desired outcomes of HIE-related emergency care (modeled as parameters): preventing unrequired hospitalizations, reducing duplicate tests, and avoiding emergency department (ED) visits. We applied this framework to 4639 ED encounters over a 12-month period in three large EDs in Milwaukee, Wisconsin, using Medicare/Medicaid claims data, public reports of hospital admissions, published payer mix data, and use data from a not-for-profit regional HIE. For this HIE, data accesses produced net financial gains for all providers and payers. Gains, due to HIE, were more significant for providers with more health maintenance organizations patients. Reducing unrequired hospitalizations and avoiding repeat ED visits were responsible for more than 70% of the savings. The results showed that fixed annual subscriptions can sustain this HIE, while ensuring financial gains to all participants. Sensitivity analysis revealed that the results were robust to uncertainties in modeling parameters. Our specific HIE pricing recommendations depend on the unique characteristics of this study population. However, our main contribution is the modeling approach, which is broadly applicable to other populations.
Terry, Claire; Hays, Sean; McCoy, Alene T; McFadden, Lisa G; Aggarwal, Manoj; Rasoulpour, Reza J; Juberg, Daland R
2016-03-01
A strategic and comprehensive program in which toxicokinetic (TK) measurements are made for all agrochemicals undergoing toxicity testing (both new compounds and compounds already registered for use) is described. This approach provides the data to more accurately assess the toxicokinetics of agrochemicals and their metabolites in laboratory animals and humans. Having this knowledge provides the ability to conduct more insightful toxicity studies, refine and interpret exposure assessments and reduce uncertainty in risk assessments. By developing a better understanding of TK across species, including humans via in vitro metabolism studies, any differences across species in TK can be identified early and the most relevant species can be selected for toxicity tests. It also provides the ability to identify any non-linearities in TK as a function of dose, which in turn can be used to identify a kinetically derived maximum dose (KMD) and avoid dosing inappropriately outside of the kinetic linear range. Measuring TK in key life stages also helps to identify changes in ADME parameters from in utero to adults. A robust TK database can also be used to set internal concentration based "Reference Concentrations" and Biomonitoring Equivalents (BE), and support selection of Chemical Specific Adjustment Factors (CSAF). All of these factors support the reduction of uncertainty throughout the entire risk assessment process. This paper outlines how a TK research strategy can be integrated into new agrochemical toxicity testing programs, together with a proposed Framework for future use. Copyright © 2015 Elsevier Inc. All rights reserved.
A penalized framework for distributed lag non-linear models.
Gasparrini, Antonio; Scheipl, Fabian; Armstrong, Ben; Kenward, Michael G
2017-09-01
Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.
NASA Astrophysics Data System (ADS)
Ziegler, Benjamin; Rauhut, Guntram
2016-03-01
The transformation of multi-dimensional potential energy surfaces (PESs) from a grid-based multimode representation to an analytical one is a standard procedure in quantum chemical programs. Within the framework of linear least squares fitting, a simple and highly efficient algorithm is presented, which relies on a direct product representation of the PES and a repeated use of Kronecker products. It shows the same scalings in computational cost and memory requirements as the potfit approach. In comparison to customary linear least squares fitting algorithms, this corresponds to a speed-up and memory saving by several orders of magnitude. Different fitting bases are tested, namely, polynomials, B-splines, and distributed Gaussians. Benchmark calculations are provided for the PESs of a set of small molecules.
Ziegler, Benjamin; Rauhut, Guntram
2016-03-21
The transformation of multi-dimensional potential energy surfaces (PESs) from a grid-based multimode representation to an analytical one is a standard procedure in quantum chemical programs. Within the framework of linear least squares fitting, a simple and highly efficient algorithm is presented, which relies on a direct product representation of the PES and a repeated use of Kronecker products. It shows the same scalings in computational cost and memory requirements as the potfit approach. In comparison to customary linear least squares fitting algorithms, this corresponds to a speed-up and memory saving by several orders of magnitude. Different fitting bases are tested, namely, polynomials, B-splines, and distributed Gaussians. Benchmark calculations are provided for the PESs of a set of small molecules.
Generalized Multilevel Structural Equation Modeling
ERIC Educational Resources Information Center
Rabe-Hesketh, Sophia; Skrondal, Anders; Pickles, Andrew
2004-01-01
A unifying framework for generalized multilevel structural equation modeling is introduced. The models in the framework, called generalized linear latent and mixed models (GLLAMM), combine features of generalized linear mixed models (GLMM) and structural equation models (SEM) and consist of a response model and a structural model for the latent…
Cloud-based large-scale air traffic flow optimization
NASA Astrophysics Data System (ADS)
Cao, Yi
The ever-increasing traffic demand makes the efficient use of airspace an imperative mission, and this paper presents an effort in response to this call. Firstly, a new aggregate model, called Link Transmission Model (LTM), is proposed, which models the nationwide traffic as a network of flight routes identified by origin-destination pairs. The traversal time of a flight route is assumed to be the mode of distribution of historical flight records, and the mode is estimated by using Kernel Density Estimation. As this simplification abstracts away physical trajectory details, the complexity of modeling is drastically decreased, resulting in efficient traffic forecasting. The predicative capability of LTM is validated against recorded traffic data. Secondly, a nationwide traffic flow optimization problem with airport and en route capacity constraints is formulated based on LTM. The optimization problem aims at alleviating traffic congestions with minimal global delays. This problem is intractable due to millions of variables. A dual decomposition method is applied to decompose the large-scale problem such that the subproblems are solvable. However, the whole problem is still computational expensive to solve since each subproblem is an smaller integer programming problem that pursues integer solutions. Solving an integer programing problem is known to be far more time-consuming than solving its linear relaxation. In addition, sequential execution on a standalone computer leads to linear runtime increase when the problem size increases. To address the computational efficiency problem, a parallel computing framework is designed which accommodates concurrent executions via multithreading programming. The multithreaded version is compared with its monolithic version to show decreased runtime. Finally, an open-source cloud computing framework, Hadoop MapReduce, is employed for better scalability and reliability. This framework is an "off-the-shelf" parallel computing model that can be used for both offline historical traffic data analysis and online traffic flow optimization. It provides an efficient and robust platform for easy deployment and implementation. A small cloud consisting of five workstations was configured and used to demonstrate the advantages of cloud computing in dealing with large-scale parallelizable traffic problems.
Efficient computation of optimal actions.
Todorov, Emanuel
2009-07-14
Optimal choice of actions is a fundamental problem relevant to fields as diverse as neuroscience, psychology, economics, computer science, and control engineering. Despite this broad relevance the abstract setting is similar: we have an agent choosing actions over time, an uncertain dynamical system whose state is affected by those actions, and a performance criterion that the agent seeks to optimize. Solving problems of this kind remains hard, in part, because of overly generic formulations. Here, we propose a more structured formulation that greatly simplifies the construction of optimal control laws in both discrete and continuous domains. An exhaustive search over actions is avoided and the problem becomes linear. This yields algorithms that outperform Dynamic Programming and Reinforcement Learning, and thereby solve traditional problems more efficiently. Our framework also enables computations that were not possible before: composing optimal control laws by mixing primitives, applying deterministic methods to stochastic systems, quantifying the benefits of error tolerance, and inferring goals from behavioral data via convex optimization. Development of a general class of easily solvable problems tends to accelerate progress--as linear systems theory has done, for example. Our framework may have similar impact in fields where optimal choice of actions is relevant.
Study of the linearity of CABRI experimental ionization chambers during RIA transients
NASA Astrophysics Data System (ADS)
Lecerf, J.; Garnier, Y.; Hudelot, JP.; Duc, B.; Pantera, L.
2018-01-01
CABRI is an experimental pulse reactor operated by CEA at the Cadarache research center and funded by the French Nuclear Safety and Radioprotection Institute (IRSN). For the purpose of the CABRI International Program (CIP), operated and managed by IRSN under an OECD/NEA framework it has been refurbished since 2003 to be able to provide experiments in prototypical PWR conditions (155 bar, 300 °C) in order to study the fuel behavior under Reactivity Initiated Accident (RIA) conditions. This paper first reminds the objectives of the power commissioning tests performed on the CABRI facility. The design and location of the neutron detectors monitoring the core power are also presented. Then it focuses on the different methodologies used to calibrate the detectors and check the consistency and co-linearity of the measurements. Finally, it presents the methods used to check the linearity of the neutron detectors up to the high power levels ( 20 GW) reached during power transients. Some results obtained during the power tests campaign are also presented.
NASA Technical Reports Server (NTRS)
Gibson, J. S.; Rosen, I. G.
1986-01-01
An abstract approximation framework is developed for the finite and infinite time horizon discrete-time linear-quadratic regulator problem for systems whose state dynamics are described by a linear semigroup of operators on an infinite dimensional Hilbert space. The schemes included the framework yield finite dimensional approximations to the linear state feedback gains which determine the optimal control law. Convergence arguments are given. Examples involving hereditary and parabolic systems and the vibration of a flexible beam are considered. Spline-based finite element schemes for these classes of problems, together with numerical results, are presented and discussed.
A Framework for Mathematical Thinking: The Case of Linear Algebra
ERIC Educational Resources Information Center
Stewart, Sepideh; Thomas, Michael O. J.
2009-01-01
Linear algebra is one of the unavoidable advanced courses that many mathematics students encounter at university level. The research reported here was part of the first author's recent PhD study, where she created and applied a theoretical framework combining the strengths of two major mathematics education theories in order to investigate the…
Gate Set Tomography on a trapped ion qubit
NASA Astrophysics Data System (ADS)
Nielsen, Erik; Blume-Kohout, Robin; Gamble, John; Rundinger, Kenneth; Mizrahi, Jonathan; Sterk, Johathan; Maunz, Peter
2015-03-01
We present enhancements to gate-set tomography (GST), which is a framework in which an entire set of quantum logic gates (including preparation and measurement) can be fully characterized without need for pre-calibrated operations. Our new method, ``extended Linear GST'' (eLGST) uses fast, reliable analysis of structured long gate sequences to deliver tomographic precision at the Heisenberg limit with GST's calibration-free framework. We demonstrate this precision on a trapped-ion qubit, and show significant (orders of magnitude) advantage over both standard process tomography and randomized benchmarking. This work was supported by the Laboratory Directed Research and Development (LDRD) program at Sandia National Laboratories. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.
Menu-Driven Solver Of Linear-Programming Problems
NASA Technical Reports Server (NTRS)
Viterna, L. A.; Ferencz, D.
1992-01-01
Program assists inexperienced user in formulating linear-programming problems. A Linear Program Solver (ALPS) computer program is full-featured LP analysis program. Solves plain linear-programming problems as well as more-complicated mixed-integer and pure-integer programs. Also contains efficient technique for solution of purely binary linear-programming problems. Written entirely in IBM's APL2/PC software, Version 1.01. Packed program contains licensed material, property of IBM (copyright 1988, all rights reserved).
Local linear discriminant analysis framework using sample neighbors.
Fan, Zizhu; Xu, Yong; Zhang, David
2011-07-01
The linear discriminant analysis (LDA) is a very popular linear feature extraction approach. The algorithms of LDA usually perform well under the following two assumptions. The first assumption is that the global data structure is consistent with the local data structure. The second assumption is that the input data classes are Gaussian distributions. However, in real-world applications, these assumptions are not always satisfied. In this paper, we propose an improved LDA framework, the local LDA (LLDA), which can perform well without needing to satisfy the above two assumptions. Our LLDA framework can effectively capture the local structure of samples. According to different types of local data structure, our LLDA framework incorporates several different forms of linear feature extraction approaches, such as the classical LDA and principal component analysis. The proposed framework includes two LLDA algorithms: a vector-based LLDA algorithm and a matrix-based LLDA (MLLDA) algorithm. MLLDA is directly applicable to image recognition, such as face recognition. Our algorithms need to train only a small portion of the whole training set before testing a sample. They are suitable for learning large-scale databases especially when the input data dimensions are very high and can achieve high classification accuracy. Extensive experiments show that the proposed algorithms can obtain good classification results.
Data Structures for Extreme Scale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahan, Simon
As computing problems of national importance grow, the government meets the increased demand by funding the development of ever larger systems. The overarching goal of the work supported in part by this grant is to increase efficiency of programming and performing computations on these large computing systems. In past work, we have demonstrated that some of these computations once thought to require expensive hardware designs and/or complex, special-purpose programming may be executed efficiently on low-cost commodity cluster computing systems using a general-purpose “latency-tolerant” programming framework. One important developed application of the ideas underlying this framework is graph database technology supportingmore » social network pattern matching used by US intelligence agencies to more quickly identify potential terrorist threats. This database application has been spun out by the Pacific Northwest National Laboratory, a Department of Energy Laboratory, into a commercial start-up, Trovares Inc. We explore an alternative application of the same underlying ideas to a well-studied challenge arising in engineering: solving unstructured sparse linear equations. Solving these equations is key to predicting the behavior of large electronic circuits before they are fabricated. Predicting that behavior ahead of fabrication means that designs can optimized and errors corrected ahead of the expense of manufacture.« less
Semilinear programming: applications and implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohan, S.
Semilinear programming is a method of solving optimization problems with linear constraints where the non-negativity restrictions on the variables are dropped and the objective function coefficients can take on different values depending on whether the variable is positive or negative. The simplex method for linear programming is modified in this thesis to solve general semilinear and piecewise linear programs efficiently without having to transform them into equivalent standard linear programs. Several models in widely different areas of optimization such as production smoothing, facility locations, goal programming and L/sub 1/ estimation are presented first to demonstrate the compact formulation that arisesmore » when such problems are formulated as semilinear programs. A code SLP is constructed using the semilinear programming techniques. Problems in aggregate planning and L/sub 1/ estimation are solved using SLP and equivalent linear programs using a linear programming simplex code. Comparisons of CPU times and number iterations indicate SLP to be far superior. The semilinear programming techniques are extended to piecewise linear programming in the implementation of the code PLP. Piecewise linear models in aggregate planning are solved using PLP and equivalent standard linear programs using a simple upper bounded linear programming code SUBLP.« less
ERIC Educational Resources Information Center
Chen, Haiwen
2012-01-01
In this article, linear item response theory (IRT) observed-score equating is compared under a generalized kernel equating framework with Levine observed-score equating for nonequivalent groups with anchor test design. Interestingly, these two equating methods are closely related despite being based on different methodologies. Specifically, when…
ERIC Educational Resources Information Center
Moody, John Charles
Assessed were the effects of linear and modified linear programed materials on the achievement of slow learners in tenth grade Biological Sciences Curriculum Study (BSCS) Special Materials biology. Two hundred and six students were randomly placed into four programed materials formats: linear programed materials, modified linear program with…
NASA Astrophysics Data System (ADS)
Sharqawy, Mostafa H.
2016-12-01
Pore network models (PNM) of Berea and Fontainebleau sandstones were constructed using nonlinear programming (NLP) and optimization methods. The constructed PNMs are considered as a digital representation of the rock samples which were based on matching the macroscopic properties of the porous media and used to conduct fluid transport simulations including single and two-phase flow. The PNMs consisted of cubic networks of randomly distributed pores and throats sizes and with various connectivity levels. The networks were optimized such that the upper and lower bounds of the pore sizes are determined using the capillary tube bundle model and the Nelder-Mead method instead of guessing them, which reduces the optimization computational time significantly. An open-source PNM framework was employed to conduct transport and percolation simulations such as invasion percolation and Darcian flow. The PNM model was subsequently used to compute the macroscopic properties; porosity, absolute permeability, specific surface area, breakthrough capillary pressure, and primary drainage curve. The pore networks were optimized to allow for the simulation results of the macroscopic properties to be in excellent agreement with the experimental measurements. This study demonstrates that non-linear programming and optimization methods provide a promising method for pore network modeling when computed tomography imaging may not be readily available.
Smart-Grid Backbone Network Real-Time Delay Reduction via Integer Programming.
Pagadrai, Sasikanth; Yilmaz, Muhittin; Valluri, Pratyush
2016-08-01
This research investigates an optimal delay-based virtual topology design using integer linear programming (ILP), which is applied to the current backbone networks such as smart-grid real-time communication systems. A network traffic matrix is applied and the corresponding virtual topology problem is solved using the ILP formulations that include a network delay-dependent objective function and lightpath routing, wavelength assignment, wavelength continuity, flow routing, and traffic loss constraints. The proposed optimization approach provides an efficient deterministic integration of intelligent sensing and decision making, and network learning features for superior smart grid operations by adaptively responding the time-varying network traffic data as well as operational constraints to maintain optimal virtual topologies. A representative optical backbone network has been utilized to demonstrate the proposed optimization framework whose simulation results indicate that superior smart-grid network performance can be achieved using commercial networks and integer programming.
Programming Models for Concurrency and Real-Time
NASA Astrophysics Data System (ADS)
Vitek, Jan
Modern real-time applications are increasingly large, complex and concurrent systems which must meet stringent performance and predictability requirements. Programming those systems require fundamental advances in programming languages and runtime systems. This talk presents our work on Flexotasks, a programming model for concurrent, real-time systems inspired by stream-processing and concurrent active objects. Some of the key innovations in Flexotasks are that it support both real-time garbage collection and region-based memory with an ownership type system for static safety. Communication between tasks is performed by channels with a linear type discipline to avoid copying messages, and by a non-blocking transactional memory facility. We have evaluated our model empirically within two distinct implementations, one based on Purdue’s Ovm research virtual machine framework and the other on Websphere, IBM’s production real-time virtual machine. We have written a number of small programs, as well as a 30 KLOC avionics collision detector application. We show that Flexotasks are capable of executing periodic threads at 10 KHz with a standard deviation of 1.2us and have performance competitive with hand coded C programs.
ERIC Educational Resources Information Center
Chen, Haiwen; Holland, Paul
2010-01-01
In this paper, we develop a new curvilinear equating for the nonequivalent groups with anchor test (NEAT) design under the assumption of the classical test theory model, that we name curvilinear Levine observed score equating. In fact, by applying both the kernel equating framework and the mean preserving linear transformation of…
Work cost of thermal operations in quantum thermodynamics
NASA Astrophysics Data System (ADS)
Renes, Joseph M.
2014-07-01
Adopting a resource theory framework of thermodynamics for quantum and nano systems pioneered by Janzing et al. (Int. J. Th. Phys. 39, 2717 (2000)), we formulate the cost in the useful work of transforming one resource state into another as a linear program of convex optimization. This approach is based on the characterization of thermal quasiorder given by Janzing et al. and later by Horodecki and Oppenheim (Nat. Comm. 4, 2059 (2013)). Both characterizations are related to an extended version of majorization studied by Ruch, Schranner and Seligman under the name mixing distance (J. Chem. Phys. 69, 386 (1978)).
A new line-of-sight approach to the non-linear Cosmic Microwave Background
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fidler, Christian; Koyama, Kazuya; Pettinari, Guido W., E-mail: christian.fidler@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk, E-mail: guido.pettinari@gmail.com
2015-04-01
We develop the transport operator formalism, a new line-of-sight integration framework to calculate the anisotropies of the Cosmic Microwave Background (CMB) at the linear and non-linear level. This formalism utilises a transformation operator that removes all inhomogeneous propagation effects acting on the photon distribution function, thus achieving a split between perturbative collisional effects at recombination and non-perturbative line-of-sight effects at later times. The former can be computed in the framework of standard cosmological perturbation theory with a second-order Boltzmann code such as SONG, while the latter can be treated within a separate perturbative scheme allowing the use of non-linear Newtonianmore » potentials. We thus provide a consistent framework to compute all physical effects contained in the Boltzmann equation and to combine the standard remapping approach with Boltzmann codes at any order in perturbation theory, without assuming that all sources are localised at recombination.« less
Unified Framework for Deriving Simultaneous Equation Algorithms for Water Distribution Networks
The known formulations for steady state hydraulics within looped water distribution networks are re-derived in terms of linear and non-linear transformations of the original set of partly linear and partly non-linear equations that express conservation of mass and energy. All of ...
Trading strategies for distribution company with stochastic distributed energy resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Chunyu; Wang, Qi; Wang, Jianhui
2016-09-01
This paper proposes a methodology to address the trading strategies of a proactive distribution company (PDISCO) engaged in the transmission-level (TL) markets. A one-leader multi-follower bilevel model is presented to formulate the gaming framework between the PDISCO and markets. The lower-level (LL) problems include the TL day-ahead market and scenario-based real-time markets, respectively with the objectives of maximizing social welfare and minimizing operation cost. The upper-level (UL) problem is to maximize the PDISCO’s profit across these markets. The PDISCO’s strategic offers/bids interactively influence the outcomes of each market. Since the LL problems are linear and convex, while the UL problemmore » is non-linear and non-convex, an equivalent primal–dual approach is used to reformulate this bilevel model to a solvable mathematical program with equilibrium constraints (MPEC). The effectiveness of the proposed model is verified by case studies.« less
An Instructional Note on Linear Programming--A Pedagogically Sound Approach.
ERIC Educational Resources Information Center
Mitchell, Richard
1998-01-01
Discusses the place of linear programming in college curricula and the advantages of using linear-programming software. Lists important characteristics of computer software used in linear programming for more effective teaching and learning. (ASK)
Harrison, Roger A; Gemmell, Isla; Reed, Katie
2015-01-01
(1) To quantify the effect of using different public health competence frameworks to audit the curriculum of an online distance learning MPH program, and (2) to measure variation in the outcomes of the audit depending on which competence framework is used. Retrospective audit. We compared the teaching content of an online distance learning MPH program against each competence listed in different public health competence frameworks relevant to an MPH. We then compared the number of competences covered in each module in the program's teaching curriculum and in the program overall, for each of the competence frameworks used in this audit. A comprehensive search of the literature identified two competence frameworks specific to MPH programs and two for public health professional/specialty training. The number of individual competences in each framework were 32 for the taught aspects of the UK Faculty of Public Health Specialist Training Program, 117 for the American Association of Public Health, 282 for the exam curriculum of the UK Faculty of Public Health Part A exam, and 393 for the European Core Competencies for MPH Education. This gave a total of 824 competences included in the audit. Overall, the online MPH program covered 88-96% of the competences depending on the specific framework used. This fell when the audit focused on just the three mandatory modules in the program, and the variation between the different competence frameworks was much larger. Using different competence frameworks to audit the curriculum of an MPH program can give different indications of its quality, especially as it fails to capture teaching considered to be relevant, yet not included in an existing competence framework. The strengths and weaknesses of using competence frameworks to audit the content of an MPH program have largely been ignored. These debates are vital given that external organizations responsible for accreditation specify a particular competence framework to be used. Our study found that each of four different competence frameworks suggested different levels of quality in our teaching program, at least in terms of the competences included in the curriculum. Relying on just one established framework missed some aspects of the curriculum included in other frameworks used in this study. Conversely, each framework included items not covered by the others. Thus, levels of agreement with the content of our MPH and established areas of competence were, in part, dependent on the competence framework used to compare its' content. While not entirely a surprising finding, this study makes an important point and makes explicit the challenges of selecting an appropriate competence framework to inform MPH programs, and especially one which recruits students from around the world.
Piecewise Linear-Linear Latent Growth Mixture Models with Unknown Knots
ERIC Educational Resources Information Center
Kohli, Nidhi; Harring, Jeffrey R.; Hancock, Gregory R.
2013-01-01
Latent growth curve models with piecewise functions are flexible and useful analytic models for investigating individual behaviors that exhibit distinct phases of development in observed variables. As an extension of this framework, this study considers a piecewise linear-linear latent growth mixture model (LGMM) for describing segmented change of…
NASA Astrophysics Data System (ADS)
Bekri, Eleni; Yannopoulos, Panayotis; Disse, Markus
2013-04-01
In the present study, a combined linear programming methodology, based on Li et al. (2010) and Bekri et al. (2012), is employed for optimizing water allocation under uncertain system conditions in the Alfeios River Basin, in Greece. The Alfeios River is a water resources system of great natural, ecological, social and economic importance for Western Greece, since it has the longest and highest flow rate watercourse in the Peloponnisos region. Moreover, the river basin was exposed in the last decades to a plethora of environmental stresses (e.g. hydrogeological alterations, intensively irrigated agriculture, surface and groundwater overexploitation and infrastructure developments), resulting in the degradation of its quantitative and qualitative characteristics. As in most Mediterranean countries, water resource management in Alfeios River Basin has been focused up to now on an essentially supply-driven approach. It is still characterized by a lack of effective operational strategies. Authority responsibility relationships are fragmented, and law enforcement and policy implementation are weak. The present regulated water allocation puzzle entails a mixture of hydropower generation, irrigation, drinking water supply and recreational activities. Under these conditions its water resources management is characterised by high uncertainty and by vague and imprecise data. The considered methodology has been developed in order to deal with uncertainties expressed as either probability distributions, or/and fuzzy boundary intervals, derived by associated α-cut levels. In this framework a set of deterministic submodels is studied through linear programming. The ad hoc water resources management and alternative management patterns in an Alfeios subbasin are analyzed and evaluated under various scenarios, using the above mentioned methodology, aiming to promote a sustainable and equitable water management. Li, Y.P., Huang, G.H. and S.L., Nie, (2010), Planning water resources management systems using a fuzzy-boundary interval-stochastic programming method, Elsevier Ltd, Advances in Water Resources, 33: 1105-1117. doi:10.1016/j.advwatres.2010.06.015 Bekri, E.S., Disse, M. and P.C.,Yannopoulos, (2012), Methodological framework for correction of quick river discharge measurements using quality characteristics, Session of Environmental Hydraulics - Hydrodynamics, 2nd Common Conference of Hellenic Hydrotechnical Association and Greek Committee for Water Resources Management, Volume: 546-557 (in Greek).
Guo, P; Huang, G H
2009-01-01
In this study, an inexact fuzzy chance-constrained two-stage mixed-integer linear programming (IFCTIP) approach is proposed for supporting long-term planning of waste-management systems under multiple uncertainties in the City of Regina, Canada. The method improves upon the existing inexact two-stage programming and mixed-integer linear programming techniques by incorporating uncertainties expressed as multiple uncertainties of intervals and dual probability distributions within a general optimization framework. The developed method can provide an effective linkage between the predefined environmental policies and the associated economic implications. Four special characteristics of the proposed method make it unique compared with other optimization techniques that deal with uncertainties. Firstly, it provides a linkage to predefined policies that have to be respected when a modeling effort is undertaken; secondly, it is useful for tackling uncertainties presented as intervals, probabilities, fuzzy sets and their incorporation; thirdly, it facilitates dynamic analysis for decisions of facility-expansion planning and waste-flow allocation within a multi-facility, multi-period, multi-level, and multi-option context; fourthly, the penalties are exercised with recourse against any infeasibility, which permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised solid waste-generation rates are violated. In a companion paper, the developed method is applied to a real case for the long-term planning of waste management in the City of Regina, Canada.
NASA Astrophysics Data System (ADS)
Papagiannopoulou, Christina; Decubber, Stijn; Miralles, Diego; Demuzere, Matthias; Dorigo, Wouter; Verhoest, Niko; Waegeman, Willem
2017-04-01
Satellite data provide an abundance of information about crucial climatic and environmental variables. These data - consisting of global records, spanning up to 35 years and having the form of multivariate time series with different spatial and temporal resolutions - enable the study of key climate-vegetation interactions. Although methods which are based on correlations and linear models are typically used for this purpose, their assumptions for linearity about the climate-vegetation relationships are too simplistic. Therefore, we adopt a recently proposed non-linear Granger causality analysis [1], in which we incorporate spatial information, concatenating data from neighboring pixels and training a joint model on the combined data. Experimental results based on global data sets show that considering non-linear relationships leads to a higher explained variance of past vegetation dynamics, compared to simple linear models. Our approach consists of several steps. First, we compile an extensive database [1], which includes multiple data sets for land surface temperature, near-surface air temperature, surface radiation, precipitation, snow water equivalents and surface soil moisture. Based on this database, high-level features are constructed and considered as predictors in our machine-learning framework. These high-level features include (de-trended) seasonal anomalies, lagged variables, past cumulative variables, and extreme indices, all calculated based on the raw climatic data. Second, we apply a spatiotemporal non-linear Granger causality framework - in which the linear predictive model is substituted for a non-linear machine learning algorithm - in order to assess which of these predictor variables Granger-cause vegetation dynamics at each 1° pixel. We use the de-trended anomalies of Normalized Difference Vegetation Index (NDVI) to characterize vegetation, being the target variable of our framework. Experimental results indicate that climate strongly (Granger-)causes vegetation dynamics in most regions globally. More specifically, water availability is the most dominant vegetation driver, being the dominant vegetation driver in 54% of the vegetated surface. Furthermore, our results show that precipitation and soil moisture have prolonged impacts on vegetation in semiarid regions, with up to 10% of additional explained variance on the vegetation dynamics occurring three months later. Finally, hydro-climatic extremes seem to have a remarkable impact on vegetation, since they also explain up to 10% of additional variance of vegetation in certain regions despite their infrequent occurrence. References [1] Papagiannopoulou, C., Miralles, D. G., Verhoest, N. E. C., Dorigo, W. A., and Waegeman, W.: A non-linear Granger causality framework to investigate climate-vegetation dynamics, Geosci. Model Dev. Discuss., doi:10.5194/gmd-2016-266, in review, 2016.
Chen, Xiujuan; Huang, Guohe; Zhao, Shan; Cheng, Guanhui; Wu, Yinghui; Zhu, Hua
2017-11-01
In this study, a stochastic fractional inventory-theory-based waste management planning (SFIWP) model was developed and applied for supporting long-term planning of the municipal solid waste (MSW) management in Xiamen City, the special economic zone of Fujian Province, China. In the SFIWP model, the techniques of inventory model, stochastic linear fractional programming, and mixed-integer linear programming were integrated in a framework. Issues of waste inventory in MSW management system were solved, and the system efficiency was maximized through considering maximum net-diverted wastes under various constraint-violation risks. Decision alternatives for waste allocation and capacity expansion were also provided for MSW management planning in Xiamen. The obtained results showed that about 4.24 × 10 6 t of waste would be diverted from landfills when p i is 0.01, which accounted for 93% of waste in Xiamen City, and the waste diversion per unit of cost would be 26.327 × 10 3 t per $10 6 . The capacities of MSW management facilities including incinerators, composting facility, and landfills would be expanded due to increasing waste generation rate.
Varadarajan, Divya; Haldar, Justin P
2017-11-01
The data measured in diffusion MRI can be modeled as the Fourier transform of the Ensemble Average Propagator (EAP), a probability distribution that summarizes the molecular diffusion behavior of the spins within each voxel. This Fourier relationship is potentially advantageous because of the extensive theory that has been developed to characterize the sampling requirements, accuracy, and stability of linear Fourier reconstruction methods. However, existing diffusion MRI data sampling and signal estimation methods have largely been developed and tuned without the benefit of such theory, instead relying on approximations, intuition, and extensive empirical evaluation. This paper aims to address this discrepancy by introducing a novel theoretical signal processing framework for diffusion MRI. The new framework can be used to characterize arbitrary linear diffusion estimation methods with arbitrary q-space sampling, and can be used to theoretically evaluate and compare the accuracy, resolution, and noise-resilience of different data acquisition and parameter estimation techniques. The framework is based on the EAP, and makes very limited modeling assumptions. As a result, the approach can even provide new insight into the behavior of model-based linear diffusion estimation methods in contexts where the modeling assumptions are inaccurate. The practical usefulness of the proposed framework is illustrated using both simulated and real diffusion MRI data in applications such as choosing between different parameter estimation methods and choosing between different q-space sampling schemes. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Stoitsov, M. V.; Schunck, N.; Kortelainen, M.; Michel, N.; Nam, H.; Olsen, E.; Sarich, J.; Wild, S.
2013-06-01
We describe the new version 2.00d of the code HFBTHO that solves the nuclear Skyrme-Hartree-Fock (HF) or Skyrme-Hartree-Fock-Bogoliubov (HFB) problem by using the cylindrical transformed deformed harmonic oscillator basis. In the new version, we have implemented the following features: (i) the modified Broyden method for non-linear problems, (ii) optional breaking of reflection symmetry, (iii) calculation of axial multipole moments, (iv) finite temperature formalism for the HFB method, (v) linear constraint method based on the approximation of the Random Phase Approximation (RPA) matrix for multi-constraint calculations, (vi) blocking of quasi-particles in the Equal Filling Approximation (EFA), (vii) framework for generalized energy density with arbitrary density-dependences, and (viii) shared memory parallelism via OpenMP pragmas. Program summaryProgram title: HFBTHO v2.00d Catalog identifier: ADUI_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUI_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 167228 No. of bytes in distributed program, including test data, etc.: 2672156 Distribution format: tar.gz Programming language: FORTRAN-95. Computer: Intel Pentium-III, Intel Xeon, AMD-Athlon, AMD-Opteron, Cray XT5, Cray XE6. Operating system: UNIX, LINUX, WindowsXP. RAM: 200 Mwords Word size: 8 bits Classification: 17.22. Does the new version supercede the previous version?: Yes Catalog identifier of previous version: ADUI_v1_0 Journal reference of previous version: Comput. Phys. Comm. 167 (2005) 43 Nature of problem: The solution of self-consistent mean-field equations for weakly-bound paired nuclei requires a correct description of the asymptotic properties of nuclear quasi-particle wave functions. In the present implementation, this is achieved by using the single-particle wave functions of the transformed harmonic oscillator, which allows for an accurate description of deformation effects and pairing correlations in nuclei arbitrarily close to the particle drip lines. Solution method: The program uses the axial Transformed Harmonic Oscillator (THO) single- particle basis to expand quasi-particle wave functions. It iteratively diagonalizes the Hartree-Fock-Bogoliubov Hamiltonian based on generalized Skyrme-like energy densities and zero-range pairing interactions until a self-consistent solution is found. A previous version of the program was presented in: M.V. Stoitsov, J. Dobaczewski, W. Nazarewicz, P. Ring, Comput. Phys. Commun. 167 (2005) 43-63. Reasons for new version: Version 2.00d of HFBTHO provides a number of new options such as the optional breaking of reflection symmetry, the calculation of axial multipole moments, the finite temperature formalism for the HFB method, optimized multi-constraint calculations, the treatment of odd-even and odd-odd nuclei in the blocking approximation, and the framework for generalized energy density with arbitrary density-dependences. It is also the first version of HFBTHO to contain threading capabilities. Summary of revisions: The modified Broyden method has been implemented, Optional breaking of reflection symmetry has been implemented, The calculation of all axial multipole moments up to λ=8 has been implemented, The finite temperature formalism for the HFB method has been implemented, The linear constraint method based on the approximation of the Random Phase Approximation (RPA) matrix for multi-constraint calculations has been implemented, The blocking of quasi-particles in the Equal Filling Approximation (EFA) has been implemented, The framework for generalized energy density functionals with arbitrary density-dependence has been implemented, Shared memory parallelism via OpenMP pragmas has been implemented. Restrictions: Axial- and time-reversal symmetries are assumed. Unusual features: The user must have access to the LAPACK subroutines DSYEVD, DSYTRF and DSYTRI, and their dependences, which compute eigenvalues and eigenfunctions of real symmetric matrices, the LAPACK subroutines DGETRI and DGETRF, which invert arbitrary real matrices, and the BLAS routines DCOPY, DSCAL, DGEMM and DGEMV for double-precision linear algebra (or provide another set of subroutines that can perform such tasks). The BLAS and LAPACK subroutines can be obtained from the Netlib Repository at the University of Tennessee, Knoxville: http://netlib2.cs.utk.edu/. Running time: Highly variable, as it depends on the nucleus, size of the basis, requested accuracy, requested configuration, compiler and libraries, and hardware architecture. An order of magnitude would be a few seconds for ground-state configurations in small bases N≈8-12, to a few minutes in very deformed configuration of a heavy nucleus with a large basis N>20.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gearhart, Jared Lee; Adair, Kristin Lynn; Durfee, Justin David.
When developing linear programming models, issues such as budget limitations, customer requirements, or licensing may preclude the use of commercial linear programming solvers. In such cases, one option is to use an open-source linear programming solver. A survey of linear programming tools was conducted to identify potential open-source solvers. From this survey, four open-source solvers were tested using a collection of linear programming test problems and the results were compared to IBM ILOG CPLEX Optimizer (CPLEX) [1], an industry standard. The solvers considered were: COIN-OR Linear Programming (CLP) [2], [3], GNU Linear Programming Kit (GLPK) [4], lp_solve [5] and Modularmore » In-core Nonlinear Optimization System (MINOS) [6]. As no open-source solver outperforms CPLEX, this study demonstrates the power of commercial linear programming software. CLP was found to be the top performing open-source solver considered in terms of capability and speed. GLPK also performed well but cannot match the speed of CLP or CPLEX. lp_solve and MINOS were considerably slower and encountered issues when solving several test problems.« less
NASA Astrophysics Data System (ADS)
Vukics, András
2012-06-01
C++QED is a versatile framework for simulating open quantum dynamics. It allows to build arbitrarily complex quantum systems from elementary free subsystems and interactions, and simulate their time evolution with the available time-evolution drivers. Through this framework, we introduce a design which should be generic for high-level representations of composite quantum systems. It relies heavily on the object-oriented and generic programming paradigms on one hand, and on the other hand, compile-time algorithms, in particular C++ template-metaprogramming techniques. The core of the design is the data structure which represents the state vectors of composite quantum systems. This data structure models the multi-array concept. The use of template metaprogramming is not only crucial to the design, but with its use all computations pertaining to the layout of the simulated system can be shifted to compile time, hence cutting on runtime. Program summaryProgram title: C++QED Catalogue identifier: AELU_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELU_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions:http://cpc.cs.qub.ac.uk/licence/aelu_v1_0.html. The C++QED package contains other software packages, Blitz, Boost and FLENS, all of which may be distributed freely but have individual license requirements. Please see individual packages for license conditions. No. of lines in distributed program, including test data, etc.: 597 974 No. of bytes in distributed program, including test data, etc.: 4 874 839 Distribution format: tar.gz Programming language: C++ Computer: i386-i686, x86_64 Operating system: In principle cross-platform, as yet tested only on UNIX-like systems (including Mac OS X). RAM: The framework itself takes about 60 MB, which is fully shared. The additional memory taken by the program which defines the actual physical system (script) is typically less than 1 MB. The memory storing the actual data scales with the system dimension for state-vector manipulations, and the square of the dimension for density-operator manipulations. This might easily be GBs, and often the memory of the machine limits the size of the simulated system. Classification: 4.3, 4.13, 6.2, 20 External routines: Boost C++ libraries (http://www.boost.org/), GNU Scientific Library (http://www.gnu.org/software/gsl/), Blitz++ (http://www.oonumerics.org/blitz/), Linear Algebra Package - Flexible Library for Efficient Numerical Solutions (http://flens.sourceforge.net/). Nature of problem: Definition of (open) composite quantum systems out of elementary building blocks [1]. Manipulation of such systems, with emphasis on dynamical simulations such as Master-equation evolution [2] and Monte Carlo wave-function simulation [3]. Solution method: Master equation, Monte Carlo wave-function method. Restrictions: Total dimensionality of the system. Master equation - few thousands. Monte Carlo wave-function trajectory - several millions. Unusual features: Because of the heavy use of compile-time algorithms, compilation of programs written in the framework may take a long time and much memory (up to several GBs). Additional comments: The framework is not a program, but provides and implements an application-programming interface for developing simulations in the indicated problem domain. Supplementary information: http://cppqed.sourceforge.net/. Running time: Depending on the magnitude of the problem, can vary from a few seconds to weeks.
Business Education. Vocational Education Program Courses Standards.
ERIC Educational Resources Information Center
Florida State Dept. of Education, Tallahassee. Div. of Vocational, Adult, and Community Education.
This document contains vocational education program courses standards (curriculum frameworks and student performance standards) for business technology education programs in Florida. Each program courses standard is composed of two parts: a curriculum framework and student performance standards. The curriculum framework includes four major…
NASA Technical Reports Server (NTRS)
Ferencz, Donald C.; Viterna, Larry A.
1991-01-01
ALPS is a computer program which can be used to solve general linear program (optimization) problems. ALPS was designed for those who have minimal linear programming (LP) knowledge and features a menu-driven scheme to guide the user through the process of creating and solving LP formulations. Once created, the problems can be edited and stored in standard DOS ASCII files to provide portability to various word processors or even other linear programming packages. Unlike many math-oriented LP solvers, ALPS contains an LP parser that reads through the LP formulation and reports several types of errors to the user. ALPS provides a large amount of solution data which is often useful in problem solving. In addition to pure linear programs, ALPS can solve for integer, mixed integer, and binary type problems. Pure linear programs are solved with the revised simplex method. Integer or mixed integer programs are solved initially with the revised simplex, and the completed using the branch-and-bound technique. Binary programs are solved with the method of implicit enumeration. This manual describes how to use ALPS to create, edit, and solve linear programming problems. Instructions for installing ALPS on a PC compatible computer are included in the appendices along with a general introduction to linear programming. A programmers guide is also included for assistance in modifying and maintaining the program.
76 FR 38602 - Bovine Tuberculosis and Brucellosis; Program Framework
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-01
...] Bovine Tuberculosis and Brucellosis; Program Framework AGENCY: Animal and Plant Health Inspection Service... framework being developed for the bovine tuberculosis and brucellosis programs in the United States. This... proposed revisions to its programs regarding bovine tuberculosis (TB) and bovine brucellosis in the United...
NASA Astrophysics Data System (ADS)
Levin, Alan R.; Zhang, Deyin; Polizzi, Eric
2012-11-01
In a recent article Polizzi (2009) [15], the FEAST algorithm has been presented as a general purpose eigenvalue solver which is ideally suited for addressing the numerical challenges in electronic structure calculations. Here, FEAST is presented beyond the “black-box” solver as a fundamental modeling framework which can naturally address the original numerical complexity of the electronic structure problem as formulated by Slater in 1937 [3]. The non-linear eigenvalue problem arising from the muffin-tin decomposition of the real-space domain is first derived and then reformulated to be solved exactly within the FEAST framework. This new framework is presented as a fundamental and practical solution for performing both accurate and scalable electronic structure calculations, bypassing the various issues of using traditional approaches such as linearization and pseudopotential techniques. A finite element implementation of this FEAST framework along with simulation results for various molecular systems is also presented and discussed.
User's manual for LINEAR, a FORTRAN program to derive linear aircraft models
NASA Technical Reports Server (NTRS)
Duke, Eugene L.; Patterson, Brian P.; Antoniewicz, Robert F.
1987-01-01
This report documents a FORTRAN program that provides a powerful and flexible tool for the linearization of aircraft models. The program LINEAR numerically determines a linear system model using nonlinear equations of motion and a user-supplied nonlinear aerodynamic model. The system model determined by LINEAR consists of matrices for both state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model.
Robustness Analysis of Integrated LPV-FDI Filters and LTI-FTC System for a Transport Aircraft
NASA Technical Reports Server (NTRS)
Khong, Thuan H.; Shin, Jong-Yeob
2007-01-01
This paper proposes an analysis framework for robustness analysis of a nonlinear dynamics system that can be represented by a polynomial linear parameter varying (PLPV) system with constant bounded uncertainty. The proposed analysis framework contains three key tools: 1) a function substitution method which can convert a nonlinear system in polynomial form into a PLPV system, 2) a matrix-based linear fractional transformation (LFT) modeling approach, which can convert a PLPV system into an LFT system with the delta block that includes key uncertainty and scheduling parameters, 3) micro-analysis, which is a well known robust analysis tool for linear systems. The proposed analysis framework is applied to evaluating the performance of the LPV-fault detection and isolation (FDI) filters of the closed-loop system of a transport aircraft in the presence of unmodeled actuator dynamics and sensor gain uncertainty. The robustness analysis results are compared with nonlinear time simulations.
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the horticulture technology programs cluster. Presented in the introductory section are a framework of programs and courses, description of the programs, and suggested course sequences for…
Recent Updates to the GEOS-5 Linear Model
NASA Technical Reports Server (NTRS)
Holdaway, Dan; Kim, Jong G.; Errico, Ron; Gelaro, Ronald; Mahajan, Rahul
2014-01-01
Global Modeling and Assimilation Office (GMAO) is close to having a working 4DVAR system and has developed a linearized version of GEOS-5.This talk outlines a series of improvements made to the linearized dynamics, physics and trajectory.Of particular interest is the development of linearized cloud microphysics, which provides the framework for 'all-sky' data assimilation.
On the linear programming bound for linear Lee codes.
Astola, Helena; Tabus, Ioan
2016-01-01
Based on an invariance-type property of the Lee-compositions of a linear Lee code, additional equality constraints can be introduced to the linear programming problem of linear Lee codes. In this paper, we formulate this property in terms of an action of the multiplicative group of the field [Formula: see text] on the set of Lee-compositions. We show some useful properties of certain sums of Lee-numbers, which are the eigenvalues of the Lee association scheme, appearing in the linear programming problem of linear Lee codes. Using the additional equality constraints, we formulate the linear programming problem of linear Lee codes in a very compact form, leading to a fast execution, which allows to efficiently compute the bounds for large parameter values of the linear codes.
NASA Astrophysics Data System (ADS)
Amsallem, David; Tezaur, Radek; Farhat, Charbel
2016-12-01
A comprehensive approach for real-time computations using a database of parametric, linear, projection-based reduced-order models (ROMs) based on arbitrary underlying meshes is proposed. In the offline phase of this approach, the parameter space is sampled and linear ROMs defined by linear reduced operators are pre-computed at the sampled parameter points and stored. Then, these operators and associated ROMs are transformed into counterparts that satisfy a certain notion of consistency. In the online phase of this approach, a linear ROM is constructed in real-time at a queried but unsampled parameter point by interpolating the pre-computed linear reduced operators on matrix manifolds and therefore computing an interpolated linear ROM. The proposed overall model reduction framework is illustrated with two applications: a parametric inverse acoustic scattering problem associated with a mockup submarine, and a parametric flutter prediction problem associated with a wing-tank system. The second application is implemented on a mobile device, illustrating the capability of the proposed computational framework to operate in real-time.
EZLP: An Interactive Computer Program for Solving Linear Programming Problems. Final Report.
ERIC Educational Resources Information Center
Jarvis, John J.; And Others
Designed for student use in solving linear programming problems, the interactive computer program described (EZLP) permits the student to input the linear programming model in exactly the same manner in which it would be written on paper. This report includes a brief review of the development of EZLP; narrative descriptions of program features,…
VENVAL : a plywood mill cost accounting program
Henry Spelter
1991-01-01
This report documents a package of computer programs called VENVAL. These programs prepare plywood mill data for a linear programming (LP) model that, in turn, calculates the optimum mix of products to make, given a set of technologies and market prices. (The software to solve a linear program is not provided and must be obtained separately.) Linear programming finds...
Ranking Forestry Investments With Parametric Linear Programming
Paul A. Murphy
1976-01-01
Parametric linear programming is introduced as a technique for ranking forestry investments under multiple constraints; it combines the advantages of simple tanking and linear programming as capital budgeting tools.
Public health program capacity for sustainability: a new framework.
Schell, Sarah F; Luke, Douglas A; Schooley, Michael W; Elliott, Michael B; Herbers, Stephanie H; Mueller, Nancy B; Bunger, Alicia C
2013-02-01
Public health programs can only deliver benefits if they are able to sustain activities over time. There is a broad literature on program sustainability in public health, but it is fragmented and there is a lack of consensus on core constructs. The purpose of this paper is to present a new conceptual framework for program sustainability in public health. This developmental study uses a comprehensive literature review, input from an expert panel, and the results of concept-mapping to identify the core domains of a conceptual framework for public health program capacity for sustainability. The concept-mapping process included three types of participants (scientists, funders, and practitioners) from several public health areas (e.g., tobacco control, heart disease and stroke, physical activity and nutrition, and injury prevention). The literature review identified 85 relevant studies focusing on program sustainability in public health. Most of the papers described empirical studies of prevention-oriented programs aimed at the community level. The concept-mapping process identified nine core domains that affect a program's capacity for sustainability: Political Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, Public Health Impacts, and Strategic Planning. Concept-mapping participants further identified 93 items across these domains that have strong face validity-89% of the individual items composing the framework had specific support in the sustainability literature. The sustainability framework presented here suggests that a number of selected factors may be related to a program's ability to sustain its activities and benefits over time. These factors have been discussed in the literature, but this framework synthesizes and combines the factors and suggests how they may be interrelated with one another. The framework presents domains for public health decision makers to consider when developing and implementing prevention and intervention programs. The sustainability framework will be useful for public health decision makers, program managers, program evaluators, and dissemination and implementation researchers.
A Pareto analysis approach to assess relevant marginal CO{sub 2} footprint for petroleum products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tehrani, Nejad M. Alireza, E-mail: alireza.tehraninejad@gmail.com
2015-07-15
Recently, linear programing (LP) models have been extended to track the marginal CO{sub 2} intensity of automotive fuels at the refinery gate. The obtained CO{sub 2} data are recommended for policy making because they capture the economic and environmental tensions as well as the processing effects related to oil products. However, they are proven to be extremely sensitive to small perturbations and therefore useless in practice. In this paper, we first investigate the theoretical reasons of this drawback. Then, we develop a multiple objective LP framework to assess relevant marginal CO{sub 2} footprints that preserve both defensibility and stability atmore » a satisfactory level of acceptance. A case study illustrates this new methodology. - Highlights: • Refining LP models have limitations to provide useful marginal CO{sub 2} footprints. • A multi objective optimization framework is developed to assess relevant CO{sub 2} data. • Within a European Refinig industry, diesel is more CO{sub 2} intensive than gasoline.« less
A spatial operator algebra for manipulator modeling and control
NASA Technical Reports Server (NTRS)
Rodriguez, G.; Kreutz, Kenneth; Jain, Abhinandan
1989-01-01
A recently developed spatial operator algebra, useful for modeling, control, and trajectory design of manipulators is discussed. The elements of this algebra are linear operators whose domain and range spaces consist of forces, moments, velocities, and accelerations. The effect of these operators is equivalent to a spatial recursion along the span of a manipulator. Inversion of operators can be efficiently obtained via techniques of recursive filtering and smoothing. The operator algebra provides a high level framework for describing the dynamic and kinematic behavior of a manipulator and control and trajectory design algorithms. The interpretation of expressions within the algebraic framework leads to enhanced conceptual and physical understanding of manipulator dynamics and kinematics. Furthermore, implementable recursive algorithms can be immediately derived from the abstract operator expressions by inspection. Thus, the transition from an abstract problem formulation and solution to the detailed mechanizaton of specific algorithms is greatly simplified. The analytical formulation of the operator algebra, as well as its implementation in the Ada programming language are discussed.
Faizullah, Faiz
2016-01-01
The aim of the current paper is to present the path-wise and moment estimates for solutions to stochastic functional differential equations with non-linear growth condition in the framework of G-expectation and G-Brownian motion. Under the nonlinear growth condition, the pth moment estimates for solutions to SFDEs driven by G-Brownian motion are proved. The properties of G-expectations, Hölder's inequality, Bihari's inequality, Gronwall's inequality and Burkholder-Davis-Gundy inequalities are used to develop the above mentioned theory. In addition, the path-wise asymptotic estimates and continuity of pth moment for the solutions to SFDEs in the G-framework, with non-linear growth condition are shown.
Massively parallel sparse matrix function calculations with NTPoly
NASA Astrophysics Data System (ADS)
Dawson, William; Nakajima, Takahito
2018-04-01
We present NTPoly, a massively parallel library for computing the functions of sparse, symmetric matrices. The theory of matrix functions is a well developed framework with a wide range of applications including differential equations, graph theory, and electronic structure calculations. One particularly important application area is diagonalization free methods in quantum chemistry. When the input and output of the matrix function are sparse, methods based on polynomial expansions can be used to compute matrix functions in linear time. We present a library based on these methods that can compute a variety of matrix functions. Distributed memory parallelization is based on a communication avoiding sparse matrix multiplication algorithm. OpenMP task parallellization is utilized to implement hybrid parallelization. We describe NTPoly's interface and show how it can be integrated with programs written in many different programming languages. We demonstrate the merits of NTPoly by performing large scale calculations on the K computer.
Computational efficiency improvements for image colorization
NASA Astrophysics Data System (ADS)
Yu, Chao; Sharma, Gaurav; Aly, Hussein
2013-03-01
We propose an efficient algorithm for colorization of greyscale images. As in prior work, colorization is posed as an optimization problem: a user specifies the color for a few scribbles drawn on the greyscale image and the color image is obtained by propagating color information from the scribbles to surrounding regions, while maximizing the local smoothness of colors. In this formulation, colorization is obtained by solving a large sparse linear system, which normally requires substantial computation and memory resources. Our algorithm improves the computational performance through three innovations over prior colorization implementations. First, the linear system is solved iteratively without explicitly constructing the sparse matrix, which significantly reduces the required memory. Second, we formulate each iteration in terms of integral images obtained by dynamic programming, reducing repetitive computation. Third, we use a coarseto- fine framework, where a lower resolution subsampled image is first colorized and this low resolution color image is upsampled to initialize the colorization process for the fine level. The improvements we develop provide significant speedup and memory savings compared to the conventional approach of solving the linear system directly using off-the-shelf sparse solvers, and allow us to colorize images with typical sizes encountered in realistic applications on typical commodity computing platforms.
A FRAMEWORK FOR A COMPUTATIONAL TOXICOLOGY RESEARCH PROGRAM IN ORD
"A Framework for a Computational Toxicology Research Program in ORD" was drafted by a Technical Writing Team having representatives from all of ORD's Laboratories and Centers. The document describes a framework for the development of an program within ORD to utilize approaches d...
Investigating Integer Restrictions in Linear Programming
ERIC Educational Resources Information Center
Edwards, Thomas G.; Chelst, Kenneth R.; Principato, Angela M.; Wilhelm, Thad L.
2015-01-01
Linear programming (LP) is an application of graphing linear systems that appears in many Algebra 2 textbooks. Although not explicitly mentioned in the Common Core State Standards for Mathematics, linear programming blends seamlessly into modeling with mathematics, the fourth Standard for Mathematical Practice (CCSSI 2010, p. 7). In solving a…
Conceptual framework for development of comprehensive e-health evaluation tool.
Khoja, Shariq; Durrani, Hammad; Scott, Richard E; Sajwani, Afroz; Piryani, Usha
2013-01-01
The main objective of this study was to develop an e-health evaluation tool based on a conceptual framework including relevant theories for evaluating use of technology in health programs. This article presents the development of an evaluation framework for e-health programs. The study was divided into three stages: Stage 1 involved a detailed literature search of different theories and concepts on evaluation of e-health, Stage 2 plotted e-health theories to identify relevant themes, and Stage 3 developed a matrix of evaluation themes and stages of e-health programs. The framework identifies and defines different stages of e-health programs and then applies evaluation theories to each of these stages for development of the evaluation tool. This framework builds on existing theories of health and technology evaluation and presents a conceptual framework for developing an e-health evaluation tool to examine and measure different factors that play a definite role in the success of e-health programs. The framework on the horizontal axis divides e-health into different stages of program implementation, while the vertical axis identifies different themes and areas of consideration for e-health evaluation. The framework helps understand various aspects of e-health programs and their impact that require evaluation at different stages of the life cycle. The study led to the development of a new and comprehensive e-health evaluation tool, named the Khoja-Durrani-Scott Framework for e-Health Evaluation.
User's manual for interactive LINEAR: A FORTRAN program to derive linear aircraft models
NASA Technical Reports Server (NTRS)
Antoniewicz, Robert F.; Duke, Eugene L.; Patterson, Brian P.
1988-01-01
An interactive FORTRAN program that provides the user with a powerful and flexible tool for the linearization of aircraft aerodynamic models is documented in this report. The program LINEAR numerically determines a linear system model using nonlinear equations of motion and a user-supplied linear or nonlinear aerodynamic model. The nonlinear equations of motion used are six-degree-of-freedom equations with stationary atmosphere and flat, nonrotating earth assumptions. The system model determined by LINEAR consists of matrices for both the state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model.
Helitzer, Deborah L; Sussman, Andrew L; Hoffman, Richard M; Getrich, Christina M; Warner, Teddy D; Rhyne, Robert L
2014-08-01
Conceptual frameworks (CF) have historically been used to develop program theory. We re-examine the literature about the role of CF in this context, specifically how they can be used to create descriptive and prescriptive theories, as building blocks for a program theory. Using a case example of colorectal cancer screening intervention development, we describe the process of developing our initial CF, the methods used to explore the constructs in the framework and revise the framework for intervention development. We present seven steps that guided the development of our CF: (1) assemble the "right" research team, (2) incorporate existing literature into the emerging CF, (3) construct the conceptual framework, (4) diagram the framework, (5) operationalize the framework: develop the research design and measures, (6) conduct the research, and (7) revise the framework. A revised conceptual framework depicted more complicated inter-relationships of the different predisposing, enabling, reinforcing, and system-based factors. The updated framework led us to generate program theory and serves as the basis for designing future intervention studies and outcome evaluations. A CF can build a foundation for program theory. We provide a set of concrete steps and lessons learned to assist practitioners in developing a CF. Copyright © 2014 Elsevier Ltd. All rights reserved.
Machine Learning-based discovery of closures for reduced models of dynamical systems
NASA Astrophysics Data System (ADS)
Pan, Shaowu; Duraisamy, Karthik
2017-11-01
Despite the successful application of machine learning (ML) in fields such as image processing and speech recognition, only a few attempts has been made toward employing ML to represent the dynamics of complex physical systems. Previous attempts mostly focus on parameter calibration or data-driven augmentation of existing models. In this work we present a ML framework to discover closure terms in reduced models of dynamical systems and provide insights into potential problems associated with data-driven modeling. Based on exact closure models for linear system, we propose a general linear closure framework from viewpoint of optimization. The framework is based on trapezoidal approximation of convolution term. Hyperparameters that need to be determined include temporal length of memory effect, number of sampling points, and dimensions of hidden states. To circumvent the explicit specification of memory effect, a general framework inspired from neural networks is also proposed. We conduct both a priori and posteriori evaluations of the resulting model on a number of non-linear dynamical systems. This work was supported in part by AFOSR under the project ``LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.
Lu, Zhao; Sun, Jing; Butts, Kenneth
2014-05-01
Support vector regression for approximating nonlinear dynamic systems is more delicate than the approximation of indicator functions in support vector classification, particularly for systems that involve multitudes of time scales in their sampled data. The kernel used for support vector learning determines the class of functions from which a support vector machine can draw its solution, and the choice of kernel significantly influences the performance of a support vector machine. In this paper, to bridge the gap between wavelet multiresolution analysis and kernel learning, the closed-form orthogonal wavelet is exploited to construct new multiscale asymmetric orthogonal wavelet kernels for linear programming support vector learning. The closed-form multiscale orthogonal wavelet kernel provides a systematic framework to implement multiscale kernel learning via dyadic dilations and also enables us to represent complex nonlinear dynamics effectively. To demonstrate the superiority of the proposed multiscale wavelet kernel in identifying complex nonlinear dynamic systems, two case studies are presented that aim at building parallel models on benchmark datasets. The development of parallel models that address the long-term/mid-term prediction issue is more intricate and challenging than the identification of series-parallel models where only one-step ahead prediction is required. Simulation results illustrate the effectiveness of the proposed multiscale kernel learning.
A depth-first search algorithm to compute elementary flux modes by linear programming.
Quek, Lake-Ee; Nielsen, Lars K
2014-07-30
The decomposition of complex metabolic networks into elementary flux modes (EFMs) provides a useful framework for exploring reaction interactions systematically. Generating a complete set of EFMs for large-scale models, however, is near impossible. Even for moderately-sized models (<400 reactions), existing approaches based on the Double Description method must iterate through a large number of combinatorial candidates, thus imposing an immense processor and memory demand. Based on an alternative elementarity test, we developed a depth-first search algorithm using linear programming (LP) to enumerate EFMs in an exhaustive fashion. Constraints can be introduced to directly generate a subset of EFMs satisfying the set of constraints. The depth-first search algorithm has a constant memory overhead. Using flux constraints, a large LP problem can be massively divided and parallelized into independent sub-jobs for deployment into computing clusters. Since the sub-jobs do not overlap, the approach scales to utilize all available computing nodes with minimal coordination overhead or memory limitations. The speed of the algorithm was comparable to efmtool, a mainstream Double Description method, when enumerating all EFMs; the attrition power gained from performing flux feasibility tests offsets the increased computational demand of running an LP solver. Unlike the Double Description method, the algorithm enables accelerated enumeration of all EFMs satisfying a set of constraints.
BigQ: a NoSQL based framework to handle genomic variants in i2b2.
Gabetta, Matteo; Limongelli, Ivan; Rizzo, Ettore; Riva, Alberto; Segagni, Daniele; Bellazzi, Riccardo
2015-12-29
Precision medicine requires the tight integration of clinical and molecular data. To this end, it is mandatory to define proper technological solutions able to manage the overwhelming amount of high throughput genomic data needed to test associations between genomic signatures and human phenotypes. The i2b2 Center (Informatics for Integrating Biology and the Bedside) has developed a widely internationally adopted framework to use existing clinical data for discovery research that can help the definition of precision medicine interventions when coupled with genetic data. i2b2 can be significantly advanced by designing efficient management solutions of Next Generation Sequencing data. We developed BigQ, an extension of the i2b2 framework, which integrates patient clinical phenotypes with genomic variant profiles generated by Next Generation Sequencing. A visual programming i2b2 plugin allows retrieving variants belonging to the patients in a cohort by applying filters on genomic variant annotations. We report an evaluation of the query performance of our system on more than 11 million variants, showing that the implemented solution scales linearly in terms of query time and disk space with the number of variants. In this paper we describe a new i2b2 web service composed of an efficient and scalable document-based database that manages annotations of genomic variants and of a visual programming plug-in designed to dynamically perform queries on clinical and genetic data. The system therefore allows managing the fast growing volume of genomic variants and can be used to integrate heterogeneous genomic annotations.
Many-core graph analytics using accelerated sparse linear algebra routines
NASA Astrophysics Data System (ADS)
Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric
2016-05-01
Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.
A python framework for environmental model uncertainty analysis
White, Jeremy; Fienen, Michael N.; Doherty, John E.
2016-01-01
We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.
Development and validation of a general purpose linearization program for rigid aircraft models
NASA Technical Reports Server (NTRS)
Duke, E. L.; Antoniewicz, R. F.
1985-01-01
A FORTRAN program that provides the user with a powerful and flexible tool for the linearization of aircraft models is discussed. The program LINEAR numerically determines a linear systems model using nonlinear equations of motion and a user-supplied, nonlinear aerodynamic model. The system model determined by LINEAR consists of matrices for both the state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model. Also, included in the report is a comparison of linear and nonlinear models for a high performance aircraft.
The Alberta K-9 Mathematics Program of Studies with Achievement Indicators
ERIC Educational Resources Information Center
Alberta Education, 2007
2007-01-01
The "Alberta K-9 Mathematics Program of Studies with Achievement Indicators" has been derived from "The Common Curriculum Framework for K-9 Mathematics: Western and Northern Canadian Protocol," May 2006 (the Common Curriculum Framework). The program of studies incorporates the conceptual framework for Kindergarten to Grade 9…
Linearized Programming of Memristors for Artificial Neuro-Sensor Signal Processing
Yang, Changju; Kim, Hyongsuk
2016-01-01
A linearized programming method of memristor-based neural weights is proposed. Memristor is known as an ideal element to implement a neural synapse due to its embedded functions of analog memory and analog multiplication. Its resistance variation with a voltage input is generally a nonlinear function of time. Linearization of memristance variation about time is very important for the easiness of memristor programming. In this paper, a method utilizing an anti-serial architecture for linear programming is proposed. The anti-serial architecture is composed of two memristors with opposite polarities. It linearizes the variation of memristance due to complimentary actions of two memristors. For programming a memristor, additional memristor with opposite polarity is employed. The linearization effect of weight programming of an anti-serial architecture is investigated and memristor bridge synapse which is built with two sets of anti-serial memristor architecture is taken as an application example of the proposed method. Simulations are performed with memristors of both linear drift model and nonlinear model. PMID:27548186
Linearized Programming of Memristors for Artificial Neuro-Sensor Signal Processing.
Yang, Changju; Kim, Hyongsuk
2016-08-19
A linearized programming method of memristor-based neural weights is proposed. Memristor is known as an ideal element to implement a neural synapse due to its embedded functions of analog memory and analog multiplication. Its resistance variation with a voltage input is generally a nonlinear function of time. Linearization of memristance variation about time is very important for the easiness of memristor programming. In this paper, a method utilizing an anti-serial architecture for linear programming is proposed. The anti-serial architecture is composed of two memristors with opposite polarities. It linearizes the variation of memristance due to complimentary actions of two memristors. For programming a memristor, additional memristor with opposite polarity is employed. The linearization effect of weight programming of an anti-serial architecture is investigated and memristor bridge synapse which is built with two sets of anti-serial memristor architecture is taken as an application example of the proposed method. Simulations are performed with memristors of both linear drift model and nonlinear model.
Optimization Research of Generation Investment Based on Linear Programming Model
NASA Astrophysics Data System (ADS)
Wu, Juan; Ge, Xueqian
Linear programming is an important branch of operational research and it is a mathematical method to assist the people to carry out scientific management. GAMS is an advanced simulation and optimization modeling language and it will combine a large number of complex mathematical programming, such as linear programming LP, nonlinear programming NLP, MIP and other mixed-integer programming with the system simulation. In this paper, based on the linear programming model, the optimized investment decision-making of generation is simulated and analyzed. At last, the optimal installed capacity of power plants and the final total cost are got, which provides the rational decision-making basis for optimized investments.
Public health program capacity for sustainability: a new framework
2013-01-01
Background Public health programs can only deliver benefits if they are able to sustain activities over time. There is a broad literature on program sustainability in public health, but it is fragmented and there is a lack of consensus on core constructs. The purpose of this paper is to present a new conceptual framework for program sustainability in public health. Methods This developmental study uses a comprehensive literature review, input from an expert panel, and the results of concept-mapping to identify the core domains of a conceptual framework for public health program capacity for sustainability. The concept-mapping process included three types of participants (scientists, funders, and practitioners) from several public health areas (e.g., tobacco control, heart disease and stroke, physical activity and nutrition, and injury prevention). Results The literature review identified 85 relevant studies focusing on program sustainability in public health. Most of the papers described empirical studies of prevention-oriented programs aimed at the community level. The concept-mapping process identified nine core domains that affect a program’s capacity for sustainability: Political Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, Public Health Impacts, and Strategic Planning. Concept-mapping participants further identified 93 items across these domains that have strong face validity—89% of the individual items composing the framework had specific support in the sustainability literature. Conclusions The sustainability framework presented here suggests that a number of selected factors may be related to a program’s ability to sustain its activities and benefits over time. These factors have been discussed in the literature, but this framework synthesizes and combines the factors and suggests how they may be interrelated with one another. The framework presents domains for public health decision makers to consider when developing and implementing prevention and intervention programs. The sustainability framework will be useful for public health decision makers, program managers, program evaluators, and dissemination and implementation researchers. PMID:23375082
Causality Analysis of fMRI Data Based on the Directed Information Theory Framework.
Wang, Zhe; Alahmadi, Ahmed; Zhu, David C; Li, Tongtong
2016-05-01
This paper aims to conduct fMRI-based causality analysis in brain connectivity by exploiting the directed information (DI) theory framework. Unlike the well-known Granger causality (GC) analysis, which relies on the linear prediction technique, the DI theory framework does not have any modeling constraints on the sequences to be evaluated and ensures estimation convergence. Moreover, it can be used to generate the GC graphs. In this paper, first, we introduce the core concepts in the DI framework. Second, we present how to conduct causality analysis using DI measures between two time series. We provide the detailed procedure on how to calculate the DI for two finite-time series. The two major steps involved here are optimal bin size selection for data digitization and probability estimation. Finally, we demonstrate the applicability of DI-based causality analysis using both the simulated data and experimental fMRI data, and compare the results with that of the GC analysis. Our analysis indicates that GC analysis is effective in detecting linear or nearly linear causal relationship, but may have difficulty in capturing nonlinear causal relationships. On the other hand, DI-based causality analysis is more effective in capturing both linear and nonlinear causal relationships. Moreover, it is observed that brain connectivity among different regions generally involves dynamic two-way information transmissions between them. Our results show that when bidirectional information flow is present, DI is more effective than GC to quantify the overall causal relationship.
NASA Astrophysics Data System (ADS)
Fukuda, Jun'ichi; Johnson, Kaj M.
2010-06-01
We present a unified theoretical framework and solution method for probabilistic, Bayesian inversions of crustal deformation data. The inversions involve multiple data sets with unknown relative weights, model parameters that are related linearly or non-linearly through theoretic models to observations, prior information on model parameters and regularization priors to stabilize underdetermined problems. To efficiently handle non-linear inversions in which some of the model parameters are linearly related to the observations, this method combines both analytical least-squares solutions and a Monte Carlo sampling technique. In this method, model parameters that are linearly and non-linearly related to observations, relative weights of multiple data sets and relative weights of prior information and regularization priors are determined in a unified Bayesian framework. In this paper, we define the mixed linear-non-linear inverse problem, outline the theoretical basis for the method, provide a step-by-step algorithm for the inversion, validate the inversion method using synthetic data and apply the method to two real data sets. We apply the method to inversions of multiple geodetic data sets with unknown relative data weights for interseismic fault slip and locking depth. We also apply the method to the problem of estimating the spatial distribution of coseismic slip on faults with unknown fault geometry, relative data weights and smoothing regularization weight.
Portfolio optimization by using linear programing models based on genetic algorithm
NASA Astrophysics Data System (ADS)
Sukono; Hidayat, Y.; Lesmana, E.; Putra, A. S.; Napitupulu, H.; Supian, S.
2018-01-01
In this paper, we discussed the investment portfolio optimization using linear programming model based on genetic algorithms. It is assumed that the portfolio risk is measured by absolute standard deviation, and each investor has a risk tolerance on the investment portfolio. To complete the investment portfolio optimization problem, the issue is arranged into a linear programming model. Furthermore, determination of the optimum solution for linear programming is done by using a genetic algorithm. As a numerical illustration, we analyze some of the stocks traded on the capital market in Indonesia. Based on the analysis, it is shown that the portfolio optimization performed by genetic algorithm approach produces more optimal efficient portfolio, compared to the portfolio optimization performed by a linear programming algorithm approach. Therefore, genetic algorithms can be considered as an alternative on determining the investment portfolio optimization, particularly using linear programming models.
Study on Web-Based Tool for Regional Agriculture Industry Structure Optimization Using Ajax
NASA Astrophysics Data System (ADS)
Huang, Xiaodong; Zhu, Yeping
According to the research status of regional agriculture industry structure adjustment information system and the current development of information technology, this paper takes web-based regional agriculture industry structure optimization tool as research target. This paper introduces Ajax technology and related application frameworks to build an auxiliary toolkit of decision support system for agricultural policy maker and economy researcher. The toolkit includes a “one page” style component of regional agriculture industry structure optimization which provides agile arguments setting method that enables applying sensitivity analysis and usage of data and comparative advantage analysis result, and a component that can solve the linear programming model and its dual problem by simplex method.
Development of a Rural Health Framework: Implications for Program Service Planning and Delivery
White, Deanna
2013-01-01
Purpose: To describe the development and application of an evidence-based Rural Health Framework to guide rural health program, policy and service planning. Methods: A literature review of rural health programs, focusing on health promotion, chronic disease prevention and population health, was conducted using several bibliographic databases. Findings: Thirty papers met the criteria for review, describing chronic disease interventions and public health policies in rural settings. Twenty-one papers demonstrated effective intervention programs and highlighted potential good practices for rural health programs, which were used to define key elements of a Rural Health Framework. Conclusions: The Rural Health Framework was applied to an influenza immunization program to demonstrate its utility in assisting public health providers to increase uptake of the vaccine. This Rural Health Framework provides an opportunity for program planners to reflect on the key issues facing rural communities to ensure the development of policies and strategies that will prudently and effectively meet population health needs. PMID:23968625
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the machine tool operation/machine tool and tool and die making technology programs cluster. Presented in the introductory section are a framework of courses and programs, description of the…
Aspect-object alignment with Integer Linear Programming in opinion mining.
Zhao, Yanyan; Qin, Bing; Liu, Ting; Yang, Wei
2015-01-01
Target extraction is an important task in opinion mining. In this task, a complete target consists of an aspect and its corresponding object. However, previous work has always simply regarded the aspect as the target itself and has ignored the important "object" element. Thus, these studies have addressed incomplete targets, which are of limited use for practical applications. This paper proposes a novel and important sentiment analysis task, termed aspect-object alignment, to solve the "object neglect" problem. The objective of this task is to obtain the correct corresponding object for each aspect. We design a two-step framework for this task. We first provide an aspect-object alignment classifier that incorporates three sets of features, namely, the basic, relational, and special target features. However, the objects that are assigned to aspects in a sentence often contradict each other and possess many complicated features that are difficult to incorporate into a classifier. To resolve these conflicts, we impose two types of constraints in the second step: intra-sentence constraints and inter-sentence constraints. These constraints are encoded as linear formulations, and Integer Linear Programming (ILP) is used as an inference procedure to obtain a final global decision that is consistent with the constraints. Experiments on a corpus in the camera domain demonstrate that the three feature sets used in the aspect-object alignment classifier are effective in improving its performance. Moreover, the classifier with ILP inference performs better than the classifier without it, thereby illustrating that the two types of constraints that we impose are beneficial.
Zhao, Yingfeng; Liu, Sanyang
2016-01-01
We present a practical branch and bound algorithm for globally solving generalized linear multiplicative programming problem with multiplicative constraints. To solve the problem, a relaxation programming problem which is equivalent to a linear programming is proposed by utilizing a new two-phase relaxation technique. In the algorithm, lower and upper bounds are simultaneously obtained by solving some linear relaxation programming problems. Global convergence has been proved and results of some sample examples and a small random experiment show that the proposed algorithm is feasible and efficient.
An approximation theory for the identification of linear thermoelastic systems
NASA Technical Reports Server (NTRS)
Rosen, I. G.; Su, Chien-Hua Frank
1990-01-01
An abstract approximation framework and convergence theory for the identification of thermoelastic systems is developed. Starting from an abstract operator formulation consisting of a coupled second order hyperbolic equation of elasticity and first order parabolic equation for heat conduction, well-posedness is established using linear semigroup theory in Hilbert space, and a class of parameter estimation problems is then defined involving mild solutions. The approximation framework is based upon generic Galerkin approximation of the mild solutions, and convergence of solutions of the resulting sequence of approximating finite dimensional parameter identification problems to a solution of the original infinite dimensional inverse problem is established using approximation results for operator semigroups. An example involving the basic equations of one dimensional linear thermoelasticity and a linear spline based scheme are discussed. Numerical results indicate how the approach might be used in a study of damping mechanisms in flexible structures.
Very Low-Cost Nutritious Diet Plans Designed by Linear Programming.
ERIC Educational Resources Information Center
Foytik, Jerry
1981-01-01
Provides procedural details of Linear Programing, developed by the U.S. Department of Agriculture to devise a dietary guide for consumers that minimizes food costs without sacrificing nutritional quality. Compares Linear Programming with the Thrifty Food Plan, which has been a basis for allocating coupons under the Food Stamp Program. (CS)
OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Seyong; Vetter, Jeffrey S
2014-01-01
Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing andmore » implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.« less
Zhou, Gaochao; Tao, Xudong; Shen, Ze; Zhu, Guanghao; Jin, Biaobing; Kang, Lin; Xu, Weiwei; Chen, Jian; Wu, Peiheng
2016-01-01
We propose a kind of general framework for the design of a perfect linear polarization converter that works in the transmission mode. Using an intuitive picture that is based on the method of bi-directional polarization mode decomposition, it is shown that when the device under consideration simultaneously possesses two complementary symmetry planes, with one being equivalent to a perfect electric conducting surface and the other being equivalent to a perfect magnetic conducting surface, linear polarization conversion can occur with an efficiency of 100% in the absence of absorptive losses. The proposed framework is validated by two design examples that operate near 10 GHz, where the numerical, experimental and analytic results are in good agreements. PMID:27958313
NASA Astrophysics Data System (ADS)
Kusumawati, Rosita; Subekti, Retno
2017-04-01
Fuzzy bi-objective linear programming (FBOLP) model is bi-objective linear programming model in fuzzy number set where the coefficients of the equations are fuzzy number. This model is proposed to solve portfolio selection problem which generate an asset portfolio with the lowest risk and the highest expected return. FBOLP model with normal fuzzy numbers for risk and expected return of stocks is transformed into linear programming (LP) model using magnitude ranking function.
Bauer, Mark S; Krawczyk, Lois; Tuozzo, Kathy; Frigand, Cara; Holmes, Sally; Miller, Christopher J; Abel, Erica; Osser, David N; Franz, Aleda; Brandt, Cynthia; Rooney, Meghan; Fleming, Jerry; Smith, Eric; Godleski, Linda
2018-01-01
Telemental health interventions have empirical support from clinical trials and structured demonstration projects. However, their implementation and sustainability under less structured clinical conditions are not well demonstrated. We conducted a follow-up analysis of the implementation and sustainability of a clinical video teleconference-based collaborative care model for individuals with bipolar disorder treated in the Department of Veterans Affairs to (a) characterize the extent of implementation and sustainability of the program after its establishment and (b) identify barriers and facilitators to implementation and sustainability. We conducted a mixed methods program evaluation, assessing quantitative aspects of implementation according to the Reach, Efficacy, Adoption, Implementation, and Maintenance implementation framework. We conducted qualitative analysis of semistructured interviews with 16 of the providers who submitted consults, utilizing the Integrated Promoting Action on Research Implementation in the Health Services implementation framework. The program demonstrated linear growth in sites (n = 35) and consults (n = 915) from late 2011 through mid-2016. Site-based analysis indicated statistically significant sustainability beyond the first year of operation. Qualitative analysis identified key facilitators, including consult content, ease of use via electronic health record, and national infrastructure. Barriers included availability of telehealth space, equipment, and staff at the sites, as well as the labor-intensive nature of scheduling. The program achieved continuous growth over almost 5 years due to (1) successfully filling a need perceived by providers, (2) developing in a supportive context, and (3) receiving effective facilitation by national and local infrastructure. Clinical video teleconference-based interventions, even multicomponent collaborative care interventions for individuals with complex mental health conditions, can grow vigorously under appropriate conditions.
ERIC Educational Resources Information Center
Kane, Michael T.; Mroch, Andrew A.; Suh, Youngsuk; Ripkey, Douglas R.
2009-01-01
This paper analyzes five linear equating models for the "nonequivalent groups with anchor test" (NEAT) design with internal anchors (i.e., the anchor test is part of the full test). The analysis employs a two-dimensional framework. The first dimension contrasts two general approaches to developing the equating relationship. Under a "parameter…
Liu, Chunhua; Park, Eunsol; Jin, Yinghua; Liu, Jie; Yu, Yanxia; Zhang, Wei; Lei, Shengbin; Hu, Wenping
2018-05-31
A two-dimensional surface covalent organic framework, prepared by a surface-confined synthesis using 4,4'-azodianiline and benzene-1,3,5-tricarbaldehyde as the precursors, was used as a host network to effectively immobilize arylenevinylene macrocycles (AVMs). Thus AVMs could be separated from their linear polymer analogues, which are the common side-products in the cyclooligomerization process. Scanning tunneling microscopy investigations revealed efficient removal of linear polymers by a simple surface binding and solvent washing process. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Dyehouse, Melissa; Bennett, Deborah; Harbor, Jon; Childress, Amy; Dark, Melissa
2009-01-01
Logic models are based on linear relationships between program resources, activities, and outcomes, and have been used widely to support both program development and evaluation. While useful in describing some programs, the linear nature of the logic model makes it difficult to capture the complex relationships within larger, multifaceted…
Planning for Program Design and Assessment Using Value Creation Frameworks
ERIC Educational Resources Information Center
Whisler, Laurel; Anderson, Rachel; Brown, Jenai
2017-01-01
This article explains a program design and planning process using the Value Creation Framework (VCF) developed by Wenger, Trayner, and de Laat (2011). The framework involves identifying types of value or benefit for those involved in the program, conditions and activities that support creation of that value, data that measure whether the value was…
Framework for an Effective Assessment and Accountability Program: The Philadelphia Example
ERIC Educational Resources Information Center
Porter, Andrew C.; Chester, Mitchell D.; Schlesinger, Michael D.
2004-01-01
The purpose of this article is to put in the hands of researchers, practitioners, and policy makers a powerful framework for building and studying the effects of high-quality assessment and accountability programs. The framework is illustrated through a description and analysis of the assessment and accountability program in the School District of…
NASA Astrophysics Data System (ADS)
Laban, Shaban; El-Desouky, Aly
2014-05-01
To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Also, it is possible to extend the use of the framework in monitoring the IDC pipeline. The detailed design, implementation,conclusion and future work of the proposed framework will be presented.
ERIC Educational Resources Information Center
California State Dept. of Education, Sacramento.
This booklet describes the characteristics and role of curriculum frameworks and describes how they can be used in developing educational programs. It is designed as a guide for writers of frameworks, for educators who are responsible for implementing frameworks, or for evaluators of educational programs. It provides a concise description of the…
An Integrity Framework for Image-Based Navigation Systems
2010-06-01
Anton H. and Rorres C. Elementary Linear Algebra . New York, NY: John Wiley & Sons, Inc., 2000. 4. Arthur T. “The Disparity of Parity, Determining...107. Spilker , James J.J. Digital Communications by Satellite. Englewood Cliffs NJ: Prentice Hall, 1977. 108. Strang G. Linear Algebra and its...2.3 The Linearized and Extended Kalman Filters . . . . . . 22 2.3.1 State and Measurement Model Equations . . . 23 2.3.2 The Linearized Kalman Filter
NASA Astrophysics Data System (ADS)
Xiao, Jingjie
A key hurdle for implementing real-time pricing of electricity is a lack of consumers' responses. Solutions to overcome the hurdle include the energy management system that automatically optimizes household appliance usage such as plug-in hybrid electric vehicle charging (and discharging with vehicle-to-grid) via a two-way communication with the grid. Real-time pricing, combined with household automation devices, has a potential to accommodate an increasing penetration of plug-in hybrid electric vehicles. In addition, the intelligent energy controller on the consumer-side can help increase the utilization rate of the intermittent renewable resource, as the demand can be managed to match the output profile of renewables, thus making the intermittent resource such as wind and solar more economically competitive in the long run. One of the main goals of this dissertation is to present how real-time retail pricing, aided by control automation devices, can be integrated into the wholesale electricity market under various uncertainties through approximate dynamic programming. What distinguishes this study from the existing work in the literature is that whole- sale electricity prices are endogenously determined as we solve a system operator's economic dispatch problem on an hourly basis over the entire optimization horizon. This modeling and algorithm framework will allow a feedback loop between electricity prices and electricity consumption to be fully captured. While we are interested in a near-optimal solution using approximate dynamic programming; deterministic linear programming benchmarks are use to demonstrate the quality of our solutions. The other goal of the dissertation is to use this framework to provide numerical evidence to the debate on whether real-time pricing is superior than the current flat rate structure in terms of both economic and environmental impacts. For this purpose, the modeling and algorithm framework is tested on a large-scale test case with hundreds of power plants based on data available for California, making our findings useful for policy makers, system operators and utility companies to gain a concrete understanding on the scale of the impact with real-time pricing.
Gradient design for liquid chromatography using multi-scale optimization.
López-Ureña, S; Torres-Lapasió, J R; Donat, R; García-Alvarez-Coque, M C
2018-01-26
In reversed phase-liquid chromatography, the usual solution to the "general elution problem" is the application of gradient elution with programmed changes of organic solvent (or other properties). A correct quantification of chromatographic peaks in liquid chromatography requires well resolved signals in a proper analysis time. When the complexity of the sample is high, the gradient program should be accommodated to the local resolution needs of each analyte. This makes the optimization of such situations rather troublesome, since enhancing the resolution for a given analyte may imply a collateral worsening of the resolution of other analytes. The aim of this work is to design multi-linear gradients that maximize the resolution, while fulfilling some restrictions: all peaks should be eluted before a given maximal time, the gradient should be flat or increasing, and sudden changes close to eluting peaks are penalized. Consequently, an equilibrated baseline resolution for all compounds is sought. This goal is achieved by splitting the optimization problem in a multi-scale framework. In each scale κ, an optimization problem is solved with N κ ≈ 2 κ variables that are used to build the gradients. The N κ variables define cubic splines written in terms of a B-spline basis. This allows expressing gradients as polygonals of M points approximating the splines. The cubic splines are built using subdivision schemes, a technique of fast generation of smooth curves, compatible with the multi-scale framework. Owing to the nature of the problem and the presence of multiple local maxima, the algorithm used in the optimization problem of each scale κ should be "global", such as the pattern-search algorithm. The multi-scale optimization approach is successfully applied to find the best multi-linear gradient for resolving a mixture of amino acid derivatives. Copyright © 2017 Elsevier B.V. All rights reserved.
McCallum, Meg; Carver, Janet; Dupere, David; Ganong, Sharon; Henderson, J David; McKim, Ann; McNeil-Campbell, Lisa; Richardson, Holly; Simpson, Judy; Tschupruk, Cheryl; Jewers, Heather
2018-05-15
In 2014, Nova Scotia released a provincial palliative care strategy and implementation working groups were established. The Capacity Building and Practice Change Working Group, comprised of health professionals, public advisors, academics, educators, and a volunteer supervisor, was asked to select palliative care education programs for health professionals and volunteers. The first step in achieving this mandate was to establish competencies for health professionals and volunteers caring for patients with life-limiting illness and their families and those specializing in palliative care. In 2015, a literature search for palliative care competencies and an environmental scan of related education programs were conducted. The Irish Palliative Care Competence Framework serves as the foundation of the Nova Scotia Palliative Care Competency Framework. Additional disciplines and competencies were added and any competencies not specific to palliative care were removed. To highlight interprofessional practice, the framework illustrates shared and discipline-specific competencies. Stakeholders were asked to validate the framework and map the competencies to educational programs. Numerous rounds of review refined the framework. The framework includes competencies for 22 disciplines, 9 nursing specialties, and 4 physician specialties. The framework, released in 2017, and the selection and implementation of education programs were a significant undertaking. The framework will support the implementation of the Nova Scotia Integrated Palliative Care Strategy, enhance the interprofessional nature of palliative care, and guide the further implementation of education programs. Other jurisdictions have expressed considerable interest in the framework.
NASA Technical Reports Server (NTRS)
Wright, Jeffrey; Thakur, Siddharth
2006-01-01
Loci-STREAM is an evolving computational fluid dynamics (CFD) software tool for simulating possibly chemically reacting, possibly unsteady flows in diverse settings, including rocket engines, turbomachines, oil refineries, etc. Loci-STREAM implements a pressure- based flow-solving algorithm that utilizes unstructured grids. (The benefit of low memory usage by pressure-based algorithms is well recognized by experts in the field.) The algorithm is robust for flows at all speeds from zero to hypersonic. The flexibility of arbitrary polyhedral grids enables accurate, efficient simulation of flows in complex geometries, including those of plume-impingement problems. The present version - Loci-STREAM version 0.9 - includes an interface with the Portable, Extensible Toolkit for Scientific Computation (PETSc) library for access to enhanced linear-equation-solving programs therein that accelerate convergence toward a solution. The name "Loci" reflects the creation of this software within the Loci computational framework, which was developed at Mississippi State University for the primary purpose of simplifying the writing of complex multidisciplinary application programs to run in distributed-memory computing environments including clusters of personal computers. Loci has been designed to relieve application programmers of the details of programming for distributed-memory computers.
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for certificate of residential carpentry and residential carpentry technology programs. Presented in the introductory section are program descriptions and suggested course sequences for both programs. Section I lists…
Linear Programming across the Curriculum
ERIC Educational Resources Information Center
Yoder, S. Elizabeth; Kurz, M. Elizabeth
2015-01-01
Linear programming (LP) is taught in different departments across college campuses with engineering and management curricula. Modeling an LP problem is taught in every linear programming class. As faculty teaching in Engineering and Management departments, the depth to which teachers should expect students to master this particular type of…
Fundamental solution of the problem of linear programming and method of its determination
NASA Technical Reports Server (NTRS)
Petrunin, S. V.
1978-01-01
The idea of a fundamental solution to a problem in linear programming is introduced. A method of determining the fundamental solution and of applying this method to the solution of a problem in linear programming is proposed. Numerical examples are cited.
A Sawmill Manager Adapts To Change With Linear Programming
George F. Dutrow; James E. Granskog
1973-01-01
Linear programming provides guidelines for increasing sawmill capacity and flexibility and for determining stumpagepurchasing strategy. The operator of a medium-sized sawmill implemented improvements suggested by linear programming analysis; results indicate a 45 percent increase in revenue and a 36 percent hike in volume processed.
I-81 ITS program evaluation framework
DOT National Transportation Integrated Search
2003-07-01
This document presents the evaluation framework for the I-81 ITS Model Safety Corridor Program. The objectives of the framework are threefold: 1) serve as input into the development of infrastructure in the I-81 Corridor to generate baseline data for...
ERIC Educational Resources Information Center
Lloyd-Strovas, Jenny D.; Arsuffi, Thomas L.
2016-01-01
We examined the diversity of environmental education (EE) in Texas, USA, by developing a framework to assess EE organizations and programs at a large scale: the Environmental Education Database of Organizations and Programs (EEDOP). This framework consisted of the following characteristics: organization/visitor demographics, pedagogy/curriculum,…
Decision Aids Using Heterogeneous Intelligence Analysis
2010-08-20
developing a Geocultural service, a software framework and inferencing engine for the Transparent Urban Structures program. The scope of the effort...has evolved as the program has matured and is including multiple data sources, as well as interfaces out to the ONR architectural framework . Tasks...Interface; Application Program Interface; Application Programmer Interface CAF Common Application Framework EDA Event Driven Architecture a 16. SECURITY
Reyes, E Michael; Sharma, Anjali; Thomas, Kate K; Kuehn, Chuck; Morales, José Rafael
2014-09-17
Little information exists on the technical assistance needs of local indigenous organizations charged with managing HIV care and treatment programs funded by the US President's Emergency Plan for AIDS Relief (PEPFAR). This paper describes the methods used to adapt the Primary Care Assessment Tool (PCAT) framework, which has successfully strengthened HIV primary care services in the US, into one that could strengthen the capacity of local partners to deliver priority health programs in resource-constrained settings by identifying their specific technical assistance needs. Qualitative methods and inductive reasoning approaches were used to conceptualize and adapt the new Clinical Assessment for Systems Strengthening (ClASS) framework. Stakeholder interviews, comparisons of existing assessment tools, and a pilot test helped determine the overall ClASS framework for use in low-resource settings. The framework was further refined one year post-ClASS implementation. Stakeholder interviews, assessment of existing tools, a pilot process and the one-year post- implementation assessment informed the adaptation of the ClASS framework for assessing and strengthening technical and managerial capacities of health programs at three levels: international partner, local indigenous partner, and local partner treatment facility. The PCAT focus on organizational strengths and systems strengthening was retained and implemented in the ClASS framework and approach. A modular format was chosen to allow the use of administrative, fiscal and clinical modules in any combination and to insert new modules as needed by programs. The pilot led to refined pre-visit planning, informed review team composition, increased visit duration, and restructured modules. A web-based toolkit was developed to capture three years of experiential learning; this kit can also be used for independent implementation of the ClASS framework. A systematic adaptation process has produced a qualitative framework that can inform implementation strategies in support of country led HIV care and treatment programs. The framework, as a well-received iterative process focused on technical assistance, may have broader utility in other global programs.
Unified Program Design: Organizing Existing Programming Models, Delivery Options, and Curriculum
ERIC Educational Resources Information Center
Rubenstein, Lisa DaVia; Ridgley, Lisa M.
2017-01-01
A persistent problem in the field of gifted education has been the lack of categorization and delineation of gifted programming options. To address this issue, we propose Unified Program Design as a structural framework for gifted program models. This framework defines gifted programs as the combination of delivery methods and curriculum models.…
Timetabling an Academic Department with Linear Programming.
ERIC Educational Resources Information Center
Bezeau, Lawrence M.
This paper describes an approach to faculty timetabling and course scheduling that uses computerized linear programming. After reviewing the literature on linear programming, the paper discusses the process whereby a timetable was created for a department at the University of New Brunswick. Faculty were surveyed with respect to course offerings…
ERIC Educational Resources Information Center
Schmitt, M. A.; And Others
1994-01-01
Compares traditional manure application planning techniques calculated to meet agronomic nutrient needs on a field-by-field basis with plans developed using computer-assisted linear programming optimization methods. Linear programming provided the most economical and environmentally sound manure application strategy. (Contains 15 references.) (MDH)
Chen, Cong; Zhu, Ying; Zeng, Xueting; Huang, Guohe; Li, Yongping
2018-07-15
Contradictions of increasing carbon mitigation pressure and electricity demand have been aggravated significantly. A heavy emphasis is placed on analyzing the carbon mitigation potential of electric energy systems via tradable green certificates (TGC). This study proposes a tradable green certificate (TGC)-fractional fuzzy stochastic robust optimization (FFSRO) model through integrating fuzzy possibilistic, two-stage stochastic and stochastic robust programming techniques into a linear fractional programming framework. The framework can address uncertainties expressed as stochastic and fuzzy sets, and effectively deal with issues of multi-objective tradeoffs between the economy and environment. The proposed model is applied to the major economic center of China, the Beijing-Tianjin-Hebei region. The generated results of proposed model indicate that a TGC mechanism is a cost-effective pathway to cope with carbon reduction and support the sustainable development pathway of electric energy systems. In detail, it can: (i) effectively promote renewable power development and reduce fossil fuel use; (ii) lead to higher CO 2 mitigation potential than non-TGC mechanism; and (iii) greatly alleviate financial pressure on the government to provide renewable energy subsidies. The TGC-FFSRO model can provide a scientific basis for making related management decisions of electric energy systems. Copyright © 2017 Elsevier B.V. All rights reserved.
Applications of Goal Programming to Education.
ERIC Educational Resources Information Center
Van Dusseldorp, Ralph A.; And Others
This paper discusses goal programming, a computer-based operations research technique that is basically a modification and extension of linear programming. The authors first discuss the similarities and differences between goal programming and linear programming, then describe the limitations of goal programming and its possible applications for…
Notes from the field: the economic value chain in disease management organizations.
Fetterolf, Donald
2006-12-01
The disease management (DM) "value chain" is composed of a linear series of steps that include operational milestones in the development of knowledge, each stage evolving from the preceding one. As an adaptation of Michael Porter's "value chain" model, the process flow in DM moves along the following path: (1) data/information technology, (2) information generation, (3) analysis, (4) assessment/recommendations, (5) actionable customer plan, and (6) program assessment/reassessment. Each of these stages is managed as a major line of product operations within a DM company or health plan. Metrics around each of the key production variables create benchmark milestones, ongoing management insight into program effectiveness, and potential drivers for activity-based cost accounting pricing models. The value chain process must remain robust from early entry of data and information into the system, through the final presentation and recommendations for our clients if the program is to be effective. For individuals involved in the evaluation or review of DM programs, this framework is an excellent method to visualize the key components and sequence in the process. The value chain model is an excellent way to establish the value of a formal DM program and to create a consultancy relationship with a client involved in purchasing these complex services.
Framework for Development of Object-Oriented Software
NASA Technical Reports Server (NTRS)
Perez-Poveda, Gus; Ciavarella, Tony; Nieten, Dan
2004-01-01
The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework s development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/ training systems.
State-space forecasting of Schistosoma haematobium time-series in Niono, Mali.
Medina, Daniel C; Findley, Sally E; Doumbia, Seydou
2008-08-13
Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with infectious diseases. The incidence of Schistosoma sp.-which are neglected tropical diseases exposing and infecting more than 500 and 200 million individuals in 77 countries, respectively-is rising because of 1) numerous irrigation and hydro-electric projects, 2) steady shifts from nomadic to sedentary existence, and 3) ineffective control programs. Notwithstanding the colossal scope of these parasitic infections, less than 0.5% of Schistosoma sp. investigations have attempted to predict their spatial and or temporal distributions. Undoubtedly, public health programs in developing countries could benefit from parsimonious forecasting and early warning systems to enhance management of these parasitic diseases. In this longitudinal retrospective (01/1996-06/2004) investigation, the Schistosoma haematobium time-series for the district of Niono, Mali, was fitted with general-purpose exponential smoothing methods to generate contemporaneous on-line forecasts. These methods, which are encapsulated within a state-space framework, accommodate seasonal and inter-annual time-series fluctuations. Mean absolute percentage error values were circa 25% for 1- to 5-month horizon forecasts. The exponential smoothing state-space framework employed herein produced reasonably accurate forecasts for this time-series, which reflects the incidence of S. haematobium-induced terminal hematuria. It obliquely captured prior non-linear interactions between disease dynamics and exogenous covariates (e.g., climate, irrigation, and public health interventions), thus obviating the need for more complex forecasting methods in the district of Niono, Mali. Therefore, this framework could assist with managing and assessing S. haematobium transmission and intervention impact, respectively, in this district and potentially elsewhere in the Sahel.
State–Space Forecasting of Schistosoma haematobium Time-Series in Niono, Mali
Medina, Daniel C.; Findley, Sally E.; Doumbia, Seydou
2008-01-01
Background Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with infectious diseases. The incidence of Schistosoma sp.—which are neglected tropical diseases exposing and infecting more than 500 and 200 million individuals in 77 countries, respectively—is rising because of 1) numerous irrigation and hydro-electric projects, 2) steady shifts from nomadic to sedentary existence, and 3) ineffective control programs. Notwithstanding the colossal scope of these parasitic infections, less than 0.5% of Schistosoma sp. investigations have attempted to predict their spatial and or temporal distributions. Undoubtedly, public health programs in developing countries could benefit from parsimonious forecasting and early warning systems to enhance management of these parasitic diseases. Methodology/Principal Findings In this longitudinal retrospective (01/1996–06/2004) investigation, the Schistosoma haematobium time-series for the district of Niono, Mali, was fitted with general-purpose exponential smoothing methods to generate contemporaneous on-line forecasts. These methods, which are encapsulated within a state–space framework, accommodate seasonal and inter-annual time-series fluctuations. Mean absolute percentage error values were circa 25% for 1- to 5-month horizon forecasts. Conclusions/Significance The exponential smoothing state–space framework employed herein produced reasonably accurate forecasts for this time-series, which reflects the incidence of S. haematobium–induced terminal hematuria. It obliquely captured prior non-linear interactions between disease dynamics and exogenous covariates (e.g., climate, irrigation, and public health interventions), thus obviating the need for more complex forecasting methods in the district of Niono, Mali. Therefore, this framework could assist with managing and assessing S. haematobium transmission and intervention impact, respectively, in this district and potentially elsewhere in the Sahel. PMID:18698361
Correlation and simple linear regression.
Eberly, Lynn E
2007-01-01
This chapter highlights important steps in using correlation and simple linear regression to address scientific questions about the association of two continuous variables with each other. These steps include estimation and inference, assessing model fit, the connection between regression and ANOVA, and study design. Examples in microbiology are used throughout. This chapter provides a framework that is helpful in understanding more complex statistical techniques, such as multiple linear regression, linear mixed effects models, logistic regression, and proportional hazards regression.
NASA Technical Reports Server (NTRS)
Klumpp, A. R.; Lawson, C. L.
1988-01-01
Routines provided for common scalar, vector, matrix, and quaternion operations. Computer program extends Ada programming language to include linear-algebra capabilities similar to HAS/S programming language. Designed for such avionics applications as software for Space Station.
A Framework for Assessing Developmental Education Programs
ERIC Educational Resources Information Center
Goldwasser, Molly; Martin, Kimberly; Harris, Eugenia
2017-01-01
This paper presents a framework for educators, administrators, and researchers to assess distinct facets of developmental education programs. The researchers review the literature on best practices in developmental education with regards to program cost, program structure, and student placement procedures. This paper also identifies seven model…
Industrial Education. Vocational Education Program Courses Standards.
ERIC Educational Resources Information Center
Florida State Dept. of Education, Tallahassee. Div. of Vocational, Adult, and Community Education.
This document contains vocational education program courses standards (curriculum frameworks and student performance standards) for exploratory courses, practical arts courses, and job preparatory programs offered at the secondary or postsecondary level in Florida. Each program courses standard is composed of two parts: a curriculum framework and…
A log-linear model approach to estimation of population size using the line-transect sampling method
Anderson, D.R.; Burnham, K.P.; Crain, B.R.
1978-01-01
The technique of estimating wildlife population size and density using the belt or line-transect sampling method has been used in many past projects, such as the estimation of density of waterfowl nestling sites in marshes, and is being used currently in such areas as the assessment of Pacific porpoise stocks in regions of tuna fishing activity. A mathematical framework for line-transect methodology has only emerged in the last 5 yr. In the present article, we extend this mathematical framework to a line-transect estimator based upon a log-linear model approach.
NASA Astrophysics Data System (ADS)
Zabavnikova, T. A.; Kadashevich, Yu. I.; Pomytkin, S. P.
2018-05-01
A geometric non-linear endochronic theory of inelasticity in tensor parametric form is considered. In the framework of this theory, the creep strains are modelled. The effect of various schemes of applying stresses and changing of material properties on the development of creep strains is studied. The constitutive equations of the model are represented by non-linear systems of ordinary differential equations which are solved in MATLAB environment by implicit difference method. Presented results demonstrate a good qualitative agreement of theoretical data and experimental observations including the description of the tertiary creep and pre-fracture of materials.
NASA Technical Reports Server (NTRS)
Tuey, R. C.
1972-01-01
Computer solutions of linear programming problems are outlined. Information covers vector spaces, convex sets, and matrix algebra elements for solving simultaneous linear equations. Dual problems, reduced cost analysis, ranges, and error analysis are illustrated.
A Mixed Integer Linear Program for Solving a Multiple Route Taxi Scheduling Problem
NASA Technical Reports Server (NTRS)
Montoya, Justin Vincent; Wood, Zachary Paul; Rathinam, Sivakumar; Malik, Waqar Ahmad
2010-01-01
Aircraft movements on taxiways at busy airports often create bottlenecks. This paper introduces a mixed integer linear program to solve a Multiple Route Aircraft Taxi Scheduling Problem. The outputs of the model are in the form of optimal taxi schedules, which include routing decisions for taxiing aircraft. The model extends an existing single route formulation to include routing decisions. An efficient comparison framework compares the multi-route formulation and the single route formulation. The multi-route model is exercised for east side airport surface traffic at Dallas/Fort Worth International Airport to determine if any arrival taxi time savings can be achieved by allowing arrivals to have two taxi routes: a route that crosses an active departure runway and a perimeter route that avoids the crossing. Results indicate that the multi-route formulation yields reduced arrival taxi times over the single route formulation only when a perimeter taxiway is used. In conditions where the departure aircraft are given an optimal and fixed takeoff sequence, accumulative arrival taxi time savings in the multi-route formulation can be as high as 3.6 hours more than the single route formulation. If the departure sequence is not optimal, the multi-route formulation results in less taxi time savings made over the single route formulation, but the average arrival taxi time is significantly decreased.
A depth-first search algorithm to compute elementary flux modes by linear programming
2014-01-01
Background The decomposition of complex metabolic networks into elementary flux modes (EFMs) provides a useful framework for exploring reaction interactions systematically. Generating a complete set of EFMs for large-scale models, however, is near impossible. Even for moderately-sized models (<400 reactions), existing approaches based on the Double Description method must iterate through a large number of combinatorial candidates, thus imposing an immense processor and memory demand. Results Based on an alternative elementarity test, we developed a depth-first search algorithm using linear programming (LP) to enumerate EFMs in an exhaustive fashion. Constraints can be introduced to directly generate a subset of EFMs satisfying the set of constraints. The depth-first search algorithm has a constant memory overhead. Using flux constraints, a large LP problem can be massively divided and parallelized into independent sub-jobs for deployment into computing clusters. Since the sub-jobs do not overlap, the approach scales to utilize all available computing nodes with minimal coordination overhead or memory limitations. Conclusions The speed of the algorithm was comparable to efmtool, a mainstream Double Description method, when enumerating all EFMs; the attrition power gained from performing flux feasibility tests offsets the increased computational demand of running an LP solver. Unlike the Double Description method, the algorithm enables accelerated enumeration of all EFMs satisfying a set of constraints. PMID:25074068
Joint tumor segmentation and dense deformable registration of brain MR images.
Parisot, Sarah; Duffau, Hugues; Chemouny, Stéphane; Paragios, Nikos
2012-01-01
In this paper we propose a novel graph-based concurrent registration and segmentation framework. Registration is modeled with a pairwise graphical model formulation that is modular with respect to the data and regularization term. Segmentation is addressed by adopting a similar graphical model, using image-based classification techniques while producing a smooth solution. The two problems are coupled via a relaxation of the registration criterion in the presence of tumors as well as a segmentation through a registration term aiming the separation between healthy and diseased tissues. Efficient linear programming is used to solve both problems simultaneously. State of the art results demonstrate the potential of our method on a large and challenging low-grade glioma data set.
Wasserman, Deborah L
2010-05-01
This paper offers a framework for using a systems orientation and "foundational theory" to enhance theory-driven evaluations and logic models. The framework guides the process of identifying and explaining operative relationships and perspectives within human service program systems. Self-Determination Theory exemplifies how a foundational theory can be used to support the framework in a wide range of program evaluations. Two examples illustrate how applications of the framework have improved the evaluators' abilities to observe and explain program effect. In both exemplars improvements involved addressing and organizing into a single logic model heretofore seemingly disparate evaluation issues regarding valuing (by whose values); the role of organizational and program context; and evaluation anxiety and utilization. Copyright 2009 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Rockwell, S. Kay; Albrecht, Julie A.; Nugent, Gwen C.; Kunz, Gina M.
2012-01-01
Targeting Outcomes of Programs (TOP) is a seven-step hierarchical programming model in which the program development and performance sides are mirror images of each other. It served as a framework to identify a simple method for targeting photographic events in nonformal education programs, indicating why, when, and how photographs would be useful…
Droplet Deformation in an Extensional Flow: The Role of Surfactant Physical Chemistry
NASA Technical Reports Server (NTRS)
Stebe, Kathleen J.
1996-01-01
Surfactant-induced Marangoni effects strongly alter the stresses exerted along fluid particle interfaces. In low gravity processes, these stresses can dictate the system behavior. The dependence of Marangoni effects on surfactant physical chemistry is not understood, severely impacting our ability to predict and control fluid particle flows. A droplet in an extensional flow allows the controlled study of stretching and deforming interfaces. The deformations of the drop allow both Marangoni stresses, which resist tangential shear, and Marangoni elasticities, which resist surface dilatation, to develop. This flow presents an ideal model system for studying these effects. Prior surfactant-related work in this flow considered a linear dependence of the surface tension on the surface concentration, valid only at dilute surface concentrations, or a non-linear framework at concentrations sufficiently dilute that the linear approximation was valid. The linear framework becomes inadequate for several reasons. The finite dimensions of surfactant molecules must be taken into account with a model that includes surfaces saturation. Nonideal interactions between adsorbed surfactant molecules alter the partitioning of surfactant between the bulk and the interface, the dynamics of surfactant adsorptive/desorptive exchange, and the sensitivity of the surface tension to adsorbed surfactant. For example, cohesion between hydrocarbon chains favors strong adsorption. Cohesion also slows the rate of desorption from interfaces, and decreases the sensitivity of the surface tension to adsorbed surfactant. Strong cohesive interactions result in first order surface phase changes with a plateau in the surface tension vs surface concentration. Within this surface concentration range, the surface tension is decoupled from surface concentration gradients. We are engaged in the study of the role of surfactant physical chemistry in determining the Marangoni stresses on a drop in an extensional flow in a numerical and experimental program. Using surfactants whose dynamics and equilibrium behavior have been characterized in our laboratory, drop deformation will be studied in ground-based experiment. In an accompanying numerical study, predictive drop deformations will be determined based on the isotherm and equation of state determined in our laboratory. This work will improve our abilities to predict and control all fluid particle flows.
Evolution of a multilevel framework for health program evaluation.
Masso, Malcolm; Quinsey, Karen; Fildes, Dave
2017-07-01
A well-conceived evaluation framework increases understanding of a program's goals and objectives, facilitates the identification of outcomes and can be used as a planning tool during program development. Herein we describe the origins and development of an evaluation framework that recognises that implementation is influenced by the setting in which it takes place, the individuals involved and the processes by which implementation is accomplished. The framework includes an evaluation hierarchy that focuses on outcomes for consumers, providers and the care delivery system, and is structured according to six domains: program delivery, impact, sustainability, capacity building, generalisability and dissemination. These components of the evaluation framework fit into a matrix structure, and cells within the matrix are supported by relevant evaluation tools. The development of the framework has been influenced by feedback from various stakeholders, existing knowledge of the evaluators and the literature on health promotion and implementation science. Over the years, the framework has matured and is generic enough to be useful in a wide variety of circumstances, yet specific enough to focus data collection, data analysis and the presentation of findings.
General Criteria for Evaluating Social Programs.
ERIC Educational Resources Information Center
Shipman, Stephanie
1989-01-01
A framework of general evaluation criteria for ensuring the comprehensiveness of program reviews and appropriate and fair comparison of children's programs is outlined. It has two components: (1) descriptive; and (2) evaluative. The framework was developed by researchers at the General Accounting Office for evaluation of federal programs. (TJH)
Health Occupations Education. Vocational Education Program Courses Standards.
ERIC Educational Resources Information Center
Florida State Dept. of Education, Tallahassee. Div. of Vocational, Adult, and Community Education.
This document contains vocational education program courses standards for exploratory courses, practical arts courses, and job preparatory programs offered at the secondary or postsecondary level. Each program standard is composed of two parts: a curriculum framework and student performance standards. The curriculum framework includes four major…
Object matching using a locally affine invariant and linear programming techniques.
Li, Hongsheng; Huang, Xiaolei; He, Lei
2013-02-01
In this paper, we introduce a new matching method based on a novel locally affine-invariant geometric constraint and linear programming techniques. To model and solve the matching problem in a linear programming formulation, all geometric constraints should be able to be exactly or approximately reformulated into a linear form. This is a major difficulty for this kind of matching algorithm. We propose a novel locally affine-invariant constraint which can be exactly linearized and requires a lot fewer auxiliary variables than other linear programming-based methods do. The key idea behind it is that each point in the template point set can be exactly represented by an affine combination of its neighboring points, whose weights can be solved easily by least squares. Errors of reconstructing each matched point using such weights are used to penalize the disagreement of geometric relationships between the template points and the matched points. The resulting overall objective function can be solved efficiently by linear programming techniques. Our experimental results on both rigid and nonrigid object matching show the effectiveness of the proposed algorithm.
NASA Astrophysics Data System (ADS)
O'Malley, D.; Le, E. B.; Vesselinov, V. V.
2015-12-01
We present a fast, scalable, and highly-implementable stochastic inverse method for characterization of aquifer heterogeneity. The method utilizes recent advances in randomized matrix algebra and exploits the structure of the Quasi-Linear Geostatistical Approach (QLGA), without requiring a structured grid like Fast-Fourier Transform (FFT) methods. The QLGA framework is a more stable version of Gauss-Newton iterates for a large number of unknown model parameters, but provides unbiased estimates. The methods are matrix-free and do not require derivatives or adjoints, and are thus ideal for complex models and black-box implementation. We also incorporate randomized least-square solvers and data-reduction methods, which speed up computation and simulate missing data points. The new inverse methodology is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). Julia is an advanced high-level scientific programing language that allows for efficient memory management and utilization of high-performance computational resources. Inversion results based on series of synthetic problems with steady-state and transient calibration data are presented.
Electric train energy consumption modeling
Wang, Jinghui; Rakha, Hesham A.
2017-05-01
For this paper we develop an electric train energy consumption modeling framework considering instantaneous regenerative braking efficiency in support of a rail simulation system. The model is calibrated with data from Portland, Oregon using an unconstrained non-linear optimization procedure, and validated using data from Chicago, Illinois by comparing model predictions against the National Transit Database (NTD) estimates. The results demonstrate that regenerative braking efficiency varies as an exponential function of the deceleration level, rather than an average constant as assumed in previous studies. The model predictions are demonstrated to be consistent with the NTD estimates, producing a predicted error ofmore » 1.87% and -2.31%. The paper demonstrates that energy recovery reduces the overall power consumption by 20% for the tested Chicago route. Furthermore, the paper demonstrates that the proposed modeling approach is able to capture energy consumption differences associated with train, route and operational parameters, and thus is applicable for project-level analysis. The model can be easily implemented in traffic simulation software, used in smartphone applications and eco-transit programs given its fast execution time and easy integration in complex frameworks.« less
Systems identification and the adaptive management of waterfowl in the United States
Williams, B.K.; Nichols, J.D.
2001-01-01
Waterfowl management in the United States is one of the more visible conservation success stories in the United States. It is authorized and supported by appropriate legislative authorities, based on large-scale monitoring programs, and widely accepted by the public. The process is one of only a limited number of large-scale examples of effective collaboration between research and management, integrating scientific information with management in a coherent framework for regulatory decision-making. However, harvest management continues to face some serious technical problems, many of which focus on sequential identification of the resource system in a context of optimal decision-making. The objective of this paper is to provide a theoretical foundation of adaptive harvest management, the approach currently in use in the United States for regulatory decision-making. We lay out the legal and institutional framework for adaptive harvest management and provide a formal description of regulatory decision-making in terms of adaptive optimization. We discuss some technical and institutional challenges in applying adaptive harvest management and focus specifically on methods of estimating resource states for linear resource systems.
Electric train energy consumption modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jinghui; Rakha, Hesham A.
For this paper we develop an electric train energy consumption modeling framework considering instantaneous regenerative braking efficiency in support of a rail simulation system. The model is calibrated with data from Portland, Oregon using an unconstrained non-linear optimization procedure, and validated using data from Chicago, Illinois by comparing model predictions against the National Transit Database (NTD) estimates. The results demonstrate that regenerative braking efficiency varies as an exponential function of the deceleration level, rather than an average constant as assumed in previous studies. The model predictions are demonstrated to be consistent with the NTD estimates, producing a predicted error ofmore » 1.87% and -2.31%. The paper demonstrates that energy recovery reduces the overall power consumption by 20% for the tested Chicago route. Furthermore, the paper demonstrates that the proposed modeling approach is able to capture energy consumption differences associated with train, route and operational parameters, and thus is applicable for project-level analysis. The model can be easily implemented in traffic simulation software, used in smartphone applications and eco-transit programs given its fast execution time and easy integration in complex frameworks.« less
Bartolini, Fabio; Gallerani, Vittorio; Raggi, Meri; Viaggi, Davide
2007-10-01
The performance of different policy design strategies is a key issue in evaluating programmes for water quality improvement under the Water Framework Directive (60/2000). This issue is emphasised by information asymmetries between regulator and agents. Using an economic model under asymmetric information, the aim of this paper is to compare the cost-effectiveness of selected methods of designing payments to farmers in order to reduce nitrogen pollution in agriculture. A principal-agent model is used, based on profit functions generated through farm-level linear programming. This allows a comparison of flat rate payments and a menu of contracts developed through mechanism design. The model is tested in an area of Emilia Romagna (Italy) in two policy contexts: Agenda 2000 and the 2003 Common Agricultural Policy (CAP) reform. The results show that different policy design options lead to differences in policy costs as great as 200-400%, with clear advantages for the menu of contracts. However, different policy scenarios may strongly affect such differences. Hence, the paper calls for greater attention to the interplay between CAP scenarios and water quality measures.
NASA Astrophysics Data System (ADS)
Bartolini, Fabio; Gallerani, Vittorio; Raggi, Meri; Viaggi, Davide
2007-10-01
The performance of different policy design strategies is a key issue in evaluating programmes for water quality improvement under the Water Framework Directive (60/2000). This issue is emphasised by information asymmetries between regulator and agents. Using an economic model under asymmetric information, the aim of this paper is to compare the cost-effectiveness of selected methods of designing payments to farmers in order to reduce nitrogen pollution in agriculture. A principal-agent model is used, based on profit functions generated through farm-level linear programming. This allows a comparison of flat rate payments and a menu of contracts developed through mechanism design. The model is tested in an area of Emilia Romagna (Italy) in two policy contexts: Agenda 2000 and the 2003 Common Agricultural Policy (CAP) reform. The results show that different policy design options lead to differences in policy costs as great as 200-400%, with clear advantages for the menu of contracts. However, different policy scenarios may strongly affect such differences. Hence, the paper calls for greater attention to the interplay between CAP scenarios and water quality measures.
Non-linear analytic and coanalytic problems ( L_p-theory, Clifford analysis, examples)
NASA Astrophysics Data System (ADS)
Dubinskii, Yu A.; Osipenko, A. S.
2000-02-01
Two kinds of new mathematical model of variational type are put forward: non-linear analytic and coanalytic problems. The formulation of these non-linear boundary-value problems is based on a decomposition of the complete scale of Sobolev spaces into the "orthogonal" sum of analytic and coanalytic subspaces. A similar decomposition is considered in the framework of Clifford analysis. Explicit examples are presented.
Sridharan, Sanjeev; Jones, Bobby; Caudill, Barry; Nakaima, April
2016-10-01
This paper describes a framework that can help refine program theory through data explorations and stakeholder dialogue. The framework incorporates the following steps: a recognition that program implementation might need to be multi-phased for a number of interventions, the need to take stock of program theory, the application of pattern recognition methods to help identify heterogeneous program mechanisms, and stakeholder dialogue to refine the program. As part of the data exploration, a method known as developmental trajectories is implemented to learn about heterogeneous trajectories of outcomes in longitudinal evaluations. This method identifies trajectory clusters and also can estimate different treatment impacts for the various groups. The framework is highlighted with data collected in an evaluation of an alcohol risk-reduction program delivered in a college fraternity setting. The framework discussed in the paper is informed by a realist focus on "what works for whom under what contexts." The utility of the framework in contributing to a dialogue on heterogeneous mechanism and subsequent implementation is described. The connection of the ideas in paper to a 'learning through principled discovery' approach is also described. Copyright © 2016. Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the dental assisting technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies. Section II…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the welding and cutting programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies, and…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the surgical technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies for the program,…
Discovering Knowledge from Noisy Databases Using Genetic Programming.
ERIC Educational Resources Information Center
Wong, Man Leung; Leung, Kwong Sak; Cheng, Jack C. Y.
2000-01-01
Presents a framework that combines Genetic Programming and Inductive Logic Programming, two approaches in data mining, to induce knowledge from noisy databases. The framework is based on a formalism of logic grammars and is implemented as a data mining system called LOGENPRO (Logic Grammar-based Genetic Programming System). (Contains 34…
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC. Program Evaluation and Methodology Div.
This general program evaluation framework provides a wide range of criteria that can be applied in the evaluation of diverse federal progams. The framework was developed from a literature search on program evaluation methods and their use, the experiences of the United States Government Accounting Office (GAO), and consideration of the types of…
ERIC Educational Resources Information Center
Bentley, Kia J.
2013-01-01
This article presents a framework for evaluation in social work doctoral education and details 10 years of successes and challenges in one PhD program's use of the framework, including planning and implementing specific assessment activities around student learning outcomes and larger program goals. The article argues that a range of…
Kassahun, Aron; Braka, Fiona; Gallagher, Kathleen; Gebriel, Aregai Wolde; Nsubuga, Peter; M’pele-Kilebou, Pierre
2017-01-01
Introduction the World Health Organization (WHO), Ethiopia country office, introduced an accountability framework into its Polio Eradication Program in 2014 with the aim of improving the program's performance. Our study aims to evaluate staff performance and key program indicators following the introduction of the accountability framework. Methods the impact of the WHO accountability framework was reviewed after its first year of implementation from June 2014 to June 2015. We analyzed selected program and staff performance indicators associated with acute flaccid paralysis (AFP) surveillance from a database available at WHO. Data on managerial actions taken were also reviewed. Performance of a total of 38 staff was evaluated during our review. Results our review of results for the first four quarters of implementation of the polio eradication accountability framework showed improvement both at the program and individual level when compared with the previous year. Managerial actions taken during the study period based on the results from the monitoring tool included eleven written acknowledgments, six discussions regarding performance improvement, six rotations of staff, four written first-warning letters and nine non-renewal of contracts. Conclusion the introduction of the accountability framework resulted in improvement in staff performance and overall program indicators for AFP surveillance. PMID:28890753
Registration of 4D time-series of cardiac images with multichannel Diffeomorphic Demons.
Peyrat, Jean-Marc; Delingette, Hervé; Sermesant, Maxime; Pennec, Xavier; Xu, Chenyang; Ayache, Nicholas
2008-01-01
In this paper, we propose a generic framework for intersubject non-linear registration of 4D time-series images. In this framework, spatio-temporal registration is defined by mapping trajectories of physical points as opposed to spatial registration that solely aims at mapping homologous points. First, we determine the trajectories we want to register in each sequence using a motion tracking algorithm based on the Diffeomorphic Demons algorithm. Then, we perform simultaneously pairwise registrations of corresponding time-points with the constraint to map the same physical points over time. We show this trajectory registration can be formulated as a multichannel registration of 3D images. We solve it using the Diffeomorphic Demons algorithm extended to vector-valued 3D images. This framework is applied to the inter-subject non-linear registration of 4D cardiac CT sequences.
A trajectory generation framework for modeling spacecraft entry in MDAO
NASA Astrophysics Data System (ADS)
D`Souza, Sarah N.; Sarigul-Klijn, Nesrin
2016-04-01
In this paper a novel trajectory generation framework was developed that optimizes trajectory event conditions for use in a Generalized Entry Guidance algorithm. The framework was developed to be adaptable via the use of high fidelity equations of motion and drag based analytical bank profiles. Within this framework, a novel technique was implemented that resolved the sensitivity of the bank profile to atmospheric non-linearities. The framework's adaptability was established by running two different entry bank conditions. Each case yielded a reference trajectory and set of transition event conditions that are flight feasible and implementable in a Generalized Entry Guidance algorithm.
Schelbe, Lisa; Randolph, Karen A; Yelick, Anna; Cheatham, Leah P; Groton, Danielle B
2018-01-01
Increased attention to former foster youth pursuing post-secondary education has resulted in the creation of college campus based support programs to address their need. However, limited empirical evidence and theoretical knowledge exist about these programs. This study seeks to describe the application of systems theory as a framework for examining a college campus based support program for former foster youth. In-depth semi-structured interviews were conducted with 32 program stakeholders including students, mentors, collaborative members, and independent living program staff. Using qualitative data analysis software, holistic coding techniques were employed to analyze interview transcripts. Then applying principles of extended case method using systems theory, data were analyzed. Findings suggest systems theory serves as a framework for understanding the functioning of a college campus based support program. The theory's concepts help delineate program components and roles of stakeholders; outline boundaries between and interactions among stakeholders; and identify program strengths and weakness. Systems theory plays an important role in identifying intervention components and providing a structure through which to identify and understand program elements as a part of the planning process. This study highlights the utility of systems theory as a framework for program planning and evaluation.
Development of a Neural Network-Based Renewable Energy Forecasting Framework for Process Industries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Soobin; Ryu, Jun-Hyung; Hodge, Bri-Mathias
2016-06-25
This paper presents a neural network-based forecasting framework for photovoltaic power (PV) generation as a decision-supporting tool to employ renewable energies in the process industry. The applicability of the proposed framework is illustrated by comparing its performance against other methodologies such as linear and nonlinear time series modelling approaches. A case study of an actual PV power plant in South Korea is presented.
Remigio, Roberto Di; Bast, Radovan; Frediani, Luca; Saue, Trond
2015-05-28
We present a formulation of four-component relativistic self-consistent field (SCF) theory for a molecular solute described within the framework of the polarizable continuum model (PCM) for solvation. The linear response function for a four-component PCM-SCF state is also derived, as well as the explicit form of the additional contributions to the first-order response equations. The implementation of such a four-component PCM-SCF model, as carried out in a development version of the DIRAC program package, is documented. In particular, we present the newly developed application programming interface PCMSolver used in the actual implementation with DIRAC. To demonstrate the applicability of the approach, we present and analyze calculations of solvation effects on the geometries, electric dipole moments, and static electric dipole polarizabilities for the group 16 dihydrides H2X (X = O, S, Se, Te, Po).
Muleme, James; Kankya, Clovice; Ssempebwa, John C.; Mazeri, Stella; Muwonge, Adrian
2017-01-01
Knowledge, attitude, and practice (KAP) studies guide the implementation of public health interventions (PHIs), and they are important tools for political persuasion. The design and implementation of PHIs assumes a linear KAP relationship, i.e., an awareness campaign results in the desirable societal behavioral change. However, there is no robust framework for testing this relationship before and after PHIs. Here, we use qualitative and quantitative data on pesticide usage to test this linear relationship, identify associated context specific factors as well as assemble a framework that could be used to guide and evaluate PHIs. We used data from a cross-sectional mixed methods study on pesticide usage. Quantitative data were collected using a structured questionnaire from 167 households representing 1,002 individuals. Qualitative data were collected from key informants and focus group discussions. Quantitative and qualitative data analysis was done in R 3.2.0 as well as qualitative thematic analysis, respectively. Our framework shows that a KAP linear relationship only existed for households with a low knowledge score, suggesting that an awareness campaign would only be effective for ~37% of the households. Context specific socioeconomic factors explain why this relationship does not hold for households with high knowledge scores. These findings are essential for developing targeted cost-effective and sustainable interventions on pesticide usage and other PHIs with context specific modifications. PMID:29276703
Muleme, James; Kankya, Clovice; Ssempebwa, John C; Mazeri, Stella; Muwonge, Adrian
2017-01-01
Knowledge, attitude, and practice (KAP) studies guide the implementation of public health interventions (PHIs), and they are important tools for political persuasion. The design and implementation of PHIs assumes a linear KAP relationship, i.e., an awareness campaign results in the desirable societal behavioral change. However, there is no robust framework for testing this relationship before and after PHIs. Here, we use qualitative and quantitative data on pesticide usage to test this linear relationship, identify associated context specific factors as well as assemble a framework that could be used to guide and evaluate PHIs. We used data from a cross-sectional mixed methods study on pesticide usage. Quantitative data were collected using a structured questionnaire from 167 households representing 1,002 individuals. Qualitative data were collected from key informants and focus group discussions. Quantitative and qualitative data analysis was done in R 3.2.0 as well as qualitative thematic analysis, respectively. Our framework shows that a KAP linear relationship only existed for households with a low knowledge score, suggesting that an awareness campaign would only be effective for ~37% of the households. Context specific socioeconomic factors explain why this relationship does not hold for households with high knowledge scores. These findings are essential for developing targeted cost-effective and sustainable interventions on pesticide usage and other PHIs with context specific modifications.
NASA Astrophysics Data System (ADS)
Berry Bertram, Kathryn
2011-12-01
The Geophysical Institute (GI) Framework for Professional Development was designed to prepare culturally responsive teachers of science, technology, engineering, and math (STEM). Professional development programs based on the framework are created for rural Alaskan teachers who instruct diverse classrooms that include indigenous students. This dissertation was written in response to the question, "Under what circumstances is the GI Framework for Professional Development effective in preparing culturally responsive teachers of science, technology, engineering, and math?" Research was conducted on two professional development programs based on the GI Framework: the Arctic Climate Modeling Program (ACMP) and the Science Teacher Education Program (STEP). Both programs were created by backward design to student learning goals aligned with Alaska standards and rooted in principles of indigenous ideology. Both were created with input from Alaska Native cultural knowledge bearers, Arctic scientists, education researchers, school administrators, and master teachers with extensive instructional experience. Both provide integrated instruction reflective of authentic Arctic research practices, and training in diverse methods shown to increase indigenous student STEM engagement. While based on the same framework, these programs were chosen for research because they offer distinctly different training venues for K-12 teachers. STEP offered two-week summer institutes on the UAF campus for more than 175 teachers from 33 Alaska school districts. By contrast, ACMP served 165 teachers from one rural Alaska school district along the Bering Strait. Due to challenges in making professional development opportunities accessible to all teachers in this geographically isolated district, ACMP offered a year-round mix of in-person, long-distance, online, and local training. Discussion centers on a comparison of the strategies used by each program to address GI Framework cornerstones, on methodologies used to conduct program research, and on findings obtained. Research indicates that in both situations the GI Framework for Professional Development was effective in preparing culturally responsive STEM teachers. Implications of these findings and recommendations for future research are discussed in the conclusion.
Beets, Michael W; Webster, Collin; Saunders, Ruth; Huberty, Jennifer L
2013-03-01
Afterschool programs (3-6 p.m.) are positioned to play a critical role in combating childhood obesity. To this end, state and national organizations have developed policies related to promoting physical activity and guiding the nutritional quality of snacks served in afterschool programs. No conceptual frameworks, however, are available that describe the process of how afterschool programs will translate such policies into daily practice to reach eventual outcomes. Drawing from complex systems theory, this article describes the development of a framework that identifies critical modifiable levers within afterschool programs that can be altered and/or strengthened to reach policy goals. These include the policy environment at the national, state, and local levels; individual site, afterschool program leader, staff, and child characteristics; and existing outside organizational partnerships. Use of this framework and recognition of its constituent elements have the potential to lead to the successful and sustainable adoption and implementation of physical activity and nutrition policies in afterschool programs nationwide.
Trinker, Horst
2011-10-28
We study the distribution of triples of codewords of codes and ordered codes. Schrijver [A. Schrijver, New code upper bounds from the Terwilliger algebra and semidefinite programming, IEEE Trans. Inform. Theory 51 (8) (2005) 2859-2866] used the triple distribution of a code to establish a bound on the number of codewords based on semidefinite programming. In the first part of this work, we generalize this approach for ordered codes. In the second part, we consider linear codes and linear ordered codes and present a MacWilliams-type identity for the triple distribution of their dual code. Based on the non-negativity of this linear transform, we establish a linear programming bound and conclude with a table of parameters for which this bound yields better results than the standard linear programming bound.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-06
... Availability of the Framework Document for Commercial Refrigeration Equipment AGENCY: Office of Energy... data collection process to consider amended energy conservation standards for commercial refrigeration... Energy, Building Technologies Program, Mailstop EE-2J, Framework Document for Commercial Refrigeration...
ATLAS offline software performance monitoring and optimization
NASA Astrophysics Data System (ADS)
Chauhan, N.; Kabra, G.; Kittelmann, T.; Langenberg, R.; Mandrysch, R.; Salzburger, A.; Seuster, R.; Ritsch, E.; Stewart, G.; van Eldik, N.; Vitillo, R.; Atlas Collaboration
2014-06-01
In a complex multi-developer, multi-package software environment, such as the ATLAS offline framework Athena, tracking the performance of the code can be a non-trivial task in itself. In this paper we describe improvements in the instrumentation of ATLAS offline software that have given considerable insight into the performance of the code and helped to guide the optimization work. The first tool we used to instrument the code is PAPI, which is a programing interface for accessing hardware performance counters. PAPI events can count floating point operations, cycles, instructions and cache accesses. Triggering PAPI to start/stop counting for each algorithm and processed event results in a good understanding of the algorithm level performance of ATLAS code. Further data can be obtained using Pin, a dynamic binary instrumentation tool. Pin tools can be used to obtain similar statistics as PAPI, but advantageously without requiring recompilation of the code. Fine grained routine and instruction level instrumentation is also possible. Pin tools can additionally interrogate the arguments to functions, like those in linear algebra libraries, so that a detailed usage profile can be obtained. These tools have characterized the extensive use of vector and matrix operations in ATLAS tracking. Currently, CLHEP is used here, which is not an optimal choice. To help evaluate replacement libraries a testbed has been setup allowing comparison of the performance of different linear algebra libraries (including CLHEP, Eigen and SMatrix/SVector). Results are then presented via the ATLAS Performance Management Board framework, which runs daily with the current development branch of the code and monitors reconstruction and Monte-Carlo jobs. This framework analyses the CPU and memory performance of algorithms and an overview of results are presented on a web page. These tools have provided the insight necessary to plan and implement performance enhancements in ATLAS code by identifying the most common operations, with the call parameters well understood, and allowing improvements to be quantified in detail.
Aspect-Oriented Monitoring of C Programs
NASA Technical Reports Server (NTRS)
Havelund, Klaus; VanWyk, Eric
2008-01-01
The paper presents current work on extending ASPECTC with state machines, resulting in a framework for aspect-oriented monitoring of C programs. Such a framework can be used for testing purposes, or it can be part of a fault protection strategy. The long term goal is to explore the synergy between the fields of runtime verification, focused on program monitoring, and aspect-oriented programming, focused on more general program development issues. The work is inspired by the observation that most work in this direction has been done for JAVA, partly due to the lack of easily accessible extensible compiler frameworks for C. The work is performed using the SILVER extensible attribute grammar compiler framework, in which C has been defined as a host language. Our work consists of extending C with ASPECTC, and subsequently to extend ASPECTC with state machines.
NASA Technical Reports Server (NTRS)
Bhadra, Dipasis; Morser, Frederick R.
2006-01-01
In this paper, the authors review the FAA s current program investments and lay out a preliminary analytical framework to undertake projects that may address some of the noted deficiencies. By drawing upon the well developed theories from corporate finance, an analytical framework is offered that can be used for choosing FAA s investments taking into account risk, expected returns and inherent dependencies across NAS programs. The framework can be expanded into taking multiple assets and realistic values for parameters in drawing an efficient risk-return frontier for the entire FAA investment programs.
LINEAR - DERIVATION AND DEFINITION OF A LINEAR AIRCRAFT MODEL
NASA Technical Reports Server (NTRS)
Duke, E. L.
1994-01-01
The Derivation and Definition of a Linear Model program, LINEAR, provides the user with a powerful and flexible tool for the linearization of aircraft aerodynamic models. LINEAR was developed to provide a standard, documented, and verified tool to derive linear models for aircraft stability analysis and control law design. Linear system models define the aircraft system in the neighborhood of an analysis point and are determined by the linearization of the nonlinear equations defining vehicle dynamics and sensors. LINEAR numerically determines a linear system model using nonlinear equations of motion and a user supplied linear or nonlinear aerodynamic model. The nonlinear equations of motion used are six-degree-of-freedom equations with stationary atmosphere and flat, nonrotating earth assumptions. LINEAR is capable of extracting both linearized engine effects, such as net thrust, torque, and gyroscopic effects and including these effects in the linear system model. The point at which this linear model is defined is determined either by completely specifying the state and control variables, or by specifying an analysis point on a trajectory and directing the program to determine the control variables and the remaining state variables. The system model determined by LINEAR consists of matrices for both the state and observation equations. The program has been designed to provide easy selection of state, control, and observation variables to be used in a particular model. Thus, the order of the system model is completely under user control. Further, the program provides the flexibility of allowing alternate formulations of both the state and observation equations. Data describing the aircraft and the test case is input to the program through a terminal or formatted data files. All data can be modified interactively from case to case. The aerodynamic model can be defined in two ways: a set of nondimensional stability and control derivatives for the flight point of interest, or a full non-linear aerodynamic model as used in simulations. LINEAR is written in FORTRAN and has been implemented on a DEC VAX computer operating under VMS with a virtual memory requirement of approximately 296K of 8 bit bytes. Both an interactive and batch version are included. LINEAR was developed in 1988.
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for two programs in the state's postsecondary-level computer information systems technology cluster: computer programming and network support. Presented in the introduction are program descriptions and suggested course…
Assessing Study Abroad Programs: Application of the "SLEPT" Framework through Learning Communities
ERIC Educational Resources Information Center
Tajes, Maria; Ortiz, Jamie
2010-01-01
This case study proposes a comprehensive conceptual framework for exploring student learning outcomes of short-term study abroad programs. It uses the Social, Legal, Economic, Political, and Technological framework to assess understanding of the host country before departing and after returning. Participation fostered global literacy and critical…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the state's funeral services technology program. Presented in the introduction are a program description and suggested course sequence. Section I lists baseline competencies for the funeral…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the two course sequences of the state's postsecondary-level drafting and design technology program: architectural drafting technology and drafting and design technology. Presented first are a program description and…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the emergency medical technology (EMT) programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the banking and finance technology program. Presented in the introduction are a program description and suggested course sequence. Section I is a curriculum guide consisting of outlines for…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the automotive technology programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies, and…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the automotive machinist programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies, and…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the medical assisting technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies, and…
PCC Framework for Program-Generators
NASA Technical Reports Server (NTRS)
Kong, Soonho; Choi, Wontae; Yi, Kwangkeun
2009-01-01
In this paper, we propose a proof-carrying code framework for program-generators. The enabling technique is abstract parsing, a static string analysis technique, which is used as a component for generating and validating certificates. Our framework provides an efficient solution for certifying program-generators whose safety properties are expressed in terms of the grammar representing the generated program. The fixed-point solution of the analysis is generated and attached with the program-generator on the code producer side. The consumer receives the code with a fixed-point solution and validates that the received fixed point is indeed a fixed point of the received code. This validation can be done in a single pass.
ERIC Educational Resources Information Center
Office of Head Start, US Department of Health and Human Services, 2010
2010-01-01
This report presents a revision of the Head Start Child Outcomes Framework (2000), renamed The Head Start Child Development and Learning Framework: Promoting Positive Outcomes in Early Childhood Programs Serving Children 3-5 Years Old. The Framework outlines the essential areas of development and learning that are to be used by Head Start programs…
A framework for monitoring social process and outcomes in environmental programs.
Chapman, Sarah
2014-12-01
When environmental programs frame their activities as being in the service of human wellbeing, social variables need to be integrated into monitoring and evaluation (M&E) frameworks. This article draws upon ecosystem services theory to develop a framework to guide the M&E of collaborative environmental programs with anticipated social benefits. The framework has six components: program need, program activities, pathway process variables, moderating process variables, outcomes, and program value. Needs are defined in terms of ecosystem services, as well as other human needs that must be addressed to achieve outcomes. The pathway variable relates to the development of natural resource governance capacity in the target community. Moderating processes can be externalities such as the inherent capacity of the natural system to service ecosystem needs, local demand for natural resources, policy or socio-economic drivers. Internal program-specific processes relate to program service delivery, targeting and participant responsiveness. Ecological outcomes are expressed in terms of changes in landscape structure and function, which in turn influence ecosystem service provision. Social benefits derived from the program are expressed in terms of the value of the eco-social service to user-specified goals. The article provides suggestions from the literature for identifying indicators and measures for components and component variables, and concludes with an example of how the framework was used to inform the M&E of an adaptive co-management program in western Kenya. Copyright © 2014 Elsevier Ltd. All rights reserved.
Novins, Douglas K.; Moore, Laurie A.; Beals, Janette; Aarons, Gregory A.; Rieckmann, Traci; Kaufman, Carol E.
2013-01-01
Background Because of their broad geographic distribution, diverse ownership and operation, and funding instability, it is a challenge to develop a framework for studying substance abuse treatment programs serving American Indian and Alaska Native communities at a national level. This is further complicated by the historic reluctance of American Indian and Alaska Native communities to participate in research. Objectives and Methods We developed a framework for studying these substance abuse treatment programs (n = 293) at a national level as part of a study of attitudes toward, and use of, evidence-based treatments among substance abuse treatment programs serving AI/AN communities with the goal of assuring participation of a broad array of programs and the communities that they serve. Results Because of the complexities of identifying specific substance abuse treatment programs, the sampling framework divides these programs into strata based on the American Indian and Alaska Native communities that they serve: (1) the 20 largest tribes (by population); (2) urban AI/AN clinics; (3) Alaska Native Health Corporations; (4) other Tribes; and (5) other regional programs unaffiliated with a specific AI/AN community. In addition, the recruitment framework was designed to be sensitive to likely concerns about participating in research. Conclusion and Scientific Significance This systematic approach for studying substance abuse and other clinical programs serving AI/AN communities assures the participation of diverse AI/AN programs and communities and may be useful in designing similar national studies. PMID:22931088
Profiling a Mind Map User: A Descriptive Appraisal
ERIC Educational Resources Information Center
Tucker, Joanne M.; Armstrong, Gary R.; Massad, Victor J.
2010-01-01
Whether manually or through the use of software, a non-linear information organization framework known as mind mapping offers an alternative method for capturing thoughts, ideas and information to linear thinking modes such as outlining. Mind mapping is brainstorming, organizing, and problem solving. This paper examines mind mapping techniques,…
Simultaneous Optimization of Decisions Using a Linear Utility Function.
ERIC Educational Resources Information Center
Vos, Hans J.
1990-01-01
An approach is presented to simultaneously optimize decision rules for combinations of elementary decisions through a framework derived from Bayesian decision theory. The developed linear utility model for selection-mastery decisions was applied to a sample of 43 first year medical students to illustrate the procedure. (SLD)
Utilitarian Aggregation of Beliefs and Tastes.
ERIC Educational Resources Information Center
Gilboa, Itzhak; Samet, Dov; Schmeidler, David
2004-01-01
Harsanyi's utilitarianism is extended here to Savage's framework. We formulate a Pareto condition that implies that both society's utility function and its probability measure are linear combinations of those of the individuals. An indiscriminate Pareto condition has been shown to contradict linear aggregation of beliefs and tastes. We argue that…
FSILP: fuzzy-stochastic-interval linear programming for supporting municipal solid waste management.
Li, Pu; Chen, Bing
2011-04-01
Although many studies on municipal solid waste management (MSW management) were conducted under uncertain conditions of fuzzy, stochastic, and interval coexistence, the solution to the conventional linear programming problems of integrating fuzzy method with the other two was inefficient. In this study, a fuzzy-stochastic-interval linear programming (FSILP) method is developed by integrating Nguyen's method with conventional linear programming for supporting municipal solid waste management. The Nguyen's method was used to convert the fuzzy and fuzzy-stochastic linear programming problems into the conventional linear programs, by measuring the attainment values of fuzzy numbers and/or fuzzy random variables, as well as superiority and inferiority between triangular fuzzy numbers/triangular fuzzy-stochastic variables. The developed method can effectively tackle uncertainties described in terms of probability density functions, fuzzy membership functions, and discrete intervals. Moreover, the method can also improve upon the conventional interval fuzzy programming and two-stage stochastic programming approaches, with advantageous capabilities that are easily achieved with fewer constraints and significantly reduces consumption time. The developed model was applied to a case study of municipal solid waste management system in a city. The results indicated that reasonable solutions had been generated. The solution can help quantify the relationship between the change of system cost and the uncertainties, which could support further analysis of tradeoffs between the waste management cost and the system failure risk. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Indarsih, Indrati, Ch. Rini
2016-02-01
In this paper, we define variance of the fuzzy random variables through alpha level. We have a theorem that can be used to know that the variance of fuzzy random variables is a fuzzy number. We have a multi-objective linear programming (MOLP) with fuzzy random of objective function coefficients. We will solve the problem by variance approach. The approach transform the MOLP with fuzzy random of objective function coefficients into MOLP with fuzzy of objective function coefficients. By weighted methods, we have linear programming with fuzzy coefficients and we solve by simplex method for fuzzy linear programming.
Bruhn, Peter; Geyer-Schulz, Andreas
2002-01-01
In this paper, we introduce genetic programming over context-free languages with linear constraints for combinatorial optimization, apply this method to several variants of the multidimensional knapsack problem, and discuss its performance relative to Michalewicz's genetic algorithm with penalty functions. With respect to Michalewicz's approach, we demonstrate that genetic programming over context-free languages with linear constraints improves convergence. A final result is that genetic programming over context-free languages with linear constraints is ideally suited to modeling complementarities between items in a knapsack problem: The more complementarities in the problem, the stronger the performance in comparison to its competitors.
Adaptive learning in complex reproducing kernel Hilbert spaces employing Wirtinger's subgradients.
Bouboulis, Pantelis; Slavakis, Konstantinos; Theodoridis, Sergios
2012-03-01
This paper presents a wide framework for non-linear online supervised learning tasks in the context of complex valued signal processing. The (complex) input data are mapped into a complex reproducing kernel Hilbert space (RKHS), where the learning phase is taking place. Both pure complex kernels and real kernels (via the complexification trick) can be employed. Moreover, any convex, continuous and not necessarily differentiable function can be used to measure the loss between the output of the specific system and the desired response. The only requirement is the subgradient of the adopted loss function to be available in an analytic form. In order to derive analytically the subgradients, the principles of the (recently developed) Wirtinger's calculus in complex RKHS are exploited. Furthermore, both linear and widely linear (in RKHS) estimation filters are considered. To cope with the problem of increasing memory requirements, which is present in almost all online schemes in RKHS, the sparsification scheme, based on projection onto closed balls, has been adopted. We demonstrate the effectiveness of the proposed framework in a non-linear channel identification task, a non-linear channel equalization problem and a quadrature phase shift keying equalization scheme, using both circular and non circular synthetic signal sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malkoske, Kyle; Nielsen, Michelle; Brown, Erika
A close partnership between the Canadian Partnership for Quality Radiotherapy (CPQR) and the Canadian Organization of Medical Physicist’s (COMP) Quality Assurance and Radiation Safety Advisory Committee (QARSAC) has resulted in the development of a suite of Technical Quality Control (TQC) Guidelines for radiation treatment equipment, that outline specific performance objectives and criteria that equipment should meet in order to assure an acceptable level of radiation treatment quality. The framework includes consolidation of existing guidelines and/or literature by expert reviewers, structured stages of public review, external field-testing and ratification by COMP. The adopted framework for the development and maintenance of themore » TQCs ensures the guidelines incorporate input from the medical physics community during development, measures the workload required to perform the QC tests outlined in each TQC, and remain relevant (i.e. “living documents”) through subsequent planned reviews and updates. This presentation will show the Multi-Leaf Linear Accelerator document as an example of how feedback and cross-national work to achieve a robust guidance document. During field-testing, each technology was tested at multiple centres in a variety of clinic environments. As part of the defined feedback, workload data was captured. This lead to average time associated with testing as defined in each TQC document. As a result, for a medium-sized centre comprising 6 linear accelerators and a comprehensive brachytherapy program, we evaluate the physics workload to 1.5 full-time equivalent physicist per year to complete all QC tests listed in this suite.« less
A Mixed Integer Linear Program for Airport Departure Scheduling
NASA Technical Reports Server (NTRS)
Gupta, Gautam; Jung, Yoon Chul
2009-01-01
Aircraft departing from an airport are subject to numerous constraints while scheduling departure times. These constraints include wake-separation constraints for successive departures, miles-in-trail separation for aircraft bound for the same departure fixes, and time-window or prioritization constraints for individual flights. Besides these, emissions as well as increased fuel consumption due to inefficient scheduling need to be included. Addressing all the above constraints in a single framework while allowing for resequencing of the aircraft using runway queues is critical to the implementation of the Next Generation Air Transport System (NextGen) concepts. Prior work on airport departure scheduling has addressed some of the above. However, existing methods use pre-determined runway queues, and schedule aircraft from these departure queues. The source of such pre-determined queues is not explicit, and could potentially be a subjective controller input. Determining runway queues and scheduling within the same framework would potentially result in better scheduling. This paper presents a mixed integer linear program (MILP) for the departure-scheduling problem. The program takes as input the incoming sequence of aircraft for departure from a runway, along with their earliest departure times and an optional prioritization scheme based on time-window of departure for each aircraft. The program then assigns these aircraft to the available departure queues and schedules departure times, explicitly considering wake separation and departure fix restrictions to minimize total delay for all aircraft. The approach is generalized and can be used in a variety of situations, and allows for aircraft prioritization based on operational as well as environmental considerations. We present the MILP in the paper, along with benefits over the first-come-first-serve (FCFS) scheme for numerous randomized problems based on real-world settings. The MILP results in substantially reduced delays as compared to FCFS, and the magnitude of the savings depends on the queue and departure fix structure. The MILP assumes deterministic aircraft arrival times at the runway queues. However, due to taxi time uncertainty, aircraft might arrive either earlier or later than these deterministic times. Thus, to incorporate this uncertainty, we present a method for using the MILP with "overlap discounted rolling planning horizon". The approach is based on valuing near-term decision results more than future ones. We develop a model of taxitime uncertainty based on real-world data, and then compare the baseline FCFS delays with delays using the above MILP in a simple rolling-horizon method and in the overlap discounted scheme.
Kase, Courtney; Hoover, Sharon; Boyd, Gina; West, Kristina D; Dubenitz, Joel; Trivedi, Pamala A; Peterson, Hilary J; Stein, Bradley D
2017-07-01
There is an unmet need for behavioral health support and services among children and adolescents, which school behavioral health has the potential to address. Existing reviews and meta-analyses document the behavioral health benefits of school behavioral health programs and frameworks, but few summaries of the academic benefits of such programs exist. We provide exemplars of the academic benefits of school behavioral health programs and frameworks. A literature review identified school behavioral health-related articles and reports. Articles for inclusion were restricted to those that were school-based programs and frameworks in the United States that included an empirical evaluation of intervention academic-related outcomes. Findings from 36 primary research, review, and meta-analysis articles from the past 17 years show the benefits of school behavioral health clinical interventions and targeted interventions on a range of academic outcomes for adolescents. Our findings are consistent with reports documenting health benefits of school behavioral health frameworks and programs and can facilitate further efforts to support school behavioral health for a range of stakeholders interested in the benefits of school behavioral health programs and frameworks on academic outcomes. © 2017, American School Health Association.
An Evaluation Framework and Comparative Analysis of the Widely Used First Programming Languages
Farooq, Muhammad Shoaib; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed; Abid, Adnan
2014-01-01
Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL). The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores. PMID:24586449
An evaluation framework and comparative analysis of the widely used first programming languages.
Farooq, Muhammad Shoaib; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed; Abid, Adnan
2014-01-01
Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL). The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores.
High Resolution, Large Deformation 3D Traction Force Microscopy
López-Fagundo, Cristina; Reichner, Jonathan; Hoffman-Kim, Diane; Franck, Christian
2014-01-01
Traction Force Microscopy (TFM) is a powerful approach for quantifying cell-material interactions that over the last two decades has contributed significantly to our understanding of cellular mechanosensing and mechanotransduction. In addition, recent advances in three-dimensional (3D) imaging and traction force analysis (3D TFM) have highlighted the significance of the third dimension in influencing various cellular processes. Yet irrespective of dimensionality, almost all TFM approaches have relied on a linear elastic theory framework to calculate cell surface tractions. Here we present a new high resolution 3D TFM algorithm which utilizes a large deformation formulation to quantify cellular displacement fields with unprecedented resolution. The results feature some of the first experimental evidence that cells are indeed capable of exerting large material deformations, which require the formulation of a new theoretical TFM framework to accurately calculate the traction forces. Based on our previous 3D TFM technique, we reformulate our approach to accurately account for large material deformation and quantitatively contrast and compare both linear and large deformation frameworks as a function of the applied cell deformation. Particular attention is paid in estimating the accuracy penalty associated with utilizing a traditional linear elastic approach in the presence of large deformation gradients. PMID:24740435
Reconstruction of three-dimensional ultrasound images based on cyclic Savitzky-Golay filters
NASA Astrophysics Data System (ADS)
Toonkum, Pollakrit; Suwanwela, Nijasri C.; Chinrungrueng, Chedsada
2011-01-01
We present a new algorithm for reconstructing a three-dimensional (3-D) ultrasound image from a series of two-dimensional B-scan ultrasound slices acquired in the mechanical linear scanning framework. Unlike most existing 3-D ultrasound reconstruction algorithms, which have been developed and evaluated in the freehand scanning framework, the new algorithm has been designed to capitalize the regularity pattern of the mechanical linear scanning, where all the B-scan slices are precisely parallel and evenly spaced. The new reconstruction algorithm, referred to as the cyclic Savitzky-Golay (CSG) reconstruction filter, is an improvement on the original Savitzky-Golay filter in two respects: First, it is extended to accept a 3-D array of data as the filter input instead of a one-dimensional data sequence. Second, it incorporates the cyclic indicator function in its least-squares objective function so that the CSG algorithm can simultaneously perform both smoothing and interpolating tasks. The performance of the CSG reconstruction filter compared to that of most existing reconstruction algorithms in generating a 3-D synthetic test image and a clinical 3-D carotid artery bifurcation in the mechanical linear scanning framework are also reported.
NASA Technical Reports Server (NTRS)
Lawson, C. L.; Krogh, F. T.; Gold, S. S.; Kincaid, D. R.; Sullivan, J.; Williams, E.; Hanson, R. J.; Haskell, K.; Dongarra, J.; Moler, C. B.
1982-01-01
The Basic Linear Algebra Subprograms (BLAS) library is a collection of 38 FORTRAN-callable routines for performing basic operations of numerical linear algebra. BLAS library is portable and efficient source of basic operations for designers of programs involving linear algebriac computations. BLAS library is supplied in portable FORTRAN and Assembler code versions for IBM 370, UNIVAC 1100 and CDC 6000 series computers.
Towards a Theory-Based Design Framework for an Effective E-Learning Computer Programming Course
ERIC Educational Resources Information Center
McGowan, Ian S.
2016-01-01
Built on Dabbagh (2005), this paper presents a four component theory-based design framework for an e-learning session in introductory computer programming. The framework, driven by a body of exemplars component, emphasizes the transformative interaction between the knowledge building community (KBC) pedagogical model, a mixed instructional…
Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts
1981-05-01
program to begin probing the details of the interaction process. The theoretical framework underlying such a program is explained in detail. The theory of...of the time sequence of events during penetration. Data from one series of experiments, reported in detail elsewhere, is presented and discussed within the theoretical framework .
A Framework for Analysis of Case Studies of Reading Lessons
ERIC Educational Resources Information Center
Carlisle, Joanne F.; Kelcey, Ben; Rosaen, Cheryl; Phelps, Geoffrey; Vereb, Anita
2013-01-01
This paper focuses on the development and study of a framework to provide direction and guidance for practicing teachers in using a web-based case studies program for professional development in early reading; the program is called Case Studies Reading Lessons (CSRL). The framework directs and guides teachers' analysis of reading instruction by…
A Numerical Approximation Framework for the Stochastic Linear Quadratic Regulator on Hilbert Spaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levajković, Tijana, E-mail: tijana.levajkovic@uibk.ac.at, E-mail: t.levajkovic@sf.bg.ac.rs; Mena, Hermann, E-mail: hermann.mena@uibk.ac.at; Tuffaha, Amjad, E-mail: atufaha@aus.edu
We present an approximation framework for computing the solution of the stochastic linear quadratic control problem on Hilbert spaces. We focus on the finite horizon case and the related differential Riccati equations (DREs). Our approximation framework is concerned with the so-called “singular estimate control systems” (Lasiecka in Optimal control problems and Riccati equations for systems with unbounded controls and partially analytic generators: applications to boundary and point control problems, 2004) which model certain coupled systems of parabolic/hyperbolic mixed partial differential equations with boundary or point control. We prove that the solutions of the approximate finite-dimensional DREs converge to the solutionmore » of the infinite-dimensional DRE. In addition, we prove that the optimal state and control of the approximate finite-dimensional problem converge to the optimal state and control of the corresponding infinite-dimensional problem.« less
2010-01-01
Background The prevention of overweight sometimes raises complex ethical questions. Ethical public health frameworks may be helpful in evaluating programs or policy for overweight prevention. We give an overview of the purpose, form and contents of such public health frameworks and investigate to which extent they are useful for evaluating programs to prevent overweight and/or obesity. Methods Our search for frameworks consisted of three steps. Firstly, we asked experts in the field of ethics and public health for the frameworks they were aware of. Secondly, we performed a search in Pubmed. Thirdly, we checked literature references in the articles on frameworks we found. In total, we thus found six ethical frameworks. We assessed the area on which the available ethical frameworks focus, the users they target at, the type of policy or intervention they propose to address, and their aim. Further, we looked at their structure and content, that is, tools for guiding the analytic process, the main ethical principles or values, possible criteria for dealing with ethical conflicts, and the concrete policy issues they are applied to. Results All frameworks aim to support public health professionals or policymakers. Most of them provide a set of values or principles that serve as a standard for evaluating policy. Most frameworks articulate both the positive ethical foundations for public health and ethical constraints or concerns. Some frameworks offer analytic tools for guiding the evaluative process. Procedural guidelines and concrete criteria for solving important ethical conflicts in the particular area of the prevention of overweight or obesity are mostly lacking. Conclusions Public health ethical frameworks may be supportive in the evaluation of overweight prevention programs or policy, but seem to lack practical guidance to address ethical conflicts in this particular area. PMID:20969761
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the veterinary technology program. Presented in the introductory section are a of the program and suggested course sequence. Section I lists baseline competencies, and section II consists of…
ERIC Educational Resources Information Center
Paprzycki, Peter; Tuttle, Nicole; Czerniak, Charlene M.; Molitor, Scott; Kadervaek, Joan; Mendenhall, Robert
2017-01-01
This study investigates the effect of a Framework-aligned professional development program at the PreK-3 level. The NSF funded program integrated science with literacy and mathematics learning and provided teacher professional development, along with materials and programming for parents to encourage science investigations and discourse around…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the collision repair technology programs cluster. Presented in the introductory section are a description of the program and suggested course sequences for 1- and 2-year certificates. Section…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the forestry technology program cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies for the…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the ophthalmic technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies, and section II…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the health care assistant program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies for the nurse…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the plumber and pipefitter/steamfitter cluster. Presented in the introductory section are program descriptions and suggested course sequences for the plumbing and pipefitting programs. Section…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the radiologic technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies for the program,…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the civil technology programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies, and section…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the state's marketing management technology program. Presented in the introduction are a program description and suggested course sequence. Section I lists baseline competencies for the…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the medical laboratory technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies, and…
Zörnig, Peter
2015-08-01
We present integer programming models for some variants of the farthest string problem. The number of variables and constraints is substantially less than that of the integer linear programming models known in the literature. Moreover, the solution of the linear programming-relaxation contains only a small proportion of noninteger values, which considerably simplifies the rounding process. Numerical tests have shown excellent results, especially when a small set of long sequences is given.
Haque, Hasibul; Hill, Philip C; Gauld, Robin
2017-01-01
Against a backdrop of changing concepts of aid effectiveness, development effectiveness, health systems strengthening, and increasing emphasis on impact evaluation, this article proposes a theory-driven impact evaluation framework to gauge the effect of aid effectiveness principles on programmatic outcomes of different aid funded programs in the health sector of a particular country. The foundation and step-by-step process of implementing the framework are described. With empirical evidence from the field, the steps involve analysis of context, program designs, implementation mechanisms, outcomes, synthesis, and interpretation of findings through the programs' underlying program theories and interactions with the state context and health system. The framework can be useful for comparatively evaluating different aid interventions both in fragile and non-fragile state contexts.
The Data-to-Action Framework: A Rapid Program Improvement Process.
Zakocs, Ronda; Hill, Jessica A; Brown, Pamela; Wheaton, Jocelyn; Freire, Kimberley E
2015-08-01
Although health education programs may benefit from quality improvement methods, scant resources exist to help practitioners apply these methods for program improvement. The purpose of this article is to describe the Data-to-Action framework, a process that guides practitioners through rapid-feedback cycles in order to generate actionable data to improve implementation of ongoing programs. The framework was designed while implementing DELTA PREP, a 3-year project aimed at building the primary prevention capacities of statewide domestic violence coalitions. The authors describe the framework's main steps and provide a case example of a rapid-feedback cycle and several examples of rapid-feedback memos produced during the project period. The authors also discuss implications for health education evaluation and practice. © 2015 Society for Public Health Education.
Zhou, Yuan; Cheng, Xinyao; Xu, Xiangyang; Song, Enmin
2013-12-01
Segmentation of carotid artery intima-media in longitudinal ultrasound images for measuring its thickness to predict cardiovascular diseases can be simplified as detecting two nearly parallel boundaries within a certain distance range, when plaque with irregular shapes is not considered. In this paper, we improve the implementation of two dynamic programming (DP) based approaches to parallel boundary detection, dual dynamic programming (DDP) and piecewise linear dual dynamic programming (PL-DDP). Then, a novel DP based approach, dual line detection (DLD), which translates the original 2-D curve position to a 4-D parameter space representing two line segments in a local image segment, is proposed to solve the problem while maintaining efficiency and rotation invariance. To apply the DLD to ultrasound intima-media segmentation, it is imbedded in a framework that employs an edge map obtained from multiplication of the responses of two edge detectors with different scales and a coupled snake model that simultaneously deforms the two contours for maintaining parallelism. The experimental results on synthetic images and carotid arteries of clinical ultrasound images indicate improved performance of the proposed DLD compared to DDP and PL-DDP, with respect to accuracy and efficiency. Copyright © 2013 Elsevier B.V. All rights reserved.
Zhang, Jisheng; Jia, Limin; Niu, Shuyun; Zhang, Fan; Tong, Lu; Zhou, Xuesong
2015-01-01
It is essential for transportation management centers to equip and manage a network of fixed and mobile sensors in order to quickly detect traffic incidents and further monitor the related impact areas, especially for high-impact accidents with dramatic traffic congestion propagation. As emerging small Unmanned Aerial Vehicles (UAVs) start to have a more flexible regulation environment, it is critically important to fully explore the potential for of using UAVs for monitoring recurring and non-recurring traffic conditions and special events on transportation networks. This paper presents a space-time network- based modeling framework for integrated fixed and mobile sensor networks, in order to provide a rapid and systematic road traffic monitoring mechanism. By constructing a discretized space-time network to characterize not only the speed for UAVs but also the time-sensitive impact areas of traffic congestion, we formulate the problem as a linear integer programming model to minimize the detection delay cost and operational cost, subject to feasible flying route constraints. A Lagrangian relaxation solution framework is developed to decompose the original complex problem into a series of computationally efficient time-dependent and least cost path finding sub-problems. Several examples are used to demonstrate the results of proposed models in UAVs’ route planning for small and medium-scale networks. PMID:26076404
Zhang, Liangliang; Yuan, Shuai; Feng, Liang; Guo, Bingbing; Qin, Jun-Sheng; Xu, Ben; Lollar, Christina; Sun, Daofeng; Zhou, Hong-Cai
2018-04-23
Multi-component metal-organic frameworks (MOFs) with precisely controlled pore environments are highly desired owing to their potential applications in gas adsorption, separation, cooperative catalysis, and biomimetics. A series of multi-component MOFs, namely PCN-900(RE), were constructed from a combination of tetratopic porphyrinic linkers, linear linkers, and rare-earth hexanuclear clusters (RE 6 ) under the guidance of thermodynamics. These MOFs exhibit high surface areas (up to 2523 cm 2 g -1 ) and unlimited tunability by modification of metal nodes and/or linker components. Post-synthetic exchange of linear linkers and metalation of two organic linkers were realized, allowing the incorporation of a wide range of functional moieties. Two different metal sites were sequentially placed on the linear linker and the tetratopic porphyrinic linker, respectively, giving rise to an ideal platform for heterogeneous catalysis. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
A framework for the automated data-driven constitutive characterization of composites
J.G. Michopoulos; John Hermanson; T. Furukawa; A. Iliopoulos
2010-01-01
We present advances on the development of a mechatronically and algorithmically automated framework for the data-driven identification of constitutive material models based on energy density considerations. These models can capture both the linear and nonlinear constitutive response of multiaxially loaded composite materials in a manner that accounts for progressive...
A Block-LU Update for Large-Scale Linear Programming
1990-01-01
linear programming problems. Results are given from runs on the Cray Y -MP. 1. Introduction We wish to use the simplex method [Dan63] to solve the...standard linear program, minimize cTx subject to Ax = b 1< x <U, where A is an m by n matrix and c, x, 1, u, and b are of appropriate dimension. The simplex...the identity matrix. The basis is used to solve for the search direction y and the dual variables 7r in the following linear systems: Bky = aq (1.2) and
A linear framework for time-scale separation in nonlinear biochemical systems.
Gunawardena, Jeremy
2012-01-01
Cellular physiology is implemented by formidably complex biochemical systems with highly nonlinear dynamics, presenting a challenge for both experiment and theory. Time-scale separation has been one of the few theoretical methods for distilling general principles from such complexity. It has provided essential insights in areas such as enzyme kinetics, allosteric enzymes, G-protein coupled receptors, ion channels, gene regulation and post-translational modification. In each case, internal molecular complexity has been eliminated, leading to rational algebraic expressions among the remaining components. This has yielded familiar formulas such as those of Michaelis-Menten in enzyme kinetics, Monod-Wyman-Changeux in allostery and Ackers-Johnson-Shea in gene regulation. Here we show that these calculations are all instances of a single graph-theoretic framework. Despite the biochemical nonlinearity to which it is applied, this framework is entirely linear, yet requires no approximation. We show that elimination of internal complexity is feasible when the relevant graph is strongly connected. The framework provides a new methodology with the potential to subdue combinatorial explosion at the molecular level.
The Maryland Youth Suicide Prevention School Program.
ERIC Educational Resources Information Center
Maryland State Dept. of Education, Baltimore.
The Maryland State Department of Education developed this framework for a suicide prevention program. The program framework addresses the following goals: (1) increase awareness among school personnel and community awareness among school personnel and community leaders of the incidence of teenage suicide; (2) train school personnel in individual…
Near-miss incident management in the chemical process industry.
Phimister, James R; Oktem, Ulku; Kleindorfer, Paul R; Kunreuther, Howard
2003-06-01
This article provides a systematic framework for the analysis and improvement of near-miss programs in the chemical process industries. Near-miss programs improve corporate environmental, health, and safety (EHS) performance through the identification and management of near misses. Based on more than 100 interviews at 20 chemical and pharmaceutical facilities, a seven-stage framework has been developed and is presented herein. The framework enables sites to analyze their own near-miss programs, identify weak management links, and implement systemwide improvements.
NASA's mobile satellite communications program; ground and space segment technologies
NASA Technical Reports Server (NTRS)
Naderi, F.; Weber, W. J.; Knouse, G. H.
1984-01-01
This paper describes the Mobile Satellite Communications Program of the United States National Aeronautics and Space Administration (NASA). The program's objectives are to facilitate the deployment of the first generation commercial mobile satellite by the private sector, and to technologically enable future generations by developing advanced and high risk ground and space segment technologies. These technologies are aimed at mitigating severe shortages of spectrum, orbital slot, and spacecraft EIRP which are expected to plague the high capacity mobile satellite systems of the future. After a brief introduction of the concept of mobile satellite systems and their expected evolution, this paper outlines the critical ground and space segment technologies. Next, the Mobile Satellite Experiment (MSAT-X) is described. MSAT-X is the framework through which NASA will develop advanced ground segment technologies. An approach is outlined for the development of conformal vehicle antennas, spectrum and power-efficient speech codecs, and modulation techniques for use in the non-linear faded channels and efficient multiple access schemes. Finally, the paper concludes with a description of the current and planned NASA activities aimed at developing complex large multibeam spacecraft antennas needed for future generation mobile satellite systems.
Xie, Y L; Li, Y P; Huang, G H; Li, Y F; Chen, L R
2011-04-15
In this study, an inexact-chance-constrained water quality management (ICC-WQM) model is developed for planning regional environmental management under uncertainty. This method is based on an integration of interval linear programming (ILP) and chance-constrained programming (CCP) techniques. ICC-WQM allows uncertainties presented as both probability distributions and interval values to be incorporated within a general optimization framework. Complexities in environmental management systems can be systematically reflected, thus applicability of the modeling process can be highly enhanced. The developed method is applied to planning chemical-industry development in Binhai New Area of Tianjin, China. Interval solutions associated with different risk levels of constraint violation have been obtained. They can be used for generating decision alternatives and thus help decision makers identify desired policies under various system-reliability constraints of water environmental capacity of pollutant. Tradeoffs between system benefits and constraint-violation risks can also be tackled. They are helpful for supporting (a) decision of wastewater discharge and government investment, (b) formulation of local policies regarding water consumption, economic development and industry structure, and (c) analysis of interactions among economic benefits, system reliability and pollutant discharges. Copyright © 2011 Elsevier B.V. All rights reserved.
Linear System of Equations, Matrix Inversion, and Linear Programming Using MS Excel
ERIC Educational Resources Information Center
El-Gebeily, M.; Yushau, B.
2008-01-01
In this note, we demonstrate with illustrations two different ways that MS Excel can be used to solve Linear Systems of Equation, Linear Programming Problems, and Matrix Inversion Problems. The advantage of using MS Excel is its availability and transparency (the user is responsible for most of the details of how a problem is solved). Further, we…
Tackling non-linearities with the effective field theory of dark energy and modified gravity
NASA Astrophysics Data System (ADS)
Frusciante, Noemi; Papadomanolakis, Georgios
2017-12-01
We present the extension of the effective field theory framework to the mildly non-linear scales. The effective field theory approach has been successfully applied to the late time cosmic acceleration phenomenon and it has been shown to be a powerful method to obtain predictions about cosmological observables on linear scales. However, mildly non-linear scales need to be consistently considered when testing gravity theories because a large part of the data comes from those scales. Thus, non-linear corrections to predictions on observables coming from the linear analysis can help in discriminating among different gravity theories. We proceed firstly by identifying the necessary operators which need to be included in the effective field theory Lagrangian in order to go beyond the linear order in perturbations and then we construct the corresponding non-linear action. Moreover, we present the complete recipe to map any single field dark energy and modified gravity models into the non-linear effective field theory framework by considering a general action in the Arnowitt-Deser-Misner formalism. In order to illustrate this recipe we proceed to map the beyond-Horndeski theory and low-energy Hořava gravity into the effective field theory formalism. As a final step we derived the 4th order action in term of the curvature perturbation. This allowed us to identify the non-linear contributions coming from the linear order perturbations which at the next order act like source terms. Moreover, we confirm that the stability requirements, ensuring the positivity of the kinetic term and the speed of propagation for scalar mode, are automatically satisfied once the viability of the theory is demanded at linear level. The approach we present here will allow to construct, in a model independent way, all the relevant predictions on observables at mildly non-linear scales.
High profile students’ growth of mathematical understanding in solving linier programing problems
NASA Astrophysics Data System (ADS)
Utomo; Kusmayadi, TA; Pramudya, I.
2018-04-01
Linear program has an important role in human’s life. This linear program is learned in senior high school and college levels. This material is applied in economy, transportation, military and others. Therefore, mastering linear program is useful for provision of life. This research describes a growth of mathematical understanding in solving linear programming problems based on the growth of understanding by the Piere-Kieren model. Thus, this research used qualitative approach. The subjects were students of grade XI in Salatiga city. The subjects of this study were two students who had high profiles. The researcher generally chose the subjects based on the growth of understanding from a test result in the classroom; the mark from the prerequisite material was ≥ 75. Both of the subjects were interviewed by the researcher to know the students’ growth of mathematical understanding in solving linear programming problems. The finding of this research showed that the subjects often folding back to the primitive knowing level to go forward to the next level. It happened because the subjects’ primitive understanding was not comprehensive.
Hybrid Metaheuristics for Solving a Fuzzy Single Batch-Processing Machine Scheduling Problem
Molla-Alizadeh-Zavardehi, S.; Tavakkoli-Moghaddam, R.; Lotfi, F. Hosseinzadeh
2014-01-01
This paper deals with a problem of minimizing total weighted tardiness of jobs in a real-world single batch-processing machine (SBPM) scheduling in the presence of fuzzy due date. In this paper, first a fuzzy mixed integer linear programming model is developed. Then, due to the complexity of the problem, which is NP-hard, we design two hybrid metaheuristics called GA-VNS and VNS-SA applying the advantages of genetic algorithm (GA), variable neighborhood search (VNS), and simulated annealing (SA) frameworks. Besides, we propose three fuzzy earliest due date heuristics to solve the given problem. Through computational experiments with several random test problems, a robust calibration is applied on the parameters. Finally, computational results on different-scale test problems are presented to compare the proposed algorithms. PMID:24883359
Full three-body problem in effective-field-theory models of gravity
NASA Astrophysics Data System (ADS)
Battista, Emmanuele; Esposito, Giampiero
2014-10-01
Recent work in the literature has studied the restricted three-body problem within the framework of effective-field-theory models of gravity. This paper extends such a program by considering the full three-body problem, when the Newtonian potential is replaced by a more general central potential which depends on the mutual separations of the three bodies. The general form of the equations of motion is written down, and they are studied when the interaction potential reduces to the quantum-corrected central potential considered recently in the literature. A recursive algorithm is found for solving the associated variational equations, which describe small departures from given periodic solutions of the equations of motion. Our scheme involves repeated application of a 2×2 matrix of first-order linear differential operators.
SCM: A method to improve network service layout efficiency with network evolution.
Zhao, Qi; Zhang, Chuanhao; Zhao, Zheng
2017-01-01
Network services are an important component of the Internet, which are used to expand network functions for third-party developers. Network function virtualization (NFV) can improve the speed and flexibility of network service deployment. However, with the evolution of the network, network service layout may become inefficient. Regarding this problem, this paper proposes a service chain migration (SCM) method with the framework of "software defined network + network function virtualization" (SDN+NFV), which migrates service chains to adapt to network evolution and improves the efficiency of the network service layout. SCM is modeled as an integer linear programming problem and resolved via particle swarm optimization. An SCM prototype system is designed based on an SDN controller. Experiments demonstrate that SCM could reduce the network traffic cost and energy consumption efficiently.
A spatial operator algebra for manipulator modeling and control
NASA Technical Reports Server (NTRS)
Rodriguez, G.; Kreutz, K.; Jain, A.
1989-01-01
A spatial operator algebra for modeling the control and trajectory design of manipulation is discussed, with emphasis on its analytical formulation and implementation in the Ada programming language. The elements of this algebra are linear operators whose domain and range spaces consist of forces, moments, velocities, and accelerations. The effect of these operators is equivalent to a spatial recursion along the span of the manipulator. Inversion is obtained using techniques of recursive filtering and smoothing. The operator alegbra provides a high-level framework for describing the dynamic and kinematic behavior of a manipulator and control and trajectory design algorithms. Implementable recursive algorithms can be immediately derived from the abstract operator expressions by inspection, thus greatly simplifying the transition from an abstract problem formulation and solution to the detailed mechanization of a specific algorithm.
Can Linear Superiorization Be Useful for Linear Optimization Problems?
Censor, Yair
2017-01-01
Linear superiorization considers linear programming problems but instead of attempting to solve them with linear optimization methods it employs perturbation resilient feasibility-seeking algorithms and steers them toward reduced (not necessarily minimal) target function values. The two questions that we set out to explore experimentally are (i) Does linear superiorization provide a feasible point whose linear target function value is lower than that obtained by running the same feasibility-seeking algorithm without superiorization under identical conditions? and (ii) How does linear superiorization fare in comparison with the Simplex method for solving linear programming problems? Based on our computational experiments presented here, the answers to these two questions are: “yes” and “very well”, respectively. PMID:29335660
NASA Technical Reports Server (NTRS)
Egebrecht, R. A.; Thorbjornsen, A. R.
1967-01-01
Digital computer programs determine steady-state performance characteristics of active and passive linear circuits. The ac analysis program solves the basic circuit parameters. The compiler program solves these circuit parameters and in addition provides a more versatile program by allowing the user to perform mathematical and logical operations.
Students' Use of Computational Thinking in Linear Algebra
ERIC Educational Resources Information Center
Bagley, Spencer; Rabin, Jeffrey M.
2016-01-01
In this work, we examine students' ways of thinking when presented with a novel linear algebra problem. Our intent was to explore how students employ and coordinate three modes of thinking, which we call computational, abstract, and geometric, following similar frameworks proposed by Hillel (2000) and Sierpinska (2000). However, the undergraduate…
Re-Mediating Classroom Activity with a Non-Linear, Multi-Display Presentation Tool
ERIC Educational Resources Information Center
Bligh, Brett; Coyle, Do
2013-01-01
This paper uses an Activity Theory framework to evaluate the use of a novel, multi-screen, non-linear presentation tool. The Thunder tool allows presenters to manipulate and annotate multiple digital slides and to concurrently display a selection of juxtaposed resources across a wall-sized projection area. Conventional, single screen presentation…
A picture of Indian adolescent mental health: an analysis from three urban secondary schools.
Long, Katelyn N G; Gren, Lisa H; Long, Paul M; Jaggi, Rachel; Banik, Srabani; Mihalopoulos, Nicole L
2017-08-01
Purpose Mental health disorders are a pressing issue among adolescents around the world, including in India. A better understanding of the factors related to poor mental health will allow for more effective and targeted interventions for Indian adolescents. Methods The Indian Adolescent Health Questionnaire (IAHQ), a validated questionnaire designed specifically for use in schools, was administered to approximately 1500 secondary students in three private urban Indian schools in 2012. The Strengths and Difficulties Questionnaire (SDQ) module assessed mental health. Linear regression was used to predict SDQ scores. The biopsychosocial framework was used as an organizing framework to understand how each explanatory variable in the final model might impact the SDQ score. Results One thousand four hundred and eight students returned IAHQ surveys (93.9% response rate); 1102 students completed questions for inclusion in the regression model (78.3% inclusion rate). Statistically significant (p < 0.05) independent variables associated with SDQ scores were gender, level of overall health, negative peer pressure, insults from peers, kindness of peers, feeling safe at home, at school, or with friends, and grades. Discussion Schools have a role to play in improving adolescent mental health. Many of the significant variables in our study can be addressed in the school environment through school-wide, long-term programs utilizing teachers and lay counselors. The IAHQ and SDQ can be used by schools to identify factors that contribute to poor mental health among students and then develop targeted programs to support improved mental health.
NASA Technical Reports Server (NTRS)
Ortiz, James N.; Scott,Kelly; Smith, Harold
2004-01-01
The assembly and operation of the ISS has generated significant challenges that have ultimately impacted resources available to the program's primary mission: research. To address this, program personnel routinely perform trade-off studies on alternative options to enhance research. The approach, content level of analysis and resulting outputs of these studies vary due to many factors, however, complicating the Program Manager's job of selecting the best option. To address this, the program requested a framework be developed to evaluate multiple research-enhancing options in a thorough, disciplined and repeatable manner, and to identify the best option on the basis of cost, benefit and risk. The resulting framework consisted of a systematic methodology and a decision-support toolset. The framework provides quantifiable and repeatable means for ranking research-enhancing options for the complex and multiple-constraint domain of the space research laboratory. This paper describes the development, verification and validation of this framework and provides observations on its operational use.
A Planning Framework for Crafting the Required-Curriculum Phase of an MBA Program
ERIC Educational Resources Information Center
Haskins, Mark E.
2005-01-01
This article introduces a planning framework for designing that part of an MBA program during which students take the bulk, if not all, of their required courses. The framework highlights three student venues that can be jointly leveraged for enhanced student learning. Those venues are the required curriculum, students' affinity groups, and the…
A framework for evaluating and designing citizen science programs for natural resources monitoring.
Chase, Sarah K; Levine, Arielle
2016-06-01
We present a framework of resource characteristics critical to the design and assessment of citizen science programs that monitor natural resources. To develop the framework we reviewed 52 citizen science programs that monitored a wide range of resources and provided insights into what resource characteristics are most conducive to developing citizen science programs and how resource characteristics may constrain the use or growth of these programs. We focused on 4 types of resource characteristics: biophysical and geographical, management and monitoring, public awareness and knowledge, and social and cultural characteristics. We applied the framework to 2 programs, the Tucson (U.S.A.) Bird Count and the Maui (U.S.A.) Great Whale Count. We found that resource characteristics such as accessibility, diverse institutional involvement in resource management, and social or cultural importance of the resource affected program endurance and success. However, the relative influence of each characteristic was in turn affected by goals of the citizen science programs. Although the goals of public engagement and education sometimes complimented the goal of collecting reliable data, in many cases trade-offs must be made between these 2 goals. Program goals and priorities ultimately dictate the design of citizen science programs, but for a program to endure and successfully meet its goals, program managers must consider the diverse ways that the nature of the resource being monitored influences public participation in monitoring. © 2016 Society for Conservation Biology.
A framework for telehealth program evaluation.
Nepal, Surya; Li, Jane; Jang-Jaccard, Julian; Alem, Leila
2014-04-01
Evaluating telehealth programs is a challenging task, yet it is the most sensible first step when embarking on a telehealth study. How can we frame and report on telehealth studies? What are the health services elements to select based on the application needs? What are the appropriate terms to use to refer to such elements? Various frameworks have been proposed in the literature to answer these questions, and each framework is defined by a set of properties covering different aspects of telehealth systems. The most common properties include application, technology, and functionality. With the proliferation of telehealth, it is important not only to understand these properties, but also to define new properties to account for a wider range of context of use and evaluation outcomes. This article presents a comprehensive framework for delivery design, implementation, and evaluation of telehealth services. We first survey existing frameworks proposed in the literature and then present our proposed comprehensive multidimensional framework for telehealth. Six key dimensions of the proposed framework include health domains, health services, delivery technologies, communication infrastructure, environment setting, and socioeconomic analysis. We define a set of example properties for each dimension. We then demonstrate how we have used our framework to evaluate telehealth programs in rural and remote Australia. A few major international studies have been also mapped to demonstrate the feasibility of the framework. The key characteristics of the framework are as follows: (a) loosely coupled and hence easy to use, (b) provides a basis for describing a wide range of telehealth programs, and (c) extensible to future developments and needs.
E2E: A Summary of the e2e Learning Framework.
ERIC Educational Resources Information Center
Learning and Skills Development Agency, London (England).
This publication is a summary of the E2E (Entry to Employment) Learning Framework that provides guidance on program implementation. (E2E is a new learning program for young people not yet ready or able to enter Modern Apprenticeship programs, a Level 2 program, or employment directly.) Section 2 highlights core values to which all involved should…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the brick, block, and stonemasonry program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies for the…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the child development technology programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies,…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the fashion marketing technology programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies,…
Leveraging Competency Framework to Improve Teaching and Learning: A Methodological Approach
ERIC Educational Resources Information Center
Shankararaman, Venky; Ducrot, Joelle
2016-01-01
A number of engineering education programs have defined learning outcomes and course-level competencies, and conducted assessments at the program level to determine areas for continuous improvement. However, many of these programs have not implemented a comprehensive competency framework to support the actual delivery and assessment of an…
ERIC Educational Resources Information Center
King, Gillian; Currie, Melissa; Smith, Linda; Servais, Michelle; McDougall, Janette
2008-01-01
A framework of operating models for interdisciplinary research programs in clinical service organizations is presented, consisting of a "clinician-researcher" skill development model, a program evaluation model, a researcher-led knowledge generation model, and a knowledge conduit model. Together, these models comprise a tailored, collaborative…
ERIC Educational Resources Information Center
Pitas, Nicholas; Murray, Alison; Olsen, Max; Graefe, Alan
2017-01-01
This article describes a modified importance-performance framework for use in evaluation of recreation-based experiential learning programs. Importance-performance analysis (IPA) provides an effective and readily applicable means of evaluating many programs, but the near universal satisfaction associated with recreation inhibits the use of IPA in…
NASA Technical Reports Server (NTRS)
Barry, Matthew R.
2006-01-01
The X-Windows Socket Widget Class ("Class" is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing network connections for graphical-user-interface (GUI) computer programs. UNIX Transmission Control Protocol/Internet Protocol (TCP/IP) socket programming libraries require many method calls to configure, operate, and destroy sockets. Most X Windows GUI programs use widget sets or toolkits to facilitate management of complex objects. The widget standards facilitate construction of toolkits and application programs. The X-Windows Socket Widget Class encapsulates UNIX TCP/IP socket-management tasks within the framework of an X Windows widget. Using the widget framework, X Windows GUI programs can treat one or more network socket instances in the same manner as that of other graphical widgets, making it easier to program sockets. Wrapping ISP socket programming libraries inside a widget framework enables a programmer to treat a network interface as though it were a GUI.
Kannan, R; Ievlev, A V; Laanait, N; Ziatdinov, M A; Vasudevan, R K; Jesse, S; Kalinin, S V
2018-01-01
Many spectral responses in materials science, physics, and chemistry experiments can be characterized as resulting from the superposition of a number of more basic individual spectra. In this context, unmixing is defined as the problem of determining the individual spectra, given measurements of multiple spectra that are spatially resolved across samples, as well as the determination of the corresponding abundance maps indicating the local weighting of each individual spectrum. Matrix factorization is a popular linear unmixing technique that considers that the mixture model between the individual spectra and the spatial maps is linear. Here, we present a tutorial paper targeted at domain scientists to introduce linear unmixing techniques, to facilitate greater understanding of spectroscopic imaging data. We detail a matrix factorization framework that can incorporate different domain information through various parameters of the matrix factorization method. We demonstrate many domain-specific examples to explain the expressivity of the matrix factorization framework and show how the appropriate use of domain-specific constraints such as non-negativity and sum-to-one abundance result in physically meaningful spectral decompositions that are more readily interpretable. Our aim is not only to explain the off-the-shelf available tools, but to add additional constraints when ready-made algorithms are unavailable for the task. All examples use the scalable open source implementation from https://github.com/ramkikannan/nmflibrary that can run from small laptops to supercomputers, creating a user-wide platform for rapid dissemination and adoption across scientific disciplines.
Linear Programming and Its Application to Pattern Recognition Problems
NASA Technical Reports Server (NTRS)
Omalley, M. J.
1973-01-01
Linear programming and linear programming like techniques as applied to pattern recognition problems are discussed. Three relatively recent research articles on such applications are summarized. The main results of each paper are described, indicating the theoretical tools needed to obtain them. A synopsis of the author's comments is presented with regard to the applicability or non-applicability of his methods to particular problems, including computational results wherever given.
Two Computer Programs for the Statistical Evaluation of a Weighted Linear Composite.
ERIC Educational Resources Information Center
Sands, William A.
1978-01-01
Two computer programs (one batch, one interactive) are designed to provide statistics for a weighted linear combination of several component variables. Both programs provide mean, variance, standard deviation, and a validity coefficient. (Author/JKS)
Calculating Higher-Order Moments of Phylogenetic Stochastic Mapping Summaries in Linear Time.
Dhar, Amrit; Minin, Vladimir N
2017-05-01
Stochastic mapping is a simulation-based method for probabilistically mapping substitution histories onto phylogenies according to continuous-time Markov models of evolution. This technique can be used to infer properties of the evolutionary process on the phylogeny and, unlike parsimony-based mapping, conditions on the observed data to randomly draw substitution mappings that do not necessarily require the minimum number of events on a tree. Most stochastic mapping applications simulate substitution mappings only to estimate the mean and/or variance of two commonly used mapping summaries: the number of particular types of substitutions (labeled substitution counts) and the time spent in a particular group of states (labeled dwelling times) on the tree. Fast, simulation-free algorithms for calculating the mean of stochastic mapping summaries exist. Importantly, these algorithms scale linearly in the number of tips/leaves of the phylogenetic tree. However, to our knowledge, no such algorithm exists for calculating higher-order moments of stochastic mapping summaries. We present one such simulation-free dynamic programming algorithm that calculates prior and posterior mapping variances and scales linearly in the number of phylogeny tips. Our procedure suggests a general framework that can be used to efficiently compute higher-order moments of stochastic mapping summaries without simulations. We demonstrate the usefulness of our algorithm by extending previously developed statistical tests for rate variation across sites and for detecting evolutionarily conserved regions in genomic sequences.
Calculating Higher-Order Moments of Phylogenetic Stochastic Mapping Summaries in Linear Time
Dhar, Amrit
2017-01-01
Abstract Stochastic mapping is a simulation-based method for probabilistically mapping substitution histories onto phylogenies according to continuous-time Markov models of evolution. This technique can be used to infer properties of the evolutionary process on the phylogeny and, unlike parsimony-based mapping, conditions on the observed data to randomly draw substitution mappings that do not necessarily require the minimum number of events on a tree. Most stochastic mapping applications simulate substitution mappings only to estimate the mean and/or variance of two commonly used mapping summaries: the number of particular types of substitutions (labeled substitution counts) and the time spent in a particular group of states (labeled dwelling times) on the tree. Fast, simulation-free algorithms for calculating the mean of stochastic mapping summaries exist. Importantly, these algorithms scale linearly in the number of tips/leaves of the phylogenetic tree. However, to our knowledge, no such algorithm exists for calculating higher-order moments of stochastic mapping summaries. We present one such simulation-free dynamic programming algorithm that calculates prior and posterior mapping variances and scales linearly in the number of phylogeny tips. Our procedure suggests a general framework that can be used to efficiently compute higher-order moments of stochastic mapping summaries without simulations. We demonstrate the usefulness of our algorithm by extending previously developed statistical tests for rate variation across sites and for detecting evolutionarily conserved regions in genomic sequences. PMID:28177780
A general science-based framework for dynamical spatio-temporal models
Wikle, C.K.; Hooten, M.B.
2010-01-01
Spatio-temporal statistical models are increasingly being used across a wide variety of scientific disciplines to describe and predict spatially-explicit processes that evolve over time. Correspondingly, in recent years there has been a significant amount of research on new statistical methodology for such models. Although descriptive models that approach the problem from the second-order (covariance) perspective are important, and innovative work is being done in this regard, many real-world processes are dynamic, and it can be more efficient in some cases to characterize the associated spatio-temporal dependence by the use of dynamical models. The chief challenge with the specification of such dynamical models has been related to the curse of dimensionality. Even in fairly simple linear, first-order Markovian, Gaussian error settings, statistical models are often over parameterized. Hierarchical models have proven invaluable in their ability to deal to some extent with this issue by allowing dependency among groups of parameters. In addition, this framework has allowed for the specification of science based parameterizations (and associated prior distributions) in which classes of deterministic dynamical models (e. g., partial differential equations (PDEs), integro-difference equations (IDEs), matrix models, and agent-based models) are used to guide specific parameterizations. Most of the focus for the application of such models in statistics has been in the linear case. The problems mentioned above with linear dynamic models are compounded in the case of nonlinear models. In this sense, the need for coherent and sensible model parameterizations is not only helpful, it is essential. Here, we present an overview of a framework for incorporating scientific information to motivate dynamical spatio-temporal models. First, we illustrate the methodology with the linear case. We then develop a general nonlinear spatio-temporal framework that we call general quadratic nonlinearity and demonstrate that it accommodates many different classes of scientific-based parameterizations as special cases. The model is presented in a hierarchical Bayesian framework and is illustrated with examples from ecology and oceanography. ?? 2010 Sociedad de Estad??stica e Investigaci??n Operativa.
Building Campus Communities Inclusive of International Students: A Framework for Program Development
ERIC Educational Resources Information Center
Jameson, Helen Park; Goshit, Sunday
2017-01-01
This chapter provides readers with a practical, how-to approach and framework for developing inclusive, intercultural training programs for student affairs professionals on college campuses in the United States.
The RANDOM computer program: A linear congruential random number generator
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1986-01-01
The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.
Balfour, Margaret E; Tanner, Kathleen; Jurica, Paul J; Rhoads, Richard; Carson, Chris A
2016-01-01
Crisis and emergency psychiatric services are an integral part of the healthcare system, yet there are no standardized measures for programs providing these services. We developed the Crisis Reliability Indicators Supporting Emergency Services (CRISES) framework to create measures that inform internal performance improvement initiatives and allow comparison across programs. The framework consists of two components-the CRISES domains (timely, safe, accessible, least-restrictive, effective, consumer/family centered, and partnership) and the measures supporting each domain. The CRISES framework provides a foundation for development of standardized measures for the crisis field. This will become increasingly important as pay-for-performance initiatives expand with healthcare reform.
Dimension Reduction With Extreme Learning Machine.
Kasun, Liyanaarachchi Lekamalage Chamara; Yang, Yan; Huang, Guang-Bin; Zhang, Zhengyou
2016-08-01
Data may often contain noise or irrelevant information, which negatively affect the generalization capability of machine learning algorithms. The objective of dimension reduction algorithms, such as principal component analysis (PCA), non-negative matrix factorization (NMF), random projection (RP), and auto-encoder (AE), is to reduce the noise or irrelevant information of the data. The features of PCA (eigenvectors) and linear AE are not able to represent data as parts (e.g. nose in a face image). On the other hand, NMF and non-linear AE are maimed by slow learning speed and RP only represents a subspace of original data. This paper introduces a dimension reduction framework which to some extend represents data as parts, has fast learning speed, and learns the between-class scatter subspace. To this end, this paper investigates a linear and non-linear dimension reduction framework referred to as extreme learning machine AE (ELM-AE) and sparse ELM-AE (SELM-AE). In contrast to tied weight AE, the hidden neurons in ELM-AE and SELM-AE need not be tuned, and their parameters (e.g, input weights in additive neurons) are initialized using orthogonal and sparse random weights, respectively. Experimental results on USPS handwritten digit recognition data set, CIFAR-10 object recognition, and NORB object recognition data set show the efficacy of linear and non-linear ELM-AE and SELM-AE in terms of discriminative capability, sparsity, training time, and normalized mean square error.
Accommodation of practical constraints by a linear programming jet select. [for Space Shuttle
NASA Technical Reports Server (NTRS)
Bergmann, E.; Weiler, P.
1983-01-01
An experimental spacecraft control system will be incorporated into the Space Shuttle flight software and exercised during a forthcoming mission to evaluate its performance and handling qualities. The control system incorporates a 'phase space' control law to generate rate change requests and a linear programming jet select to compute jet firings. Posed as a linear programming problem, jet selection must represent the rate change request as a linear combination of jet acceleration vectors where the coefficients are the jet firing times, while minimizing the fuel expended in satisfying that request. This problem is solved in real time using a revised Simplex algorithm. In order to implement the jet selection algorithm in the Shuttle flight control computer, it was modified to accommodate certain practical features of the Shuttle such as limited computer throughput, lengthy firing times, and a large number of control jets. To the authors' knowledge, this is the first such application of linear programming. It was made possible by careful consideration of the jet selection problem in terms of the properties of linear programming and the Simplex algorithm. These modifications to the jet select algorithm may by useful for the design of reaction controlled spacecraft.
A quasi-likelihood approach to non-negative matrix factorization
Devarajan, Karthik; Cheung, Vincent C.K.
2017-01-01
A unified approach to non-negative matrix factorization based on the theory of generalized linear models is proposed. This approach embeds a variety of statistical models, including the exponential family, within a single theoretical framework and provides a unified view of such factorizations from the perspective of quasi-likelihood. Using this framework, a family of algorithms for handling signal-dependent noise is developed and its convergence proven using the Expectation-Maximization algorithm. In addition, a measure to evaluate the goodness-of-fit of the resulting factorization is described. The proposed methods allow modeling of non-linear effects via appropriate link functions and are illustrated using an application in biomedical signal processing. PMID:27348511
Implementing Restricted Maximum Likelihood Estimation in Structural Equation Models
ERIC Educational Resources Information Center
Cheung, Mike W.-L.
2013-01-01
Structural equation modeling (SEM) is now a generic modeling framework for many multivariate techniques applied in the social and behavioral sciences. Many statistical models can be considered either as special cases of SEM or as part of the latent variable modeling framework. One popular extension is the use of SEM to conduct linear mixed-effects…
ERIC Educational Resources Information Center
Aikman, Sheila; Rao, Nitya
2012-01-01
The article draws on qualitative educational research across a diversity of low-income countries to examine the gendered inequalities in education as complex, multi-faceted and situated rather than a series of barriers to be overcome through linear input-output processes focused on isolated dimensions of quality. It argues that frameworks for…
The Effects of Routing and Scoring within a Computer Adaptive Multi-Stage Framework
ERIC Educational Resources Information Center
Dallas, Andrew
2014-01-01
This dissertation examined the overall effects of routing and scoring within a computer adaptive multi-stage framework (ca-MST). Testing in a ca-MST environment has become extremely popular in the testing industry. Testing companies enjoy its efficiency benefits as compared to traditionally linear testing and its quality-control features over…
Timber management planning with timber ram and goal programming
Richard C. Field
1978-01-01
By using goal programming to enhance the linear programming of Timber RAM, multiple decision criteria were incorporated in the timber management planning of a National Forest in the southeastern United States. Combining linear and goal programming capitalizes on the advantages of the two techniques and produces operationally feasible solutions. This enhancement may...
ERIC Educational Resources Information Center
Friedrich, Philipp E.; Prøitz, Tine S.; Stensaker, Bjørn
2016-01-01
Qualification frameworks are spreading rapidly, not least in Europe following the introduction of the European Qualification Framework. The impact of such frameworks are contested, and the article contributes to this debate by analyzing how a selected group of different study programs in Norwegian higher education is adapting to the newly launched…
A comparison of Heuristic method and Llewellyn’s rules for identification of redundant constraints
NASA Astrophysics Data System (ADS)
Estiningsih, Y.; Farikhin; Tjahjana, R. H.
2018-03-01
Important techniques in linear programming is modelling and solving practical optimization. Redundant constraints are consider for their effects on general linear programming problems. Identification and reduce redundant constraints are for avoidance of all the calculations associated when solving an associated linear programming problems. Many researchers have been proposed for identification redundant constraints. This paper a compararison of Heuristic method and Llewellyn’s rules for identification of redundant constraints.
A frequency averaging framework for the solution of complex dynamic systems
Lecomte, Christophe
2014-01-01
A frequency averaging framework is proposed for the solution of complex linear dynamic systems. It is remarkable that, while the mid-frequency region is usually very challenging, a smooth transition from low- through mid- and high-frequency ranges is possible and all ranges can now be considered in a single framework. An interpretation of the frequency averaging in the time domain is presented and it is explained that the average may be evaluated very efficiently in terms of system solutions. PMID:24910518
Conceptual Frameworks in Undergraduate Nursing Curricula: Report of a National Survey.
ERIC Educational Resources Information Center
McEwen, Melanie; Brown, Sandra C.
2002-01-01
Responses from 300 accredited nursing schools indicated that they used eclectic conceptual frameworks for curriculum; the most common component was the nursing process. Associate degree programs were more likely to use simple-to-complex organization. Diploma programs were more likely to use the medical model than baccalaureate programs. Frameworks…
The Data-to-Action Framework: A Rapid Program Improvement Process
ERIC Educational Resources Information Center
Zakocs, Ronda; Hill, Jessica A.; Brown, Pamela; Wheaton, Jocelyn; Freire, Kimberley E.
2015-01-01
Although health education programs may benefit from quality improvement methods, scant resources exist to help practitioners apply these methods for program improvement. The purpose of this article is to describe the Data-to-Action framework, a process that guides practitioners through rapid-feedback cycles in order to generate actionable data to…
Utilizing the Theoretical Framework of Collective Identity to Understand Processes in Youth Programs
ERIC Educational Resources Information Center
Futch, Valerie A.
2016-01-01
This article explores collective identity as a useful theoretical framework for understanding social and developmental processes that occur in youth programs. Through narrative analysis of past participant interviews (n = 21) from an after-school theater program, known as "The SOURCE", it was found that participants very clearly describe…
Kernel-imbedded Gaussian processes for disease classification using microarray gene expression data
Zhao, Xin; Cheung, Leo Wang-Kit
2007-01-01
Background Designing appropriate machine learning methods for identifying genes that have a significant discriminating power for disease outcomes has become more and more important for our understanding of diseases at genomic level. Although many machine learning methods have been developed and applied to the area of microarray gene expression data analysis, the majority of them are based on linear models, which however are not necessarily appropriate for the underlying connection between the target disease and its associated explanatory genes. Linear model based methods usually also bring in false positive significant features more easily. Furthermore, linear model based algorithms often involve calculating the inverse of a matrix that is possibly singular when the number of potentially important genes is relatively large. This leads to problems of numerical instability. To overcome these limitations, a few non-linear methods have recently been introduced to the area. Many of the existing non-linear methods have a couple of critical problems, the model selection problem and the model parameter tuning problem, that remain unsolved or even untouched. In general, a unified framework that allows model parameters of both linear and non-linear models to be easily tuned is always preferred in real-world applications. Kernel-induced learning methods form a class of approaches that show promising potentials to achieve this goal. Results A hierarchical statistical model named kernel-imbedded Gaussian process (KIGP) is developed under a unified Bayesian framework for binary disease classification problems using microarray gene expression data. In particular, based on a probit regression setting, an adaptive algorithm with a cascading structure is designed to find the appropriate kernel, to discover the potentially significant genes, and to make the optimal class prediction accordingly. A Gibbs sampler is built as the core of the algorithm to make Bayesian inferences. Simulation studies showed that, even without any knowledge of the underlying generative model, the KIGP performed very close to the theoretical Bayesian bound not only in the case with a linear Bayesian classifier but also in the case with a very non-linear Bayesian classifier. This sheds light on its broader usability to microarray data analysis problems, especially to those that linear methods work awkwardly. The KIGP was also applied to four published microarray datasets, and the results showed that the KIGP performed better than or at least as well as any of the referred state-of-the-art methods did in all of these cases. Conclusion Mathematically built on the kernel-induced feature space concept under a Bayesian framework, the KIGP method presented in this paper provides a unified machine learning approach to explore both the linear and the possibly non-linear underlying relationship between the target features of a given binary disease classification problem and the related explanatory gene expression data. More importantly, it incorporates the model parameter tuning into the framework. The model selection problem is addressed in the form of selecting a proper kernel type. The KIGP method also gives Bayesian probabilistic predictions for disease classification. These properties and features are beneficial to most real-world applications. The algorithm is naturally robust in numerical computation. The simulation studies and the published data studies demonstrated that the proposed KIGP performs satisfactorily and consistently. PMID:17328811
NASA Astrophysics Data System (ADS)
Davidsen, Claus; Liu, Suxia; Mo, Xingguo; Rosbjerg, Dan; Bauer-Gottwein, Peter
2014-05-01
Optimal management of conjunctive use of surface water and groundwater has been attempted with different algorithms in the literature. In this study, a hydro-economic modelling approach to optimize conjunctive use of scarce surface water and groundwater resources under uncertainty is presented. A stochastic dynamic programming (SDP) approach is used to minimize the basin-wide total costs arising from water allocations and water curtailments. Dynamic allocation problems with inclusion of groundwater resources proved to be more complex to solve with SDP than pure surface water allocation problems due to head-dependent pumping costs. These dynamic pumping costs strongly affect the total costs and can lead to non-convexity of the future cost function. The water user groups (agriculture, industry, domestic) are characterized by inelastic demands and fixed water allocation and water supply curtailment costs. As in traditional SDP approaches, one step-ahead sub-problems are solved to find the optimal management at any time knowing the inflow scenario and reservoir/aquifer storage levels. These non-linear sub-problems are solved using a genetic algorithm (GA) that minimizes the sum of the immediate and future costs for given surface water reservoir and groundwater aquifer end storages. The immediate cost is found by solving a simple linear allocation sub-problem, and the future costs are assessed by interpolation in the total cost matrix from the following time step. Total costs for all stages, reservoir states, and inflow scenarios are used as future costs to drive a forward moving simulation under uncertain water availability. The use of a GA to solve the sub-problems is computationally more costly than a traditional SDP approach with linearly interpolated future costs. However, in a two-reservoir system the future cost function would have to be represented by a set of planes, and strict convexity in both the surface water and groundwater dimension cannot be maintained. The optimization framework based on the GA is still computationally feasible and represents a clean and customizable method. The method has been applied to the Ziya River basin, China. The basin is located on the North China Plain and is subject to severe water scarcity, which includes surface water droughts and groundwater over-pumping. The head-dependent groundwater pumping costs will enable assessment of the long-term effects of increased electricity prices on the groundwater pumping. The coupled optimization framework is used to assess realistic alternative development scenarios for the basin. In particular the potential for using electricity pricing policies to reach sustainable groundwater pumping is investigated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, M; Baek, J
2016-06-15
Purpose: To investigate the slice direction dependent detectability in cone beam CT images with anatomical background. Methods: We generated 3D anatomical background images using breast anatomy model. To generate 3D breast anatomy, we filtered 3D Gaussian noise with a square root of 1/f{sup 3}, and then assigned the attenuation coefficient of glandular (0.8cm{sup −1}) and adipose (0.46 cm{sup −1}) tissues based on voxel values. Projections were acquired by forward projection, and quantum noise was added to the projection data. The projection data were reconstructed by FDK algorithm. We compared the detectability of a 3 mm spherical signal in the imagemore » reconstructed from four different backprojection Methods: Hanning weighted ramp filter with linear interpolation (RECON1), Hanning weighted ramp filter with Fourier interpolation (RECON2), ramp filter with linear interpolation (RECON3), and ramp filter with Fourier interpolation (RECON4), respectively. We computed task SNR of the spherical signal in transverse and longitudinal planes using channelized Hotelling observer with Laguerre-Gauss channels. Results: Transverse plane has similar task SNR values for different backprojection methods, while longitudinal plane has a maximum task SNR value in RECON1. For all backprojection methods, longitudinal plane has higher task SNR than transverse plane. Conclusion: In this work, we investigated detectability for different slice direction in cone beam CT images with anatomical background. Longitudinal plane has a higher task SNR than transverse plane, and backprojection with hanning weighted ramp filter with linear interpolation method (i.e., RECON1) produced the highest task SNR among four different backprojection methods. This research was supported by the MSIP (Ministry of Science, ICT and Future Planning), Korea, under the IT Consilience Creative Programs(IITP-2015-R0346-15-1008) supervised by the IITP (Institute for Information & Communications Technology Promotion), Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the MSIP (2015R1C1A1A01052268) and framework of international cooperation program managed by NRF (NRF-2015K2A1A2067635).« less
The Use of the Data-to-Action Framework in the Evaluation of CDC's DELTA FOCUS Program.
Armstead, Theresa L; Kearns, Megan; Rambo, Kirsten; Estefan, Lianne Fuino; Dills, Jenny; Rivera, Moira S; El-Beshti, Rasha
The Centers for Disease Control and Prevention's (CDC's) Domestic Violence Prevention Enhancements and Leadership Through Alliances, Focusing on Outcomes for Communities United with States (DELTA FOCUS) program is a 5-year cooperative agreement (2013-2018) funding 10 state domestic violence coalitions and local coordinated community response teams to engage in primary prevention of intimate partner violence. Grantees' prevention strategies were often developmental and emergent; therefore, CDC's approach to program oversight, administration, and support to grantees required a flexible approach. CDC staff adopted a Data-to-Action Framework for the DELTA FOCUS program evaluation that supported a culture of learning to meet dynamic and unexpected information needs. Briefly, a Data-to-Action Framework involves the collection and use of information in real time for program improvement. Utilizing this framework, the DELTA FOCUS data-to-action process yielded important insights into CDC's ongoing technical assistance, improved program accountability by providing useful materials, and information for internal agency leadership, and helped build a learning community among grantees. CDC and other funders, as decision makers, can promote program improvements that are data-informed by incorporating internal processes supportive of ongoing data collection and review.
A Second-Order Conditionally Linear Mixed Effects Model with Observed and Latent Variable Covariates
ERIC Educational Resources Information Center
Harring, Jeffrey R.; Kohli, Nidhi; Silverman, Rebecca D.; Speece, Deborah L.
2012-01-01
A conditionally linear mixed effects model is an appropriate framework for investigating nonlinear change in a continuous latent variable that is repeatedly measured over time. The efficacy of the model is that it allows parameters that enter the specified nonlinear time-response function to be stochastic, whereas those parameters that enter in a…
The linear regulator problem for parabolic systems
NASA Technical Reports Server (NTRS)
Banks, H. T.; Kunisch, K.
1983-01-01
An approximation framework is presented for computation (in finite imensional spaces) of Riccati operators that can be guaranteed to converge to the Riccati operator in feedback controls for abstract evolution systems in a Hilbert space. It is shown how these results may be used in the linear optimal regulator problem for a large class of parabolic systems.
ERIC Educational Resources Information Center
Anderson, Daniel
2012-01-01
This manuscript provides an overview of hierarchical linear modeling (HLM), as part of a series of papers covering topics relevant to consumers of educational research. HLM is tremendously flexible, allowing researchers to specify relations across multiple "levels" of the educational system (e.g., students, classrooms, schools, etc.).…
Redwood-Campbell, Lynda; Pakes, Barry; Rouleau, Katherine; MacDonald, Colla J; Arya, Neil; Purkey, Eva; Schultz, Karen; Dhatt, Reena; Wilson, Briana; Hadi, Abdullahel; Pottie, Kevin
2011-07-22
Recognizing the growing demand from medical students and residents for more comprehensive global health training, and the paucity of explicit curricula on such issues, global health and curriculum experts from the six Ontario Family Medicine Residency Programs worked together to design a framework for global health curricula in family medicine training programs. A working group comprised of global health educators from Ontario's six medical schools conducted a scoping review of global health curricula, competencies, and pedagogical approaches. The working group then hosted a full day meeting, inviting experts in education, clinical care, family medicine and public health, and developed a consensus process and draft framework to design global health curricula. Through a series of weekly teleconferences over the next six months, the framework was revised and used to guide the identification of enabling global health competencies (behaviours, skills and attitudes) for Canadian Family Medicine training. The main outcome was an evidence-informed interactive framework http://globalhealth.ennovativesolution.com/ to provide a shared foundation to guide the design, delivery and evaluation of global health education programs for Ontario's family medicine residency programs. The curriculum framework blended a definition and mission for global health training, core values and principles, global health competencies aligning with the Canadian Medical Education Directives for Specialists (CanMEDS) competencies, and key learning approaches. The framework guided the development of subsequent enabling competencies. The shared curriculum framework can support the design, delivery and evaluation of global health curriculum in Canada and around the world, lay the foundation for research and development, provide consistency across programmes, and support the creation of learning and evaluation tools to align with the framework. The process used to develop this framework can be applied to other aspects of residency curriculum development.
2011-01-01
Background Recognizing the growing demand from medical students and residents for more comprehensive global health training, and the paucity of explicit curricula on such issues, global health and curriculum experts from the six Ontario Family Medicine Residency Programs worked together to design a framework for global health curricula in family medicine training programs. Methods A working group comprised of global health educators from Ontario's six medical schools conducted a scoping review of global health curricula, competencies, and pedagogical approaches. The working group then hosted a full day meeting, inviting experts in education, clinical care, family medicine and public health, and developed a consensus process and draft framework to design global health curricula. Through a series of weekly teleconferences over the next six months, the framework was revised and used to guide the identification of enabling global health competencies (behaviours, skills and attitudes) for Canadian Family Medicine training. Results The main outcome was an evidence-informed interactive framework http://globalhealth.ennovativesolution.com/ to provide a shared foundation to guide the design, delivery and evaluation of global health education programs for Ontario's family medicine residency programs. The curriculum framework blended a definition and mission for global health training, core values and principles, global health competencies aligning with the Canadian Medical Education Directives for Specialists (CanMEDS) competencies, and key learning approaches. The framework guided the development of subsequent enabling competencies. Conclusions The shared curriculum framework can support the design, delivery and evaluation of global health curriculum in Canada and around the world, lay the foundation for research and development, provide consistency across programmes, and support the creation of learning and evaluation tools to align with the framework. The process used to develop this framework can be applied to other aspects of residency curriculum development. PMID:21781319
Agent-based modeling of the immune system: NetLogo, a promising framework.
Chiacchio, Ferdinando; Pennisi, Marzio; Russo, Giulia; Motta, Santo; Pappalardo, Francesco
2014-01-01
Several components that interact with each other to evolve a complex, and, in some cases, unexpected behavior, represents one of the main and fascinating features of the mammalian immune system. Agent-based modeling and cellular automata belong to a class of discrete mathematical approaches in which entities (agents) sense local information and undertake actions over time according to predefined rules. The strength of this approach is characterized by the appearance of a global behavior that emerges from interactions among agents. This behavior is unpredictable, as it does not follow linear rules. There are a lot of works that investigates the immune system with agent-based modeling and cellular automata. They have shown the ability to see clearly and intuitively into the nature of immunological processes. NetLogo is a multiagent programming language and modeling environment for simulating complex phenomena. It is designed for both research and education and is used across a wide range of disciplines and education levels. In this paper, we summarize NetLogo applications to immunology and, particularly, how this framework can help in the development and formulation of hypotheses that might drive further experimental investigations of disease mechanisms.
Basu, Protonu; Williams, Samuel; Van Straalen, Brian; ...
2017-04-05
GPUs, with their high bandwidths and computational capabilities are an increasingly popular target for scientific computing. Unfortunately, to date, harnessing the power of the GPU has required use of a GPU-specific programming model like CUDA, OpenCL, or OpenACC. Thus, in order to deliver portability across CPU-based and GPU-accelerated supercomputers, programmers are forced to write and maintain two versions of their applications or frameworks. In this paper, we explore the use of a compiler-based autotuning framework based on CUDA-CHiLL to deliver not only portability, but also performance portability across CPU- and GPU-accelerated platforms for the geometric multigrid linear solvers found inmore » many scientific applications. We also show that with autotuning we can attain near Roofline (a performance bound for a computation and target architecture) performance across the key operations in the miniGMG benchmark for both CPU- and GPU-based architectures as well as for a multiple stencil discretizations and smoothers. We show that our technology is readily interoperable with MPI resulting in performance at scale equal to that obtained via hand-optimized MPI+CUDA implementation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basu, Protonu; Williams, Samuel; Van Straalen, Brian
GPUs, with their high bandwidths and computational capabilities are an increasingly popular target for scientific computing. Unfortunately, to date, harnessing the power of the GPU has required use of a GPU-specific programming model like CUDA, OpenCL, or OpenACC. Thus, in order to deliver portability across CPU-based and GPU-accelerated supercomputers, programmers are forced to write and maintain two versions of their applications or frameworks. In this paper, we explore the use of a compiler-based autotuning framework based on CUDA-CHiLL to deliver not only portability, but also performance portability across CPU- and GPU-accelerated platforms for the geometric multigrid linear solvers found inmore » many scientific applications. We also show that with autotuning we can attain near Roofline (a performance bound for a computation and target architecture) performance across the key operations in the miniGMG benchmark for both CPU- and GPU-based architectures as well as for a multiple stencil discretizations and smoothers. We show that our technology is readily interoperable with MPI resulting in performance at scale equal to that obtained via hand-optimized MPI+CUDA implementation.« less
FAST Modularization Framework for Wind Turbine Simulation: Full-System Linearization
Jonkman, Jason M.; Jonkman, Bonnie J.
2016-10-03
The wind engineering community relies on multiphysics engineering software to run nonlinear time-domain simulations e.g. for design-standards-based loads analysis. Although most physics involved in wind energy are nonlinear, linearization of the underlying nonlinear system equations is often advantageous to understand the system response and exploit well-established methods and tools for analyzing linear systems. Here, this paper presents the development and verification of the new linearization functionality of the open-source engineering tool FAST v8 for land-based wind turbines, as well as the concepts and mathematical background needed to understand and apply it correctly.
FAST modularization framework for wind turbine simulation: full-system linearization
NASA Astrophysics Data System (ADS)
Jonkman, J. M.; Jonkman, B. J.
2016-09-01
The wind engineering community relies on multiphysics engineering software to run nonlinear time-domain simulations e.g. for design-standards-based loads analysis. Although most physics involved in wind energy are nonlinear, linearization of the underlying nonlinear system equations is often advantageous to understand the system response and exploit well- established methods and tools for analyzing linear systems. This paper presents the development and verification of the new linearization functionality of the open-source engineering tool FAST v8 for land-based wind turbines, as well as the concepts and mathematical background needed to understand and apply it correctly.
LCFIPlus: A framework for jet analysis in linear collider studies
NASA Astrophysics Data System (ADS)
Suehara, Taikan; Tanabe, Tomohiko
2016-02-01
We report on the progress in flavor identification tools developed for a future e+e- linear collider such as the International Linear Collider (ILC) and Compact Linear Collider (CLIC). Building on the work carried out by the LCFIVertex collaboration, we employ new strategies in vertex finding and jet finding, and introduce new discriminating variables for jet flavor identification. We present the performance of the new algorithms in the conditions simulated using a detector concept designed for the ILC. The algorithms have been successfully used in ILC physics simulation studies, such as those presented in the ILC Technical Design Report.
Conceptual framework for a Danish human biomonitoring program
Thomsen, Marianne; Knudsen, Lisbeth E; Vorkamp, Katrin; Frederiksen, Marie; Bach, Hanne; Bonefeld-Jorgensen, Eva Cecilie; Rastogi, Suresch; Fauser, Patrik; Krongaard, Teddy; Sorensen, Peter Borgen
2008-01-01
The aim of this paper is to present the conceptual framework for a Danish human biomonitoring (HBM) program. The EU and national science-policy interface, that is fundamental for a realization of the national and European environment and human health strategies, is discussed, including the need for a structured and integrated environmental and human health surveillance program at national level. In Denmark, the initiative to implement such activities has been taken. The proposed framework of the Danish monitoring program constitutes four scientific expert groups, i.e. i. Prioritization of the strategy for the monitoring program, ii. Collection of human samples, iii. Analysis and data management and iv. Dissemination of results produced within the program. This paper presents the overall framework for data requirements and information flow in the integrated environment and health surveillance program. The added value of an HBM program, and in this respect the objectives of national and European HBM programs supporting environmental health integrated policy-decisions and human health targeted policies, are discussed. In Denmark environmental monitoring has been prioritized by extensive surveillance systems of pollution in oceans, lakes and soil as well as ground and drinking water. Human biomonitoring has only taken place in research programs and few incidences of e.g. lead contamination. However an arctic program for HBM has been in force for decades and from the preparations of the EU-pilot project on HBM increasing political interest in a Danish program has developed. PMID:18541069
ERIC Educational Resources Information Center
Matzke, Orville R.
The purpose of this study was to formulate a linear programming model to simulate a foundation type support program and to apply this model to a state support program for the public elementary and secondary school districts in the State of Iowa. The model was successful in producing optimal solutions to five objective functions proposed for…
Redesigning a clinical mentoring program for improved outcomes in the clinical training of clerks
Lin, Chia-Der; Lin, Blossom Yen-Ju; Lin, Cheng-Chieh; Lee, Cheng-Chun
2015-01-01
Introduction Mentorship has been noted as critical to medical students adapting to clinical training in the medical workplace. A lack of infrastructure in a mentoring program might deter relationship building between mentors and mentees. This study assessed the effect of a redesigned clinical mentoring program from the perspective of clerks. The objective was to assess the benefits of the redesigned program and identify potential improvements. Methods A redesigned clinical mentoring program was launched in a medical center according to previous theoretical and practical studies on clinical training workplaces, including the elements of mentor qualifications, positive and active enhancers for mentor–mentee relationship building, the timing of mentoring performance evaluation, and financial and professional incentives. A four-wave web survey was conducted, comprising one evaluation of the former mentoring program and three evaluations of the redesigned clinical mentoring program. Sixty-four fifth-year medical students in clerkships who responded to the first wave and to at least two of the three following waves were included in the study. A structured and validated questionnaire encompassing 15 items on mentor performance and the personal characteristics of the clerks was used. Mixed linear models were developed for repeated measurements and to adjust for personal characteristics. Results The results revealed that the redesigned mentoring program improved the mentors’ performance over time for most evaluated items regarding professional development and personal support provided to the mentees. Conclusions Our findings serve as an improved framework for the role of the institution and demonstrate how institutional policies, programs, and structures can shape a clinical mentoring program. We recommend the adoption of mentorship schemes for other cohorts of medical students and for different learning and training stages involved in becoming a physician. PMID:26384479
Science-based Framework for Environmental Benefits Assessment
2013-03-01
ER D C/ EL T R -1 3 -4 Environmental Benefits Analysis Program Science-based Framework for Environmental Benefits Assessment E nv ir...acwc.sdp.sirsi.net/client/default. Environmental Benefits Analysis Program ERDC/EL TR-13-4 March 2013 Science-based Framework for Environmental Benefits ...evaluating ecosystem restoration benefits within the context of USACE Civil Works planning process. An emphasis is placed on knowledge gained from
NASA Astrophysics Data System (ADS)
Ebrahimnejad, Ali
2015-08-01
There are several methods, in the literature, for solving fuzzy variable linear programming problems (fuzzy linear programming in which the right-hand-side vectors and decision variables are represented by trapezoidal fuzzy numbers). In this paper, the shortcomings of some existing methods are pointed out and to overcome these shortcomings a new method based on the bounded dual simplex method is proposed to determine the fuzzy optimal solution of that kind of fuzzy variable linear programming problems in which some or all variables are restricted to lie within lower and upper bounds. To illustrate the proposed method, an application example is solved and the obtained results are given. The advantages of the proposed method over existing methods are discussed. Also, one application of this algorithm in solving bounded transportation problems with fuzzy supplies and demands is dealt with. The proposed method is easy to understand and to apply for determining the fuzzy optimal solution of bounded fuzzy variable linear programming problems occurring in real-life situations.
Adapting Maslow's Hierarchy of Needs as a Framework for Resident Wellness.
Hale, Andrew J; Ricotta, Daniel N; Freed, Jason; Smith, C Christopher; Huang, Grace C
2018-04-30
Burnout in graduate medical education is pervasive and has a deleterious impact on career satisfaction, personal well-being, and patient outcomes. Interventions in residency programs have often addressed isolated contributors to burnout; however, a more comprehensive framework for conceptualizing wellness is needed. In this article the authors propose Maslow's hierarchy of human needs (physiologic, safety, love/belonging, esteem, and self-actualization) as a potential framework for addressing wellness initiatives. There are numerous contributors to burnout among physician-trainees, and programs to combat burnout must be equally multifaceted. A holistic approach, considering both the trainees personal and professional needs, is recommended. Maslow's Needs can be adapted to create such a framework in graduate medical education. The authors review current evidence to support this model. This work surveys current interventions to mitigate burnout and organizes them into a scaffold that can be used by residency programs interested in a complete framework to supporting wellness.
ng: What next-generation languages can teach us about HENP frameworks in the manycore era
NASA Astrophysics Data System (ADS)
Binet, Sébastien
2011-12-01
Current High Energy and Nuclear Physics (HENP) frameworks were written before multicore systems became widely deployed. A 'single-thread' execution model naturally emerged from that environment, however, this no longer fits into the processing model on the dawn of the manycore era. Although previous work focused on minimizing the changes to be applied to the LHC frameworks (because of the data taking phase) while still trying to reap the benefits of the parallel-enhanced CPU architectures, this paper explores what new languages could bring to the design of the next-generation frameworks. Parallel programming is still in an intensive phase of R&D and no silver bullet exists despite the 30+ years of literature on the subject. Yet, several parallel programming styles have emerged: actors, message passing, communicating sequential processes, task-based programming, data flow programming, ... to name a few. We present the work of the prototyping of a next-generation framework in new and expressive languages (python and Go) to investigate how code clarity and robustness are affected and what are the downsides of using languages younger than FORTRAN/C/C++.
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the diesel equipment technology programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies,…
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the dental hygiene technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies. Section II…
ERIC Educational Resources Information Center
Peeraer, Jef; Van Petegem, Peter
2012-01-01
In the framework of a development cooperation program on quality of education in Vietnam, a professional development trajectory for teacher educators on the use of information and communication technology (ICT) in education was developed and implemented over the course of a three-year program. We describe how the framework on "Technological…
ERIC Educational Resources Information Center
Khampirat, Buratin; McRae, Norah
2016-01-01
Cooperative and Work-integrated Education (CWIE) programs have been widely accepted as educational programs that can effectively connect what students are learning to the world of work through placements. Because a global quality standards framework could be a very valuable resource and guide to establishing, developing, and accrediting quality…
Evaluating a Blended Degree Program through the Use of the NSSE Framework
ERIC Educational Resources Information Center
Vaughan, Norman; Cloutier, David
2017-01-01
The purpose of this student-faculty partnership research study was to evaluate the effectiveness of a blended four-year Bachelor of Education Elementary Program at a Canadian university using the National Survey of Student Engagement (NSSE) framework. Data was collected from the first graduating cohort of students from the B.Ed. program in…
ERIC Educational Resources Information Center
Gaudet, Cyndi H.; Annulis, Heather M.; Kmiec, John J., Jr.
2008-01-01
This article describes an ongoing project to build a comprehensive evaluation framework for the competency-based Master of Science in Workforce Training and Development (MSWTD) program at The University of Southern Mississippi (USM). First, it discusses some trends and issues in evaluating the performance of higher education programs in the United…
A Program Structure for Event-Based Speech Synthesis by Rules within a Flexible Segmental Framework.
ERIC Educational Resources Information Center
Hill, David R.
1978-01-01
A program structure based on recently developed techniques for operating system simulation has the required flexibility for use as a speech synthesis algorithm research framework. This program makes synthesis possible with less rigid time and frequency-component structure than simpler schemes. It also meets real-time operation and memory-size…
3.0 Foundation programs for the Delaware CEMRI framework
Peter S. Murdoch
2008-01-01
A complete review of all the national monitoring programs that could possibly contribute to the Delaware River Basin (DRB) CEMRI Framework is beyond the scope of this report. The U.S. Environmental Protection Agency (EPA) Mid-Atlantic Integrated Assessment developed a Web-based annotated inventory of such monitoring programs for the mid-Atlantic region. Olsen et al. (...
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the physical therapy assistant program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies, and section…
Status Report on NEAMS System Analysis Module Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, R.; Fanning, T. H.; Sumner, T.
2015-12-01
Under the Reactor Product Line (RPL) of DOE-NE’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, an advanced SFR System Analysis Module (SAM) is being developed at Argonne National Laboratory. The goal of the SAM development is to provide fast-running, improved-fidelity, whole-plant transient analyses capabilities. SAM utilizes an object-oriented application framework MOOSE), and its underlying meshing and finite-element library libMesh, as well as linear and non-linear solvers PETSc, to leverage modern advanced software environments and numerical methods. It also incorporates advances in physical and empirical models and seeks closure models based on information from high-fidelity simulations and experiments. This reportmore » provides an update on the SAM development, and summarizes the activities performed in FY15 and the first quarter of FY16. The tasks include: (1) implement the support of 2nd-order finite elements in SAM components for improved accuracy and computational efficiency; (2) improve the conjugate heat transfer modeling and develop pseudo 3-D full-core reactor heat transfer capabilities; (3) perform verification and validation tests as well as demonstration simulations; (4) develop the coupling requirements for SAS4A/SASSYS-1 and SAM integration.« less
From master slave interferometry to complex master slave interferometry: theoretical work
NASA Astrophysics Data System (ADS)
Rivet, Sylvain; Bradu, Adrian; Maria, Michael; Feuchter, Thomas; Leick, Lasse; Podoleanu, Adrian
2018-03-01
A general theoretical framework is described to obtain the advantages and the drawbacks of two novel Fourier Domain Optical Coherence Tomography (OCT) methods denoted as Master/Slave Interferometry (MSI) and its extension denoted as Complex Master/Slave Interferometry (CMSI). Instead of linearizing the digital data representing the channeled spectrum before a Fourier transform can be applied to it (as in OCT standard methods), channeled spectrum is decomposed on the basis of local oscillations. This replaces the need for linearization, generally time consuming, before any calculation of the depth profile in the range of interest. In this model two functions, g and h, are introduced. The function g describes the modulation chirp of the channeled spectrum signal due to nonlinearities in the decoding process from wavenumber to time. The function h describes the dispersion in the interferometer. The utilization of these two functions brings two major improvements to previous implementations of the MSI method. The paper details the steps to obtain the functions g and h, and represents the CMSI in a matrix formulation that enables to implement easily this method in LabVIEW by using parallel programming with multi-cores.
Zheng, Wenming; Lin, Zhouchen; Wang, Haixian
2014-04-01
A novel discriminant analysis criterion is derived in this paper under the theoretical framework of Bayes optimality. In contrast to the conventional Fisher's discriminant criterion, the major novelty of the proposed one is the use of L1 norm rather than L2 norm, which makes it less sensitive to the outliers. With the L1-norm discriminant criterion, we propose a new linear discriminant analysis (L1-LDA) method for linear feature extraction problem. To solve the L1-LDA optimization problem, we propose an efficient iterative algorithm, in which a novel surrogate convex function is introduced such that the optimization problem in each iteration is to simply solve a convex programming problem and a close-form solution is guaranteed to this problem. Moreover, we also generalize the L1-LDA method to deal with the nonlinear robust feature extraction problems via the use of kernel trick, and hereafter proposed the L1-norm kernel discriminant analysis (L1-KDA) method. Extensive experiments on simulated and real data sets are conducted to evaluate the effectiveness of the proposed method in comparing with the state-of-the-art methods.
Performance of a reentrant cavity beam position monitor
NASA Astrophysics Data System (ADS)
Simon, Claire; Luong, Michel; Chel, Stéphane; Napoly, Olivier; Novo, Jorge; Roudier, Dominique; Rouvière, Nelly; Baboi, Nicoleta; Mildner, Nils; Nölle, Dirk
2008-08-01
The beam-based alignment and feedback systems, essential operations for the future colliders, require high resolution beam position monitors (BPMs). In the framework of the European CARE/SRF program, a reentrant cavity BPM with its associated electronics was developed by the CEA/DSM/Irfu in collaboration with DESY. The design, the fabrication, and the beam test of this monitor are detailed within this paper. This BPM is designed to be inserted in a cryomodule, work at cryogenic temperature in a clean environment. It has achieved a resolution better than 10μm and has the possibility to perform bunch to bunch measurements for the x-ray free electron laser (X-FEL) and the International Linear Collider (ILC). Its other features are a small size of the rf cavity, a large aperture (78 mm), and an excellent linearity. A first prototype of a reentrant cavity BPM was installed in the free electron laser in Hamburg (FLASH), at Deutsches Elektronen-Synchrotron (DESY) and demonstrated its operation at cryogenic temperature inside a cryomodule. The second, installed, also, in the FLASH linac to be tested with beam, measured a resolution of approximately 4μm over a dynamic range ±5mm in single bunch.
NASA Astrophysics Data System (ADS)
Hirt, Ulrike; Mewes, Melanie; Meyer, Burghard C.
The structure of a landscape is highly relevant for research and planning (such as fulfilling the requirements of the Water Framework Directive - WFD - and for implementation of comprehensive catchment planning). There is a high potential for restoration of linear landscape elements in most European landscapes. By implementing the WFD in Germany, the restoration of linear landscape elements could be a valuable measure, for example to reduce nutrient input into rivers. Despite this importance of landscape structures for water and nutrients fluxes, biodiversity and the appearance of a landscape, specific studies of the linear elements are rare for larger catchment areas. Existing studies are limited because they either use remote sensing data, which does not adequately differentiate all types of linear landscape elements, or they focus only on a specific type of linear element. To address these limitations, we developed a framework allowing comprehensive quantification of linear landscape elements for catchment areas, using publicly available biotope type data. We analysed the dependence of landscape structures on natural regions and regional soil characteristics. Three data sets (differing in biotopes, soil parameters and natural regions) were generated for the catchment area of the middle Mulde River (2700 km 2) in Germany, using overlay processes in geographic information systems (GIS), followed by statistical evaluation. The linear landscape components of the total catchment area are divided into roads (55%), flowing water (21%), tree rows (14%), avenues (5%), and hedges (2%). The occurrence of these landscape components varies regionally among natural units and different soil regions. For example, the mixed deciduous stands (3.5 m/ha) are far more frequent in foothills (6 m/ha) than in hill country (0.9 m/ha). In contrast, fruit trees are more frequent in hill country (5.2 m/ha) than in the cooler foothills (0.5 m/ha). Some 70% of avenues, and 40% of tree rows, are discontinuous; in contrast, only 20% of hedges are discontinuous. Using our innovative framework, comprehensive information about landscape elements can now be obtained for regional applications. This approach can be applied to other regions and is highly relevant for landscape planning, erosion control, protection of waters and preservation of biotopes and species.
Towards an agent-oriented programming language based on Scala
NASA Astrophysics Data System (ADS)
Mitrović, Dejan; Ivanović, Mirjana; Budimac, Zoran
2012-09-01
Scala and its multi-threaded model based on actors represent an excellent framework for developing purely reactive agents. This paper presents an early research on extending Scala with declarative programming constructs, which would result in a new agent-oriented programming language suitable for developing more advanced, BDI agent architectures. The main advantage the new language over many other existing solutions for programming BDI agents is a natural and straightforward integration of imperative and declarative programming constructs, fitted under a single development framework.
Feedback control by online learning an inverse model.
Waegeman, Tim; Wyffels, Francis; Schrauwen, Francis
2012-10-01
A model, predictor, or error estimator is often used by a feedback controller to control a plant. Creating such a model is difficult when the plant exhibits nonlinear behavior. In this paper, a novel online learning control framework is proposed that does not require explicit knowledge about the plant. This framework uses two learning modules, one for creating an inverse model, and the other for actually controlling the plant. Except for their inputs, they are identical. The inverse model learns by the exploration performed by the not yet fully trained controller, while the actual controller is based on the currently learned model. The proposed framework allows fast online learning of an accurate controller. The controller can be applied on a broad range of tasks with different dynamic characteristics. We validate this claim by applying our control framework on several control tasks: 1) the heating tank problem (slow nonlinear dynamics); 2) flight pitch control (slow linear dynamics); and 3) the balancing problem of a double inverted pendulum (fast linear and nonlinear dynamics). The results of these experiments show that fast learning and accurate control can be achieved. Furthermore, a comparison is made with some classical control approaches, and observations concerning convergence and stability are made.
Can linear superiorization be useful for linear optimization problems?
NASA Astrophysics Data System (ADS)
Censor, Yair
2017-04-01
Linear superiorization (LinSup) considers linear programming problems but instead of attempting to solve them with linear optimization methods it employs perturbation resilient feasibility-seeking algorithms and steers them toward reduced (not necessarily minimal) target function values. The two questions that we set out to explore experimentally are: (i) does LinSup provide a feasible point whose linear target function value is lower than that obtained by running the same feasibility-seeking algorithm without superiorization under identical conditions? (ii) How does LinSup fare in comparison with the Simplex method for solving linear programming problems? Based on our computational experiments presented here, the answers to these two questions are: ‘yes’ and ‘very well’, respectively.
Designing the framework for competency-based master of public health programs in India.
Sharma, Kavya; Zodpey, Sanjay; Morgan, Alison; Gaidhane, Abhay; Syed, Zahiruddin Quazi; Kumar, Rajeev
2013-01-01
Competency in the practice of public health is the implicit goal of education institutions that offer master of public health (MPH) programs. With the expanding number of institutions offering courses in public health in India, it is timely to develop a common framework to ensure that graduates are proficient in critical public health. Steps such as situation assessment, survey of public health care professionals in India, and national consultation were undertaken to develop a proposed competency-based framework for MPH programs in India. The existing curricula of all 23 Indian MPH courses vary significantly in content with regard to core, concentration, and crosscutting discipline areas and course durations. The competency or learning outcome is not well defined. The findings of the survey suggest that MPH graduates in India should have competencies ranging from monitoring of health problems and epidemics in the community, applying biostatistics in public health, conducting action research, understanding social and community influence on public health developing indicators and instruments to monitor and evaluate community health programs, developing proposals, and involving community in planning, delivery, and monitoring of health programs. Competency statements were framed and mapped with domains including epidemiology, biostatistics, social and behavioral sciences, health care system, policy, planning, and financing, and environmental health sciences and a crosscutting domain that include health communication and informatics, health management and leadership, professionalism, systems thinking, and public health biology. The proposed competency-based framework for Indian MPH programs can be adapted to meet the needs of diverse, unique programs. The framework ensures the uniqueness and diversity of individual MPH programs in India while contributing to measures of overall program success.
Crocker, Jonny; Shields, Katherine F; Venkataramanan, Vidya; Saywell, Darren; Bartram, Jamie
2016-10-01
Training and capacity building are long established critical components of global water, sanitation, and hygiene (WaSH) policies, strategies, and programs. Expanding capacity building support for WaSH in developing countries is one of the targets of the Sustainable Development Goals. There are many training evaluation methods and tools available. However, training evaluations in WaSH have been infrequent, have often not utilized these methods and tools, and have lacked rigor. We developed a conceptual framework for evaluating training in WaSH by reviewing and adapting concepts from literature. Our framework includes three target outcomes: learning, individual performance, and improved programming; and two sets of influences: trainee and context factors. We applied the framework to evaluate a seven-month community-led total sanitation (CLTS) management training program delivered to 42 government officials in Kenya from September 2013 to May 2014. Trainees were given a pre-training questionnaire and were interviewed at two weeks and seven months after initial training. We qualitatively analyzed the data using our conceptual framework. The training program resulted in trainees learning the CLTS process and new skills, and improving their individual performance through application of advocacy, partnership, and supervision soft skills. The link from trainees' performance to improved programming was constrained by resource limitations and pre-existing rigidity of trainees' organizations. Training-over-time enhanced outcomes and enabled trainees to overcome constraints in their work. Training in soft skills is relevant to managing public health programs beyond WaSH. We make recommendations on how training programs can be targeted and adapted to improve outcomes. Our conceptual framework can be used as a tool both for planning and evaluating training programs in WaSH. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
McCleery, W. Tyler; Mohd-Radzman, Nadiatul A.; Grieneisen, Veronica A.
Cells within tissues can be regarded as autonomous entities that respond to their local environment and signaling from neighbors. Cell coordination is particularly important in plants, where root architecture must strategically invest resources for growth to optimize nutrient acquisition. Thus, root cells are constantly adapting to environmental cues and neighbor communication in a non-linear manner. To explain such plasticity, we view the root as a swarm of coupled multi-cellular structures, ''metamers'', rather than as a continuum of identical cells. These metamers are individually programmed to achieve a local objective - developing a lateral root primordia, which aids in local foraging of nutrients. Collectively, such individual attempts may be halted, structuring root architecture as an emergent behavior. Each metamer's decision to branch is coordinated locally and globally through hormone signaling, including processes of controlled diffusion, active polar transport, and dynamic feedback. We present a physical model of the signaling mechanism that coordinates branching decisions in response to the environment. This work was funded by the European Commission 7th Framework Program, Project No. 601062, SWARM-ORGAN.
A farm-level precision land management framework based on integer programming
Li, Qi; Hu, Guiping; Jubery, Talukder Zaki; Ganapathysubramanian, Baskar
2017-01-01
Farmland management involves several planning and decision making tasks including seed selection and irrigation management. A farm-level precision farmland management model based on mixed integer linear programming is proposed in this study. Optimal decisions are designed for pre-season planning of crops and irrigation water allocation. The model captures the effect of size and shape of decision scale as well as special irrigation patterns. The authors illustrate the model with a case study on a farm in the state of California in the U.S. and show the model can capture the impact of precision farm management on profitability. The results show that threefold increase of annual net profit for farmers could be achieved by carefully choosing irrigation and seed selection. Although farmers could increase profits by applying precision management to seed or irrigation alone, profit increase is more significant if farmers apply precision management on seed and irrigation simultaneously. The proposed model can also serve as a risk analysis tool for farmers facing seasonal irrigation water limits as well as a quantitative tool to explore the impact of precision agriculture. PMID:28346499
NASA Astrophysics Data System (ADS)
Moraes Rêgo, Patrícia Helena; Viana da Fonseca Neto, João; Ferreira, Ernesto M.
2015-08-01
The main focus of this article is to present a proposal to solve, via UDUT factorisation, the convergence and numerical stability problems that are related to the covariance matrix ill-conditioning of the recursive least squares (RLS) approach for online approximations of the algebraic Riccati equation (ARE) solution associated with the discrete linear quadratic regulator (DLQR) problem formulated in the actor-critic reinforcement learning and approximate dynamic programming context. The parameterisations of the Bellman equation, utility function and dynamic system as well as the algebra of Kronecker product assemble a framework for the solution of the DLQR problem. The condition number and the positivity parameter of the covariance matrix are associated with statistical metrics for evaluating the approximation performance of the ARE solution via RLS-based estimators. The performance of RLS approximators is also evaluated in terms of consistence and polarisation when associated with reinforcement learning methods. The used methodology contemplates realisations of online designs for DLQR controllers that is evaluated in a multivariable dynamic system model.
NASA Astrophysics Data System (ADS)
Moraitis, Kostas; Archontis, Vasilis; Tziotziou, Konstantinos; Georgoulis, Manolis K.
We calculate the instantaneous free magnetic energy and relative magnetic helicity of solar active regions using two independent approaches: a) a non-linear force-free (NLFF) method that requires only a single photospheric vector magnetogram, and b) well known semi-analytical formulas that require the full three-dimensional (3D) magnetic field structure. The 3D field is obtained either from MHD simulations, or from observed magnetograms via respective NLFF field extrapolations. We find qualitative agreement between the two methods and, quantitatively, a discrepancy not exceeding a factor of 4. The comparison of the two methods reveals, as a byproduct, two independent tests for the quality of a given force-free field extrapolation. We find that not all extrapolations manage to achieve the force-free condition in a valid, divergence-free, magnetic configuration. This research has been co-financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: Thales. Investing in knowledge society through the European Social Fund.
Classical and sequential limit analysis revisited
NASA Astrophysics Data System (ADS)
Leblond, Jean-Baptiste; Kondo, Djimédo; Morin, Léo; Remmal, Almahdi
2018-04-01
Classical limit analysis applies to ideal plastic materials, and within a linearized geometrical framework implying small displacements and strains. Sequential limit analysis was proposed as a heuristic extension to materials exhibiting strain hardening, and within a fully general geometrical framework involving large displacements and strains. The purpose of this paper is to study and clearly state the precise conditions permitting such an extension. This is done by comparing the evolution equations of the full elastic-plastic problem, the equations of classical limit analysis, and those of sequential limit analysis. The main conclusion is that, whereas classical limit analysis applies to materials exhibiting elasticity - in the absence of hardening and within a linearized geometrical framework -, sequential limit analysis, to be applicable, strictly prohibits the presence of elasticity - although it tolerates strain hardening and large displacements and strains. For a given mechanical situation, the relevance of sequential limit analysis therefore essentially depends upon the importance of the elastic-plastic coupling in the specific case considered.
Specification and Verification of Web Applications in Rewriting Logic
NASA Astrophysics Data System (ADS)
Alpuente, María; Ballis, Demis; Romero, Daniel
This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.
Yan, Yong; O'Connor, Alice E; Kanthasamy, Gopikkaa; Atkinson, George; Allan, David R; Blake, Alexander J; Schröder, Martin
2018-03-21
High-pressure single-crystal X-ray structural analyses of isostructural MFM-133(M) (M = Zr, Hf) of flu topology and incorporating the tetracarboxylate ligand TCHB 4- [H 4 TCHB = 3,3',5,5'-tetrakis(4-carboxyphenyl)-2,2',4,4',6,6'-hexamethyl-1,1'-biphenyl] and {M 6 (μ 3 -OH) 8 (OH) 8 (COO) 8 } clusters confirm negative linear compressibility (NLC) behavior along the c axis. This occurs via a three-dimensional wine-rack NLC mechanism leading to distortion of the octahedral cage toward a more elongated polyhedron under static compression. Despite the isomorphous nature of these two structures, MFM-133(Hf) shows a higher degree of NLC than the Zr(IV) analogue. Thus, for the first time, we demonstrate here that the NLC property can be effectively tuned in a framework material by simply varying the inorganic component of the frameworks without changing the network topology and structure.
Steeply dipping heaving bedrock, Colorado: Part 1 - Heave features and physical geological framework
Noe, D.C.; Higgins, J.D.; Olsen, H.W.
2007-01-01
Differentially heaving bedrock has caused severe damage near the Denver metropolitan area. This paper describes heave-feature morphologies, the underlying bedrock framework, and their inter-relationship. The heave features are linear to curvilinear and may attain heights of 0.7 m (2.4 ft), widths of 58 m (190 ft), and lengths of 1,067 m (3,500 ft). They are nearly symmetrical to highly asymmetrical in cross section, with width-to-height ratios of 45:1 to 400:1, and most are oriented parallel with the mountain front. The bedrock consists of Mesozoic sedimentary formations having dip angles of 30 degrees to vertical to overturned. Mixed claystone-siltstone bedding sequences up to 36-m (118-ft) thick are common in the heave-prone areas, and interbeds of bentonite, limestone, or sandstone may be present. Highly fractured zones of weathered to variably weathered claystone extend to depths of 19.5 to 22.3 m (64 to 73 ft). Fracture spacings are 0.1 to 0.2 m (0.3 to 0.7 ft) in the weathered and variably weathered bedrock and up to 0.75 m (2.5 ft) in the underlying, unweathered bedrock. Curvilinear shear planes in the weathered claystone show thrust or reverse offsets up to 1.2 m (3.9 ft). Three associations between heave-feature morphologies and the geological framework are recognized: (1) Linear, symmetrical to asymmetrical heaves are associated with primary bedding composition changes. (2) Linear, highly asymmetrical heaves are associated with shear planes along bedding. (3) Curvi-linear, highly asymmetrical heaves are associated with bedding-oblique shear planes.
Portfolio optimization using fuzzy linear programming
NASA Astrophysics Data System (ADS)
Pandit, Purnima K.
2013-09-01
Portfolio Optimization (PO) is a problem in Finance, in which investor tries to maximize return and minimize risk by carefully choosing different assets. Expected return and risk are the most important parameters with regard to optimal portfolios. In the simple form PO can be modeled as quadratic programming problem which can be put into equivalent linear form. PO problems with the fuzzy parameters can be solved as multi-objective fuzzy linear programming problem. In this paper we give the solution to such problems with an illustrative example.
Users manual for linear Time-Varying Helicopter Simulation (Program TVHIS)
NASA Technical Reports Server (NTRS)
Burns, M. R.
1979-01-01
A linear time-varying helicopter simulation program (TVHIS) is described. The program is designed as a realistic yet efficient helicopter simulation. It is based on a linear time-varying helicopter model which includes rotor, actuator, and sensor models, as well as a simulation of flight computer logic. The TVHIS can generate a mean trajectory simulation along a nominal trajectory, or propagate covariance of helicopter states, including rigid-body, turbulence, control command, controller states, and rigid-body state estimates.
NASA Astrophysics Data System (ADS)
Si, Y.; Li, X.; Li, T.; Huang, Y.; Yin, D.
2016-12-01
The cascade reservoirs in Upper Yellow River (UYR), one of the largest hydropower bases in China, play a vital role in peak load and frequency regulation for Northwest China Power Grid. The joint operation of this system has been put forward for years whereas has not come into effect due to management difficulties and inflow uncertainties, and thus there is still considerable improvement room for hydropower production. This study presents a decision support framework incorporating long- and short-term operation of the reservoir system. For long-term operation, we maximize hydropower production of the reservoir system using historical hydrological data of multiple years, and derive operating rule curves for storage reservoirs. For short-term operation, we develop a program consisting of three modules, namely hydrologic forecast module, reservoir operation module and coordination module. The coordination module is responsible for calling the hydrologic forecast module to acquire predicted inflow within a short-term horizon, and transferring the information to the reservoir operation module to generate optimal release decision. With the hydrologic forecast information updated, the rolling short-term optimization is iterated until the end of operation period, where the long-term operating curves serve as the ending storage target. As an application, the Digital Yellow River Integrated Model (referred to as "DYRIM", which is specially designed for runoff-sediment simulation in the Yellow River basin by Tsinghua University) is used in the hydrologic forecast module, and the successive linear programming (SLP) in the reservoir operation module. The application in the reservoir system of UYR demonstrates that the framework can effectively support real-time decision making, and ensure both computational accuracy and speed. Furthermore, it is worth noting that the general framework can be extended to any other reservoir system with any or combination of hydrological model(s) to forecast and any solver to optimize the operation of reservoir system.
Visual exploration of high-dimensional data through subspace analysis and dynamic projections
Liu, S.; Wang, B.; Thiagarajan, J. J.; ...
2015-06-01
Here, we introduce a novel interactive framework for visualizing and exploring high-dimensional datasets based on subspace analysis and dynamic projections. We assume the high-dimensional dataset can be represented by a mixture of low-dimensional linear subspaces with mixed dimensions, and provide a method to reliably estimate the intrinsic dimension and linear basis of each subspace extracted from the subspace clustering. Subsequently, we use these bases to define unique 2D linear projections as viewpoints from which to visualize the data. To understand the relationships among the different projections and to discover hidden patterns, we connect these projections through dynamic projections that createmore » smooth animated transitions between pairs of projections. We introduce the view transition graph, which provides flexible navigation among these projections to facilitate an intuitive exploration. Finally, we provide detailed comparisons with related systems, and use real-world examples to demonstrate the novelty and usability of our proposed framework.« less
Visual Exploration of High-Dimensional Data through Subspace Analysis and Dynamic Projections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, S.; Wang, B.; Thiagarajan, Jayaraman J.
2015-06-01
We introduce a novel interactive framework for visualizing and exploring high-dimensional datasets based on subspace analysis and dynamic projections. We assume the high-dimensional dataset can be represented by a mixture of low-dimensional linear subspaces with mixed dimensions, and provide a method to reliably estimate the intrinsic dimension and linear basis of each subspace extracted from the subspace clustering. Subsequently, we use these bases to define unique 2D linear projections as viewpoints from which to visualize the data. To understand the relationships among the different projections and to discover hidden patterns, we connect these projections through dynamic projections that create smoothmore » animated transitions between pairs of projections. We introduce the view transition graph, which provides flexible navigation among these projections to facilitate an intuitive exploration. Finally, we provide detailed comparisons with related systems, and use real-world examples to demonstrate the novelty and usability of our proposed framework.« less
Doctor coach: a deliberate practice approach to teaching and learning clinical skills.
Gifford, Kimberly A; Fall, Leslie H
2014-02-01
The rapidly evolving medical education landscape requires restructuring the approach to teaching and learning across the continuum of medical education. The deliberate practice strategies used to coach learners in disciplines beyond medicine can also be used to train medical learners. However, these deliberate practice strategies are not explicitly taught in most medical schools or residencies. The authors designed the Doctor Coach framework and competencies in 2007-2008 to serve as the foundation for new faculty development and resident-as-teacher programs. In addition to teaching deliberate practice strategies, the programs model a deliberate practice approach that promotes the continuous integration of newly developed coaching competencies by participants into their daily teaching practice. Early evaluation demonstrated the feasibility and efficacy of implementing the Doctor Coach framework across the continuum of medical education. Additionally, the Doctor Coach framework has been disseminated through national workshops, which have resulted in additional institutions applying the framework and competencies to develop their own coaching programs. Design of a multisource evaluation tool based on the coaching competencies will enable more rigorous study of the Doctor Coach framework and training programs and provide a richer feedback mechanism for participants. The framework will also facilitate the faculty development needed to implement the milestones and entrustable professional activities in medical education.
Automatic Mrf-Based Registration of High Resolution Satellite Video Data
NASA Astrophysics Data System (ADS)
Platias, C.; Vakalopoulou, M.; Karantzalos, K.
2016-06-01
In this paper we propose a deformable registration framework for high resolution satellite video data able to automatically and accurately co-register satellite video frames and/or register them to a reference map/image. The proposed approach performs non-rigid registration, formulates a Markov Random Fields (MRF) model, while efficient linear programming is employed for reaching the lowest potential of the cost function. The developed approach has been applied and validated on satellite video sequences from Skybox Imaging and compared with a rigid, descriptor-based registration method. Regarding the computational performance, both the MRF-based and the descriptor-based methods were quite efficient, with the first one converging in some minutes and the second in some seconds. Regarding the registration accuracy the proposed MRF-based method significantly outperformed the descriptor-based one in all the performing experiments.
Quantifying Bell nonlocality with the trace distance
NASA Astrophysics Data System (ADS)
Brito, S. G. A.; Amaral, B.; Chaves, R.
2018-02-01
Measurements performed on distant parts of an entangled quantum state can generate correlations incompatible with classical theories respecting the assumption of local causality. This is the phenomenon known as quantum nonlocality that, apart from its fundamental role, can also be put to practical use in applications such as cryptography and distributed computing. Clearly, developing ways of quantifying nonlocality is an important primitive in this scenario. Here, we propose to quantify the nonlocality of a given probability distribution via its trace distance to the set of classical correlations. We show that this measure is a monotone under the free operations of a resource theory and, furthermore, that it can be computed efficiently with a linear program. We put our framework to use in a variety of relevant Bell scenarios also comparing the trace distance to other standard measures in the literature.
Distance estimation and collision prediction for on-line robotic motion planning
NASA Technical Reports Server (NTRS)
Kyriakopoulos, K. J.; Saridis, G. N.
1992-01-01
An efficient method for computing the minimum distance and predicting collisions between moving objects is presented. This problem is incorporated into the framework of an in-line motion-planning algorithm to satisfy collision avoidance between a robot and moving objects modeled as convex polyhedra. In the beginning, the deterministic problem where the information about the objects is assumed to be certain is examined. L(1) or L(infinity) norms are used to represent distance and the problem becomes a linear programming problem. The stochastic problem is formulated where the uncertainty is induced by sensing and the unknown dynamics of the moving obstacles. Two problems are considered: First, filtering of the distance between the robot and the moving object at the present time. Second, prediction of the minimum distance in the future in order to predict the collision time.
Optimal motion planning using navigation measure
NASA Astrophysics Data System (ADS)
Vaidya, Umesh
2018-05-01
We introduce navigation measure as a new tool to solve the motion planning problem in the presence of static obstacles. Existence of navigation measure guarantees collision-free convergence at the final destination set beginning with almost every initial condition with respect to the Lebesgue measure. Navigation measure can be viewed as a dual to the navigation function. While the navigation function has its minimum at the final destination set and peaks at the obstacle set, navigation measure takes the maximum value at the destination set and is zero at the obstacle set. A linear programming formalism is proposed for the construction of navigation measure. Set-oriented numerical methods are utilised to obtain finite dimensional approximation of this navigation measure. Application of the proposed navigation measure-based theoretical and computational framework is demonstrated for a motion planning problem in a complex fluid flow.
SCM: A method to improve network service layout efficiency with network evolution
Zhao, Qi; Zhang, Chuanhao
2017-01-01
Network services are an important component of the Internet, which are used to expand network functions for third-party developers. Network function virtualization (NFV) can improve the speed and flexibility of network service deployment. However, with the evolution of the network, network service layout may become inefficient. Regarding this problem, this paper proposes a service chain migration (SCM) method with the framework of “software defined network + network function virtualization” (SDN+NFV), which migrates service chains to adapt to network evolution and improves the efficiency of the network service layout. SCM is modeled as an integer linear programming problem and resolved via particle swarm optimization. An SCM prototype system is designed based on an SDN controller. Experiments demonstrate that SCM could reduce the network traffic cost and energy consumption efficiently. PMID:29267299
Managing time-substitutable electricity usage using dynamic controls
Ghosh, Soumyadip; Hosking, Jonathan R.; Natarajan, Ramesh; Subramaniam, Shivaram; Zhang, Xiaoxuan
2017-02-07
A predictive-control approach allows an electricity provider to monitor and proactively manage peak and off-peak residential intra-day electricity usage in an emerging smart energy grid using time-dependent dynamic pricing incentives. The daily load is modeled as time-shifted, but cost-differentiated and substitutable, copies of the continuously-consumed electricity resource, and a consumer-choice prediction model is constructed to forecast the corresponding intra-day shares of total daily load according to this model. This is embedded within an optimization framework for managing the daily electricity usage. A series of transformations are employed, including the reformulation-linearization technique (RLT) to obtain a Mixed-Integer Programming (MIP) model representation of the resulting nonlinear optimization problem. In addition, various regulatory and pricing constraints are incorporated in conjunction with the specified profit and capacity utilization objectives.
Managing time-substitutable electricity usage using dynamic controls
Ghosh, Soumyadip; Hosking, Jonathan R.; Natarajan, Ramesh; Subramaniam, Shivaram; Zhang, Xiaoxuan
2017-02-21
A predictive-control approach allows an electricity provider to monitor and proactively manage peak and off-peak residential intra-day electricity usage in an emerging smart energy grid using time-dependent dynamic pricing incentives. The daily load is modeled as time-shifted, but cost-differentiated and substitutable, copies of the continuously-consumed electricity resource, and a consumer-choice prediction model is constructed to forecast the corresponding intra-day shares of total daily load according to this model. This is embedded within an optimization framework for managing the daily electricity usage. A series of transformations are employed, including the reformulation-linearization technique (RLT) to obtain a Mixed-Integer Programming (MIP) model representation of the resulting nonlinear optimization problem. In addition, various regulatory and pricing constraints are incorporated in conjunction with the specified profit and capacity utilization objectives.
Linear Programming for Vocational Education Planning. Interim Report.
ERIC Educational Resources Information Center
Young, Robert C.; And Others
The purpose of the paper is to define for potential users of vocational education management information systems a quantitative analysis technique and its utilization to facilitate more effective planning of vocational education programs. Defining linear programming (LP) as a management technique used to solve complex resource allocation problems…
ERIC Educational Resources Information Center
Puhan, Gautam; Moses, Tim P.; Yu, Lei; Dorans, Neil J.
2007-01-01
The purpose of the current study was to examine whether log-linear smoothing of observed score distributions in small samples results in more accurate differential item functioning (DIF) estimates under the simultaneous item bias test (SIBTEST) framework. Data from a teacher certification test were analyzed using White candidates in the reference…
Optimization-Based Robust Nonlinear Control
2006-08-01
ABSTRACT New control algorithms were developed for robust stabilization of nonlinear dynamical systems . Novel, linear matrix inequality-based synthesis...was to further advance optimization-based robust nonlinear control design, for general nonlinear systems (especially in discrete time ), for linear...Teel, IEEE Transactions on Control Systems Technology, vol. 14, no. 3, p. 398-407, May 2006. 3. "A unified framework for input-to-state stability in
ERIC Educational Resources Information Center
Kunina-Habenicht, Olga; Rupp, Andre A.; Wilhelm, Oliver
2012-01-01
Using a complex simulation study we investigated parameter recovery, classification accuracy, and performance of two item-fit statistics for correct and misspecified diagnostic classification models within a log-linear modeling framework. The basic manipulated test design factors included the number of respondents (1,000 vs. 10,000), attributes (3…
Integrated Analytic and Linearized Inverse Kinematics for Precise Full Body Interactions
NASA Astrophysics Data System (ADS)
Boulic, Ronan; Raunhardt, Daniel
Despite the large success of games grounded on movement-based interactions the current state of full body motion capture technologies still prevents the exploitation of precise interactions with complex environments. This paper focuses on ensuring a precise spatial correspondence between the user and the avatar. We build upon our past effort in human postural control with a Prioritized Inverse Kinematics framework. One of its key advantage is to ease the dynamic combination of postural and collision avoidance constraints. However its reliance on a linearized approximation of the problem makes it vulnerable to the well-known full extension singularity of the limbs. In such context the tracking performance is reduced and/or less believable intermediate postural solutions are produced. We address this issue by introducing a new type of analytic constraint that smoothly integrates within the prioritized Inverse Kinematics framework. The paper first recalls the background of full body 3D interactions and the advantages and drawbacks of the linearized IK solution. Then the Flexion-EXTension constraint (FLEXT in short) is introduced for the partial position control of limb-like articulated structures. Comparative results illustrate the interest of this new type of integrated analytical and linearized IK control.
NASA Astrophysics Data System (ADS)
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
NASA Astrophysics Data System (ADS)
Chen, Ying-Ying; Jin, Fei-Fei
2018-03-01
The eastern equatorial Pacific has a pronounced westward propagating SST annual cycle resulting from ocean-atmosphere interactions with equatorial semiannual solar forcing and off-equatorial annual solar forcing conveyed to the equator. In this two-part paper, a simple linear coupled framework is proposed to quantify the internal dynamics and external forcing for a better understanding of the linear part of the dynamics annual cycle. It is shown that an essential internal dynamical factor is the SST damping rate which measures the coupled stability in a similar way as the Bjerknes instability index for the El Niño-Southern Oscillation. It comprises three major negative terms (dynamic damping due to the Ekman pumping feedback, mean circulation advection, and thermodynamic feedback) and two positive terms (thermocline feedback and zonal advection). Another dynamical factor is the westward-propagation speed that is mainly determined by the thermodynamic feedback, the Ekman pumping feedback, and the mean circulation. The external forcing is measured by the annual and semiannual forcing factors. These linear internal and external factors, which can be estimated from data, determine the amplitude of the annual cycle.
Feasibility of Decentralized Linear-Quadratic-Gaussian Control of Autonomous Distributed Spacecraft
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell
1999-01-01
A distributed satellite formation, modeled as an arbitrary number of fully connected nodes in a network, could be controlled using a decentralized controller framework that distributes operations in parallel over the network. For such problems, a solution that minimizes data transmission requirements, in the context of linear-quadratic-Gaussian (LQG) control theory, was given by Speyer. This approach is advantageous because it is non-hierarchical, detected failures gracefully degrade system performance, fewer local computations are required than for a centralized controller, and it is optimal with respect to the standard LQG cost function. Disadvantages of the approach are the need for a fully connected communications network, the total operations performed over all the nodes are greater than for a centralized controller, and the approach is formulated for linear time-invariant systems. To investigate the feasibility of the decentralized approach to satellite formation flying, a simple centralized LQG design for a spacecraft orbit control problem is adapted to the decentralized framework. The simple design uses a fixed reference trajectory (an equatorial, Keplerian, circular orbit), and by appropriate choice of coordinates and measurements is formulated as a linear time-invariant system.
An evaluation of bias in propensity score-adjusted non-linear regression models.
Wan, Fei; Mitra, Nandita
2018-03-01
Propensity score methods are commonly used to adjust for observed confounding when estimating the conditional treatment effect in observational studies. One popular method, covariate adjustment of the propensity score in a regression model, has been empirically shown to be biased in non-linear models. However, no compelling underlying theoretical reason has been presented. We propose a new framework to investigate bias and consistency of propensity score-adjusted treatment effects in non-linear models that uses a simple geometric approach to forge a link between the consistency of the propensity score estimator and the collapsibility of non-linear models. Under this framework, we demonstrate that adjustment of the propensity score in an outcome model results in the decomposition of observed covariates into the propensity score and a remainder term. Omission of this remainder term from a non-collapsible regression model leads to biased estimates of the conditional odds ratio and conditional hazard ratio, but not for the conditional rate ratio. We further show, via simulation studies, that the bias in these propensity score-adjusted estimators increases with larger treatment effect size, larger covariate effects, and increasing dissimilarity between the coefficients of the covariates in the treatment model versus the outcome model.
Probabilistic dual heuristic programming-based adaptive critic
NASA Astrophysics Data System (ADS)
Herzallah, Randa
2010-02-01
Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.
Tegegne, Sisay G.; MKanda, Pascal; Yehualashet, Yared G.; Erbeto, Tesfaye B.; Touray, Kebba; Nsubuga, Peter; Banda, Richard; Vaz, Rui G.
2016-01-01
Background. An accountability framework is a central feature of managing human and financial resources. One of its primary goals is to improve program performance through close monitoring of selected priority activities. The principal objective of this study was to determine the contribution of a systematic accountability framework to improving the performance of the World Health Organization (WHO)–Nigeria polio program staff, as well as the program itself. Methods. The effect of implementation of the accountability framework was evaluated using data on administrative actions and select process indicators associated with acute flaccid paralysis (AFP) surveillance, routine immunization, and polio supplemental immunization activities. Data were collected in 2014 during supportive supervision, using Magpi software (a company that provides service to collect data using mobile phones). A total of 2500 staff were studied. Results. Data on administrative actions and process indicators from quarters 2–4 in 2014 were compared. With respect to administrative actions, 1631 personnel (74%) received positive feedback (written or verbal commendation) in quarter 4 through the accountability framework, compared with 1569 (73%) and 1152 (61%) during quarters 3 and 2, respectively. These findings accorded with data on process indicators associated with AFP surveillance and routine immunization, showing statistically significant improvements in staff performance at the end of quarter 4, compared with other quarters. Conclusions. Improvements in staff performance and process indicators were observed for the WHO-Nigeria polio program after implementation of a systematic accountability framework. PMID:26823334
A mathematical framework for the selection of an optimal set of peptides for epitope-based vaccines.
Toussaint, Nora C; Dönnes, Pierre; Kohlbacher, Oliver
2008-12-01
Epitope-based vaccines (EVs) have a wide range of applications: from therapeutic to prophylactic approaches, from infectious diseases to cancer. The development of an EV is based on the knowledge of target-specific antigens from which immunogenic peptides, so-called epitopes, are derived. Such epitopes form the key components of the EV. Due to regulatory, economic, and practical concerns the number of epitopes that can be included in an EV is limited. Furthermore, as the major histocompatibility complex (MHC) binding these epitopes is highly polymorphic, every patient possesses a set of MHC class I and class II molecules of differing specificities. A peptide combination effective for one person can thus be completely ineffective for another. This renders the optimal selection of these epitopes an important and interesting optimization problem. In this work we present a mathematical framework based on integer linear programming (ILP) that allows the formulation of various flavors of the vaccine design problem and the efficient identification of optimal sets of epitopes. Out of a user-defined set of predicted or experimentally determined epitopes, the framework selects the set with the maximum likelihood of eliciting a broad and potent immune response. Our ILP approach allows an elegant and flexible formulation of numerous variants of the EV design problem. In order to demonstrate this, we show how common immunological requirements for a good EV (e.g., coverage of epitopes from each antigen, coverage of all MHC alleles in a set, or avoidance of epitopes with high mutation rates) can be translated into constraints or modifications of the objective function within the ILP framework. An implementation of the algorithm outperforms a simple greedy strategy as well as a previously suggested evolutionary algorithm and has runtimes on the order of seconds for typical problem sizes.
Hermann, Gunter; Pohl, Vincent; Tremblay, Jean Christophe
2017-10-30
In this contribution, we extend our framework for analyzing and visualizing correlated many-electron dynamics to non-variational, highly scalable electronic structure method. Specifically, an explicitly time-dependent electronic wave packet is written as a linear combination of N-electron wave functions at the configuration interaction singles (CIS) level, which are obtained from a reference time-dependent density functional theory (TDDFT) calculation. The procedure is implemented in the open-source Python program detCI@ORBKIT, which extends the capabilities of our recently published post-processing toolbox (Hermann et al., J. Comput. Chem. 2016, 37, 1511). From the output of standard quantum chemistry packages using atom-centered Gaussian-type basis functions, the framework exploits the multideterminental structure of the hybrid TDDFT/CIS wave packet to compute fundamental one-electron quantities such as difference electronic densities, transient electronic flux densities, and transition dipole moments. The hybrid scheme is benchmarked against wave function data for the laser-driven state selective excitation in LiH. It is shown that all features of the electron dynamics are in good quantitative agreement with the higher-level method provided a judicious choice of functional is made. Broadband excitation of a medium-sized organic chromophore further demonstrates the scalability of the method. In addition, the time-dependent flux densities unravel the mechanistic details of the simulated charge migration process at a glance. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Pey, Jon; Valgepea, Kaspar; Rubio, Angel; Beasley, John E; Planes, Francisco J
2013-12-08
The study of cellular metabolism in the context of high-throughput -omics data has allowed us to decipher novel mechanisms of importance in biotechnology and health. To continue with this progress, it is essential to efficiently integrate experimental data into metabolic modeling. We present here an in-silico framework to infer relevant metabolic pathways for a particular phenotype under study based on its gene/protein expression data. This framework is based on the Carbon Flux Path (CFP) approach, a mixed-integer linear program that expands classical path finding techniques by considering additional biophysical constraints. In particular, the objective function of the CFP approach is amended to account for gene/protein expression data and influence obtained paths. This approach is termed integrative Carbon Flux Path (iCFP). We show that gene/protein expression data also influences the stoichiometric balancing of CFPs, which provides a more accurate picture of active metabolic pathways. This is illustrated in both a theoretical and real scenario. Finally, we apply this approach to find novel pathways relevant in the regulation of acetate overflow metabolism in Escherichia coli. As a result, several targets which could be relevant for better understanding of the phenomenon leading to impaired acetate overflow are proposed. A novel mathematical framework that determines functional pathways based on gene/protein expression data is presented and validated. We show that our approach is able to provide new insights into complex biological scenarios such as acetate overflow in Escherichia coli.
Kinjo, Ken; Uchibe, Eiji; Doya, Kenji
2013-01-01
Linearly solvable Markov Decision Process (LMDP) is a class of optimal control problem in which the Bellman's equation can be converted into a linear equation by an exponential transformation of the state value function (Todorov, 2009b). In an LMDP, the optimal value function and the corresponding control policy are obtained by solving an eigenvalue problem in a discrete state space or an eigenfunction problem in a continuous state using the knowledge of the system dynamics and the action, state, and terminal cost functions. In this study, we evaluate the effectiveness of the LMDP framework in real robot control, in which the dynamics of the body and the environment have to be learned from experience. We first perform a simulation study of a pole swing-up task to evaluate the effect of the accuracy of the learned dynamics model on the derived the action policy. The result shows that a crude linear approximation of the non-linear dynamics can still allow solution of the task, despite with a higher total cost. We then perform real robot experiments of a battery-catching task using our Spring Dog mobile robot platform. The state is given by the position and the size of a battery in its camera view and two neck joint angles. The action is the velocities of two wheels, while the neck joints were controlled by a visual servo controller. We test linear and bilinear dynamic models in tasks with quadratic and Guassian state cost functions. In the quadratic cost task, the LMDP controller derived from a learned linear dynamics model performed equivalently with the optimal linear quadratic regulator (LQR). In the non-quadratic task, the LMDP controller with a linear dynamics model showed the best performance. The results demonstrate the usefulness of the LMDP framework in real robot control even when simple linear models are used for dynamics learning.
Rong, Qiangqiang; Cai, Yanpeng; Chen, Bing; Yue, Wencong; Yin, Xin'an; Tan, Qian
2017-02-15
In this research, an export coefficient based dual inexact two-stage stochastic credibility constrained programming (ECDITSCCP) model was developed through integrating an improved export coefficient model (ECM), interval linear programming (ILP), fuzzy credibility constrained programming (FCCP) and a fuzzy expected value equation within a general two stage programming (TSP) framework. The proposed ECDITSCCP model can effectively address multiple uncertainties expressed as random variables, fuzzy numbers, pure and dual intervals. Also, the model can provide a direct linkage between pre-regulated management policies and the associated economic implications. Moreover, the solutions under multiple credibility levels can be obtained for providing potential decision alternatives for decision makers. The proposed model was then applied to identify optimal land use structures for agricultural NPS pollution mitigation in a representative upstream subcatchment of the Miyun Reservoir watershed in north China. Optimal solutions of the model were successfully obtained, indicating desired land use patterns and nutrient discharge schemes to get a maximum agricultural system benefits under a limited discharge permit. Also, numerous results under multiple credibility levels could provide policy makers with several options, which could help get an appropriate balance between system benefits and pollution mitigation. The developed ECDITSCCP model can be effectively applied to addressing the uncertain information in agricultural systems and shows great applicability to the land use adjustment for agricultural NPS pollution mitigation. Copyright © 2016 Elsevier B.V. All rights reserved.
The RAVE/VERTIGO vertex reconstruction toolkit and framework
NASA Astrophysics Data System (ADS)
Waltenberger, W.; Mitaroff, W.; Moser, F.; Pflugfelder, B.; Riedel, H. V.
2008-07-01
A detector-independent toolkit for vertex reconstruction (RAVE1) is being developed, along with a standalone framework (VERTIGO2) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.
ERIC Educational Resources Information Center
Smith, Karan B.
1996-01-01
Presents activities which highlight major concepts of linear programming. Demonstrates how technology allows students to solve linear programming problems using exploration prior to learning algorithmic methods. (DDR)
De Neve, Jan-Walter; Boudreaux, Chantelle; Gill, Roopan; Geldsetzer, Pascal; Vaikath, Maria; Bärnighausen, Till; Bossert, Thomas J
2017-07-03
Many countries have created community-based health worker (CHW) programs for HIV. In most of these countries, several national and non-governmental initiatives have been implemented raising questions of how well these different approaches address the health problems and use health resources in a compatible way. While these questions have led to a general policy initiative to promote harmonization across programs, there is a need for countries to develop a more coherent and organized approach to CHW programs and to generate evidence about the most efficient and effective strategies to ensure their optimal, sustained performance. We conducted a narrative review of the existing published and gray literature on the harmonization of CHW programs. We searched for and noted evidence on definitions, models, and/or frameworks of harmonization; theoretical arguments or hypotheses about the effects of CHW program fragmentation; and empirical evidence. Based on this evidence, we defined harmonization, introduced three priority areas for harmonization, and identified a conceptual framework for analyzing harmonization of CHW programs that can be used to support their expanding role in HIV service delivery. We identified and described the major issues and relationships surrounding the harmonization of CHW programs, including key characteristics, facilitators, and barriers for each of the priority areas of harmonization, and used our analytic framework to map overarching findings. We apply this approach of CHW programs supporting HIV services across four countries in Southern Africa in a separate article. There is a large number and immense diversity of CHW programs for HIV. This includes integration of HIV components into countries' existing national programs along with the development of multiple, stand-alone CHW programs. We defined (i) coordination among stakeholders, (ii) integration into the broader health system, and (iii) assurance of a CHW program's sustainability to be priority areas of harmonization. While harmonization is likely a complex political process, with in many cases incremental steps toward improvement, a wide range of facilitators are available to decision-makers. These can be categorized using an analytic framework assessing the (i) health issue, (ii) intervention itself, (iii) stakeholders, (iv) health system, and (v) broad context. There is a need to address fragmentation of CHW programs to advance and sustain CHW roles and responsibilities for HIV. This study provides a narrative review and analytic framework to understand the process by which harmonization of CHW programs might be achieved and to test the assumption that harmonization is needed to improve CHW performance.
A Framework for Quality in Educational Technology Programs.
ERIC Educational Resources Information Center
Confrey, Jere; Sabelli, Nora; Sheingold, Karen
2002-01-01
Presents a framework for judging educational technology programs that was developed for the Department of Education by the Expert Panel on Educational Technology. Highlights include clearly articulated goals; developing learning and thinking skills; equity for educational excellence; promoting organizational change; measurable evidence of…
Jang, Myoungock; Chao, Ariana; Whittemore, Robin
2015-01-01
Intervention programs targeting parents to manage childhood overweight and obesity have emerged based on parents influence on the health behaviors of their children. The purpose of this review was to systematically evaluate intervention programs targeting parents to manage childhood overweight and obesity using the Reach, Efficacy, Adopt, Implementation, and Maintenance (RE-AIM) framework. There was a moderate risk of bias across all studies. The overall proportion of studies (n=7) reporting on each dimension of the RE-AIM framework ranged from 78.6% (reach) to 23.8% (maintenance). The majority of intervention programs demonstrated improvement in child BMI. However intervention programs did not reach families of diverse race/ethnicity, were provided by highly trained professionals, and demonstrated high attrition, thus limiting generalizability. Copyright © 2015 Elsevier Inc. All rights reserved.
CyberMedVPS: visual programming for development of simulators.
Morais, Aline M; Machado, Liliane S
2011-01-01
Computer applications based on Virtual Reality (VR) has been outstanding in training and teaching in the medical filed due to their ability to simulate realistic in which users can practice skills and decision making in different situations. But was realized in these frameworks a hard interaction of non-programmers users. Based on this problematic will be shown the CyberMedVPS, a graphical module which implement Visual Programming concepts to solve an interaction trouble. Frameworks to develop such simulators are available but their use demands knowledge of programming. Based on this problematic will be shown the CyberMedVPS, a graphical module for the CyberMed framework, which implements Visual Programming concepts to allow the development of simulators by non-programmers professionals of the medical field.
NASA Astrophysics Data System (ADS)
Zhang, Chenglong; Zhang, Fan; Guo, Shanshan; Liu, Xiao; Guo, Ping
2018-01-01
An inexact nonlinear mλ-measure fuzzy chance-constrained programming (INMFCCP) model is developed for irrigation water allocation under uncertainty. Techniques of inexact quadratic programming (IQP), mλ-measure, and fuzzy chance-constrained programming (FCCP) are integrated into a general optimization framework. The INMFCCP model can deal with not only nonlinearities in the objective function, but also uncertainties presented as discrete intervals in the objective function, variables and left-hand side constraints and fuzziness in the right-hand side constraints. Moreover, this model improves upon the conventional fuzzy chance-constrained programming by introducing a linear combination of possibility measure and necessity measure with varying preference parameters. To demonstrate its applicability, the model is then applied to a case study in the middle reaches of Heihe River Basin, northwest China. An interval regression analysis method is used to obtain interval crop water production functions in the whole growth period under uncertainty. Therefore, more flexible solutions can be generated for optimal irrigation water allocation. The variation of results can be examined by giving different confidence levels and preference parameters. Besides, it can reflect interrelationships among system benefits, preference parameters, confidence levels and the corresponding risk levels. Comparison between interval crop water production functions and deterministic ones based on the developed INMFCCP model indicates that the former is capable of reflecting more complexities and uncertainties in practical application. These results can provide more reliable scientific basis for supporting irrigation water management in arid areas.
Mathematical Modeling of Intestinal Iron Absorption Using Genetic Programming
Colins, Andrea; Gerdtzen, Ziomara P.; Nuñez, Marco T.; Salgado, J. Cristian
2017-01-01
Iron is a trace metal, key for the development of living organisms. Its absorption process is complex and highly regulated at the transcriptional, translational and systemic levels. Recently, the internalization of the DMT1 transporter has been proposed as an additional regulatory mechanism at the intestinal level, associated to the mucosal block phenomenon. The short-term effect of iron exposure in apical uptake and initial absorption rates was studied in Caco-2 cells at different apical iron concentrations, using both an experimental approach and a mathematical modeling framework. This is the first report of short-term studies for this system. A non-linear behavior in the apical uptake dynamics was observed, which does not follow the classic saturation dynamics of traditional biochemical models. We propose a method for developing mathematical models for complex systems, based on a genetic programming algorithm. The algorithm is aimed at obtaining models with a high predictive capacity, and considers an additional parameter fitting stage and an additional Jackknife stage for estimating the generalization error. We developed a model for the iron uptake system with a higher predictive capacity than classic biochemical models. This was observed both with the apical uptake dataset used for generating the model and with an independent initial rates dataset used to test the predictive capacity of the model. The model obtained is a function of time and the initial apical iron concentration, with a linear component that captures the global tendency of the system, and a non-linear component that can be associated to the movement of DMT1 transporters. The model presented in this paper allows the detailed analysis, interpretation of experimental data, and identification of key relevant components for this complex biological process. This general method holds great potential for application to the elucidation of biological mechanisms and their key components in other complex systems. PMID:28072870
ERIC Educational Resources Information Center
Palmer, Jackie; Powell, Mary Jo
The Laboratory Network Program and the National Network of Eisenhower Mathematics and Science Regional Consortia, operating as the Curriculum Frameworks Task Force, jointly convened a group of educators involved in implementing state-level mathematics or science curriculum frameworks (CF). The Hilton Head (South Carolina) conference had a dual…
ERIC Educational Resources Information Center
United Nations Children's Fund, Kabul (Afghanistan).
This Master Plan of Operation has two parts: The Framework and the Program. Part I (the Framework) sets out the articles of agreement between the Government of Afghanistan and the United Nations Children's Fund (UNICEF) concerning a 1-year program for the improvement of social services for children and mothers regarded as the most disadvantaged…
Copy Hiding Application Interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Holger; Poliakoff, David; Robinson, Peter
2016-10-06
CHAI is a light-weight framework which abstracts the automated movement of data (e.g. to/from Host/Device) via RAJA like performance portability programming model constructs. It can be viewed as a utility framework and an adjunct to FAJA (A Performance Portability Framework). Performance Portability is a technique that abstracts the complexities of modern Heterogeneous Architectures while allowing the original program to undergo incremental minimally invasive code changes in order to adapt to the newer architectures.
NASA Technical Reports Server (NTRS)
1979-01-01
The computer program Linear SCIDNT which evaluates rotorcraft stability and control coefficients from flight or wind tunnel test data is described. It implements the maximum likelihood method to maximize the likelihood function of the parameters based on measured input/output time histories. Linear SCIDNT may be applied to systems modeled by linear constant-coefficient differential equations. This restriction in scope allows the application of several analytical results which simplify the computation and improve its efficiency over the general nonlinear case.
NASA Technical Reports Server (NTRS)
Pilkey, W. D.; Chen, Y. H.
1974-01-01
An indirect synthesis method is used in the efficient optimal design of multi-degree of freedom, multi-design element, nonlinear, transient systems. A limiting performance analysis which requires linear programming for a kinematically linear system is presented. The system is selected using system identification methods such that the designed system responds as closely as possible to the limiting performance. The efficiency is a result of the method avoiding the repetitive systems analyses accompanying other numerical optimization methods.
A theory-informed approach to mental health care capacity building for pharmacists.
Murphy, Andrea L; Gardner, David M; Kutcher, Stan P; Martin-Misener, Ruth
2014-01-01
Pharmacists are knowledgeable, accessible health care professionals who can provide services that improve outcomes in mental health care. Various challenges and opportunities can exist in pharmacy practice to hinder or support pharmacists' efforts. We used a theory-informed approach to development and implementation of a capacity-building program to enhance pharmacists' roles in mental health care. Theories and frameworks including the Consolidated Framework for Implementation Research, the Theoretical Domains Framework, and the Behaviour Change Wheel were used to inform the conceptualization, development, and implementation of a capacity-building program to enhance pharmacists' roles in mental health care. The More Than Meds program was developed and implemented through an iterative process. The main program components included: an education and training day; use of a train-the-trainer approach from partnerships with pharmacists and people with lived experience of mental illness; development of a community of practice through email communications, a website, and a newsletter; and use of educational outreach delivered by pharmacists. Theories and frameworks used throughout the program's development and implementation facilitated a means to conceptualize the component parts of the program as well as its overall presence as a whole from inception through evolution in implementation. Using theoretical foundations for the program enabled critical consideration and understanding of issues related to trialability and adaptability of the program. Theory was essential to the underlying development and implementation of a capacity-building program for enhancing services by pharmacists for people with lived experience of mental illness. Lessons learned from the development and implementation of this program are informing current research and evolution of the program.
NASA Astrophysics Data System (ADS)
Subramaniam, Karthigeyan; Esprívalo Harrell, Pamela; Wojnowski, David
2013-04-01
Background and purpose : This study details the use of a conceptual framework to analyze prospective teachers' images of scientists to reveal their context-specific conceptions of scientists. The conceptual framework consists of context-specific conceptions related to positive, stereotypical and negative images of scientists as detailed in the literature on the images, role and work of scientists. Sample, design and method : One hundred and ninety-six drawings of scientists, generated by prospective teachers, were analyzed using the Draw-A-Scientist-Test Checklist (DAST-C), a binary linear regression and the conceptual framework. Results : The results of the binary linear regression analysis revealed a statistically significant difference for two DAST-C elements: ethnicity differences with regard to drawing a scientist who was Caucasian and gender differences for indications of danger. Analysis using the conceptual framework helped to categorize the same drawings into positive, stereotypical, negative and composite images of a scientist. Conclusions : The conceptual framework revealed that drawings were focused on the physical appearance of the scientist, and to a lesser extent on the equipment, location and science-related practices that provided the context of a scientist's role and work. Implications for teacher educators include the need to understand that there is a need to provide tools, like the conceptual framework used in this study, to help prospective teachers to confront and engage with their multidimensional perspectives of scientists in light of the current trends on perceiving and valuing scientists. In addition, teacher educators need to use the conceptual framework, which yields qualitative perspectives about drawings, together with the DAST-C, which yields quantitative measure for drawings, to help prospective teachers to gain a holistic outlook on their drawings of scientists.
A multi-GPU real-time dose simulation software framework for lung radiotherapy.
Santhanam, A P; Min, Y; Neelakkantan, H; Papp, N; Meeks, S L; Kupelian, P A
2012-09-01
Medical simulation frameworks facilitate both the preoperative and postoperative analysis of the patient's pathophysical condition. Of particular importance is the simulation of radiation dose delivery for real-time radiotherapy monitoring and retrospective analyses of the patient's treatment. In this paper, a software framework tailored for the development of simulation-based real-time radiation dose monitoring medical applications is discussed. A multi-GPU-based computational framework coupled with inter-process communication methods is introduced for simulating the radiation dose delivery on a deformable 3D volumetric lung model and its real-time visualization. The model deformation and the corresponding dose calculation are allocated among the GPUs in a task-specific manner and is performed in a pipelined manner. Radiation dose calculations are computed on two different GPU hardware architectures. The integration of this computational framework with a front-end software layer and back-end patient database repository is also discussed. Real-time simulation of the dose delivered is achieved at once every 120 ms using the proposed framework. With a linear increase in the number of GPU cores, the computational time of the simulation was linearly decreased. The inter-process communication time also improved with an increase in the hardware memory. Variations in the delivered dose and computational speedup for variations in the data dimensions are investigated using D70 and D90 as well as gEUD as metrics for a set of 14 patients. Computational speed-up increased with an increase in the beam dimensions when compared with a CPU-based commercial software while the error in the dose calculation was <1%. Our analyses show that the framework applied to deformable lung model-based radiotherapy is an effective tool for performing both real-time and retrospective analyses.
Clinical time series prediction: Toward a hierarchical dynamical system framework.
Liu, Zitao; Hauskrecht, Milos
2015-09-01
Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Nakhanu, Shikuku Beatrice; Musasia, Amadalo Maurice
2015-01-01
The topic Linear Programming is included in the compulsory Kenyan secondary school mathematics curriculum at form four. The topic provides skills for determining best outcomes in a given mathematical model involving some linear relationship. This technique has found application in business, economics as well as various engineering fields. Yet many…
NASA Astrophysics Data System (ADS)
Louarroudi, E.; Pintelon, R.; Lataire, J.
2014-10-01
Time-periodic (TP) phenomena occurring, for instance, in wind turbines, helicopters, anisotropic shaft-bearing systems, and cardiovascular/respiratory systems, are often not addressed when classical frequency response function (FRF) measurements are performed. As the traditional FRF concept is based on the linear time-invariant (LTI) system theory, it is only approximately valid for systems with varying dynamics. Accordingly, the quantification of any deviation from this ideal LTI framework is more than welcome. The “measure of deviation” allows us to define the notion of the best LTI (BLTI) approximation, which yields the best - in mean square sense - LTI description of a linear time-periodic LTP system. By taking into consideration the TP effects, it is shown in this paper that the variability of the BLTI measurement can be reduced significantly compared with that of classical FRF estimators. From a single experiment, the proposed identification methods can handle (non-)linear time-periodic [(N)LTP] systems in open-loop with a quantification of (i) the noise and/or the NL distortions, (ii) the TP distortions and (iii) the transient (leakage) errors. Besides, a geometrical interpretation of the BLTI approximation is provided, leading to a framework called vector FRF analysis. The theory presented is supported by numerical simulations as well as real measurements mimicking the well-known mechanical Mathieu oscillator.
SYSTEMS ANALYSIS, * WATER SUPPLIES, MATHEMATICAL MODELS, OPTIMIZATION, ECONOMICS, LINEAR PROGRAMMING, HYDROLOGY, REGIONS, ALLOCATIONS, RESTRAINT, RIVERS, EVAPORATION, LAKES, UTAH, SALVAGE, MINES(EXCAVATIONS).
BIODEGRADATION PROBABILITY PROGRAM (BIODEG)
The Biodegradation Probability Program (BIODEG) calculates the probability that a chemical under aerobic conditions with mixed cultures of microorganisms will biodegrade rapidly or slowly. It uses fragment constants developed using multiple linear and non-linear regressions and d...
The Use of Linear Programming for Prediction.
ERIC Educational Resources Information Center
Schnittjer, Carl J.
The purpose of the study was to develop a linear programming model to be used for prediction, test the accuracy of the predictions, and compare the accuracy with that produced by curvilinear multiple regression analysis. (Author)
75 FR 33570 - Magnuson-Stevens Act Provisions; Fishing Capacity Reduction Framework
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-14
.... 100330171-0232-01] RIN 0648-AY79 Magnuson-Stevens Act Provisions; Fishing Capacity Reduction Framework... regulations specifying procedures for implementing fishing capacity reduction programs (reduction programs) in... either to surrender their fishing permits including relevant fishing histories for that fishery, or...
Teaching Money Literacy in a Positive Youth Development Program: The Project P.A.T.H.S. in Hong Kong
Lee, Tak Yan; Law, Ben M. F.
2011-01-01
In view of the high impact of materialistic orientation among children and adolescents, financial educational programs are provided as preventive measures. Without a clear framework, it is impossible to evaluate these programs. The goals of this paper are threefold. Firstly, the phenomena related to adolescent materialistic orientation and its associated problems in Hong Kong are examined. Secondly, the concept of financial education as a preventive measure is reviewed. Both board and narrow definitions of money literacy are examined. A framework on money literacy for children and adolescents as a founding stone for financial education is proposed. The framework finds its support from a typology proposed by the authors and results from an integration of research findings on dimensions of the concepts of money and success. Finally, curriculum units for Grades 7 to 9 students in a positive youth development program (the Project P.A.T.H.S.) are developed using the framework. PMID:22194664
Rhenter, Pauline; Moreau, Delphine; Laval, Christian; Mantovani, Jean; Albisson, Amandine; Suderie, Guillaume; Boucekine, Mohamed; Tinland, Aurelie; Loubière, Sandrine; Greacen, Tim; Auquier, Pascal; Girard, Vincent
2018-03-14
This paper is a qualitative analysis of the effects of accompagnement , a support framework, on recovery trajectories of people with long-term homelessness and severe psychiatric disorders during 24 months in a Housing First-type program in France. A comprehensive methodology based on grounded theory was used to construct an interview guide, conduct multiple interviews with 35 Housing First participants sampled for heterogeneity, and produce memos on their trajectories before and after entering the program based on interview information. Thematic analysis of a representative subsample ( n = 13) of memos identified 12 objective factors and 6 subjective factors key to the recovery process. An in-depth re-analysis of the memos generated four recovery themes: (1) the need for secure space favorable to self-reflexivity; (2) a "honeymoon" effect; (3) the importance of even weak social ties; (4) support from and hope among peers. Three challenges to recovery were identified: (1) finding a balance between protection and risk; (2) breaking downward spirals; (3) bifurcating the trajectory. This study provides new insight into the recovery process, understood as a non-linear transformation of an experience-the relationship between objective life conditions and subjective perception of those conditions-which reinforces protective support over risk elements.
Rhenter, Pauline; Moreau, Delphine; Laval, Christian; Mantovani, Jean; Albisson, Amandine; Suderie, Guillaume; Boucekine, Mohamed; Tinland, Aurelie; Loubière, Sandrine; Greacen, Tim; Auquier, Pascal; Girard, Vincent
2018-01-01
This paper is a qualitative analysis of the effects of accompagnement, a support framework, on recovery trajectories of people with long-term homelessness and severe psychiatric disorders during 24 months in a Housing First-type program in France. A comprehensive methodology based on grounded theory was used to construct an interview guide, conduct multiple interviews with 35 Housing First participants sampled for heterogeneity, and produce memos on their trajectories before and after entering the program based on interview information. Thematic analysis of a representative subsample (n = 13) of memos identified 12 objective factors and 6 subjective factors key to the recovery process. An in-depth re-analysis of the memos generated four recovery themes: (1) the need for secure space favorable to self-reflexivity; (2) a “honeymoon” effect; (3) the importance of even weak social ties; (4) support from and hope among peers. Three challenges to recovery were identified: (1) finding a balance between protection and risk; (2) breaking downward spirals; (3) bifurcating the trajectory. This study provides new insight into the recovery process, understood as a non-linear transformation of an experience—the relationship between objective life conditions and subjective perception of those conditions—which reinforces protective support over risk elements. PMID:29538346
ERIC Educational Resources Information Center
Davis, LaShara A.; Morgan, Susan E.; Mobley, Amy R.
2016-01-01
Additional strategies to evaluate the impact of community nutrition education programs on low-income individuals are needed. The objective of this qualitative study was to examine the use of the Memorable Messages Framework as an intermediary nutrition education program evaluation tool to determine what fruit and vegetable messages were reported…
DataForge: Modular platform for data storage and analysis
NASA Astrophysics Data System (ADS)
Nozik, Alexander
2018-04-01
DataForge is a framework for automated data acquisition, storage and analysis based on modern achievements of applied programming. The aim of the DataForge is to automate some standard tasks like parallel data processing, logging, output sorting and distributed computing. Also the framework extensively uses declarative programming principles via meta-data concept which allows a certain degree of meta-programming and improves results reproducibility.
Fulmer, Erika; Rogers, Todd; Glasgow, LaShawn; Brown, Susan; Kuiper, Nicole
2018-03-01
The outcome indicator framework helps tobacco prevention and control programs (TCPs) plan and implement theory-driven evaluations of their efforts to reduce and prevent tobacco use. Tobacco use is the single-most preventable cause of morbidity and mortality in the United States. The implementation of public health best practices by comprehensive state TCPs has been shown to prevent the initiation of tobacco use, reduce tobacco use prevalence, and decrease tobacco-related health care expenditures. Achieving and sustaining program goals require TCPs to evaluate the effectiveness and impact of their programs. To guide evaluation efforts by TCPs, the Centers for Disease Control and Prevention's Office on Smoking and Health developed an outcome indicator framework that includes a high-level logic model and evidence-based outcome indicators for each tobacco prevention and control goal area. In this article, we describe how TCPs and other community organizations can use the outcome indicator framework in their evaluation efforts. We also discuss how the framework is used at the national level to unify tobacco prevention and control efforts across varying state contexts, identify promising practices, and expand the public health evidence base.
Martin-Sanchez, Fernando; Rowlands, David; Schaper, Louise; Hansen, David
2017-01-01
The Certified Health Informatician Australasia (CHIA) program consists of an online exam, which aims to test whether a candidate has the knowledge and skills that are identified in the competencies framework to perform as a health informatics professional. The CHIA Health Informatics Competencies Framework provides the context in which the questions for the exam have been developed. The core competencies for health informatics that are tested in the exam have been developed with reference to similar programs by the American Medical Informatics Association, the International Medical Informatics Association and COACH, Canada's Health Informatics Association, and builds on the previous work done by the Australian Health Informatics Education Council. This paper shows how the development of this competency framework is helping to raise the profile of health informaticians in Australasia, contributing to a wider recognition of the profession, and defining more clearly the body of knowledge underpinning this discipline. This framework can also be used as a set of guidelines for recruiting purposes, definitions of career pathways, or the design of educational and training activities. We discuss here the current status of the program, its resultsandprospectsfor the future.
Synthesizing Dynamic Programming Algorithms from Linear Temporal Logic Formulae
NASA Technical Reports Server (NTRS)
Rosu, Grigore; Havelund, Klaus
2001-01-01
The problem of testing a linear temporal logic (LTL) formula on a finite execution trace of events, generated by an executing program, occurs naturally in runtime analysis of software. We present an algorithm which takes an LTL formula and generates an efficient dynamic programming algorithm. The generated algorithm tests whether the LTL formula is satisfied by a finite trace of events given as input. The generated algorithm runs in linear time, its constant depending on the size of the LTL formula. The memory needed is constant, also depending on the size of the formula.
On the stability and instantaneous velocity of grasped frictionless objects
NASA Technical Reports Server (NTRS)
Trinkle, Jeffrey C.
1992-01-01
A quantitative test for form closure valid for any number of contact points is formulated as a linear program, the optimal objective value of which provides a measure of how far a grasp is from losing form closure. Another contribution of the study is the formulation of a linear program whose solution yields the same information as the classical approach. The benefit of the formulation is that explicit testing of all possible combinations of contact interactions can be avoided by the algorithm used to solve the linear program.
A novel recurrent neural network with finite-time convergence for linear programming.
Liu, Qingshan; Cao, Jinde; Chen, Guanrong
2010-11-01
In this letter, a novel recurrent neural network based on the gradient method is proposed for solving linear programming problems. Finite-time convergence of the proposed neural network is proved by using the Lyapunov method. Compared with the existing neural networks for linear programming, the proposed neural network is globally convergent to exact optimal solutions in finite time, which is remarkable and rare in the literature of neural networks for optimization. Some numerical examples are given to show the effectiveness and excellent performance of the new recurrent neural network.
Formation Flying With Decentralized Control in Libration Point Orbits
NASA Technical Reports Server (NTRS)
Folta, David; Carpenter, J. Russell; Wagner, Christoph
2000-01-01
A decentralized control framework is investigated for applicability of formation flying control in libration orbits. The decentralized approach, being non-hierarchical, processes only direct measurement data, in parallel with the other spacecraft. Control is accomplished via linearization about a reference libration orbit with standard control using a Linear Quadratic Regulator (LQR) or the GSFC control algorithm. Both are linearized about the current state estimate as with the extended Kalman filter. Based on this preliminary work, the decentralized approach appears to be feasible for upcoming libration missions using distributed spacecraft.
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
ERIC Educational Resources Information Center
Marschall, Gosia; Andrews, Paul
2015-01-01
In this article we present an exploratory case study of six Polish teachers' perspectives on the teaching of linear equations to grade six students. Data, which derived from semi-structured interviews, were analysed against an extant framework and yielded a number of commonly held beliefs about what teachers aimed to achieve and how they would…
Fu, Wei; Nijhoff, Frank W
2017-07-01
A unified framework is presented for the solution structure of three-dimensional discrete integrable systems, including the lattice AKP, BKP and CKP equations. This is done through the so-called direct linearizing transform, which establishes a general class of integral transforms between solutions. As a particular application, novel soliton-type solutions for the lattice CKP equation are obtained.
ERIC Educational Resources Information Center
Dimitrov, Dimiter M.; Raykov, Tenko; AL-Qataee, Abdullah Ali
2015-01-01
This article is concerned with developing a measure of general academic ability (GAA) for high school graduates who apply to colleges, as well as with the identification of optimal weights of the GAA indicators in a linear combination that yields a composite score with maximal reliability and maximal predictive validity, employing the framework of…
The Use of Graphs in Specific Situations of the Initial Conditions of Linear Differential Equations
ERIC Educational Resources Information Center
Buendía, Gabriela; Cordero, Francisco
2013-01-01
In this article, we present a discussion on the role of graphs and its significance in the relation between the number of initial conditions and the order of a linear differential equation, which is known as the initial value problem. We propose to make a functional framework for the use of graphs that intends to broaden the explanations of the…
X-Windows Information Sharing Protocol Widget Class
NASA Technical Reports Server (NTRS)
Barry, Matthew R.
2006-01-01
The X-Windows Information Sharing Protocol (ISP) Widget Class ("Class") is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing ISP graphical-user-interface (GUI) computer programs. ISP programming tasks require many method calls to identify, query, and interpret the connections and messages exchanged between a client and an ISP server. Most X-Windows GUI programs use widget sets or toolkits to facilitate management of complex objects. The widget standards facilitate construction of toolkits and application programs. The X-Windows Information Sharing Protocol (ISP) Widget Class encapsulates the client side of the ISP programming libraries within the framework of an X-Windows widget. Using the widget framework, X-Windows GUI programs can interact with ISP services in an abstract way and in the same manner as that of other graphical widgets, making it easier to write ISP GUI client programs. Wrapping ISP client services inside a widget framework enables a programmer to treat an ISP server interface as though it were a GUI. Moreover, an alternate subclass could implement another communication protocol in the same sort of widget.
Large-scale linear programs in planning and prediction.
DOT National Transportation Integrated Search
2017-06-01
Large-scale linear programs are at the core of many traffic-related optimization problems in both planning and prediction. Moreover, many of these involve significant uncertainty, and hence are modeled using either chance constraints, or robust optim...
Computer-aided linear-circuit design.
NASA Technical Reports Server (NTRS)
Penfield, P.
1971-01-01
Usually computer-aided design (CAD) refers to programs that analyze circuits conceived by the circuit designer. Among the services such programs should perform are direct network synthesis, analysis, optimization of network parameters, formatting, storage of miscellaneous data, and related calculations. The program should be embedded in a general-purpose conversational language such as BASIC, JOSS, or APL. Such a program is MARTHA, a general-purpose linear-circuit analyzer embedded in APL.
Latent log-linear models for handwritten digit classification.
Deselaers, Thomas; Gass, Tobias; Heigold, Georg; Ney, Hermann
2012-06-01
We present latent log-linear models, an extension of log-linear models incorporating latent variables, and we propose two applications thereof: log-linear mixture models and image deformation-aware log-linear models. The resulting models are fully discriminative, can be trained efficiently, and the model complexity can be controlled. Log-linear mixture models offer additional flexibility within the log-linear modeling framework. Unlike previous approaches, the image deformation-aware model directly considers image deformations and allows for a discriminative training of the deformation parameters. Both are trained using alternating optimization. For certain variants, convergence to a stationary point is guaranteed and, in practice, even variants without this guarantee converge and find models that perform well. We tune the methods on the USPS data set and evaluate on the MNIST data set, demonstrating the generalization capabilities of our proposed models. Our models, although using significantly fewer parameters, are able to obtain competitive results with models proposed in the literature.
Framing Care for Planners of Education Programs
ERIC Educational Resources Information Center
Glowacki-Dudka, Michelle; Mullett, Cathy; Griswold, Wendy; Baize-Ward, Amy; Vetor-Suits, Crissy; Londt, Susan Cole
2018-01-01
Using a framework of care to design experiences in formal or informal learning does two things. It acknowledges intentions of reflective learning through open communication and meets expectations of scholars seeking knowledge within a learning community. This proposed framework was developed from programs involving popular education, community…
Toward a Distinctive Christian Undergraduate Management Program
ERIC Educational Resources Information Center
Smith, Thomas M.; VanderVeen, Steve
2008-01-01
We motivate and develop a theoretical framework for creating a distinctive Christian undergraduate management program that is directed toward providing (a) the necessary intellectual characteristics to do "well" and (b) the necessary emotional characteristics to do "good." This framework consists of seven propositions that connect the learning…
75 FR 62326 - Magnuson-Stevens Act Provisions; Fishing Capacity Reduction Framework
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-08
.... 100330171-0388-02] RIN 0648-AY79 Magnuson-Stevens Act Provisions; Fishing Capacity Reduction Framework... implementing fishing capacity reduction programs (reduction programs) in accordance with the Magnuson-Stevens... pays harvesters in a fishery that has more vessels than capacity either to surrender their fishing...
Growth of nanostructures with controlled diameter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pfefferle, Lisa; Haller, Gary; Ciuparu, Dragos
2009-02-03
Transition metal-substituted MCM-41 framework structures with a high degree of structural order and a narrow pore diameter distribution were reproducibly synthesized by a hydrothermal method using a surfactant and an anti-foaming agent. The pore size and the mesoporous volume depend linearly on the surfactant chain length. The transition metals, such as cobalt, are incorporated substitutionally and highly dispersed in the silica framework. Single wall carbon nanotubes with a narrow diameter distribution that correlates with the pore diameter of the catalytic framework structure were prepared by a Boudouard reaction. Nanostructures with a specified diameter or cross-sectional area can therefore be predictablymore » prepared by selecting a suitable pore size of the framework structure.« less
Tierless Programming for the Internet of Things
DOE Office of Scientific and Technical Information (OSTI.GOV)
Decker, Brett
The Internet of Things (IoT) is about Internet-addressability and connectivity for everyday devices. The goal of this project was to create a framework to allow developers to more easily control IoT devices and turn their interactions into meaningful applications. We leveraged a tierless approach for Software Defined Networking (SDN) to build this framework. We expanded Flowlog, a tierless programming language for SDN controllers, to support IoT devices developed by Spark IO to build this framework.
Development of a Curricular Framework for Pediatric Hospital Medicine Fellowships.
Jerardi, Karen E; Fisher, Erin; Rassbach, Caroline; Maniscalco, Jennifer; Blankenburg, Rebecca; Chase, Lindsay; Shah, Neha
2017-07-01
Pediatric Hospital Medicine (PHM) is an emerging field in pediatrics and one that has experienced immense growth and maturation in a short period of time. Evolution and rapid expansion of the field invigorated the goal of standardizing PHM fellowship curricula, which naturally aligned with the field's evolving pursuit of a defined identity and consideration of certification options. The national group of PHM fellowship program directors sought to establish curricular standards that would more accurately reflect the competencies needed to practice pediatric hospital medicine and meet future board certification needs. In this manuscript, we describe the method by which we reached consensus on a 2-year curricular framework for PHM fellowship programs, detail the current model for this framework, and provide examples of how this curricular framework may be applied to meet the needs of a variety of fellows and fellowship programs. The 2-year PHM fellowship curricular framework was developed over a number of years through an iterative process and with the input of PHM fellowship program directors (PDs), PHM fellowship graduates, PHM leaders, pediatric hospitalists practicing in a variety of clinical settings, and other educators outside the field. We have developed a curricular framework for PHM Fellowships that consists of 8 education units (defined as 4 weeks each) in 3 areas: clinical care, systems and scholarship, and individualized curriculum. Copyright © 2017 by the American Academy of Pediatrics.
Chmiel, Aviva S; Shaha, Maya; Schneider, Daniel K
2017-01-01
The aim of this research is to develop a comprehensive evaluation framework involving all actors in a higher education blended learning (BL) program. BL evaluation usually either focuses on students, faculty, technological or institutional aspects. Currently, no validated comprehensive monitoring tool exists that can support introduction and further implementation of BL in a higher education context. Starting from established evaluation principles and standards, concepts that were to be evaluated were firstly identified and grouped. In a second step, related BL evaluation tools referring to students, faculty and institutional level were selected. This allowed setting up and implementing an evaluation framework to monitor the introduction of BL during two succeeding recurrences of the program. The results of the evaluation allowed documenting strengths and weaknesses of the BL format in a comprehensive way, involving all actors. It has led to improvements at program, faculty and course level. The evaluation process and the reporting of the results proved to be demanding in time and personal resources. The evaluation framework allows measuring the most significant dimensions influencing the success of a BL implementation at program level. However, this comprehensive evaluation is resource intensive. Further steps will be to refine the framework towards a sustainable and transferable BL monitoring tool that finds a balance between comprehensiveness and efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.
Planning Student Flow with Linear Programming: A Tunisian Case Study.
ERIC Educational Resources Information Center
Bezeau, Lawrence
A student flow model in linear programming format, designed to plan the movement of students into secondary and university programs in Tunisia, is described. The purpose of the plan is to determine a sufficient number of graduating students that would flow back into the system as teachers or move into the labor market to meet fixed manpower…
NASA Astrophysics Data System (ADS)
Kathpalia, B.; Tan, D.; Stern, I.; Erturk, A.
2018-01-01
It is well known that plucking-based frequency up-conversion can enhance the power output in piezoelectric energy harvesting by enabling cyclic free vibration at the fundamental bending mode of the harvester even for very low excitation frequencies. In this work, we present a geometrically nonlinear plucking-based framework for frequency up-conversion in piezoelectric energy harvesting under quasistatic excitations associated with low-frequency stimuli such as walking and similar rigid body motions. Axial shortening of the plectrum is essential to enable plucking excitation, which requires a nonlinear framework relating the plectrum parameters (e.g. overlap length between the plectrum and harvester) to the overall electrical power output. Von Kármán-type geometrically nonlinear deformation of the flexible plectrum cantilever is employed to relate the overlap length between the flexible (nonlinear) plectrum and the stiff (linear) harvester to the transverse quasistatic tip displacement of the plectrum, and thereby the tip load on the linear harvester in each plucking cycle. By combining the nonlinear plectrum mechanics and linear harvester dynamics with two-way electromechanical coupling, the electrical power output is obtained directly in terms of the overlap length. Experimental case studies and validations are presented for various overlap lengths and a set of electrical load resistance values. Further analysis results are reported regarding the combined effects of plectrum thickness and overlap length on the plucking force and harvested power output. The experimentally validated nonlinear plectrum-linear harvester framework proposed herein can be employed to design and optimize frequency up-conversion by properly choosing the plectrum parameters (geometry, material, overlap length, etc) as well as the harvester parameters.
NASA Technical Reports Server (NTRS)
Weston, R. P.; Green, L. L.; Salas, A. O.; Samareh, J. A.; Townsend, J. C.; Walsh, J. L.
1999-01-01
An objective of the HPCC Program at NASA Langley has been to promote the use of advanced computing techniques to more rapidly solve the problem of multidisciplinary optimization of a supersonic transport configuration. As a result, a software system has been designed and is being implemented to integrate a set of existing discipline analysis codes, some of them CPU-intensive, into a distributed computational framework for the design of a High Speed Civil Transport (HSCT) configuration. The proposed paper will describe the engineering aspects of integrating these analysis codes and additional interface codes into an automated design system. The objective of the design problem is to optimize the aircraft weight for given mission conditions, range, and payload requirements, subject to aerodynamic, structural, and performance constraints. The design variables include both thicknesses of structural elements and geometric parameters that define the external aircraft shape. An optimization model has been adopted that uses the multidisciplinary analysis results and the derivatives of the solution with respect to the design variables to formulate a linearized model that provides input to the CONMIN optimization code, which outputs new values for the design variables. The analysis process begins by deriving the updated geometries and grids from the baseline geometries and grids using the new values for the design variables. This free-form deformation approach provides internal FEM (finite element method) grids that are consistent with aerodynamic surface grids. The next step involves using the derived FEM and section properties in a weights process to calculate detailed weights and the center of gravity location for specified flight conditions. The weights process computes the as-built weight, weight distribution, and weight sensitivities for given aircraft configurations at various mass cases. Currently, two mass cases are considered: cruise and gross take-off weight (GTOW). Weights information is obtained from correlations of data from three sources: 1) as-built initial structural and non-structural weights from an existing database, 2) theoretical FEM structural weights and sensitivities from Genesis, and 3) empirical as-built weight increments, non-structural weights, and weight sensitivities from FLOPS. For the aeroelastic analysis, a variable-fidelity aerodynamic analysis has been adopted. This approach uses infrequent CPU-intensive non-linear CFD to calculate a non-linear correction relative to a linear aero calculation for the same aerodynamic surface at an angle of attack that results in the same configuration lift. For efficiency, this nonlinear correction is applied after each subsequent linear aero solution during the iterations between the aerodynamic and structural analyses. Convergence is achieved when the vehicle shape being used for the aerodynamic calculations is consistent with the structural deformations caused by the aerodynamic loads. To make the structural analyses more efficient, a linearized structural deformation model has been adopted, in which a single stiffness matrix can be used to solve for the deformations under all the load conditions. Using the converged aerodynamic loads, a final set of structural analyses are performed to determine the stress distributions and the buckling conditions for constraint calculation. Performance constraints are obtained by running FLOPS using drag polars that are computed using results from non-linear corrections to the linear aero code plus several codes to provide drag increments due to skin friction, wave drag, and other miscellaneous drag contributions. The status of the integration effort will be presented in the proposed paper, and results will be provided that illustrate the degree of accuracy in the linearizations that have been employed.
Lattice enumeration for inverse molecular design using the signature descriptor.
Martin, Shawn
2012-07-23
We describe an inverse quantitative structure-activity relationship (QSAR) framework developed for the design of molecular structures with desired properties. This framework uses chemical fragments encoded with a molecular descriptor known as a signature. It solves a system of linear constrained Diophantine equations to reorganize the fragments into novel molecular structures. The method has been previously applied to problems in drug and materials design but has inherent computational limitations due to the necessity of solving the Diophantine constraints. We propose a new approach to overcome these limitations using the Fincke-Pohst algorithm for lattice enumeration. We benchmark the new approach against previous results on LFA-1/ICAM-1 inhibitory peptides, linear homopolymers, and hydrofluoroether foam blowing agents. Software implementing the new approach is available at www.cs.otago.ac.nz/homepages/smartin.
Structured functional additive regression in reproducing kernel Hilbert spaces.
Zhu, Hongxiao; Yao, Fang; Zhang, Hao Helen
2014-06-01
Functional additive models (FAMs) provide a flexible yet simple framework for regressions involving functional predictors. The utilization of data-driven basis in an additive rather than linear structure naturally extends the classical functional linear model. However, the critical issue of selecting nonlinear additive components has been less studied. In this work, we propose a new regularization framework for the structure estimation in the context of Reproducing Kernel Hilbert Spaces. The proposed approach takes advantage of the functional principal components which greatly facilitates the implementation and the theoretical analysis. The selection and estimation are achieved by penalized least squares using a penalty which encourages the sparse structure of the additive components. Theoretical properties such as the rate of convergence are investigated. The empirical performance is demonstrated through simulation studies and a real data application.
A national framework for disaster health education in Australia.
FitzGerald, Gerard J; Aitken, Peter; Arbon, Paul; Archer, Frank; Cooper, David; Leggat, Peter; Myers, Colin; Robertson, Andrew; Tarrant, Michael; Davis, Elinor R
2010-01-01
Recent events have heightened awareness of disaster health issues and the need to prepare the health workforce to plan for and respond to major incidents. This has been reinforced at an international level by the World Association for Disaster and Emergency Medicine, which has proposed an international educational framework. The aim of this paper is to outline the development of a national educational framework for disaster health in Australia. The framework was developed on the basis of the literature and the previous experience of members of a National Collaborative for Disaster Health Education and Research. The Collaborative was brought together in a series of workshops and teleconferences, utilizing a modified Delphi technique to finalize the content at each level of the framework and to assign a value to the inclusion of that content at the various levels. The framework identifies seven educational levels along with educational outcomes for each level. The framework also identifies the recommended contents at each level and assigns a rating of depth for each component. The framework is not intended as a detailed curriculum, but rather as a guide for educationalists to develop specific programs at each level. This educational framework will provide an infrastructure around which future educational programs in Disaster Health in Australia may be designed and delivered. It will permit improved articulation for students between the various levels and greater consistency between programs so that operational responders may have a consistent language and operational approach to the management of major events.
Linear decomposition approach for a class of nonconvex programming problems.
Shen, Peiping; Wang, Chunfeng
2017-01-01
This paper presents a linear decomposition approach for a class of nonconvex programming problems by dividing the input space into polynomially many grids. It shows that under certain assumptions the original problem can be transformed and decomposed into a polynomial number of equivalent linear programming subproblems. Based on solving a series of liner programming subproblems corresponding to those grid points we can obtain the near-optimal solution of the original problem. Compared to existing results in the literature, the proposed algorithm does not require the assumptions of quasi-concavity and differentiability of the objective function, and it differs significantly giving an interesting approach to solving the problem with a reduced running time.
Pörzse, Gábor
2009-08-09
Research and development (R&D) has been playing a leading role in the European Community's history since the very beginning of European integration. Its importance has grown in recent years, after the launch of the Lisbon strategy. Framework programs have always played a considerable part in community research. The aim of their introduction was to fine tune national R&D activities, and to successfully divide research tasks between the Community and the member states. The Community, from the very outset, has acknowledged the importance of life sciences. It is no coincidence that life sciences have become the second biggest priority in the last two framework programs. This study provides a historical, and at the same time analytical and evaluative review of community R&D policy and activity from the starting point of its development until the present day. It examines in detail how the changes in structure, conditional system, regulations and priorities of the framework programs have followed the formation of social and economic needs. The paper puts special emphasis on the analysis of the development of life science research, presenting how they have met the challenges of the age, and how they have been built into the framework programs. Another research area of the present study is to elaborate how successfully Hungarian researchers have been joining the community research, especially the framework programs in the field of life sciences. To answer these questions, it was essential to survey, process and analyze the data available in the national and European public and closed databases. Contrary to the previous documents, this analysis doesn't concentrate on the political and scientific background. It outlines which role community research has played in sustainable social and economic development and competitiveness, how it has supported common policies and how the processes of integration have been deepening. Besides, the present paper offers a complete review of the given field, from its foundation up until the present day, by elaborating the newest initiatives and ideas for the future. This work is also novel from the point of view of the given professional field, the life sciences in the framework programs, and processing and evaluating of data of Hungarian participation in the 5th and 6th framework programs in the field of life sciences.
Abbass-Dick, Jennifer; Dennis, Cindy-Lee
Targeting mothers and fathers in breast-feeding promotion programs is recommended as research has found that father's support positively impacts breast-feeding duration and exclusivity. Breast-feeding coparenting refers to the manner in which parents work together to achieve their breast-feeding goals. The Breast-feeding Coparenting Framework was developed on the basis of diverse coparenting models and research related to father's involvement with breast-feeding. This framework consists of 5 components: joint breast-feeding goal setting, shared breast-feeding responsibility, proactive breast-feeding support, father's/partner's parental-child interactions, and productive communication and problem solving. This framework may be of value to policy makers and program providers working to improve breast-feeding outcomes.
Parlesak, Alexandr; Geelhoed, Diederike; Robertson, Aileen
2014-06-01
Chronic undernutrition is prevalent in Mozambique, where children suffer from stunting, vitamin A deficiency, anemia, and other nutrition-related disorders. Complete diet formulation products (CDFPs) are increasingly promoted to prevent chronic undernutrition. Using linear programming, to investigate whether diet diversification using local foods should be prioritized in order to reduce the prevalence of chronic undernutrition. Market prices of local foods were collected in Tete City, Mozambique. Linear programming was applied to calculate the cheapest possible fully nutritious food baskets (FNFB) by stepwise addition of micronutrient-dense localfoods. Only the top quintile of Mozambican households, using average expenditure data, could afford the FNFB that was designed using linear programming from a spectrum of local standard foods. The addition of beef heart or liver, dried fish and fresh moringa leaves, before applying linear programming decreased the price by a factor of up to 2.6. As a result, the top three quintiles could afford the FNFB optimized using both diversification strategy and linear programming. CDFPs, when added to the baskets, were unable to overcome the micronutrient gaps without greatly exceeding recommended energy intakes, due to their high ratio of energy to micronutrient density. Dietary diversification strategies using local, low-cost, nutrient-dense foods can meet all micronutrient recommendations and overcome all micronutrient gaps. The success of linear programming to identify a low-cost FNFB depends entirely on the investigators' ability to select appropriate micronutrient-dense foods. CDFPs added to food baskets are unable to overcome micronutrient gaps without greatly exceeding recommended energy intake.
Improved Equivalent Linearization Implementations Using Nonlinear Stiffness Evaluation
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Muravyov, Alexander A.
2001-01-01
This report documents two new implementations of equivalent linearization for solving geometrically nonlinear random vibration problems of complicated structures. The implementations are given the acronym ELSTEP, for "Equivalent Linearization using a STiffness Evaluation Procedure." Both implementations of ELSTEP are fundamentally the same in that they use a novel nonlinear stiffness evaluation procedure to numerically compute otherwise inaccessible nonlinear stiffness terms from commercial finite element programs. The commercial finite element program MSC/NASTRAN (NASTRAN) was chosen as the core of ELSTEP. The FORTRAN implementation calculates the nonlinear stiffness terms and performs the equivalent linearization analysis outside of NASTRAN. The Direct Matrix Abstraction Program (DMAP) implementation performs these operations within NASTRAN. Both provide nearly identical results. Within each implementation, two error minimization approaches for the equivalent linearization procedure are available - force and strain energy error minimization. Sample results for a simply supported rectangular plate are included to illustrate the analysis procedure.
Linear combination reading program for capture gamma rays
Tanner, Allan B.
1971-01-01
This program computes a weighting function, Qj, which gives a scalar output value of unity when applied to the spectrum of a desired element and a minimum value (considering statistics) when applied to spectra of materials not containing the desired element. Intermediate values are obtained for materials containing the desired element, in proportion to the amount of the element they contain. The program is written in the BASIC language in a format specific to the Hewlett-Packard 2000A Time-Sharing System, and is an adaptation of an earlier program for linear combination reading for X-ray fluorescence analysis (Tanner and Brinkerhoff, 1971). Following the program is a sample run from a study of the application of the linear combination technique to capture-gamma-ray analysis for calcium (report in preparation).
Scheduling algorithms for rapid imaging using agile Cubesat constellations
NASA Astrophysics Data System (ADS)
Nag, Sreeja; Li, Alan S.; Merrick, James H.
2018-02-01
Distributed Space Missions such as formation flight and constellations, are being recognized as important Earth Observation solutions to increase measurement samples over space and time. Cubesats are increasing in size (27U, ∼40 kg in development) with increasing capabilities to host imager payloads. Given the precise attitude control systems emerging in the commercial market, Cubesats now have the ability to slew and capture images within short notice. We propose a modular framework that combines orbital mechanics, attitude control and scheduling optimization to plan the time-varying, full-body orientation of agile Cubesats in a constellation such that they maximize the number of observed images and observation time, within the constraints of Cubesat hardware specifications. The attitude control strategy combines bang-bang and PD control, with constraints such as power consumption, response time, and stability factored into the optimality computations and a possible extension to PID control to account for disturbances. Schedule optimization is performed using dynamic programming with two levels of heuristics, verified and improved upon using mixed integer linear programming. The automated scheduler is expected to run on ground station resources and the resultant schedules uplinked to the satellites for execution, however it can be adapted for onboard scheduling, contingent on Cubesat hardware and software upgrades. The framework is generalizable over small steerable spacecraft, sensor specifications, imaging objectives and regions of interest, and is demonstrated using multiple 20 kg satellites in Low Earth Orbit for two case studies - rapid imaging of Landsat's land and coastal images and extended imaging of global, warm water coral reefs. The proposed algorithm captures up to 161% more Landsat images than nadir-pointing sensors with the same field of view, on a 2-satellite constellation over a 12-h simulation. Integer programming was able to verify that optimality of the dynamic programming solution for single satellites was within 10%, and find up to 5% more optimal solutions. The optimality gap for constellations was found to be 22% at worst, but the dynamic programming schedules were found at nearly four orders of magnitude better computational speed than integer programming. The algorithm can include cloud cover predictions, ground downlink windows or any other spatial, temporal or angular constraints into the orbital module and be integrated into planning tools for agile constellations.
Evaluating forest management policies by parametric linear programing
Daniel I. Navon; Richard J. McConnen
1967-01-01
An analytical and simulation technique, parametric linear programing explores alternative conditions and devises an optimal management plan for each condition. Its application in solving policy-decision problems in the management of forest lands is illustrated in an example.
Guevara, V R
2004-02-01
A nonlinear programming optimization model was developed to maximize margin over feed cost in broiler feed formulation and is described in this paper. The model identifies the optimal feed mix that maximizes profit margin. Optimum metabolizable energy level and performance were found by using Excel Solver nonlinear programming. Data from an energy density study with broilers were fitted to quadratic equations to express weight gain, feed consumption, and the objective function income over feed cost in terms of energy density. Nutrient:energy ratio constraints were transformed into equivalent linear constraints. National Research Council nutrient requirements and feeding program were used for examining changes in variables. The nonlinear programming feed formulation method was used to illustrate the effects of changes in different variables on the optimum energy density, performance, and profitability and was compared with conventional linear programming. To demonstrate the capabilities of the model, I determined the impact of variation in prices. Prices for broiler, corn, fish meal, and soybean meal were increased and decreased by 25%. Formulations were identical in all other respects. Energy density, margin, and diet cost changed compared with conventional linear programming formulation. This study suggests that nonlinear programming can be more useful than conventional linear programming to optimize performance response to energy density in broiler feed formulation because an energy level does not need to be set.
A Framework for Implementing TQM in Higher Education Programs
ERIC Educational Resources Information Center
Venkatraman, Sitalakshmi
2007-01-01
Purpose: This paper aims to provide a TQM framework that stresses continuous improvements in teaching as a plausible means of TQM implementation in higher education programs. Design/methodology/approach: The literature survey of the TQM philosophies and the comparative analysis of TQM adoption in industry versus higher education provide the…
ERIC Educational Resources Information Center
Cassata-Widera, Amy; Century, Jeanne; Kim, Dae Y.
2011-01-01
The practical need for multidimensional measures of fidelity of implementation (FOI) of reform-based science, technology, engineering, and mathematics (STEM) instructional materials, combined with a theoretical need in the field for a shared conceptual framework that could support accumulating knowledge on specific enacted program elements across…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
... Efficiency Program for Commercial and Industrial Equipment: Public Meeting and Availability of the Framework Document for Commercial and Industrial Pumps AGENCY: Office of Energy Efficiency and Renewable Energy... industrial pumps. To inform interested parties and to facilitate this process, DOE has prepared a Framework...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-22
... Efficiency Program for Commercial and Industrial Equipment: Public Meeting and Availability of the Framework... the notice of public meeting and availability of the Framework Document pertaining to the development of energy conservation standards for commercial and industrial fan and blower equipment published on...
Using the RE-AIM framework to evaluate physical activity public health programs in Mexico
USDA-ARS?s Scientific Manuscript database
Physical activity (PA) public health programming has been widely used in Mexico; however, few studies have documented individual and organizational factors that might be used to evaluate their public health impact. The RE-AIM framework is an evaluation tool that examines individual and organizationa...
Optimal feedback control infinite dimensional parabolic evolution systems: Approximation techniques
NASA Technical Reports Server (NTRS)
Banks, H. T.; Wang, C.
1989-01-01
A general approximation framework is discussed for computation of optimal feedback controls in linear quadratic regular problems for nonautonomous parabolic distributed parameter systems. This is done in the context of a theoretical framework using general evolution systems in infinite dimensional Hilbert spaces. Conditions are discussed for preservation under approximation of stabilizability and detectability hypotheses on the infinite dimensional system. The special case of periodic systems is also treated.
NASA Technical Reports Server (NTRS)
Young, Katherine C.; Sobieszczanski-Sobieski, Jaroslaw
1988-01-01
This project has two objectives. The first is to determine whether linear programming techniques can improve performance when handling design optimization problems with a large number of design variables and constraints relative to the feasible directions algorithm. The second purpose is to determine whether using the Kreisselmeier-Steinhauser (KS) function to replace the constraints with one constraint will reduce the cost of total optimization. Comparisons are made using solutions obtained with linear and non-linear methods. The results indicate that there is no cost saving using the linear method or in using the KS function to replace constraints.
Asymptotic structure of space-time with a positive cosmological constant
NASA Astrophysics Data System (ADS)
Kesavan, Aruna
In general relativity a satisfactory framework for describing isolated systems exists when the cosmological constant Lambda is zero. The detailed analysis of the asymptotic structure of the gravitational field, which constitutes the framework of asymptotic flatness, lays the foundation for research in diverse areas in gravitational science. However, the framework is incomplete in two respects. First, asymptotic flatness provides well-defined expressions for physical observables such as energy and momentum as 'charges' of asymptotic symmetries at null infinity, [special character omitted] +. But the asymptotic symmetry group, called the Bondi-Metzner-Sachs group is infinite-dimensional and a tensorial expression for the 'charge' integral of an arbitrary BMS element is missing. We address this issue by providing a charge formula which is a 2-sphere integral over fields local to the 2-sphere and refers to no extraneous structure. The second, and more significant shortcoming is that observations have established that Lambda is not zero but positive in our universe. Can the framework describing isolated systems and their gravitational radiation be extended to incorporate this fact? In this dissertation we show that, unfortunately, the standard framework does not extend from the Lambda = 0 case to the Lambda > 0 case in a physically useful manner. In particular, we do not have an invariant notion of gravitational waves in the non-linear regime, nor an analog of the Bondi 'news tensor', nor positive energy theorems. In addition, we argue that the stronger boundary condition of conformal flatness of intrinsic metric on [special character omitted]+, which reduces the asymptotic symmetry group from Diff([special character omitted]) to the de Sitter group, is insufficient to characterize gravitational fluxes and is physically unreasonable. To obtain guidance for the full non-linear theory with Lambda > 0, linearized gravitational waves in de Sitter space-time are analyzed in detail. i) We show explicitly that conformal flatness of the boundary removes half the degrees of freedom of the gravitational field by hand and is not justified by physical considerations; ii) We obtain gauge invariant expressions of energy-momentum and angular momentum fluxes carried by gravitational waves in terms of fields defined at [special character omitted]+; iii) We demonstrate that the flux formulas reduce to the familiar ones in Minkowski spacetime in spite of the fact that the limit Lambda → 0 is discontinuous (since, in particular, [special character omitted]+ changes its space-like character to null in the limit); iv) We obtain a generalization of Einstein's 1918 quadrupole formula for power emission by a linearized source to include a positive Lambda; and, finally v) We show that, although energy of linearized gravitational waves can be arbitrarily negative in general, gravitational waves emitted by physically reasonable sources carry positive energy.
NASA Technical Reports Server (NTRS)
Fleming, P.
1985-01-01
A design technique is proposed for linear regulators in which a feedback controller of fixed structure is chosen to minimize an integral quadratic objective function subject to the satisfaction of integral quadratic constraint functions. Application of a non-linear programming algorithm to this mathematically tractable formulation results in an efficient and useful computer-aided design tool. Particular attention is paid to computational efficiency and various recommendations are made. Two design examples illustrate the flexibility of the approach and highlight the special insight afforded to the designer.
50 CFR 86.100 - What is the National Framework?
Code of Federal Regulations, 2013 CFR
2013-10-01
... (BIG) PROGRAM Service Completion of the National Framework § 86.100 What is the National Framework? The... your State. Through a State survey, you must conduct a boating access needs assessment or data...
50 CFR 86.100 - What is the National Framework?
Code of Federal Regulations, 2014 CFR
2014-10-01
... (BIG) PROGRAM Service Completion of the National Framework § 86.100 What is the National Framework? The... your State. Through a State survey, you must conduct a boating access needs assessment or data...
A sequential linear optimization approach for controller design
NASA Technical Reports Server (NTRS)
Horta, L. G.; Juang, J.-N.; Junkins, J. L.
1985-01-01
A linear optimization approach with a simple real arithmetic algorithm is presented for reliable controller design and vibration suppression of flexible structures. Using first order sensitivity of the system eigenvalues with respect to the design parameters in conjunction with a continuation procedure, the method converts a nonlinear optimization problem into a maximization problem with linear inequality constraints. The method of linear programming is then applied to solve the converted linear optimization problem. The general efficiency of the linear programming approach allows the method to handle structural optimization problems with a large number of inequality constraints on the design vector. The method is demonstrated using a truss beam finite element model for the optimal sizing and placement of active/passive-structural members for damping augmentation. Results using both the sequential linear optimization approach and nonlinear optimization are presented and compared. The insensitivity to initial conditions of the linear optimization approach is also demonstrated.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-29
... the Framework Document for High-Intensity Discharge Lamps AGENCY: Office of Energy Efficiency and... availability of framework document for high-intensity discharge (HID) lamps, initiating the rulemaking and data... Availability of Framework Document Regarding Energy Conservation Standards for High-Intensity Discharge (HID...
NASA Technical Reports Server (NTRS)
Banks, H. T.; Silcox, R. J.; Keeling, S. L.; Wang, C.
1989-01-01
A unified treatment of the linear quadratic tracking (LQT) problem, in which a control system's dynamics are modeled by a linear evolution equation with a nonhomogeneous component that is linearly dependent on the control function u, is presented; the treatment proceeds from the theoretical formulation to a numerical approximation framework. Attention is given to two categories of LQT problems in an infinite time interval: the finite energy and the finite average energy. The behavior of the optimal solution for finite time-interval problems as the length of the interval tends to infinity is discussed. Also presented are the formulations and properties of LQT problems in a finite time interval.
A New Pattern of Getting Nasty Number in Graphical Method
NASA Astrophysics Data System (ADS)
Sumathi, P.; Indhumathi, N.
2018-04-01
This paper proposed a new technique of getting nasty numbers using graphical method in linear programming problem and it has been proved for various Linear programming problems. And also some characterisation of nasty numbers is discussed in this paper.
NASA Astrophysics Data System (ADS)
Pradanti, Paskalia; Hartono
2018-03-01
Determination of insulin injection dose in diabetes mellitus treatment can be considered as an optimal control problem. This article is aimed to simulate optimal blood glucose control for patient with diabetes mellitus. The blood glucose regulation of diabetic patient is represented by Ackerman’s Linear Model. This problem is then solved using dynamic programming method. The desired blood glucose level is obtained by minimizing the performance index in Lagrange form. The results show that dynamic programming based on Ackerman’s Linear Model is quite good to solve the problem.
Galka, Andreas; Siniatchkin, Michael; Stephani, Ulrich; Groening, Kristina; Wolff, Stephan; Bosch-Bayard, Jorge; Ozaki, Tohru
2010-12-01
The analysis of time series obtained by functional magnetic resonance imaging (fMRI) may be approached by fitting predictive parametric models, such as nearest-neighbor autoregressive models with exogeneous input (NNARX). As a part of the modeling procedure, it is possible to apply instantaneous linear transformations to the data. Spatial smoothing, a common preprocessing step, may be interpreted as such a transformation. The autoregressive parameters may be constrained, such that they provide a response behavior that corresponds to the canonical haemodynamic response function (HRF). We present an algorithm for estimating the parameters of the linear transformations and of the HRF within a rigorous maximum-likelihood framework. Using this approach, an optimal amount of both the spatial smoothing and the HRF can be estimated simultaneously for a given fMRI data set. An example from a motor-task experiment is discussed. It is found that, for this data set, weak, but non-zero, spatial smoothing is optimal. Furthermore, it is demonstrated that activated regions can be estimated within the maximum-likelihood framework.
Improved Shaping Approach to the Preliminary Design of Low-Thrust Trajectories
NASA Astrophysics Data System (ADS)
Novak, D. M.; Vasile, M.
2011-01-01
This paper presents a general framework for the development of shape-based approaches to low-thrust trajectory design. A novel shaping method, based on a three-dimensional description of the trajectory in spherical coordinates, is developed within this general framework. Both the exponential sinusoid and the inverse polynomial shaping are demonstrated to be particular two-dimensional cases of the spherical one. The pseudoequinoctial shaping is revisited within the new framework, and the nonosculating nature of the pseudoequinoctial elements is analyzed. A two-step approach is introduced to solve the time of flight constraint, related to the design of low-thrust arcs with boundary constraints for both spherical and pseudoequinoctial shaping. The solution derived from the shaping approach is improved with a feedback linear-quadratic controller and compared against a direct collocation method based on finite elements in time. The new shaping approach and the combination of shaping and linear-quadratic controller are tested on three case studies: a mission to Mars, a mission to asteroid 1989ML, a mission to comet Tempel-1, and a mission to Neptune.
Campbell, Norm R C; Ordunez, Pedro; DiPette, Donald J; Giraldo, Gloria P; Angell, Sonia Y; Jaffe, Marc G; Lackland, Dan; Martinez, Ramón; Valdez, Yamilé; Maldonado Figueredo, Javier I; Paccot, Melanie; Santana, Maria J; Whelton, Paul K
2018-06-01
The Pan American Health Organization (PAHO)-World Hypertension League (WHL) Hypertension Monitoring and Evaluation Framework is summarized. Standardized indicators are provided for monitoring and evaluating national or subnational hypertension control programs. Five core indicators from the World Health Organization hearts initiative and a single PAHO-WHL core indicator are recommended to be used in all hypertension control programs. In addition, hypertension control programs are encouraged to select from 14 optional qualitative and 33 quantitative indicators to facilitate progress towards enhanced hypertension control. The intention is for hypertension programs to select quantitative indicators based on the current surveillance mechanisms that are available and what is feasible and to use the framework process indicators as a guide to program management. Programs may wish to increase or refine the number of indicators they use over time. With adaption the indicators can also be implemented at a community or clinic level. The standardized indicators are being pilot tested in Cuba, Colombia, Chile, and Barbados. ©2018 Wiley Periodicals, Inc.
SPAR reference manual. [for stress analysis
NASA Technical Reports Server (NTRS)
Whetstone, W. D.
1974-01-01
SPAR is a system of related programs which may be operated either in batch or demand (teletype) mode. Information exchange between programs is automatically accomplished through one or more direct access libraries, known collectively as the data complex. Card input is command-oriented, in free-field form. Capabilities available in the first production release of the system are fully documented, and include linear stress analysis, linear bifurcation buckling analysis, and linear vibrational analysis.
Polarization dependent photo-induced bias stress effect in organic transistors.
NASA Astrophysics Data System (ADS)
Podzorov, Vitaly; Choi, Hyun Ho; Najafov, Hikmet; Saranin, Danila; Kharlamov, Nikolai A.; Kuznetzov, Denis V.; Didenko, Sergei I.; Cho, Kilwon; Briseno, Alejandro L.; Rutgers-Misis Collaboration; Ru-P Collaboration; Ru-Um Collaboration; Um-P Collaboration
Photo-induced charge transfer between a semiconductor and a gate insulator that occurs in organic transistors operating under illumination leads to a shift of the onset gate voltage in these devices. Here we report an observation of a polarization dependent photo-induced bias-stress effect in two prototypical single-crystal organic field-effect transistors, based on rubrene and TPBIQ. We find that the rate of the effect is a periodic function of polarization angle of a linearly polarized photoexcitation, with a periodicity of π. The observed phenomenon provides an effective tool for addressing the relationship between molecular packing and parameter drift in organic transistors under illumination. The work was carried out with financial support from the Ministry of Education and Science of the Russian Federation in the framework of Increase Competitiveness Program of NUST «MISiS» (No. K3-2016-004), by gov. decree 16/03/2013, N 211.
Biomass supply chain optimisation for Organosolv-based biorefineries.
Giarola, Sara; Patel, Mayank; Shah, Nilay
2014-05-01
This work aims at providing a Mixed Integer Linear Programming modelling framework to help define planning strategies for the development of sustainable biorefineries. The up-scaling of an Organosolv biorefinery was addressed via optimisation of the whole system economics. Three real world case studies were addressed to show the high-level flexibility and wide applicability of the tool to model different biomass typologies (i.e. forest fellings, cereal residues and energy crops) and supply strategies. Model outcomes have revealed how supply chain optimisation techniques could help shed light on the development of sustainable biorefineries. Feedstock quality, quantity, temporal and geographical availability are crucial to determine biorefinery location and the cost-efficient way to supply the feedstock to the plant. Storage costs are relevant for biorefineries based on cereal stubble, while wood supply chains present dominant pretreatment operations costs. Copyright © 2014 Elsevier Ltd. All rights reserved.
Short-term Power Load Forecasting Based on Balanced KNN
NASA Astrophysics Data System (ADS)
Lv, Xianlong; Cheng, Xingong; YanShuang; Tang, Yan-mei
2018-03-01
To improve the accuracy of load forecasting, a short-term load forecasting model based on balanced KNN algorithm is proposed; According to the load characteristics, the historical data of massive power load are divided into scenes by the K-means algorithm; In view of unbalanced load scenes, the balanced KNN algorithm is proposed to classify the scene accurately; The local weighted linear regression algorithm is used to fitting and predict the load; Adopting the Apache Hadoop programming framework of cloud computing, the proposed algorithm model is parallelized and improved to enhance its ability of dealing with massive and high-dimension data. The analysis of the household electricity consumption data for a residential district is done by 23-nodes cloud computing cluster, and experimental results show that the load forecasting accuracy and execution time by the proposed model are the better than those of traditional forecasting algorithm.
Program Monitoring with LTL in EAGLE
NASA Technical Reports Server (NTRS)
Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik
2004-01-01
We briefly present a rule-based framework called EAGLE, shown to be capable of defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time and metric temporal logics (MTL), interval logics, forms of quantified temporal logics, and so on. In this paper we focus on a linear temporal logic (LTL) specialization of EAGLE. For an initial formula of size m, we establish upper bounds of O(m(sup 2)2(sup m)log m) and O(m(sup 4)2(sup 2m)log(sup 2) m) for the space and time complexity, respectively, of single step evaluation over an input trace. This bound is close to the lower bound O(2(sup square root m) for future-time LTL presented. EAGLE has been successfully used, in both LTL and metric LTL forms, to test a real-time controller of an experimental NASA planetary rover.
Distance estimation and collision prediction for on-line robotic motion planning
NASA Technical Reports Server (NTRS)
Kyriakopoulos, K. J.; Saridis, G. N.
1991-01-01
An efficient method for computing the minimum distance and predicting collisions between moving objects is presented. This problem has been incorporated in the framework of an in-line motion planning algorithm to satisfy collision avoidance between a robot and moving objects modeled as convex polyhedra. In the beginning the deterministic problem, where the information about the objects is assumed to be certain is examined. If instead of the Euclidean norm, L(sub 1) or L(sub infinity) norms are used to represent distance, the problem becomes a linear programming problem. The stochastic problem is formulated, where the uncertainty is induced by sensing and the unknown dynamics of the moving obstacles. Two problems are considered: (1) filtering of the minimum distance between the robot and the moving object, at the present time; and (2) prediction of the minimum distance in the future, in order to predict possible collisions with the moving obstacles and estimate the collision time.
Performance bounds for nonlinear systems with a nonlinear ℒ2-gain property
NASA Astrophysics Data System (ADS)
Zhang, Huan; Dower, Peter M.
2012-09-01
Nonlinear ℒ2-gain is a finite gain concept that generalises the notion of conventional (linear) finite ℒ2-gain to admit the application of ℒ2-gain analysis tools of a broader class of nonlinear systems. The computation of tight comparison function bounds for this nonlinear ℒ2-gain property is important in applications such as small gain design. This article presents an approximation framework for these comparison function bounds through the formulation and solution of an optimal control problem. Key to the solution of this problem is the lifting of an ℒ2-norm input constraint, which is facilitated via the introduction of an energy saturation operator. This admits the solution of the optimal control problem of interest via dynamic programming and associated numerical methods, leading to the computation of the proposed bounds. Two examples are presented to demonstrate this approach.
Typical performance of approximation algorithms for NP-hard problems
NASA Astrophysics Data System (ADS)
Takabe, Satoshi; Hukushima, Koji
2016-11-01
Typical performance of approximation algorithms is studied for randomized minimum vertex cover problems. A wide class of random graph ensembles characterized by an arbitrary degree distribution is discussed with the presentation of a theoretical framework. Herein, three approximation algorithms are examined: linear-programming relaxation, loopy-belief propagation, and the leaf-removal algorithm. The former two algorithms are analyzed using a statistical-mechanical technique, whereas the average-case analysis of the last one is conducted using the generating function method. These algorithms have a threshold in the typical performance with increasing average degree of the random graph, below which they find true optimal solutions with high probability. Our study reveals that there exist only three cases, determined by the order of the typical performance thresholds. In addition, we provide some conditions for classification of the graph ensembles and demonstrate explicitly some examples for the difference in thresholds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramamurthy, Byravamurthy
2014-05-05
In this project, developed scheduling frameworks for dynamic bandwidth demands for large-scale science applications. In particular, we developed scheduling algorithms for dynamic bandwidth demands in this project. Apart from theoretical approaches such as Integer Linear Programming, Tabu Search and Genetic Algorithm heuristics, we have utilized practical data from ESnet OSCARS project (from our DOE lab partners) to conduct realistic simulations of our approaches. We have disseminated our work through conference paper presentations and journal papers and a book chapter. In this project we addressed the problem of scheduling of lightpaths over optical wavelength division multiplexed (WDM) networks. We published severalmore » conference papers and journal papers on this topic. We also addressed the problems of joint allocation of computing, storage and networking resources in Grid/Cloud networks and proposed energy-efficient mechanisms for operatin optical WDM networks.« less
ERIC Educational Resources Information Center
Leff, H. Stephen; Turner, Ralph R.
This report focuses on the use of linear programming models to address the issues of how vocational rehabilitation (VR) resources should be allocated in order to maximize program efficiency within given resource constraints. A general introduction to linear programming models is first presented that describes the major types of models available,…