Discovering Knowledge from Noisy Databases Using Genetic Programming.
ERIC Educational Resources Information Center
Wong, Man Leung; Leung, Kwong Sak; Cheng, Jack C. Y.
2000-01-01
Presents a framework that combines Genetic Programming and Inductive Logic Programming, two approaches in data mining, to induce knowledge from noisy databases. The framework is based on a formalism of logic grammars and is implemented as a data mining system called LOGENPRO (Logic Grammar-based Genetic Programming System). (Contains 34…
2013/2014 Eco-Logical program annual report
DOT National Transportation Integrated Search
2014-12-01
The Eco-Logical approach offers an ecosystem-based framework for integrated infrastructure and natural resource planning, project development, and delivery. The 2013/2014 Eco-Logical Program Annual Report provides updates on the Federal Highway Admin...
Implementing Eco-Logical 2014-2015 Annual Report
DOT National Transportation Integrated Search
2015-12-01
The Eco-Logical approach offers an ecosystem-based framework for integrated infrastructure and natural resource planning, project development, and delivery. The 2014/2015 Implementing Eco-Logical Program Annual Report provides updates on the Federal ...
Wasserman, Deborah L
2010-05-01
This paper offers a framework for using a systems orientation and "foundational theory" to enhance theory-driven evaluations and logic models. The framework guides the process of identifying and explaining operative relationships and perspectives within human service program systems. Self-Determination Theory exemplifies how a foundational theory can be used to support the framework in a wide range of program evaluations. Two examples illustrate how applications of the framework have improved the evaluators' abilities to observe and explain program effect. In both exemplars improvements involved addressing and organizing into a single logic model heretofore seemingly disparate evaluation issues regarding valuing (by whose values); the role of organizational and program context; and evaluation anxiety and utilization. Copyright 2009 Elsevier Ltd. All rights reserved.
POLE.VAULT: A Semantic Framework for Health Policy Evaluation and Logical Testing.
Shaban-Nejad, Arash; Okhmatovskaia, Anya; Shin, Eun Kyong; Davis, Robert L; Buckeridge, David L
2017-01-01
The major goal of our study is to provide an automatic evaluation framework that aligns the results generated through semantic reasoning with the best available evidence regarding effective interventions to support the logical evaluation of public health policies. To this end, we have designed the POLicy EVAlUation & Logical Testing (POLE.VAULT) Framework to assist different stakeholders and decision-makers in making informed decisions about different health-related interventions, programs and ultimately policies, based on the contextual knowledge and the best available evidence at both individual and aggregate levels.
Sherman, Paul David
2016-04-01
This article presents a framework to identify key mechanisms for developing a logic model blueprint that can be used for an impending comprehensive evaluation of an undergraduate degree program in a Canadian university. The evaluation is a requirement of a comprehensive quality assurance process mandated by the university. A modified RUFDATA (Saunders, 2000) evaluation model is applied as an initiating framework to assist in decision making to provide a guide for conceptualizing a logic model for the quality assurance process. This article will show how an educational evaluation is strengthened by employing a RUFDATA reflective process in exploring key elements of the evaluation process, and then translating this information into a logic model format that could serve to offer a more focussed pathway for the quality assurance activities. Using preliminary program evaluation data from two key stakeholders of the undergraduate program as well as an audit of the curriculum's course syllabi, a case is made for, (1) the importance of inclusivity of key stakeholders participation in the design of the evaluation process to enrich the authenticity and accuracy of program participants' feedback, and (2) the diversification of data collection methods to ensure that stakeholders' narrative feedback is given ample exposure. It is suggested that the modified RUFDATA/logic model framework be applied to all academic programs at the university undergoing the quality assurance process at the same time so that economies of scale may be realized. Copyright © 2015 Elsevier Ltd. All rights reserved.
An interval logic for higher-level temporal reasoning
NASA Technical Reports Server (NTRS)
Schwartz, R. L.; Melliar-Smith, P. M.; Vogt, F. H.; Plaisted, D. A.
1983-01-01
Prior work explored temporal logics, based on classical modal logics, as a framework for specifying and reasoning about concurrent programs, distributed systems, and communications protocols, and reported on efforts using temporal reasoning primitives to express very high level abstract requirements that a program or system is to satisfy. Based on experience with those primitives, this report describes an Interval Logic that is more suitable for expressing such higher level temporal properties. The report provides a formal semantics for the Interval Logic, and several examples of its use. A description of decision procedures for the logic is also included.
ERIC Educational Resources Information Center
Zantal-Wiener, Kathy; Horwood, Thomas J.
2010-01-01
The authors propose a comprehensive evaluation framework to prepare for evaluating school emergency management programs. This framework involves a logic model that incorporates Government Performance and Results Act (GPRA) measures as a foundation for comprehensive evaluation that complements performance monitoring used by the U.S. Department of…
Runtime Analysis of Linear Temporal Logic Specifications
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Havelund, Klaus
2001-01-01
This report presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to B chi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.
Use of program logic models in the Southern Rural Access Program evaluation.
Pathman, Donald; Thaker, Samruddhi; Ricketts, Thomas C; Albright, Jennifer B
2003-01-01
The Southern Rural Access Program (SRAP) evaluation team used program logic models to clarify grantees' activities, objectives, and timelines. This information was used to benchmark data from grantees' progress reports to assess the program's successes. This article presents a brief background on the use of program logic models--essentially charts or diagrams specifying a program's planned activities, objectives, and goals--for evaluating and managing a program. It discusses the structure of the logic models chosen for the SRAP and how the model concept was introduced to the grantees to promote acceptance and use of the models. The article describes how the models helped clarify the program's objectives and helped lead agencies plan and manage the many program initiatives and subcontractors in their states. Models also provided a framework for grantees to report their progress to the National Program Office and evaluators and promoted the evaluators' visibility and acceptance by the grantees. Program logics, however, increased grantees' reporting requirements and demanded substantial time of the evaluators. Program logic models, on balance, proved their merit in the SRAP through their contributions to its management and evaluation and by providing a better understanding of the program's initiatives, successes, and potential impact.
A Current Logical Framework: The Propositional Fragment
2003-01-01
Under the Curry- Howard isomorphism, M can also be read as a proof term, and A as a proposition of intuitionistic linear logic in its formulation as DILL...the obliga- tion to ensure that the underlying logic (via the Curry- Howard isomorphism, if you like) is sensible. In particular, the principles of...Proceedings of the International Logic Programming Symposium (ILPS), pages 51-65, Portland, Oregon, December 1995. MIT Press. 6. G. Bellin and P. J
Reasoning on Weighted Delegatable Authorizations
NASA Astrophysics Data System (ADS)
Ruan, Chun; Varadharajan, Vijay
This paper studies logic based methods for representing and evaluating complex access control policies needed by modern database applications. In our framework, authorization and delegation rules are specified in a Weighted Delegatable Authorization Program (WDAP) which is an extended logic program. We show how extended logic programs can be used to specify complex security policies which support weighted administrative privilege delegation, weighted positive and negative authorizations, and weighted authorization propagations. We also propose a conflict resolution method that enables flexible delegation control by considering priorities of authorization grantors and weights of authorizations. A number of rules are provided to achieve delegation depth control, conflict resolution, and authorization and delegation propagations.
Hallinan, Christine M
2010-01-01
In this paper, program logic will be used to 'map out' the planning, development and evaluation of the general practice Pap nurse program in the Australian general practice arena. The incorporation of program logic into the evaluative process supports a greater appreciation of the theoretical assumptions and external influences that underpin general practice Pap nurse activity. The creation of a program logic model is a conscious strategy that results an explicit understanding of the challenges ahead, the resources available and time frames for outcomes. Program logic also enables a recognition that all players in the general practice arena need to be acknowledged by policy makers, bureaucrats and program designers when addressing through policy, issues relating to equity and accessibility of health initiatives. Logic modelling allows decision makers to consider the complexities of causal associations when developing health care proposals and programs. It enables the Pap nurse in general practice program to be represented diagrammatically by linking outcomes (short, medium and long term) with both the program activities and program assumptions. The research methodology used in the evaluation of the Pap nurse in general practice program includes a descriptive study design and the incorporation of program logic, with a retrospective analysis of Australian data from 2001 to 2009. For the purposes of gaining both empirical and contextual data for this paper, a data set analysis and literature review was performed. The application of program logic as an evaluative tool for analysis of the Pap PN incentive program facilitates a greater understanding of complex general practice activity triggers, and also allows this greater understanding to be incorporated into policy to facilitate Pap PN activity, increase general practice cervical smear and ultimately decrease burden of disease.
Petruzzello, Steven J.; Ryan, Katherine E.
2014-01-01
Transportation workers, who constitute a large sector of the workforce, have worksite factors that harm their health. Worksite wellness programs must target this at-risk population. Although physical activity is often a component of worksite wellness logic models, we consider it the cornerstone for improving the health of mass transit employees. Program theory was based on in-person interviews and focus groups of employees. We identified 4 short-term outcome categories, which provided a chain of responses based on the program activities that should lead to the desired end results. This logic model may have significant public health impact, because it can serve as a framework for other US mass transit districts and worksite populations that face similar barriers to wellness, including truck drivers, railroad employees, and pilots. The objective of this article is to discuss the development of a logic model for a physical activity–based mass-transit employee wellness program by describing the target population, program theory, the components of the logic model, and the process of its development. PMID:25032838
Das, Bhibha M; Petruzzello, Steven J; Ryan, Katherine E
2014-07-17
Transportation workers, who constitute a large sector of the workforce, have worksite factors that harm their health. Worksite wellness programs must target this at-risk population. Although physical activity is often a component of worksite wellness logic models, we consider it the cornerstone for improving the health of mass transit employees. Program theory was based on in-person interviews and focus groups of employees. We identified 4 short-term outcome categories, which provided a chain of responses based on the program activities that should lead to the desired end results. This logic model may have significant public health impact, because it can serve as a framework for other US mass transit districts and worksite populations that face similar barriers to wellness, including truck drivers, railroad employees, and pilots. The objective of this article is to discuss the development of a logic model for a physical activity-based mass-transit employee wellness program by describing the target population, program theory, the components of the logic model, and the process of its development.
Metalevel programming in robotics: Some issues
NASA Technical Reports Server (NTRS)
Kumarn, A.; Parameswaran, N.
1987-01-01
Computing in robotics has two important requirements: efficiency and flexibility. Algorithms for robot actions are implemented usually in procedural languages such as VAL and AL. But, since their excessive bindings create inflexible structures of computation, it is proposed that Logic Programming is a more suitable language for robot programming due to its non-determinism, declarative nature, and provision for metalevel programming. Logic Programming, however, results in inefficient computations. As a solution to this problem, researchers discuss a framework in which controls can be described to improve efficiency. They have divided controls into: (1) in-code and (2) metalevel and discussed them with reference to selection of rules and dataflow. Researchers illustrated the merit of Logic Programming by modelling the motion of a robot from one point to another avoiding obstacles.
Dal Palù, Alessandro; Pontelli, Enrico; He, Jing; Lu, Yonggang
2007-01-01
The paper describes a novel framework, constructed using Constraint Logic Programming (CLP) and parallelism, to determine the association between parts of the primary sequence of a protein and alpha-helices extracted from 3D low-resolution descriptions of large protein complexes. The association is determined by extracting constraints from the 3D information, regarding length, relative position and connectivity of helices, and solving these constraints with the guidance of a secondary structure prediction algorithm. Parallelism is employed to enhance performance on large proteins. The framework provides a fast, inexpensive alternative to determine the exact tertiary structure of unknown proteins.
Automata-Based Verification of Temporal Properties on Running Programs
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Havelund, Klaus; Lan, Sonie (Technical Monitor)
2001-01-01
This paper presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to Buchi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.
Assessment of Evidence-based Management Training Program: Application of a Logic Model.
Guo, Ruiling; Farnsworth, Tracy J; Hermanson, Patrick M
2016-06-01
The purposes of this study were to apply a logic model to plan and implement an evidence-based management (EBMgt) educational training program for healthcare administrators and to examine whether a logic model is a useful tool for evaluating the outcomes of the educational program. The logic model was used as a conceptual framework to guide the investigators in developing an EBMgt educational training program and evaluating the outcomes of the program. The major components of the logic model were constructed as inputs, outputs, and outcomes/impacts. The investigators delineated the logic model based on the results of the needs assessment survey. Two 3-hour training workshops were delivered to 30 participants. To assess the outcomes of the EBMgt educational program, pre- and post-tests and self-reflection surveys were conducted. The data were collected and analyzed descriptively and inferentially, using the IBM Statistical Package for the Social Sciences (SPSS) 22.0. A paired sample t-test was performed to compare the differences in participants' EBMgt knowledge and skills prior to and after the training. The assessment results showed that there was a statistically significant difference in participants' EBMgt knowledge and information searching skills before and after the training (p< 0.001). Participants' confidence in using the EBMgt approach for decision-making was significantly increased after the training workshops (p< 0.001). Eighty-three percent of participants indicated that the knowledge and skills they gained through the training program could be used for future management decision-making in their healthcare organizations. The overall evaluation results of the program were positive. It is suggested that the logic model is a useful tool for program planning, implementation, and evaluation, and it also improves the outcomes of the educational program.
ERIC Educational Resources Information Center
Faw, Leyla; Hogue, Aaron; Liddle, Howard A.
2005-01-01
The authors applied contemporary methods from the evaluation literature to measure implementation in a residential treatment program for adolescent substance abuse. A logic model containing two main components was measured. Program structure (adherence to the intended framework of service delivery) was measured using data from daily activity logs…
Diehl, Glen; Major, Solomon
2015-01-01
Measuring the effectiveness of military Global Health Engagements (GHEs) has become an area of increasing interest to the military medical field. As a result, there have been efforts to more logically and rigorously evaluate GHE projects and programs; many of these have been based on the Logic and Results Frameworks. However, while these Frameworks are apt and appropriate planning tools, they are not ideally suited to measuring programs' effectiveness. This article introduces military medicine professionals to the Measures of Effectiveness for Defense Engagement and Learning (MODEL) program, which implements a new method of assessment, one that seeks to rigorously use Measures of Effectiveness (vs. Measures of Performance) to gauge programs' and projects' success and fidelity to Theater Campaign goals. While the MODEL method draws on the Logic and Results Frameworks where appropriate, it goes beyond their planning focus by using the latest social scientific and econometric evaluation methodologies to link on-the-ground GHE "lines of effort" to the realization of national and strategic goals and end-states. It is hoped these methods will find use beyond the MODEL project itself, and will catalyze a new body of rigorous, empirically based work, which measures the effectiveness of a broad spectrum of GHE and security cooperation activities. We based our strategies on the principle that it is much more cost-effective to prevent conflicts than it is to stop one once it's started. I cannot overstate the importance of our theater security cooperation programs as the centerpiece to securing our Homeland from the irregular and catastrophic threats of the 21st Century.-GEN James L. Jones, USMC (Ret.). Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
Rethinking Social Barriers to Effective Adaptive Management.
West, Simon; Schultz, Lisen; Bekessy, Sarah
2016-09-01
Adaptive management is an approach to environmental management based on learning-by-doing, where complexity, uncertainty, and incomplete knowledge are acknowledged and management actions are treated as experiments. However, while adaptive management has received significant uptake in theory, it remains elusively difficult to enact in practice. Proponents have blamed social barriers and have called for social science contributions. We address this gap by adopting a qualitative approach to explore the development of an ecological monitoring program within an adaptive management framework in a public land management organization in Australia. We ask what practices are used to enact the monitoring program and how do they shape learning? We elicit a rich narrative through extensive interviews with a key individual, and analyze the narrative using thematic analysis. We discuss our results in relation to the concept of 'knowledge work' and Westley's (2002) framework for interpreting the strategies of adaptive managers-'managing through, in, out and up.' We find that enacting the program is conditioned by distinct and sometimes competing logics-scientific logics prioritizing experimentation and learning, public logics emphasizing accountability and legitimacy, and corporate logics demanding efficiency and effectiveness. In this context, implementing adaptive management entails practices of translation to negotiate tensions between objective and situated knowledge, external experts and organizational staff, and collegiate and hierarchical norms. Our contribution embraces the 'doing' of learning-by-doing and marks a shift from conceptualizing the social as an external barrier to adaptive management to be removed to an approach that situates adaptive management as social knowledge practice.
Learning Probabilistic Logic Models from Probabilistic Examples
Chen, Jianzhong; Muggleton, Stephen; Santos, José
2009-01-01
Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348
Learning Probabilistic Logic Models from Probabilistic Examples.
Chen, Jianzhong; Muggleton, Stephen; Santos, José
2008-10-01
We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.
Program Monitoring with LTL in EAGLE
NASA Technical Reports Server (NTRS)
Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik
2004-01-01
We briefly present a rule-based framework called EAGLE, shown to be capable of defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time and metric temporal logics (MTL), interval logics, forms of quantified temporal logics, and so on. In this paper we focus on a linear temporal logic (LTL) specialization of EAGLE. For an initial formula of size m, we establish upper bounds of O(m(sup 2)2(sup m)log m) and O(m(sup 4)2(sup 2m)log(sup 2) m) for the space and time complexity, respectively, of single step evaluation over an input trace. This bound is close to the lower bound O(2(sup square root m) for future-time LTL presented. EAGLE has been successfully used, in both LTL and metric LTL forms, to test a real-time controller of an experimental NASA planetary rover.
ERIC Educational Resources Information Center
Futris, Ted G.; Schramm, David G.
2015-01-01
What goes into designing and implementing a successful program? How do both research and practice inform program development? In this article, the process through which a federally funded training curriculum was developed and piloted tested is described. Using a logic model framework, important lessons learned are shared in defining the situation,…
How Learning Logic Programming Affects Recursion Comprehension
ERIC Educational Resources Information Center
Haberman, Bruria
2004-01-01
Recursion is a central concept in computer science, yet it is difficult for beginners to comprehend. Israeli high-school students learn recursion in the framework of a special modular program in computer science (Gal-Ezer & Harel, 1999). Some of them are introduced to the concept of recursion in two different paradigms: the procedural…
Segal, Leonie; Sara Opie, Rachelle; Dalziel, Kim
2012-01-01
Context Home-visiting programs have been offered for more than sixty years to at-risk families of newborns and infants. But despite decades of experience with program delivery, more than sixty published controlled trials, and more than thirty published literature reviews, there is still uncertainty surrounding the performance of these programs. Our particular interest was the performance of home visiting in reducing child maltreatment. Methods We developed a program logic framework to assist in understanding the neonate/infant home-visiting literature, identified through a systematic literature review. We tested whether success could be explained by the logic model using descriptive synthesis and statistical analysis. Findings Having a stated objective of reducing child maltreatment—a theory or mechanism of change underpinning the home-visiting program consistent with the target population and their needs and program components that can deliver against the nominated theory of change—considerably increased the chance of success. We found that only seven of fifty-three programs demonstrated such consistency, all of which had a statistically significant positive outcome, whereas of the fifteen that had no match, none was successful. Programs with a partial match had an intermediate success rate. The relationship between program success and full, partial or no match was statistically significant. Conclusions Employing a theory-driven approach provides a new way of understanding the disparate performance of neonate/infant home-visiting programs. Employing a similar theory-driven approach could also prove useful in the review of other programs that embody a diverse set of characteristics and may apply to diverse populations and settings. A program logic framework provides a rigorous approach to deriving policy-relevant meaning from effectiveness evidence of complex programs. For neonate/infant home-visiting programs, it means that in developing these programs, attention to consistency of objectives, theory of change, target population, and program components is critical. PMID:22428693
Segal, Leonie; Sara Opie, Rachelle; Dalziel, Kim
2012-03-01
Home-visiting programs have been offered for more than sixty years to at-risk families of newborns and infants. But despite decades of experience with program delivery, more than sixty published controlled trials, and more than thirty published literature reviews, there is still uncertainty surrounding the performance of these programs. Our particular interest was the performance of home visiting in reducing child maltreatment. We developed a program logic framework to assist in understanding the neonate/infant home-visiting literature, identified through a systematic literature review. We tested whether success could be explained by the logic model using descriptive synthesis and statistical analysis. Having a stated objective of reducing child maltreatment-a theory or mechanism of change underpinning the home-visiting program consistent with the target population and their needs and program components that can deliver against the nominated theory of change-considerably increased the chance of success. We found that only seven of fifty-three programs demonstrated such consistency, all of which had a statistically significant positive outcome, whereas of the fifteen that had no match, none was successful. Programs with a partial match had an intermediate success rate. The relationship between program success and full, partial or no match was statistically significant. Employing a theory-driven approach provides a new way of understanding the disparate performance of neonate/infant home-visiting programs. Employing a similar theory-driven approach could also prove useful in the review of other programs that embody a diverse set of characteristics and may apply to diverse populations and settings. A program logic framework provides a rigorous approach to deriving policy-relevant meaning from effectiveness evidence of complex programs. For neonate/infant home-visiting programs, it means that in developing these programs, attention to consistency of objectives, theory of change, target population, and program components is critical. © 2012 Milbank Memorial Fund.
Research Ethics Review: Identifying Public Policy and Program Gaps
Strosberg, Martin A.; Gefenas, Eugenijus; Famenka, Andrei
2014-01-01
We present an analytical frame-work for use by fellows of the Fogarty International Center–sponsored Advanced Certificate Program in Research Ethics for Central and Eastern Europe to identify gaps in the public policies establishing research ethics review systems that impede them from doing their job of protecting human research subjects. The framework, illustrated by examples from post-Communist countries, employs a logic model based on the public policy and public management literature. This paper is part of a collection of papers analyzing the Fogarty International Center’s International Research Ethics Education and Curriculum program. PMID:24782068
William H. McWilliams; Carol L. Alerich; William A. Bechtold; Mark Hansen; Christopher M. Oswalt; Mike Thompson; Jeff Turner
2012-01-01
The U.S. Department of Agriculture, Forest Service, Forest Inventory and Analysis (FIA) program maintains the National Information Management System (NIMS) that provides the computational framework for the annual forest inventory of the United States. Questions regarding the impact of key elements of programming logic, processing criteria, and estimation procedures...
Birnbaum, Marvin L; Daily, Elaine K; O'Rourke, Ann P; Kushner, Jennifer
2016-04-01
Disaster-related interventions are actions or responses undertaken during any phase of a disaster to change the current status of an affected community or a Societal System. Interventional disaster research aims to evaluate the results of such interventions in order to develop standards and best practices in Disaster Health that can be applied to disaster risk reduction. Considering interventions as production functions (transformation processes) structures the analyses and cataloguing of interventions/responses that are implemented prior to, during, or following a disaster or other emergency. Since currently it is not possible to do randomized, controlled studies of disasters, in order to validate the derived standards and best practices, the results of the studies must be compared and synthesized with results from other studies (ie, systematic reviews). Such reviews will be facilitated by the selected studies being structured using accepted frameworks. A logic model is a graphic representation of the transformation processes of a program [project] that shows the intended relationships between investments and results. Logic models are used to describe a program and its theory of change, and they provide a method for the analyzing and evaluating interventions. The Disaster Logic Model (DLM) is an adaptation of a logic model used for the evaluation of educational programs and provides the structure required for the analysis of disaster-related interventions. It incorporates a(n): definition of the current functional status of a community or Societal System, identification of needs, definition of goals, selection of objectives, implementation of the intervention(s), and evaluation of the effects, outcomes, costs, and impacts of the interventions. It is useful for determining the value of an intervention and it also provides the structure for analyzing the processes used in providing the intervention according to the Relief/Recovery and Risk-Reduction Frameworks.
A Logical Framework for Service Migration Based Survivability
2016-06-24
platforms; Service Migration Strategy Fuzzy Inference System Knowledge Base Fuzzy rules representing domain expert knowledge about implications of...service migration strategy. Our approach uses expert knowledge as linguistic reasoning rules and takes service programs damage assessment, service...programs complexity, and available network capability as input. The fuzzy inference system includes four components as shown in Figure 5: (1) a knowledge
A Logic Model for Evaluating the Academic Health Department.
Erwin, Paul Campbell; McNeely, Clea S; Grubaugh, Julie H; Valentine, Jennifer; Miller, Mark D; Buchanan, Martha
2016-01-01
Academic Health Departments (AHDs) are collaborative partnerships between academic programs and practice settings. While case studies have informed our understanding of the development and activities of AHDs, there has been no formal published evaluation of AHDs, either singularly or collectively. Developing a framework for evaluating AHDs has potential to further aid our understanding of how these relationships may matter. In this article, we present a general theory of change, in the form of a logic model, for how AHDs impact public health at the community level. We then present a specific example of how the logic model has been customized for a specific AHD. Finally, we end with potential research questions on the AHD based on these concepts. We conclude that logic models are valuable tools, which can be used to assess the value and ultimate impact of the AHD.
Rethinking Social Barriers to Effective Adaptive Management
NASA Astrophysics Data System (ADS)
West, Simon; Schultz, Lisen; Bekessy, Sarah
2016-09-01
Adaptive management is an approach to environmental management based on learning-by-doing, where complexity, uncertainty, and incomplete knowledge are acknowledged and management actions are treated as experiments. However, while adaptive management has received significant uptake in theory, it remains elusively difficult to enact in practice. Proponents have blamed social barriers and have called for social science contributions. We address this gap by adopting a qualitative approach to explore the development of an ecological monitoring program within an adaptive management framework in a public land management organization in Australia. We ask what practices are used to enact the monitoring program and how do they shape learning? We elicit a rich narrative through extensive interviews with a key individual, and analyze the narrative using thematic analysis. We discuss our results in relation to the concept of `knowledge work' and Westley's 2002) framework for interpreting the strategies of adaptive managers—`managing through, in, out and up.' We find that enacting the program is conditioned by distinct and sometimes competing logics—scientific logics prioritizing experimentation and learning, public logics emphasizing accountability and legitimacy, and corporate logics demanding efficiency and effectiveness. In this context, implementing adaptive management entails practices of translation to negotiate tensions between objective and situated knowledge, external experts and organizational staff, and collegiate and hierarchical norms. Our contribution embraces the `doing' of learning-by-doing and marks a shift from conceptualizing the social as an external barrier to adaptive management to be removed to an approach that situates adaptive management as social knowledge practice.
McDonald, Steve; Turner, Tari; Chamberlain, Catherine; Lumbiganon, Pisake; Thinkhamrop, Jadsada; Festin, Mario R; Ho, Jacqueline J; Mohammad, Hakimi; Henderson-Smart, David J; Short, Jacki; Crowther, Caroline A; Martis, Ruth; Green, Sally
2010-07-01
Rates of maternal and perinatal mortality remain high in developing countries despite the existence of effective interventions. Efforts to strengthen evidence-based approaches to improve health in these settings are partly hindered by restricted access to the best available evidence, limited training in evidence-based practice and concerns about the relevance of existing evidence. South East Asia--Optimising Reproductive and Child Health in Developing Countries (SEA-ORCHID) was a five-year project that aimed to determine whether a multifaceted intervention designed to strengthen the capacity for research synthesis, evidence-based care and knowledge implementation improved clinical practice and led to better health outcomes for mothers and babies. This paper describes the development and design of the SEA-ORCHID intervention plan using a logical framework approach. SEA-ORCHID used a before-and-after design to evaluate the impact of a multifaceted tailored intervention at nine sites across Thailand, Malaysia, Philippines and Indonesia, supported by three centres in Australia. We used a logical framework approach to systematically prepare and summarise the project plan in a clear and logical way. The development and design of the SEA-ORCHID project was based around the three components of a logical framework (problem analysis, project plan and evaluation strategy). The SEA-ORCHID logical framework defined the project's goal and purpose (To improve the health of mothers and babies in South East Asia and To improve clinical practice in reproductive health in South East Asia), and outlined a series of project objectives and activities designed to achieve these. The logical framework also established outcome and process measures appropriate to each level of the project plan, and guided project work in each of the participating countries and hospitals. Development of a logical framework in the SEA-ORCHID project enabled a reasoned, logical approach to the project design that ensured the project activities would achieve the desired outcomes and that the evaluation plan would assess both the process and outcome of the project. The logical framework was also valuable over the course of the project to facilitate communication, assess progress and build a shared understanding of the project activities, purpose and goal.
2010-01-01
Background Rates of maternal and perinatal mortality remain high in developing countries despite the existence of effective interventions. Efforts to strengthen evidence-based approaches to improve health in these settings are partly hindered by restricted access to the best available evidence, limited training in evidence-based practice and concerns about the relevance of existing evidence. South East Asia - Optimising Reproductive and Child Health in Developing Countries (SEA-ORCHID) was a five-year project that aimed to determine whether a multifaceted intervention designed to strengthen the capacity for research synthesis, evidence-based care and knowledge implementation improved clinical practice and led to better health outcomes for mothers and babies. This paper describes the development and design of the SEA-ORCHID intervention plan using a logical framework approach. Methods SEA-ORCHID used a before-and-after design to evaluate the impact of a multifaceted tailored intervention at nine sites across Thailand, Malaysia, Philippines and Indonesia, supported by three centres in Australia. We used a logical framework approach to systematically prepare and summarise the project plan in a clear and logical way. The development and design of the SEA-ORCHID project was based around the three components of a logical framework (problem analysis, project plan and evaluation strategy). Results The SEA-ORCHID logical framework defined the project's goal and purpose (To improve the health of mothers and babies in South East Asia and To improve clinical practice in reproductive health in South East Asia), and outlined a series of project objectives and activities designed to achieve these. The logical framework also established outcome and process measures appropriate to each level of the project plan, and guided project work in each of the participating countries and hospitals. Conclusions Development of a logical framework in the SEA-ORCHID project enabled a reasoned, logical approach to the project design that ensured the project activities would achieve the desired outcomes and that the evaluation plan would assess both the process and outcome of the project. The logical framework was also valuable over the course of the project to facilitate communication, assess progress and build a shared understanding of the project activities, purpose and goal. PMID:20594325
Closing the Gap Between Specification and Programming: VDM++ and SCALA
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2011-01-01
We argue that a modern programming language such as Scala offers a level of succinctness, which makes it suitable for program and systems specification as well as for high-level programming. We illustrate this by comparing the language with the Vdm++ specification language. The comparison also identifies areas where Scala perhaps could be improved, inspired by Vdm++. We furthermore illustrate Scala's potential as a specification language by augmenting it with a combination of parameterized state machines and temporal logic, defined as a library, thereby forming an expressive but simple runtime verification framework.
Fulmer, Erika; Rogers, Todd; Glasgow, LaShawn; Brown, Susan; Kuiper, Nicole
2018-03-01
The outcome indicator framework helps tobacco prevention and control programs (TCPs) plan and implement theory-driven evaluations of their efforts to reduce and prevent tobacco use. Tobacco use is the single-most preventable cause of morbidity and mortality in the United States. The implementation of public health best practices by comprehensive state TCPs has been shown to prevent the initiation of tobacco use, reduce tobacco use prevalence, and decrease tobacco-related health care expenditures. Achieving and sustaining program goals require TCPs to evaluate the effectiveness and impact of their programs. To guide evaluation efforts by TCPs, the Centers for Disease Control and Prevention's Office on Smoking and Health developed an outcome indicator framework that includes a high-level logic model and evidence-based outcome indicators for each tobacco prevention and control goal area. In this article, we describe how TCPs and other community organizations can use the outcome indicator framework in their evaluation efforts. We also discuss how the framework is used at the national level to unify tobacco prevention and control efforts across varying state contexts, identify promising practices, and expand the public health evidence base.
An Argumentation Framework based on Paraconsistent Logic
NASA Astrophysics Data System (ADS)
Umeda, Yuichi; Takahashi, Takehisa; Sawamura, Hajime
Argumentation is the most representative of intelligent activities of humans. Therefore, it is natural to think that it could have many implications for artificial intelligence and computer science as well. Specifically, argumentation may be considered a most primitive capability for interaction among computational agents. In this paper we present an argumentation framework based on the four-valued paraconsistent logic. Tolerance and acceptance of inconsistency that this logic has as its logical feature allow for arguments on inconsistent knowledge bases with which we are often confronted. We introduce various concepts for argumentation, such as arguments, attack relations, argument justification, preferential criteria of arguments based on social norms, and so on, in a way proper to the four-valued paraconsistent logic. Then, we provide the fixpoint semantics and dialectical proof theory for our argumentation framework. We also give the proofs of the soundness and completeness.
Framework for End-User Programming of Cross-Smart Space Applications
Palviainen, Marko; Kuusijärvi, Jarkko; Ovaska, Eila
2012-01-01
Cross-smart space applications are specific types of software services that enable users to share information, monitor the physical and logical surroundings and control it in a way that is meaningful for the user's situation. For developing cross-smart space applications, this paper makes two main contributions: it introduces (i) a component design and scripting method for end-user programming of cross-smart space applications and (ii) a backend framework of components that interwork to support the brunt of the RDFScript translation, and the use and execution of ontology models. Before end-user programming activities, the software professionals must develop easy-to-apply Driver components for the APIs of existing software systems. Thereafter, end-users are able to create applications from the commands of the Driver components with the help of the provided toolset. The paper also introduces the reference implementation of the framework, tools for the Driver component development and end-user programming of cross-smart space applications and the first evaluation results on their application. PMID:23202169
An Evaluation of a School-Based Teenage Pregnancy Prevention Program Using a Logic Model Framework
ERIC Educational Resources Information Center
Hulton, Linda J.
2007-01-01
Teenage pregnancy and the subsequent social morbidities associated with unintended pregnancies are complex issues facing school nurses in their daily work. In contemporary practice, school nurses are being held to higher standards of accountability and being asked to demonstrate the effective outcomes of their interventions. The purpose of this…
Agent oriented programming: An overview of the framework and summary of recent research
NASA Technical Reports Server (NTRS)
Shoham, Yoav
1993-01-01
This is a short overview of the agent-oriented programming (AOP) framework. AOP can be viewed as an specialization of object-oriented programming. The state of an agent consists of components called beliefs, choices, capabilities, commitments, and possibly others; for this reason the state of an agent is called its mental state. The mental state of agents is captured formally in an extension of standard epistemic logics: beside temporalizing the knowledge and belief operators, AOP introduces operators for commitment, choice and capability. Agents are controlled by agent programs, which include primitives for communicating with other agents. In the spirit of speech-act theory, each communication primitive is of a certain type: informing, requesting, offering, etc. This document describes these features in more detail and summarizes recent results and ongoing AOP-related work.
Developing a theoretical framework for complex community-based interventions.
Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana
2014-01-01
Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.
ERIC Educational Resources Information Center
Kimpel, Lucina
2010-01-01
This research was comprised of a case study conducted at Grand View University to determine faculty perceptions and perspectives of outcomes related to a Title III grant-funded, professional development program. The conceptual framework for the study was based on a systematic process called the logic model (W. K. Kellogg Foundation, 2004). A…
Ostrowski, M; Paulevé, L; Schaub, T; Siegel, A; Guziolowski, C
2016-11-01
Boolean networks (and more general logic models) are useful frameworks to study signal transduction across multiple pathways. Logic models can be learned from a prior knowledge network structure and multiplex phosphoproteomics data. However, most efficient and scalable training methods focus on the comparison of two time-points and assume that the system has reached an early steady state. In this paper, we generalize such a learning procedure to take into account the time series traces of phosphoproteomics data in order to discriminate Boolean networks according to their transient dynamics. To that end, we identify a necessary condition that must be satisfied by the dynamics of a Boolean network to be consistent with a discretized time series trace. Based on this condition, we use Answer Set Programming to compute an over-approximation of the set of Boolean networks which fit best with experimental data and provide the corresponding encodings. Combined with model-checking approaches, we end up with a global learning algorithm. Our approach is able to learn logic models with a true positive rate higher than 78% in two case studies of mammalian signaling networks; for a larger case study, our method provides optimal answers after 7min of computation. We quantified the gain in our method predictions precision compared to learning approaches based on static data. Finally, as an application, our method proposes erroneous time-points in the time series data with respect to the optimal learned logic models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2014-01-01
The field of runtime verification has during the last decade seen a multitude of systems for monitoring event sequences (traces) emitted by a running system. The objective is to ensure correctness of a system by checking its execution traces against formal specifications representing requirements. A special challenge is data parameterized events, where monitors have to keep track of the combination of control states as well as data constraints, relating events and the data they carry across time points. This poses a challenge wrt. efficiency of monitors, as well as expressiveness of logics. Data automata is a form of automata where states are parameterized with data, supporting monitoring of data parameterized events. We describe the full details of a very simple API in the Scala programming language, an internal DSL (Domain-Specific Language), implementing data automata. The small implementation suggests a design pattern. Data automata allow transition conditions to refer to other states than the source state, and allow target states of transitions to be inlined, offering a temporal logic flavored notation. An embedding of a logic in a high-level language like Scala in addition allows monitors to be programmed using all of Scala's language constructs, offering the full flexibility of a programming language. The framework is demonstrated on an XML processing scenario previously addressed in related work.
Rule-Based Runtime Verification
NASA Technical Reports Server (NTRS)
Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik
2003-01-01
We present a rule-based framework for defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time logics, interval logics, forms of quantified temporal logics, and so on. Our logic, EAGLE, is implemented as a Java library and involves novel techniques for rule definition, manipulation and execution. Monitoring is done on a state-by-state basis, without storing the execution trace.
Synthetic mixed-signal computation in living cells
Rubens, Jacob R.; Selvaggio, Gianluca; Lu, Timothy K.
2016-01-01
Living cells implement complex computations on the continuous environmental signals that they encounter. These computations involve both analogue- and digital-like processing of signals to give rise to complex developmental programs, context-dependent behaviours and homeostatic activities. In contrast to natural biological systems, synthetic biological systems have largely focused on either digital or analogue computation separately. Here we integrate analogue and digital computation to implement complex hybrid synthetic genetic programs in living cells. We present a framework for building comparator gene circuits to digitize analogue inputs based on different thresholds. We then demonstrate that comparators can be predictably composed together to build band-pass filters, ternary logic systems and multi-level analogue-to-digital converters. In addition, we interface these analogue-to-digital circuits with other digital gene circuits to enable concentration-dependent logic. We expect that this hybrid computational paradigm will enable new industrial, diagnostic and therapeutic applications with engineered cells. PMID:27255669
Logics of Business Education for Sustainability
ERIC Educational Resources Information Center
Andersson, Pernilla; Öhman, Johan
2016-01-01
This paper explores various kinds of logics of "business education for sustainability" and how these "logics" position the subject business person, based on eight teachers' reasoning of their own practices. The concept of logics developed within a discourse theoretical framework is employed to analyse the teachers' reasoning.…
A Rewriting Logic Approach to Type Inference
NASA Astrophysics Data System (ADS)
Ellison, Chucky; Şerbănuţă, Traian Florin; Roşu, Grigore
Meseguer and Roşu proposed rewriting logic semantics (RLS) as a programing language definitional framework that unifies operational and algebraic denotational semantics. RLS has already been used to define a series of didactic and real languages, but its benefits in connection with defining and reasoning about type systems have not been fully investigated. This paper shows how the same RLS style employed for giving formal definitions of languages can be used to define type systems. The same term-rewriting mechanism used to execute RLS language definitions can now be used to execute type systems, giving type checkers or type inferencers. The proposed approach is exemplified by defining the Hindley-Milner polymorphic type inferencer mathcal{W} as a rewrite logic theory and using this definition to obtain a type inferencer by executing it in a rewriting logic engine. The inferencer obtained this way compares favorably with other definitions or implementations of mathcal{W}. The performance of the executable definition is within an order of magnitude of that of highly optimized implementations of type inferencers, such as that of OCaml.
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
A variety of artificial intelligence techniques which could be used with regard to NASA space applications and robotics were evaluated. The techniques studied were decision tree manipulators, problem solvers, rule based systems, logic programming languages, representation language languages, and expert systems. The overall structure of a robotic simulation tool was defined and a framework for that tool developed. Nonlinear and linearized dynamics equations were formulated for n link manipulator configurations. A framework for the robotic simulation was established which uses validated manipulator component models connected according to a user defined configuration.
Questioning and Experimentation
ERIC Educational Resources Information Center
Mutanen, Arto
2014-01-01
The paper is a philosophical analysis of experimentation. The philosophical framework of the analysis is the interrogative model of inquiry developed by Hintikka. The basis of the model is explicit and well-formed logic of questions and answers. The framework allows us to formulate a flexible logic of experimentation. In particular, the formulated…
Designing an evaluation framework for WFME basic standards for medical education.
Tackett, Sean; Grant, Janet; Mmari, Kristin
2016-01-01
To create an evaluation plan for the World Federation for Medical Education (WFME) accreditation standards for basic medical education. We conceptualized the 100 basic standards from "Basic Medical Education: WFME Global Standards for Quality Improvement: The 2012 Revision" as medical education program objectives. Standards were simplified into evaluable items, which were then categorized as inputs, processes, outputs and/or outcomes to generate a logic model and corresponding plan for data collection. WFME standards posed significant challenges to evaluation due to complex wording, inconsistent formatting and lack of existing assessment tools. Our resulting logic model contained 244 items. Standard B 5.1.1 separated into 24 items, the most for any single standard. A large proportion of items (40%) required evaluation of more than one input, process, output and/or outcome. Only one standard (B 3.2.2) was interpreted as requiring evaluation of a program outcome. Current WFME standards are difficult to use for evaluation planning. Our analysis may guide adaptation and revision of standards to make them more evaluable. Our logic model and data collection plan may be useful to medical schools planning an institutional self-review and to accrediting authorities wanting to provide guidance to schools under their purview.
Generalized Symbolic Execution for Model Checking and Testing
NASA Technical Reports Server (NTRS)
Khurshid, Sarfraz; Pasareanu, Corina; Visser, Willem; Kofmeyer, David (Technical Monitor)
2003-01-01
Modern software systems, which often are concurrent and manipulate complex data structures must be extremely reliable. We present a novel framework based on symbolic execution, for automated checking of such systems. We provide a two-fold generalization of traditional symbolic execution based approaches: one, we define a program instrumentation, which enables standard model checkers to perform symbolic execution; two, we give a novel symbolic execution algorithm that handles dynamically allocated structures (e.g., lists and trees), method preconditions (e.g., acyclicity of lists), data (e.g., integers and strings) and concurrency. The program instrumentation enables a model checker to automatically explore program heap configurations (using a systematic treatment of aliasing) and manipulate logical formulae on program data values (using a decision procedure). We illustrate two applications of our framework: checking correctness of multi-threaded programs that take inputs from unbounded domains with complex structure and generation of non-isomorphic test inputs that satisfy a testing criterion. Our implementation for Java uses the Java PathFinder model checker.
Dal Palú, Alessandro; Spyrakis, Francesca; Cozzini, Pietro
2012-03-01
We describe the potential of a novel method, based on Constraint Logic Programming (CLP), developed for an exhaustive sampling of protein conformational space. The CLP framework proposed here has been tested and applied to the estrogen receptor, whose activity and function is strictly related to its intrinsic, and well known, dynamics. We have investigated in particular the flexibility of H12, focusing on the pathways followed by the helix when moving from one stable crystallographic conformation to the others. Millions of geometrically feasible conformations were generated, selected and the traces connecting the different forms were determined by using a shortest path algorithm. The preliminary analyses showed a marked agreement between the crystallographic agonist-like, antagonist-like and hypothetical apo forms, and the corresponding conformations identified by the CLP framework. These promising results, together with the short computational time required to perform the analyses, make this constraint-based approach a valuable tool for the study of protein folding prediction. The CLP framework enables one to consider various structural and energetic scenarious, without changing the core algorithm. To show the feasibility of the method, we intentionally choose a pure geometric setting, neglecting the energetic evaluation of the poses, in order to be independent from a specific force field and to provide the possibility of comparing different behaviours associated with various energy models. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
Mixing Categories and Modal Logics in the Quantum Setting
NASA Astrophysics Data System (ADS)
Cinà, Giovanni
The study of the foundations of Quantum Mechanics, especially after the advent of Quantum Computation and Information, has benefited from the application of category-theoretic tools and modal logics to the analysis of Quantum processes: we witness a wealth of theoretical frameworks casted in either of the two languages. This paper explores the interplay of the two formalisms in the peculiar context of Quantum Theory. After a review of some influential abstract frameworks, we show how different modal logic frames can be extracted from the category of finite dimensional Hilbert spaces, connecting the Categorical Quantum Mechanics approach to some modal logics that have been proposed for Quantum Computing. We then apply a general version of the same technique to two other categorical frameworks, the `topos approach' of Doering and Isham and the sheaf-theoretic work on contextuality by Abramsky and Brandenburger, suggesting how some key features can be expressed with modal languages.
Counter Unmanned Aerial System Decision-Aid Logic Process (C-UAS DALP)
decision -aid or logic process that bridges the middle elements of the kill... of use, location, general logic process , and reference mission. This is the framework for the IDEF0 functional architecture diagrams, decision -aid diagrams, logic process , and modeling and simulation....chain between detection to countermeasure response. This capstone project creates the logic for a decision process that transitions from the
EAGLE can do Efficient LTL Monitoring
NASA Technical Reports Server (NTRS)
Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik
2003-01-01
We briefly present a rule-based framework, called EAGLE, that has been shown to be capable of defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time logics, interval logics, forms of quantified temporal logics, and so on. In this paper we show how EAGLE can do linear temporal logic (LTL) monitoring in an efficient way. We give an upper bound on the space and time complexity of this monitoring.
Enhancing programming logic thinking using analogy mapping
NASA Astrophysics Data System (ADS)
Sukamto, R. A.; Megasari, R.
2018-05-01
Programming logic thinking is the most important competence for computer science students. However, programming is one of the difficult subject in computer science program. This paper reports our work about enhancing students' programming logic thinking using Analogy Mapping for basic programming subject. Analogy Mapping is a computer application which converts source code into analogies images. This research used time series evaluation and the result showed that Analogy Mapping can enhance students' programming logic thinking.
Devil is in the details: Using logic models to investigate program process.
Peyton, David J; Scicchitano, Michael
2017-12-01
Theory-based logic models are commonly developed as part of requirements for grant funding. As a tool to communicate complex social programs, theory based logic models are an effective visual communication. However, after initial development, theory based logic models are often abandoned and remain in their initial form despite changes in the program process. This paper examines the potential benefits of committing time and resources to revising the initial theory driven logic model and developing detailed logic models that describe key activities to accurately reflect the program and assist in effective program management. The authors use a funded special education teacher preparation program to exemplify the utility of drill down logic models. The paper concludes with lessons learned from the iterative revision process and suggests how the process can lead to more flexible and calibrated program management. Copyright © 2017 Elsevier Ltd. All rights reserved.
Designing a Software Tool for Fuzzy Logic Programming
NASA Astrophysics Data System (ADS)
Abietar, José M.; Morcillo, Pedro J.; Moreno, Ginés
2007-12-01
Fuzzy Logic Programming is an interesting and still growing research area that agglutinates the efforts for introducing fuzzy logic into logic programming (LP), in order to incorporate more expressive resources on such languages for dealing with uncertainty and approximated reasoning. The multi-adjoint logic programming approach is a recent and extremely flexible fuzzy logic paradigm for which, unfortunately, we have not found practical tools implemented so far. In this work, we describe a prototype system which is able to directly translate fuzzy logic programs into Prolog code in order to safely execute these residual programs inside any standard Prolog interpreter in a completely transparent way for the final user. We think that the development of such fuzzy languages and programing tools might play an important role in the design of advanced software applications for computational physics, chemistry, mathematics, medicine, industrial control and so on.
Certifying Domain-Specific Policies
NASA Technical Reports Server (NTRS)
Lowry, Michael; Pressburger, Thomas; Rosu, Grigore; Koga, Dennis (Technical Monitor)
2001-01-01
Proof-checking code for compliance to safety policies potentially enables a product-oriented approach to certain aspects of software certification. To date, previous research has focused on generic, low-level programming-language properties such as memory type safety. In this paper we consider proof-checking higher-level domain -specific properties for compliance to safety policies. The paper first describes a framework related to abstract interpretation in which compliance to a class of certification policies can be efficiently calculated Membership equational logic is shown to provide a rich logic for carrying out such calculations, including partiality, for certification. The architecture for a domain-specific certifier is described, followed by an implemented case study. The case study considers consistency of abstract variable attributes in code that performs geometric calculations in Aerospace systems.
Logic models as a tool for sexual violence prevention program development.
Hawkins, Stephanie R; Clinton-Sherrod, A Monique; Irvin, Neil; Hart, Laurie; Russell, Sarah Jane
2009-01-01
Sexual violence is a growing public health problem, and there is an urgent need to develop sexual violence prevention programs. Logic models have emerged as a vital tool in program development. The Centers for Disease Control and Prevention funded an empowerment evaluation designed to work with programs focused on the prevention of first-time male perpetration of sexual violence, and it included as one of its goals, the development of program logic models. Two case studies are presented that describe how significant positive changes can be made to programs as a result of their developing logic models that accurately describe desired outcomes. The first case study describes how the logic model development process made an organization aware of the importance of a program's environmental context for program success; the second case study demonstrates how developing a program logic model can elucidate gaps in organizational programming and suggest ways to close those gaps.
Weighted Description Logics Preference Formulas for Multiattribute Negotiation
NASA Astrophysics Data System (ADS)
Ragone, Azzurra; di Noia, Tommaso; Donini, Francesco M.; di Sciascio, Eugenio; Wellman, Michael P.
We propose a framework to compute the utility of an agreement w.r.t a preference set in a negotiation process. In particular, we refer to preferences expressed as weighted formulas in a decidable fragment of First-order Logic and agreements expressed as a formula. We ground our framework in Description Logics (DL) endowed with disjunction, to be compliant with Semantic Web technologies. A logic based approach to preference representation allows, when a background knowledge base is exploited, to relax the often unrealistic assumption of additive independence among attributes. We provide suitable definitions of the problem and present algorithms to compute utility in our setting. We also validate our approach through an experimental evaluation.
A framework to find the logic backbone of a biological network.
Maheshwari, Parul; Albert, Réka
2017-12-06
Cellular behaviors are governed by interaction networks among biomolecules, for example gene regulatory and signal transduction networks. An often used dynamic modeling framework for these networks, Boolean modeling, can obtain their attractors (which correspond to cell types and behaviors) and their trajectories from an initial state (e.g. a resting state) to the attractors, for example in response to an external signal. The existing methods however do not elucidate the causal relationships between distant nodes in the network. In this work, we propose a simple logic framework, based on categorizing causal relationships as sufficient or necessary, as a complement to Boolean networks. We identify and explore the properties of complex subnetworks that are distillable into a single logic relationship. We also identify cyclic subnetworks that ensure the stabilization of the state of participating nodes regardless of the rest of the network. We identify the logic backbone of biomolecular networks, consisting of external signals, self-sustaining cyclic subnetworks (stable motifs), and output nodes. Furthermore, we use the logic framework to identify crucial nodes whose override can drive the system from one steady state to another. We apply these techniques to two biological networks: the epithelial-to-mesenchymal transition network corresponding to a developmental process exploited in tumor invasion, and the network of abscisic acid induced stomatal closure in plants. We find interesting subnetworks with logical implications in these networks. Using these subgraphs and motifs, we efficiently reduce both networks to succinct backbone structures. The logic representation identifies the causal relationships between distant nodes and subnetworks. This knowledge can form the basis of network control or used in the reverse engineering of networks.
Rowland, Paula; McMillan, Sarah; McGillicuddy, Patti; Richards, Joy
2017-01-01
Public and patient involvement (PPI) in health care may refer to many different processes, ranging from participating in decision-making about one's own care to participating in health services research, health policy development, or organizational reforms. Across these many forms of public and patient involvement, the conceptual and theoretical underpinnings remain poorly articulated. Instead, most public and patient involvement programs rely on policy initiatives as their conceptual frameworks. This lack of conceptual clarity participates in dilemmas of program design, implementation, and evaluation. This study contributes to the development of theoretical understandings of public and patient involvement. In particular, we focus on the deployment of patient engagement programs within health service organizations. To develop a deeper understanding of the conceptual underpinnings of these programs, we examined the concept of "the patient perspective" as used by patient engagement practitioners and participants. Specifically, we focused on the way this phrase was used in the singular: "the" patient perspective or "the" patient voice. From qualitative analysis of interviews with 20 patient advisers and 6 staff members within a large urban health network in Canada, we argue that "the patient perspective" is referred to as a particular kind of situated knowledge, specifically an embodied knowledge of vulnerability. We draw parallels between this logic of patient perspective and the logic of early feminist theory, including the concepts of standpoint theory and strong objectivity. We suggest that champions of patient engagement may learn much from the way feminist theorists have constructed their arguments and addressed critique.
Computing single step operators of logic programming in radial basis function neural networks
NASA Astrophysics Data System (ADS)
Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong
2014-07-01
Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (Tp:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.
Logic Models for Program Design, Implementation, and Evaluation: Workshop Toolkit. REL 2015-057
ERIC Educational Resources Information Center
Shakman, Karen; Rodriguez, Sheila M.
2015-01-01
The Logic Model Workshop Toolkit is designed to help practitioners learn the purpose of logic models, the different elements of a logic model, and the appropriate steps for developing and using a logic model for program evaluation. Topics covered in the sessions include an overview of logic models, the elements of a logic model, an introduction to…
A Logical Framework for Distributed Data
1990-11-01
A Logical Framework for Distributed Data lLl6ll1AH43 44592 -001-05-3301 A~UTHO(S RDT&E 44043-010-37 Paul Broome and Barbara Broome 1L162618AH80... 44592 -002-46-3702 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 3.PERFORMG ORGANIZATION REPORT NUMBER U. S. Army Ballistic Research Laboratory ATTN
Modular Knowledge Representation and Reasoning in the Semantic Web
NASA Astrophysics Data System (ADS)
Serafini, Luciano; Homola, Martin
Construction of modular ontologies by combining different modules is becoming a necessity in ontology engineering in order to cope with the increasing complexity of the ontologies and the domains they represent. The modular ontology approach takes inspiration from software engineering, where modularization is a widely acknowledged feature. Distributed reasoning is the other side of the coin of modular ontologies: given an ontology comprising of a set of modules, it is desired to perform reasoning by combination of multiple reasoning processes performed locally on each of the modules. In the last ten years, a number of approaches for combining logics has been developed in order to formalize modular ontologies. In this chapter, we survey and compare the main formalisms for modular ontologies and distributed reasoning in the Semantic Web. We select four formalisms build on formal logical grounds of Description Logics: Distributed Description Logics, ℰ-connections, Package-based Description Logics and Integrated Distributed Description Logics. We concentrate on expressivity and distinctive modeling features of each framework. We also discuss reasoning capabilities of each framework.
Harmonising Nursing Terminologies Using a Conceptual Framework.
Jansen, Kay; Kim, Tae Youn; Coenen, Amy; Saba, Virginia; Hardiker, Nicholas
2016-01-01
The International Classification for Nursing Practice (ICNP®) and the Clinical Care Classification (CCC) System are standardised nursing terminologies that identify discrete elements of nursing practice, including nursing diagnoses, interventions, and outcomes. While CCC uses a conceptual framework or model with 21 Care Components to classify these elements, ICNP, built on a formal Web Ontology Language (OWL) description logic foundation, uses a logical hierarchical framework that is useful for computing and maintenance of ICNP. Since the logical framework of ICNP may not always align with the needs of nursing practice, an informal framework may be a more useful organisational tool to represent nursing content. The purpose of this study was to classify ICNP nursing diagnoses using the 21 Care Components of the CCC as a conceptual framework to facilitate usability and inter-operability of nursing diagnoses in electronic health records. Findings resulted in all 521 ICNP diagnoses being assigned to one of the 21 CCC Care Components. Further research is needed to validate the resulting product of this study with practitioners and develop recommendations for improvement of both terminologies.
Wong, Sabrina T; Yin, Delu; Bhattacharyya, Onil; Wang, Bin; Liu, Liqun; Chen, Bowen
2010-11-18
China has had no effective and systematic information system to provide guidance for strengthening PHC (Primary Health Care) or account to citizens on progress. We report on the development of the China results-based Logic Model for Community Health Facilities and Stations (CHS) and a set of relevant PHC indicators intended to measure CHS priorities. We adapted the PHC Results Based Logic Model developed in Canada and current work conducted in the community health system in China to create the China CHS Logic Model framework. We used a staged approach by first constructing the framework and indicators and then validating their content through an interactive process involving policy analysis, critical review of relevant literature and multiple stakeholder consultation. The China CHS Logic Model includes inputs, activities, outputs and outcomes with a total of 287 detailed performance indicators. In these indicators, 31 indicators measure inputs, 64 measure activities, 105 measure outputs, and 87 measure immediate (n = 65), intermediate (n = 15), or final (n = 7) outcomes. A Logic Model framework can be useful in planning, implementation, analysis and evaluation of PHC at a system and service level. The development and content validation of the China CHS Logic Model and subsequent indicators provides a means for stronger accountability and a clearer sense of overall direction and purpose needed to renew and strengthen the PHC system in China. Moreover, this work will be useful in moving towards developing a PHC information system and performance measurement across districts in urban China, and guiding the pursuit of quality in PHC.
2010-01-01
Background China has had no effective and systematic information system to provide guidance for strengthening PHC (Primary Health Care) or account to citizens on progress. We report on the development of the China results-based Logic Model for Community Health Facilities and Stations (CHS) and a set of relevant PHC indicators intended to measure CHS priorities. Methods We adapted the PHC Results Based Logic Model developed in Canada and current work conducted in the community health system in China to create the China CHS Logic Model framework. We used a staged approach by first constructing the framework and indicators and then validating their content through an interactive process involving policy analysis, critical review of relevant literature and multiple stakeholder consultation. Results The China CHS Logic Model includes inputs, activities, outputs and outcomes with a total of 287 detailed performance indicators. In these indicators, 31 indicators measure inputs, 64 measure activities, 105 measure outputs, and 87 measure immediate (n = 65), intermediate (n = 15), or final (n = 7) outcomes. Conclusion A Logic Model framework can be useful in planning, implementation, analysis and evaluation of PHC at a system and service level. The development and content validation of the China CHS Logic Model and subsequent indicators provides a means for stronger accountability and a clearer sense of overall direction and purpose needed to renew and strengthen the PHC system in China. Moreover, this work will be useful in moving towards developing a PHC information system and performance measurement across districts in urban China, and guiding the pursuit of quality in PHC. PMID:21087516
Deciding Full Branching Time Logic by Program Transformation
NASA Astrophysics Data System (ADS)
Pettorossi, Alberto; Proietti, Maurizio; Senni, Valerio
We present a method based on logic program transformation, for verifying Computation Tree Logic (CTL*) properties of finite state reactive systems. The finite state systems and the CTL* properties we want to verify, are encoded as logic programs on infinite lists. Our verification method consists of two steps. In the first step we transform the logic program that encodes the given system and the given property, into a monadic ω -program, that is, a stratified program defining nullary or unary predicates on infinite lists. This transformation is performed by applying unfold/fold rules that preserve the perfect model of the initial program. In the second step we verify the property of interest by using a proof method for monadic ω-programs.
Specification and Verification of Web Applications in Rewriting Logic
NASA Astrophysics Data System (ADS)
Alpuente, María; Ballis, Demis; Romero, Daniel
This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.
An Automated Design Framework for Multicellular Recombinase Logic.
Guiziou, Sarah; Ulliana, Federico; Moreau, Violaine; Leclere, Michel; Bonnet, Jerome
2018-05-18
Tools to systematically reprogram cellular behavior are crucial to address pressing challenges in manufacturing, environment, or healthcare. Recombinases can very efficiently encode Boolean and history-dependent logic in many species, yet current designs are performed on a case-by-case basis, limiting their scalability and requiring time-consuming optimization. Here we present an automated workflow for designing recombinase logic devices executing Boolean functions. Our theoretical framework uses a reduced library of computational devices distributed into different cellular subpopulations, which are then composed in various manners to implement all desired logic functions at the multicellular level. Our design platform called CALIN (Composable Asynchronous Logic using Integrase Networks) is broadly accessible via a web server, taking truth tables as inputs and providing corresponding DNA designs and sequences as outputs (available at http://synbio.cbs.cnrs.fr/calin ). We anticipate that this automated design workflow will streamline the implementation of Boolean functions in many organisms and for various applications.
NASA Astrophysics Data System (ADS)
Mezentsev, Yu A.; Baranova, N. V.
2018-05-01
A universal economical and mathematical model designed for determination of optimal strategies for managing subsystems (components of subsystems) of production and logistics of enterprises is considered. Declared universality allows taking into account on the system level both production components, including limitations on the ways of converting raw materials and components into sold goods, as well as resource and logical restrictions on input and output material flows. The presented model and generated control problems are developed within the framework of the unified approach that allows one to implement logical conditions of any complexity and to define corresponding formal optimization tasks. Conceptual meaning of used criteria and limitations are explained. The belonging of the generated tasks of the mixed programming with the class of NP is shown. An approximate polynomial algorithm for solving the posed optimization tasks for mixed programming of real dimension with high computational complexity is proposed. Results of testing the algorithm on the tasks in a wide range of dimensions are presented.
Development of Algorithms for Control of Humidity in Plant Growth Chambers
NASA Technical Reports Server (NTRS)
Costello, Thomas A.
2003-01-01
Algorithms were developed to control humidity in plant growth chambers used for research on bioregenerative life support at Kennedy Space Center. The algorithms used the computed water vapor pressure (based on measured air temperature and relative humidity) as the process variable, with time-proportioned outputs to operate the humidifier and de-humidifier. Algorithms were based upon proportional-integral-differential (PID) and Fuzzy Logic schemes and were implemented using I/O Control software (OPTO-22) to define and download the control logic to an autonomous programmable logic controller (PLC, ultimate ethernet brain and assorted input-output modules, OPTO-22), which performed the monitoring and control logic processing, as well the physical control of the devices that effected the targeted environment in the chamber. During limited testing, the PLC's successfully implemented the intended control schemes and attained a control resolution for humidity of less than 1%. The algorithms have potential to be used not only with autonomous PLC's but could also be implemented within network-based supervisory control programs. This report documents unique control features that were implemented within the OPTO-22 framework and makes recommendations regarding future uses of the hardware and software for biological research by NASA.
Lai, Agnes Y; Stewart, Sunita M; Mui, Moses W; Wan, Alice; Yew, Carol; Lam, Tai Hing; Chan, Sophia S
2017-01-01
Evaluation studies on train-the-trainer workshops (TTTs) to develop family well-being interventions are limited in the literature. The Logic Model offers a framework to place some important concepts and tools of intervention science in the hands of frontline service providers. This paper reports on the evaluation of a TTT for a large community-based program to enhance family well-being in Hong Kong. The 2-day TTT introduced positive psychology themes (relevant to the programs that the trainees would deliver) and the Logic Model (which provides a framework to guide intervention development and evaluation) for social service workers to guide their community-based family interventions. The effectiveness of the TTT was examined by self-administered questionnaires that assessed trainees' changes in learning (perceived knowledge, self-efficacy, attitude, and intention), trainees' reactions to training content, knowledge sharing, and benefits to their service organizations before and after the training and then 6 months and 1 year later. Missing data were replaced by baseline values in an intention-to-treat analysis. Focus group interviews were conducted approximately 6 months after training. Fifty-six trainees (79% women) joined the TTT. Forty-four and 31 trainees completed the 6-month and 1-year questionnaires, respectively. The trainees indicated that the workshop was informative and well organized. The TTT-enhanced trainees' perceived knowledge, self-efficacy, and attitudes toward the application of the Logic Model and positive psychology constructs in program design. These changes were present with small to large effect size that persisted to the 1 year follow-up. The skills learned were used to develop 31 family interventions that were delivered to about 1,000 families. Qualitative feedback supported the quantitative results. This TTT offers a practical example of academic-community partnerships that promote capacity among community social service workers. Goals included sharing basic tools of intervention development and evaluation, and the TTT offered, therefore, the potential of learning skills that extended beyond the lifetime of a single program. The research protocol was registered at the National Institutes of Health (identifier number: NCT01796275).
A logical foundation for representation of clinical data.
Campbell, K E; Das, A K; Musen, M A
1994-01-01
OBJECTIVE: A general framework for representation of clinical data that provides a declarative semantics of terms and that allows developers to define explicitly the relationships among both terms and combinations of terms. DESIGN: Use of conceptual graphs as a standard representation of logic and of an existing standardized vocabulary, the Systematized Nomenclature of Medicine (SNOMED International), for lexical elements. Concepts such as time, anatomy, and uncertainty must be modeled explicitly in a way that allows relation of these foundational concepts to surface-level clinical descriptions in a uniform manner. RESULTS: The proposed framework was used to model a simple radiology report, which included temporal references. CONCLUSION: Formal logic provides a framework for formalizing the representation of medical concepts. Actual implementations will be required to evaluate the practicality of this approach. PMID:7719805
NASA Technical Reports Server (NTRS)
Howard, Ayanna
2005-01-01
The Fuzzy Logic Engine is a software package that enables users to embed fuzzy-logic modules into their application programs. Fuzzy logic is useful as a means of formulating human expert knowledge and translating it into software to solve problems. Fuzzy logic provides flexibility for modeling relationships between input and output information and is distinguished by its robustness with respect to noise and variations in system parameters. In addition, linguistic fuzzy sets and conditional statements allow systems to make decisions based on imprecise and incomplete information. The user of the Fuzzy Logic Engine need not be an expert in fuzzy logic: it suffices to have a basic understanding of how linguistic rules can be applied to the user's problem. The Fuzzy Logic Engine is divided into two modules: (1) a graphical-interface software tool for creating linguistic fuzzy sets and conditional statements and (2) a fuzzy-logic software library for embedding fuzzy processing capability into current application programs. The graphical- interface tool was developed using the Tcl/Tk programming language. The fuzzy-logic software library was written in the C programming language.
Identifying and Mitigating Risks in Security Sector Assistance for Africa’s Fragile States
2015-01-01
The Logframe Handbook: A Logical Framework Approach to Project Cycle Management , Washington, D.C., 2005. 34 Identifying and Mitigating Risks in SSA...Fragile States 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER... Project Unique Identification Code (PUIC) for the project that produced this document is HQD126409. v Contents Preface
NASA Astrophysics Data System (ADS)
Nashiroh, Putri Khoirin; Kamdi, Waras; Elmunsyah, Hakkun
2017-09-01
Web programming is a basic subject in Computer and Informatics Engineering, a program study in a vocational high school. It requires logical thinking ability in its learning activities. The purposes of this research were (1) to develop a web programming module that implement scientific approach that can improve logical thinking ability for students in vocational high school; and (2) to test the effectiveness of web programming module based on scientific approach to train students' logical thinking ability. The results of this research was a web-programming module that apply scientific approach for learning activities to improve logical thinking ability of students in the vocational high school. The results of the effectiveness test of web-programming module give conclusion that it was very effective to train logical thinking ability and to improve learning result, this conclusion was supported by: (1) the average of posttest result of students exceeds the minimum criterion value, it was 79.91; (2) the average percentage of students' logical thinking score is 82,98; and (3) the average percentage of students' responses to the web programming module was 81.86%.
An Arbitrary First Order Theory Can Be Represented by a Program: A Theorem
NASA Technical Reports Server (NTRS)
Hosheleva, Olga
1997-01-01
How can we represent knowledge inside a computer? For formalized knowledge, classical logic seems to be the most adequate tool. Classical logic is behind all formalisms of classical mathematics, and behind many formalisms used in Artificial Intelligence. There is only one serious problem with classical logic: due to the famous Godel's theorem, classical logic is algorithmically undecidable; as a result, when the knowledge is represented in the form of logical statements, it is very difficult to check whether, based on this statement, a given query is true or not. To make knowledge representations more algorithmic, a special field of logic programming was invented. An important portion of logic programming is algorithmically decidable. To cover knowledge that cannot be represented in this portion, several extensions of the decidable fragments have been proposed. In the spirit of logic programming, these extensions are usually introduced in such a way that even if a general algorithm is not available, good heuristic methods exist. It is important to check whether the already proposed extensions are sufficient, or further extensions is necessary. In the present paper, we show that one particular extension, namely, logic programming with classical negation, introduced by M. Gelfond and V. Lifschitz, can represent (in some reasonable sense) an arbitrary first order logical theory.
Tremblay, Marie-Claude; Brousselle, Astrid; Richard, Lucie; Beaudet, Nicole
2013-10-01
Program designers and evaluators should make a point of testing the validity of a program's intervention theory before investing either in implementation or in any type of evaluation. In this context, logic analysis can be a particularly useful option, since it can be used to test the plausibility of a program's intervention theory using scientific knowledge. Professional development in public health is one field among several that would truly benefit from logic analysis, as it appears to be generally lacking in theorization and evaluation. This article presents the application of this analysis method to an innovative public health professional development program, the Health Promotion Laboratory. More specifically, this paper aims to (1) define the logic analysis approach and differentiate it from similar evaluative methods; (2) illustrate the application of this method by a concrete example (logic analysis of a professional development program); and (3) reflect on the requirements of each phase of logic analysis, as well as on the advantages and disadvantages of such an evaluation method. Using logic analysis to evaluate the Health Promotion Laboratory showed that, generally speaking, the program's intervention theory appeared to have been well designed. By testing and critically discussing logic analysis, this article also contributes to further improving and clarifying the method. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Yu, Chong Ho
Although quantitative research methodology is widely applied by psychological researchers, there is a common misconception that quantitative research is based on logical positivism. This paper examines the relationship between quantitative research and eight major notions of logical positivism: (1) verification; (2) pro-observation; (3)…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong
Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T{sub p}:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed amore » new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.« less
Jeagle: a JAVA Runtime Verification Tool
NASA Technical Reports Server (NTRS)
DAmorim, Marcelo; Havelund, Klaus
2005-01-01
We introduce the temporal logic Jeagle and its supporting tool for runtime verification of Java programs. A monitor for an Jeagle formula checks if a finite trace of program events satisfies the formula. Jeagle is a programming oriented extension of the rule-based powerful Eagle logic that has been shown to be capable of defining and implementing a range of finite trace monitoring logics, including future and past time temporal logic, real-time and metric temporal logics, interval logics, forms of quantified temporal logics, and so on. Monitoring is achieved on a state-by-state basis avoiding any need to store the input trace. Jeagle extends Eagle with constructs for capturing parameterized program events such as method calls and method returns. Parameters can be the objects that methods are called upon, arguments to methods, and return values. Jeagle allows one to refer to these in formulas. The tool performs automated program instrumentation using AspectJ. We show the transformational semantics of Jeagle.
ERIC Educational Resources Information Center
Lopez, Antonio M., Jr.
1989-01-01
Provides background material on logic programing and presents PROLOG as a high-level artificial intelligence programing language that borrows its basic constructs from logic. Suggests the language is one which will help the educator to achieve various goals, particularly the promotion of problem solving ability. (MVL)
Obfuscation Framework Based on Functionally Equivalent Combinatorial Logic Families
2008-03-01
of Defense, or the United States Government . AFIT/GCS/ENG/08-12 Obfuscation Framework Based on Functionally Equivalent Combinatorial Logic Families...time, United States policy strongly encourages the sale and transfer of some military equipment to foreign governments and makes it easier for...Proceedings of the International Conference on Availability, Reliability and Security, 2007. 14. McDonald, J. Todd and Alec Yasinsac. “Of unicorns and random
Retro-causation, Minimum Contradictions and Non-locality
NASA Astrophysics Data System (ADS)
Kafatos, Menas; Nassikas, Athanassios A.
2011-11-01
Retro-causation has been experimentally verified by Bem and proposed by Kafatos in the form of space-time non-locality in the quantum framework. Every theory includes, beyond its specific axioms, the principles of logical communication (logical language), through which it is defined. This communication obeys the Aristotelian logic (Classical Logic), the Leibniz Sufficient Reason Principle, and a hidden axiom, which basically states that there is anterior-posterior relationship everywhere in communication. By means of a theorem discussed here, it can be proved that the communication mentioned implies contradictory statements, which can only be transcended through silence, i.e. the absence of any statements. Moreover, the breaking of silence is meaningful through the claim for minimum contradictions, which implies the existence of both a logical and an illogical dimension; contradictions refer to causality, implying its opposite, namely retro-causation, and the anterior posterior axiom, implying space-time non-locality. The purpose of this paper is to outline a framework accounting for retro-causation, through both purely theoretical and reality based points of view.
Three Sets of Case Studies Suggest Logic and Consistency Challenges with Value Frameworks.
Cohen, Joshua T; Anderson, Jordan E; Neumann, Peter J
2017-02-01
To assess the logic and consistency of three prominent value frameworks. We reviewed the value frameworks from three organizations: the Memorial Sloan Kettering Cancer Center (DrugAbacus), the American Society of Clinical Oncologists, and the Institute for Clinical and Economic Review. For each framework, we developed case studies to explore the degree to which the frameworks have face validity in the sense that they are consistent with four important principles: value should be proportional to a therapy's benefit; components of value should matter to framework users (patients and payers); attribute weights should reflect user preferences; and value estimates used to inform therapy prices should reflect per-person benefit. All three frameworks can aid decision making by elucidating factors not explicitly addressed by conventional evaluation techniques (in particular, cost-effectiveness analyses). Our case studies identified four challenges: 1) value is not always proportional to benefit; 2) value reflects factors that may not be relevant to framework users (patients or payers); 3) attribute weights do not necessarily reflect user preferences or relate to value in ways that are transparent; and 4) value does not reflect per-person benefit. Although the value frameworks we reviewed capture value in a way that is important to various audiences, they are not always logical or consistent. Because these frameworks may have a growing influence on therapy access, it is imperative that analytic challenges be further explored. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Pitts, E. R.
1981-01-01
Program converts cell-net data into logic-gate models for use in test and simulation programs. Input consists of either Place, Route, and Fold (PRF) or Place-and-Route-in-Two-Dimensions (PR2D) layout data deck. Output consists of either Test Pattern Generator (TPG) or Logic-Simulation (LOGSIM) logic circuitry data deck. Designer needs to build only logic-gate-model circuit description since program acts as translator. Language is FORTRAN IV.
Michalowski, Martin; Wilk, Szymon; Tan, Xing; Michalowski, Wojtek
2014-01-01
Clinical practice guidelines (CPGs) implement evidence-based medicine designed to help generate a therapy for a patient suffering from a single disease. When applied to a comorbid patient, the concurrent combination of treatment steps from multiple CPGs is susceptible to adverse interactions in the resulting combined therapy (i.e., a therapy established according to all considered CPGs). This inability to concurrently apply CPGs has been shown to be one of the key shortcomings of CPG uptake in a clinical setting1. Several research efforts are underway to address this issue such as the K4CARE2 and GuideLine INteraction Detection Assistant (GLINDA)3 projects and our previous research on applying constraint logic programming to developing a consistent combined therapy for a comorbid patient4. However, there is no generalized framework for mitigation that effectively captures general characteristics of the problem while handling nuances such as time and ordering requirements imposed by specific CPGs. In this paper we propose a first-order logic-based (FOL) approach for developing a generalized framework of mitigation. This approach uses a meta-algorithm and entailment properties to mitigate (i.e., identify and address) adverse interactions introduced by concurrently applied CPGs. We use an illustrative case study of a patient suffering from type 2 diabetes being treated for an onset of severe rheumatoid arthritis to show the expressiveness and robustness of our proposed FOL-based approach, and we discuss its appropriateness as the basis for the generalized theory.
Global planning of several plants
NASA Technical Reports Server (NTRS)
Bescos, Sylvie
1992-01-01
This paper discusses an attempt to solve the problem of planning several pharmaceutical plants at a global level. The interest in planning at this level is to increase the global control over the production process, to improve its overall efficiency, and to reduce the need for interaction between production plants. In order to reduce the complexity of this problem and to make it tractable, some abstractions were made. Based on these abstractions, a prototype is being developed within the framework of the EUREKA project PROTOS, using Constraint Logic Programming techniques.
Automating Access Control Logics in Simple Type Theory with LEO-II
NASA Astrophysics Data System (ADS)
Benzmüller, Christoph
Garg and Abadi recently proved that prominent access control logics can be translated in a sound and complete way into modal logic S4. We have previously outlined how normal multimodal logics, including monomodal logics K and S4, can be embedded in simple type theory and we have demonstrated that the higher-order theorem prover LEO-II can automate reasoning in and about them. In this paper we combine these results and describe a sound (and complete) embedding of different access control logics in simple type theory. Employing this framework we show that the off the shelf theorem prover LEO-II can be applied to automate reasoning in and about prominent access control logics.
Software Process Assurance for Complex Electronics (SPACE)
NASA Technical Reports Server (NTRS)
Plastow, Richard A.
2007-01-01
Complex Electronics (CE) are now programmed to perform tasks that were previously handled in software, such as communication protocols. Many of the methods used to develop software bare a close resemblance to CE development. For instance, Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. Since CE devices are obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that looks at using standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques can be used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that will be more easily maintained, consistent and configurable based on the device used.
Research on teacher education programs: logic model approach.
Newton, Xiaoxia A; Poon, Rebecca C; Nunes, Nicole L; Stone, Elisa M
2013-02-01
Teacher education programs in the United States face increasing pressure to demonstrate their effectiveness through pupils' learning gains in classrooms where program graduates teach. The link between teacher candidates' learning in teacher education programs and pupils' learning in K-12 classrooms implicit in the policy discourse suggests a one-to-one correspondence. However, the logical steps leading from what teacher candidates have learned in their programs to what they are doing in classrooms that may contribute to their pupils' learning are anything but straightforward. In this paper, we argue that the logic model approach from scholarship on evaluation can enhance research on teacher education by making explicit the logical links between program processes and intended outcomes. We demonstrate the usefulness of the logic model approach through our own work on designing a longitudinal study that focuses on examining the process and impact of an undergraduate mathematics and science teacher education program. Copyright © 2012 Elsevier Ltd. All rights reserved.
Dogdu, Gamze; Yalcuk, Arda; Postalcioglu, Seda
2017-02-01
There are more than a hundred textile industries in Turkey that discharge large quantities of dye-rich wastewater, resulting in water pollution. Such effluents must be treated to meet discharge limits imposed by the Water Framework Directive in Turkey. Industrial treatment facilities must be required to monitor operations, keep them cost-effective, prevent operational faults, discharge-limit infringements, and water pollution. This paper proposes the treatment of actual textile wastewater by vertical flow constructed wetland (VFCW) systems operation and monitoring effluent wastewater quality using fuzzy logic with a graphical user interface. The treatment performance of VFCW is investigated in terms of chemical oxygen demand and ammonium nitrogen (NH4-N) content, color, and pH parameters during a 75-day period of operation. A computer program was developed with a fuzzy logic system (a decision- making tool) to graphically present (via a status analysis chart) the quality of treated textile effluent in relation to the Turkish Water Pollution Control Regulation. Fuzzy logic is used in the evaluation of data obtained from the VFCW systems and for notification of critical states exceeding the discharge limits. This creates a warning chart that reports any errors encountered in a reactor during the collection of any sample to the concerned party.
Verifying the Modal Logic Cube Is an Easy Task (For Higher-Order Automated Reasoners)
NASA Astrophysics Data System (ADS)
Benzmüller, Christoph
Prominent logics, including quantified multimodal logics, can be elegantly embedded in simple type theory (classical higher-order logic). Furthermore, off-the-shelf reasoning systems for simple type type theory exist that can be uniformly employed for reasoning within and about embedded logics. In this paper we focus on reasoning about modal logics and exploit our framework for the automated verification of inclusion and equivalence relations between them. Related work has applied first-order automated theorem provers for the task. Our solution achieves significant improvements, most notably, with respect to elegance and simplicity of the problem encodings as well as with respect to automation performance.
Alternating-Offers Protocol for Multi-issue Bilateral Negotiation in Semantic-Enabled Marketplaces
NASA Astrophysics Data System (ADS)
Ragone, Azzurra; di Noia, Tommaso; di Sciascio, Eugenio; Donini, Francesco M.
We present a semantic-based approach to multi-issue bilateral negotiation for e-commerce. We use Description Logics to model advertisements, and relations among issues as axioms in a TBox. We then introduce a logic-based alternating-offers protocol, able to handle conflicting information, that merges non-standard reasoning services in Description Logics with utility thoery to find the most suitable agreements. We illustrate and motivate the theoretical framework, the logical language, and the negotiation protocol.
Framework for analysis of guaranteed QOS systems
NASA Astrophysics Data System (ADS)
Chaudhry, Shailender; Choudhary, Alok
1997-01-01
Multimedia data is isochronous in nature and entails managing and delivering high volumes of data. Multiprocessors with their large processing power, vast memory, and fast interconnects, are an ideal candidate for the implementation of multimedia applications. Initially, multiprocessors were designed to execute scientific programs and thus their architecture was optimized to provide low message latency and efficiently support regular communication patterns. Hence, they have a regular network topology and most use wormhole routing. The design offers the benefits of a simple router, small buffer size, and network latency that is almost independent of path length. Among the various multimedia applications, video on demand (VOD) server is well-suited for implementation using parallel multiprocessors. Logical models for VOD servers are presently mapped onto multiprocessors. Our paper provides a framework for calculating bounds on utilization of system resources with which QoS parameters for each isochronous stream can be guaranteed. Effects of the architecture of multiprocessors, and efficiency of various local models and mapping on particular architectures can be investigated within our framework. Our framework is based on rigorous proofs and provides tight bounds. The results obtained may be used as the basis for admission control tests. To illustrate the versatility of our framework, we provide bounds on utilization for various logical models applied to mesh connected architectures for a video on demand server. Our results show that worm hole routing can lead to packets waiting for transmission of other packets that apparently share no common resources. This situation is analogous to head-of-the-line blocking. We find that the provision of multiple VCs per link and multiple flit buffers improves utilization (even under guaranteed QoS parameters). This analogous to parallel iterative matching.
Reavley, Nicola; Livingston, Jenni; Buchbinder, Rachelle; Bennell, Kim; Stecki, Chris; Osborne, Richard Harry
2010-02-01
Despite demands for evidence-based research and practice, little attention has been given to systematic approaches to the development of complex interventions to tackle workplace health problems. This paper outlines an approach to the initial stages of a workplace program development which integrates health promotion and disease management. The approach commences with systematic and genuine processes of obtaining information from key stakeholders with broad experience of these interventions. This information is constructed into a program framework in which practice-based and research-informed elements are both valued. We used this approach to develop a workplace education program to reduce the onset and impact of a common chronic disease - osteoarthritis. To gain information systematically at a national level, a structured concept mapping workshop with 47 participants from across Australia was undertaken. Participants were selected to maximise the whole-of-workplace perspective and included health education providers, academics, clinicians and policymakers. Participants generated statements in response to a seeding statement: Thinking as broadly as possible, what changes in education and support should occur in the workplace to help in the prevention and management of arthritis? Participants grouped the resulting statements into conceptually coherent groups and a computer program was used to generate a 'cluster map' along with a list of statements sorted according to cluster membership. In combination with research-based evidence, the concept map informed the development of a program logic model incorporating the program's guiding principles, possible service providers, services, training modes, program elements and the causal processes by which participants might benefit. The program logic model components were further validated through research findings from diverse fields, including health education, coaching, organisational learning, workplace interventions, workforce development and osteoarthritis disability prevention. In summary, wide and genuine consultation, concept mapping, and evidence-based program logic development were integrated to develop a whole-of-system complex intervention in which potential effectiveness and assimilation into the workplace for which optimised. Copyright 2009 Elsevier Ltd. All rights reserved.
A Logic Programming Testbed for Inductive Thought and Specification.
ERIC Educational Resources Information Center
Neff, Norman D.
This paper describes applications of logic programming technology to the teaching of the inductive method in computer science and mathematics. It discusses the nature of inductive thought and its place in those fields of inquiry, arguing that a complete logic programming system for supporting inductive inference is not only feasible but necessary.…
Hayes, Holly; Parchman, Michael L.; Howard, Ray
2012-01-01
Evaluating effective growth and development of a Practice-Based Research Network (PBRN) can be challenging. The purpose of this article is to describe the development of a logic model and how the framework has been used for planning and evaluation in a primary care PBRN. An evaluation team was formed consisting of the PBRN directors, staff and its board members. After the mission and the target audience were determined, facilitated meetings and discussions were held with stakeholders to identify the assumptions, inputs, activities, outputs, outcomes and outcome indicators. The long-term outcomes outlined in the final logic model are two-fold: 1.) Improved health outcomes of patients served by PBRN community clinicians; and 2.) Community clinicians are recognized leaders of quality research projects. The Logic Model proved useful in identifying stakeholder interests and dissemination activities as an area that required more attention in the PBRN. The logic model approach is a useful planning tool and project management resource that increases the probability that the PBRN mission will be successfully implemented. PMID:21900441
Detection of epistatic effects with logic regression and a classical linear regression model.
Malina, Magdalena; Ickstadt, Katja; Schwender, Holger; Posch, Martin; Bogdan, Małgorzata
2014-02-01
To locate multiple interacting quantitative trait loci (QTL) influencing a trait of interest within experimental populations, usually methods as the Cockerham's model are applied. Within this framework, interactions are understood as the part of the joined effect of several genes which cannot be explained as the sum of their additive effects. However, if a change in the phenotype (as disease) is caused by Boolean combinations of genotypes of several QTLs, this Cockerham's approach is often not capable to identify them properly. To detect such interactions more efficiently, we propose a logic regression framework. Even though with the logic regression approach a larger number of models has to be considered (requiring more stringent multiple testing correction) the efficient representation of higher order logic interactions in logic regression models leads to a significant increase of power to detect such interactions as compared to a Cockerham's approach. The increase in power is demonstrated analytically for a simple two-way interaction model and illustrated in more complex settings with simulation study and real data analysis.
Ratcliffe, Michelle M
2012-08-01
Farm to School programs hold promise to address childhood obesity. These programs may increase students’ access to healthier foods, increase students’ knowledge of and desire to eat these foods, and increase their consumption of them. Implementing Farm to School programs requires the involvement of multiple people, including nutrition services, educators, and food producers. Because these groups have not traditionally worked together and each has different goals, it is important to demonstrate how Farm to School programs that are designed to decrease childhood obesity may also address others’ objectives, such as academic achievement and economic development. A logic model is an effective tool to help articulate a shared vision for how Farm to School programs may work to accomplish multiple goals. Furthermore, there is evidence that programs based on theory are more likely to be effective at changing individuals’ behaviors. Logic models based on theory may help to explain how a program works, aid in efficient and sustained implementation, and support the development of a coherent evaluation plan. This article presents a sample theory-based logic model for Farm to School programs. The presented logic model is informed by the polytheoretical model for food and garden-based education in school settings (PMFGBE). The logic model has been applied to multiple settings, including Farm to School program development and evaluation in urban and rural school districts. This article also includes a brief discussion on the development of the PMFGBE, a detailed explanation of how Farm to School programs may enhance the curricular, physical, and social learning environments of schools, and suggestions for the applicability of the logic model for practitioners, researchers, and policy makers.
Procedural and Logic Programming: A Comparison.
ERIC Educational Resources Information Center
Watkins, Will; And Others
1988-01-01
Examines the similarities and fundamental differences between procedural programing and logic programing by comparing LogoWriter and PROLOG. Suggests that PROLOG may be a good first programing language for students to learn. (MVL)
The Effects of Learning a Computer Programming Language on the Logical Reasoning of School Children.
ERIC Educational Resources Information Center
Seidman, Robert H.
The research reported in this paper explores the syntactical and semantic link between computer programming statements and logical principles, and addresses the effects of learning a programming language on logical reasoning ability. Fifth grade students in a public school in Syracuse, New York, were randomly selected as subjects, and then…
ERIC Educational Resources Information Center
Martin, Ian; Carey, John
2014-01-01
A logic model was developed based on an analysis of the 2012 American School Counselor Association (ASCA) National Model in order to provide direction for program evaluation initiatives. The logic model identified three outcomes (increased student achievement/gap reduction, increased school counseling program resources, and systemic change and…
Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley
2014-03-01
Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. © 2013 Wiley Periodicals, Inc.
Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley
2013-01-01
Background Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. Methods In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. Results The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Conclusions Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. PMID:24006097
Parsing with logical variables (logic-based programming systems)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finin, T.W.; Stone Palmer, M.
1983-01-01
Logic based programming systems have enjoyed an increasing popularity in applied AI work in the last few years. One of the contributions to computational linguistics made by the logic programming paradigm has been the definite clause grammar. In comparing DCGS with previous parsing mechanisms such as ATNS, certain clear advantages are seen. The authors feel that the most important of these advantages are due to the use of logical variables with unification as the fundamental operation on them. To illustrate the power of the logical variable, they have implemented an experimental atn system which treats atn registers as logical variablesmore » and provides a unification operation over them. They aim to simultaneously encourage the use of the powerful mechanisms available in DCGS and demonstrate that some of these techniques can be captured without reference to a resolution theorem prover. 14 references.« less
Improving the use of health data for health system strengthening.
Nutley, Tara; Reynolds, Heidi W
2013-02-13
Good quality and timely data from health information systems are the foundation of all health systems. However, too often data sit in reports, on shelves or in databases and are not sufficiently utilised in policy and program development, improvement, strategic planning and advocacy. Without specific interventions aimed at improving the use of data produced by information systems, health systems will never fully be able to meet the needs of the populations they serve. To employ a logic model to describe a pathway of how specific activities and interventions can strengthen the use of health data in decision making to ultimately strengthen the health system. A logic model was developed to provide a practical strategy for developing, monitoring and evaluating interventions to strengthen the use of data in decision making. The model draws on the collective strengths and similarities of previous work and adds to those previous works by making specific recommendations about interventions and activities that are most proximate to affect the use of data in decision making. The model provides an organizing framework for how interventions and activities work to strengthen the systematic demand, synthesis, review, and use of data. The logic model and guidance are presented to facilitate its widespread use and to enable improved data-informed decision making in program review and planning, advocacy, policy development. Real world examples from the literature support the feasible application of the activities outlined in the model. The logic model provides specific and comprehensive guidance to improve data demand and use. It can be used to design, monitor and evaluate interventions, and to improve demand for, and use of, data in decision making. As more interventions are implemented to improve use of health data, those efforts need to be evaluated.
NASA Technical Reports Server (NTRS)
1984-01-01
The work breakdown structure (WBS) for the Space Platform Expendables Resupply Concept Definition Study is described. The WBS consists of a list of WBS elements, a dictionary of element definitions, and an element logic diagram. The list and logic diagram identify the interrelationships of the elements. The dictionary defines the types of work that may be represented by or be classified under each specific element. The Space Platform Expendable Resupply WBS was selected mainly to support the program planning, scheduling, and costing performed in the programmatics task (task 3). The WBS is neither a statement-of-work nor a work authorization document. Rather, it is a framework around which to define requirements, plan effort, assign responsibilities, allocate and control resources, and report progress, expenditures, technical performance, and schedule performance. The WBS element definitions are independent of make-or-buy decisions, organizational structure, and activity locations unless exceptions are specifically stated.
The engineering of cybernetic systems
NASA Astrophysics Data System (ADS)
Fry, Robert L.
2002-05-01
This tutorial develops a logical basis for the engineering of systems that operate cybernetically. The term cybernetic system has a clear quantitative definition. It is a system that dynamically matches acquired information to selected actions relative to a computational issue that defines the essential purpose of the system or machine. This notion requires that information and control be further quantified. The logic of questions and assertions as developed by Cox provides one means of doing this. The design and operation of cybernetic systems can be understood by contrasting these kinds of systems with communication systems and information theory as developed by Shannon. The joint logic of questions and assertions can be seen to underlie and be common to both information theory as applied to the design of discrete communication systems and to a theory of discrete general systems. The joint logic captures a natural complementarity between systems that transmit and receive information and those that acquire and act on it. Specific comparisons and contrasts are made between the source rate and channel capacity of a communication system and the acquisition rate and control capacity of a general system. An overview is provided of the joint logic of questions and assertions and the ties that this logic has to both conventional information theory and to a general theory of systems. I-diagrams, the interrogative complement of Venn diagrams, are described as providing valuable reasoning tools. An initial framework is suggested for the design of cybernetic systems. Two examples are given to illustrate this framework as applied to discrete cybernetic systems. These examples include a predator-prey problem as illustrated through "The Dog Chrysippus Pursuing its Prey," and the derivation of a single-neuron system that operates cybernetically and is biologically plausible. Future areas of research are highlighted which require development for a mature engineering framework.
Runtime Verification of C Programs
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2008-01-01
We present in this paper a framework, RMOR, for monitoring the execution of C programs against state machines, expressed in a textual (nongraphical) format in files separate from the program. The state machine language has been inspired by a graphical state machine language RCAT recently developed at the Jet Propulsion Laboratory, as an alternative to using Linear Temporal Logic (LTL) for requirements capture. Transitions between states are labeled with abstract event names and Boolean expressions over such. The abstract events are connected to code fragments using an aspect-oriented pointcut language similar to ASPECTJ's or ASPECTC's pointcut language. The system is implemented in the C analysis and transformation package CIL, and is programmed in OCAML, the implementation language of CIL. The work is closely related to the notion of stateful aspects within aspect-oriented programming, where pointcut languages are extended with temporal assertions over the execution trace.
Logical Modeling and Dynamical Analysis of Cellular Networks
Abou-Jaoudé, Wassim; Traynard, Pauline; Monteiro, Pedro T.; Saez-Rodriguez, Julio; Helikar, Tomáš; Thieffry, Denis; Chaouiya, Claudine
2016-01-01
The logical (or logic) formalism is increasingly used to model regulatory and signaling networks. Complementing these applications, several groups contributed various methods and tools to support the definition and analysis of logical models. After an introduction to the logical modeling framework and to several of its variants, we review here a number of recent methodological advances to ease the analysis of large and intricate networks. In particular, we survey approaches to determine model attractors and their reachability properties, to assess the dynamical impact of variations of external signals, and to consistently reduce large models. To illustrate these developments, we further consider several published logical models for two important biological processes, namely the differentiation of T helper cells and the control of mammalian cell cycle. PMID:27303434
Fuzzy branching temporal logic.
Moon, Seong-ick; Lee, Kwang H; Lee, Doheon
2004-04-01
Intelligent systems require a systematic way to represent and handle temporal information containing uncertainty. In particular, a logical framework is needed that can represent uncertain temporal information and its relationships with logical formulae. Fuzzy linear temporal logic (FLTL), a generalization of propositional linear temporal logic (PLTL) with fuzzy temporal events and fuzzy temporal states defined on a linear time model, was previously proposed for this purpose. However, many systems are best represented by branching time models in which each state can have more than one possible future path. In this paper, fuzzy branching temporal logic (FBTL) is proposed to address this problem. FBTL adopts and generalizes concurrent tree logic (CTL*), which is a classical branching temporal logic. The temporal model of FBTL is capable of representing fuzzy temporal events and fuzzy temporal states, and the order relation among them is represented as a directed graph. The utility of FBTL is demonstrated using a fuzzy job shop scheduling problem as an example.
Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene
2016-04-01
Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Logic via Computer Programming.
ERIC Educational Resources Information Center
Wieschenberg, Agnes A.
This paper proposed the question "How do we teach logical thinking and sophisticated mathematics to unsophisticated college students?" One answer among many is through the writing of computer programs. The writing of computer algorithms is mathematical problem solving and logic in disguise and it may attract students who would otherwise stop…
Cullen, Patricia; Clapham, Kathleen; Byrne, Jake; Hunter, Kate; Senserrick, Teresa; Keay, Lisa; Ivers, Rebecca
2016-08-01
Evidence indicates that Aboriginal people are underrepresented among driver licence holders in New South Wales, which has been attributed to licensing barriers for Aboriginal people. The Driving Change program was developed to provide culturally responsive licensing services that engage Aboriginal communities and build local capacity. This paper outlines the formative evaluation of the program, including logic model construction and exploration of contextual factors. Purposive sampling was used to identify key informants (n=12) from a consultative committee of key stakeholders and program staff. Semi-structured interviews were transcribed and thematically analysed. Data from interviews informed development of the logic model. Participants demonstrated high level of support for the program and reported that it filled an important gap. The program context revealed systemic barriers to licensing that were correspondingly targeted by specific program outputs in the logic model. Addressing underlying assumptions of the program involved managing local capacity and support to strengthen implementation. This formative evaluation highlights the importance of exploring program context as a crucial first step in logic model construction. The consultation process assisted in clarifying program goals and ensuring that the program was responding to underlying systemic factors that contribute to inequitable licensing access for Aboriginal people. Copyright © 2016 Elsevier Ltd. All rights reserved.
C code generation from Petri-net-based logic controller specification
NASA Astrophysics Data System (ADS)
Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei
2017-08-01
The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.
NASA Technical Reports Server (NTRS)
Burgin, G. H.; Fogel, L. J.; Phelps, J. P.
1975-01-01
A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.
Program theory-driven evaluation science in a youth development context.
Deane, Kelsey L; Harré, Niki
2014-08-01
Program theory-driven evaluation science (PTDES) provides a useful framework for uncovering the mechanisms responsible for positive change resulting from participation in youth development (YD) programs. Yet it is difficult to find examples of PTDES that capture the complexity of such experiences. This article offers a much-needed example of PTDES applied to Project K, a youth development program with adventure, service-learning and mentoring components. Findings from eight program staff focus groups, 351 youth participants' comments, four key program documents, and results from six previous Project K research projects were integrated to produce a theory of change for the program. A direct logic analysis was then conducted to assess the plausibility of the proposed theory against relevant research literature. This demonstrated that Project K incorporates many of the best practice principles discussed in the literature that covers the three components of the program. The contributions of this theory-building process to organizational learning and development are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
The SITE Program funded a field demonstration to evaluate the Eco Logic Gas-Phase Chemical Reduction Process developed by ELI Eco Logic International Inc. (ELI), Ontario, Canada. The Demonstration took place at the Middleground Landfill in Bay City, Michigan using landfill wa...
The Application of Logic Programming to Communication Education.
ERIC Educational Resources Information Center
Sanford, David L.
Recommending that communication students be required to learn to use computers not merely as number crunchers, word processors, data bases, and graphics generators, but also as logical inference makers, this paper examines the recently developed technology of logical programing in computer languages. It presents two syllogisms and shows how they…
77 FR 35107 - Petition for Waiver of Compliance
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-12
... devices. CSX requests relief from 49 CFR 236.109 as it applies to variable timers within the program logic... program logic of the operating software. However, CSX notes that some microprocessor-based equipment have.../check sum/universal control number of the existing location specific application logic to the previously...
A logic-based method for integer programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hooker, J.; Natraj, N.R.
1994-12-31
We propose a logic-based approach to integer programming that replaces traditional branch-and-cut techniques with logical analogs. Integer variables are regarded as atomic propositions. The constraints give rise to logical formulas that are analogous to separating cuts. No continuous relaxation is used. Rather, the cuts are selected so that they can be easily solved as a discrete relaxation. (In fact, defining a relaxation and generating cuts are best seen as the same problem.) We experiment with relaxations that have a k-tree structure and can be solved by nonserial dynamic programming. We also present logic-based analogs of facet-defining cuts, Chv{acute a}tal rank,more » etc. We conclude with some preliminary computational results.« less
Qualitative models and experimental investigation of chaotic NOR gates and set/reset flip-flops
NASA Astrophysics Data System (ADS)
Rahman, Aminur; Jordan, Ian; Blackmore, Denis
2018-01-01
It has been observed through experiments and SPICE simulations that logical circuits based upon Chua's circuit exhibit complex dynamical behaviour. This behaviour can be used to design analogues of more complex logic families and some properties can be exploited for electronics applications. Some of these circuits have been modelled as systems of ordinary differential equations. However, as the number of components in newer circuits increases so does the complexity. This renders continuous dynamical systems models impractical and necessitates new modelling techniques. In recent years, some discrete dynamical models have been developed using various simplifying assumptions. To create a robust modelling framework for chaotic logical circuits, we developed both deterministic and stochastic discrete dynamical models, which exploit the natural recurrence behaviour, for two chaotic NOR gates and a chaotic set/reset flip-flop. This work presents a complete applied mathematical investigation of logical circuits. Experiments on our own designs of the above circuits are modelled and the models are rigorously analysed and simulated showing surprisingly close qualitative agreement with the experiments. Furthermore, the models are designed to accommodate dynamics of similarly designed circuits. This will allow researchers to develop ever more complex chaotic logical circuits with a simple modelling framework.
Qualitative models and experimental investigation of chaotic NOR gates and set/reset flip-flops.
Rahman, Aminur; Jordan, Ian; Blackmore, Denis
2018-01-01
It has been observed through experiments and SPICE simulations that logical circuits based upon Chua's circuit exhibit complex dynamical behaviour. This behaviour can be used to design analogues of more complex logic families and some properties can be exploited for electronics applications. Some of these circuits have been modelled as systems of ordinary differential equations. However, as the number of components in newer circuits increases so does the complexity. This renders continuous dynamical systems models impractical and necessitates new modelling techniques. In recent years, some discrete dynamical models have been developed using various simplifying assumptions. To create a robust modelling framework for chaotic logical circuits, we developed both deterministic and stochastic discrete dynamical models, which exploit the natural recurrence behaviour, for two chaotic NOR gates and a chaotic set/reset flip-flop. This work presents a complete applied mathematical investigation of logical circuits. Experiments on our own designs of the above circuits are modelled and the models are rigorously analysed and simulated showing surprisingly close qualitative agreement with the experiments. Furthermore, the models are designed to accommodate dynamics of similarly designed circuits. This will allow researchers to develop ever more complex chaotic logical circuits with a simple modelling framework.
Ziviani, Jenny; Feeney, Rachel; Schabrun, Siobhan; Copland, David; Hodges, Paul
2014-08-01
The purpose of this study was to present the application of a logic model in depicting the underlying theory of an undergraduate research scheme for occupational therapy, physiotherapy, and speech pathology university students in Queensland, Australia. Data gathered from key written documents on the goals and intended operation of the research incubator scheme were used to create a draft (unverified) logic model. The major components of the logic model were inputs and resources, activities/outputs, and outcomes (immediate/learning, intermediate/action, and longer term/impacts). Although immediate and intermediate outcomes chiefly pertained to students' participation in honours programs, longer-term outcomes (impacts) concerned their subsequent participation in research higher-degree programs and engagement in research careers. Program logic provided an effective means of clarifying program objectives and the mechanisms by which the research incubator scheme was designed to achieve its intended outcomes. This model was developed as the basis for evaluation of the effectiveness of the scheme in achieving its stated goals.
Enrollment Logics and Discourses: Toward Developing an Enrollment Knowledge Framework
ERIC Educational Resources Information Center
Snowden, Monique L.
2013-01-01
This article brings attention to a typology of enrollment knowledge possessed and enacted by contemporary chief enrollment officers. Interview narratives are used to reveal enrollment principles and associated actions--enrollment logics--that form enrollment discourses, which in turn shape the institutionalized presence of strategic enrollment…
A hybrid nanomemristor/transistor logic circuit capable of self-programming
Borghetti, Julien; Li, Zhiyong; Straznicky, Joseph; Li, Xuema; Ohlberg, Douglas A. A.; Wu, Wei; Stewart, Duncan R.; Williams, R. Stanley
2009-01-01
Memristor crossbars were fabricated at 40 nm half-pitch, using nanoimprint lithography on the same substrate with Si metal-oxide-semiconductor field effect transistor (MOS FET) arrays to form fully integrated hybrid memory resistor (memristor)/transistor circuits. The digitally configured memristor crossbars were used to perform logic functions, to serve as a routing fabric for interconnecting the FETs and as the target for storing information. As an illustrative demonstration, the compound Boolean logic operation (A AND B) OR (C AND D) was performed with kilohertz frequency inputs, using resistor-based logic in a memristor crossbar with FET inverter/amplifier outputs. By routing the output signal of a logic operation back onto a target memristor inside the array, the crossbar was conditionally configured by setting the state of a nonvolatile switch. Such conditional programming illuminates the way for a variety of self-programmed logic arrays, and for electronic synaptic computing. PMID:19171903
A hybrid nanomemristor/transistor logic circuit capable of self-programming.
Borghetti, Julien; Li, Zhiyong; Straznicky, Joseph; Li, Xuema; Ohlberg, Douglas A A; Wu, Wei; Stewart, Duncan R; Williams, R Stanley
2009-02-10
Memristor crossbars were fabricated at 40 nm half-pitch, using nanoimprint lithography on the same substrate with Si metal-oxide-semiconductor field effect transistor (MOS FET) arrays to form fully integrated hybrid memory resistor (memristor)/transistor circuits. The digitally configured memristor crossbars were used to perform logic functions, to serve as a routing fabric for interconnecting the FETs and as the target for storing information. As an illustrative demonstration, the compound Boolean logic operation (A AND B) OR (C AND D) was performed with kilohertz frequency inputs, using resistor-based logic in a memristor crossbar with FET inverter/amplifier outputs. By routing the output signal of a logic operation back onto a target memristor inside the array, the crossbar was conditionally configured by setting the state of a nonvolatile switch. Such conditional programming illuminates the way for a variety of self-programmed logic arrays, and for electronic synaptic computing.
Conceptual Modeling via Logic Programming
1990-01-01
Define User Interface and Query Language L i1W= Ltl k.l 4. Define Procedures for Specifying Output S . Select Logic Programming Language 6. Develop ...baseline s change model. sessions and baselines. It was changed 6. Develop Methodology for C 31 Users. considerably with the advent of the window This...Model Development : Implica- for Conceptual Modeling Via Logic tions for Communications of a Cognitive Programming. Marina del Rey, Calif.: Analysis of
ERIC Educational Resources Information Center
Hamilton, Jenny; Bronte-Tinkew, Jacinta
2007-01-01
A logic model, also called a conceptual model and theory-of-change model, is a visual representation of how a program is expected to "work." It relates resources, activities, and the intended changes or impacts that a program is expected to create. Typically, logic models are diagrams or flow charts with illustrations, text, and arrows that…
THRESHOLD LOGIC IN ARTIFICIAL INTELLIGENCE
COMPUTER LOGIC, ARTIFICIAL INTELLIGENCE , BIONICS, GEOMETRY, INPUT OUTPUT DEVICES, LINEAR PROGRAMMING, MATHEMATICAL LOGIC, MATHEMATICAL PREDICTION, NETWORKS, PATTERN RECOGNITION, PROBABILITY, SWITCHING CIRCUITS, SYNTHESIS
Evaluation of a Residential Mental Health Recovery Service in North Queensland.
Heyeres, Marion; Kinchin, Irina; Whatley, Elise; Brophy, Lisa; Jago, Jon; Wintzloff, Thomas; Morton, Steve; Mosby, Vinitta; Gopalkrishnan, Narayan; Tsey, Komla
2018-01-01
Evidence shows that subacute mental health recovery occurs best when a person remains active within the community and fulfils meaningful and satisfying roles of their choosing. Several residential care services that incorporate these values have been established in Australia and overseas. This study describes (a) the development of an evaluation framework for a new subacute residential mental health recovery service in regional Australia and (b) reports on the formative evaluation outcomes. Continuous quality improvement and participatory research approaches informed all stages of the development of the evaluation framework. A program logic was established and subsequently tested for practicability. The resultant logic utilizes the Scottish Recovery Indicator 2 (SRI 2) service development tool, Individual Recovery Plans (IRPs), and the impact assessment of the service on psychiatric inpatient admissions (reported separately). Service strengths included a recovery-focused practice that identifies and addresses the basic needs of residents (consumers). The consumers of the service were encouraged to develop their own goals and self-manage their recovery plans. The staff of the service were identified as working effectively in the context of the recovery process; the staff were seen as supported and valued. Areas for improvement included more opportunities for self-management for residents and more feedback from residents and carers.
Fuzzy Traffic Control with Vehicle-to-Everything Communication.
Salman, Muntaser A; Ozdemir, Suat; Celebi, Fatih V
2018-01-27
Traffic signal control (TSC) with vehicle-to everything (V2X) communication can be a very efficient solution to traffic congestion problem. Ratio of vehicles equipped with V2X communication capability in the traffic to the total number of vehicles (called penetration rate PR) is still low, thus V2X based TSC systems need to be supported by some other mechanisms. PR is the major factor that affects the quality of TSC process along with the evaluation interval. Quality of the TSC in each direction is a function of overall TSC quality of an intersection. Hence, quality evaluation of each direction should follow the evaluation of the overall intersection. Computational intelligence, more specifically swarm algorithm, has been recently used in this field in a European Framework Program FP7 supported project called COLOMBO. In this paper, using COLOMBO framework, further investigations have been done and two new methodologies using simple and fuzzy logic have been proposed. To evaluate the performance of our proposed methods, a comparison with COLOMBOs approach has been realized. The results reveal that TSC problem can be solved as a logical problem rather than an optimization problem. Performance of the proposed approaches is good enough to be suggested for future work under realistic scenarios even under low PR.
Fuzzy Traffic Control with Vehicle-to-Everything Communication
Ozdemir, Suat; Celebi, Fatih V.
2018-01-01
Traffic signal control (TSC) with vehicle-to everything (V2X) communication can be a very efficient solution to traffic congestion problem. Ratio of vehicles equipped with V2X communication capability in the traffic to the total number of vehicles (called penetration rate PR) is still low, thus V2X based TSC systems need to be supported by some other mechanisms. PR is the major factor that affects the quality of TSC process along with the evaluation interval. Quality of the TSC in each direction is a function of overall TSC quality of an intersection. Hence, quality evaluation of each direction should follow the evaluation of the overall intersection. Computational intelligence, more specifically swarm algorithm, has been recently used in this field in a European Framework Program FP7 supported project called COLOMBO. In this paper, using COLOMBO framework, further investigations have been done and two new methodologies using simple and fuzzy logic have been proposed. To evaluate the performance of our proposed methods, a comparison with COLOMBOs approach has been realized. The results reveal that TSC problem can be solved as a logical problem rather than an optimization problem. Performance of the proposed approaches is good enough to be suggested for future work under realistic scenarios even under low PR. PMID:29382053
Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G; Khanna, Sanjeev
2017-06-01
Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings.
Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G.; Khanna, Sanjeev
2017-01-01
Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings. PMID:29151821
Response-Time Tests of Logical-Rule Models of Categorization
ERIC Educational Resources Information Center
Little, Daniel R.; Nosofsky, Robert M.; Denton, Stephen E.
2011-01-01
A recent resurgence in logical-rule theories of categorization has motivated the development of a class of models that predict not only choice probabilities but also categorization response times (RTs; Fific, Little, & Nosofsky, 2010). The new models combine mental-architecture and random-walk approaches within an integrated framework and…
Query Expansion and Query Translation as Logical Inference.
ERIC Educational Resources Information Center
Nie, Jian-Yun
2003-01-01
Examines query expansion during query translation in cross language information retrieval and develops a general framework for inferential information retrieval in two particular contexts: using fuzzy logic and probability theory. Obtains evaluation formulas that are shown to strongly correspond to those used in other information retrieval models.…
Logical Demonomy Among the Ewe in West Africa
ERIC Educational Resources Information Center
Dzobo, N. K.
1974-01-01
Examines the indigenous pattern of moral behavior among the Ewe, an ethnic and linguistic group in West Africa, and assesses its role in moral education within the African context. The author develops a conceptual framework he calls "logical demonomy" that he uses to define the Ewe system of moral laws. (JT)
Organizational Politics in Schools: Micro, Macro, and Logics of Action.
ERIC Educational Resources Information Center
Bacharach, Samuel B.; Mundell, Bryan L.
1993-01-01
Develops a framework for analyzing the politics of school organizations, affirming a Weberian perspective as most appropriate. Develops "logic of action" (the implicit relationship between means and goals) as the focal point of organizational politics. Underlines the importance of analyzing interest groups and their strategies. Political…
Johnson, Victoria A; Ronan, Kevin R; Johnston, David M; Peace, Robin
2016-11-01
A main weakness in the evaluation of disaster education programs for children is evaluators' propensity to judge program effectiveness based on changes in children's knowledge. Few studies have articulated an explicit program theory of how children's education would achieve desired outcomes and impacts related to disaster risk reduction in households and communities. This article describes the advantages of constructing program theory models for the purpose of evaluating disaster education programs for children. Following a review of some potential frameworks for program theory development, including the logic model, the program theory matrix, and the stage step model, the article provides working examples of these frameworks. The first example is the development of a program theory matrix used in an evaluation of ShakeOut, an earthquake drill practiced in two Washington State school districts. The model illustrates a theory of action; specifically, the effectiveness of school earthquake drills in preventing injuries and deaths during disasters. The second example is the development of a stage step model used for a process evaluation of What's the Plan Stan?, a voluntary teaching resource distributed to all New Zealand primary schools for curricular integration of disaster education. The model illustrates a theory of use; specifically, expanding the reach of disaster education for children through increased promotion of the resource. The process of developing the program theory models for the purpose of evaluation planning is discussed, as well as the advantages and shortcomings of the theory-based approaches. © 2015 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Sheehan, T.; Baker, B.; Degagne, R. S.
2015-12-01
With the abundance of data sources, analytical methods, and computer models, land managers are faced with the overwhelming task of making sense of a profusion of data of wildly different types. Luckily, fuzzy logic provides a method to work with different types of data using language-based propositions such as "the landscape is undisturbed," and a simple set of logic constructs. Just as many surveys allow different levels of agreement with a proposition, fuzzy logic allows values reflecting different levels of truth for a proposition. Truth levels fall within a continuum ranging from Fully True to Fully False. Hence a fuzzy logic model produces continuous results. The Environmental Evaluation Modeling System (EEMS) is a platform-independent, tree-based, fuzzy logic modeling framework. An EEMS model provides a transparent definition of an evaluation model and is commonly developed as a collaborative effort among managers, scientists, and GIS experts. Managers specify a set of evaluative propositions used to characterize the landscape. Scientists, working with managers, formulate functions that convert raw data values into truth values for the propositions and produce a logic tree to combine results into a single metric used to guide decisions. Managers, scientists, and GIS experts then work together to implement and iteratively tune the logic model and produce final results. We present examples of two successful EEMS projects that provided managers with map-based results suitable for guiding decisions: sensitivity and climate change exposure in Utah and the Colorado Plateau modeled for the Bureau of Land Management; and terrestrial ecological intactness in the Mojave and Sonoran region of southern California modeled for the Desert Renewable Energy Conservation Plan.
Program Theory Evaluation: Logic Analysis
ERIC Educational Resources Information Center
Brousselle, Astrid; Champagne, Francois
2011-01-01
Program theory evaluation, which has grown in use over the past 10 years, assesses whether a program is designed in such a way that it can achieve its intended outcomes. This article describes a particular type of program theory evaluation--logic analysis--that allows us to test the plausibility of a program's theory using scientific knowledge.…
Logic regression and its extensions.
Schwender, Holger; Ruczinski, Ingo
2010-01-01
Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Sammis, Theodore W.; Shukla, Manoj K.; Mexal, John G.; Wang, Junming; Miller, David R.
2013-01-01
Universities develop strategic planning documents, and as part of that planning process, logic models are developed for specific programs within the university. This article examines the long-standing pecan program at New Mexico State University and the deficiencies and successes in the evolution of its logic model. The university's agricultural…
ERIC Educational Resources Information Center
Wang, Zhijun; Anderson, Terry; Chen, Li; Barbera, Elena
2017-01-01
Connectivist learning is interaction-centered learning. A framework describing interaction and cognitive engagement in connectivist learning was constructed using logical reasoning techniques. The framework and analysis was designed to help researchers and learning designers understand and adapt the characteristics and principles of interaction in…
Rodríguez, Daniela C; Peterson, Lauren A
2016-05-06
Factors that influence performance of community health workers (CHWs) delivering health services are not well understood. A recent logic model proposed categories of support from both health sector and communities that influence CHW performance and program outcomes. This logic model has been used to review a growth monitoring program delivered by CHWs in Honduras, known as Atención Integral a la Niñez en la Comunidad (AIN-C). A retrospective review of AIN-C was conducted through a document desk review and supplemented with in-depth interviews. Documents were systematically coded using the categories from the logic model, and gaps were addressed through interviews. Authors reviewed coded data for each category to analyze program details and outcomes as well as identify potential issues and gaps in the logic model. Categories from the logic model were inconsistently represented, with more information available for health sector than community. Context and input activities were not well documented. Information on health sector systems-level activities was available for governance but limited for other categories, while not much was found for community systems-level activities. Most available information focused on program-level activities with substantial data on technical support. Output, outcome, and impact data were drawn from various resources and suggest mixed results of AIN-C on indicators of interest. Assessing CHW performance through a desk review left gaps that could not be addressed about the relationship of activities and performance. There were critical characteristics of program design that made it contextually appropriate; however, it was difficult to identify clear links between AIN-C and malnutrition indicators. Regarding the logic model, several categories were too broad (e.g., technical support, context) and some aspects of AIN-C did not fit neatly in logic model categories (e.g., political commitment, equity, flexibility in implementation). The CHW performance logic model has potential as a tool for program planning and evaluation but would benefit from additional supporting tools and materials to facilitate and operationalize its use.
A Self-Paced Introductory Programming Course
ERIC Educational Resources Information Center
Gill, T. Grandon; Holton, Carolyn F.
2006-01-01
In this paper, a required introductory programming course being taught to MIS undergraduates using the C++ programming language is described. Two factors make the objectives of the course--which are to provide students with an exposure to the logical organization of the computer in addition to teaching them basic programming logic--particularly…
An Overview of the Runtime Verification Tool Java PathExplorer
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)
2002-01-01
We present an overview of the Java PathExplorer runtime verification tool, in short referred to as JPAX. JPAX can monitor the execution of a Java program and check that it conforms with a set of user provided properties formulated in temporal logic. JPAX can in addition analyze the program for concurrency errors such as deadlocks and data races. The concurrency analysis requires no user provided specification. The tool facilitates automated instrumentation of a program's bytecode, which when executed will emit an event stream, the execution trace, to an observer. The observer dispatches the incoming event stream to a set of observer processes, each performing a specialized analysis, such as the temporal logic verification, the deadlock analysis and the data race analysis. Temporal logic specifications can be formulated by the user in the Maude rewriting logic, where Maude is a high-speed rewriting system for equational logic, but here extended with executable temporal logic. The Maude rewriting engine is then activated as an event driven monitoring process. Alternatively, temporal specifications can be translated into efficient automata, which check the event stream. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems.
Morris, Melody K; Shriver, Zachary; Sasisekharan, Ram; Lauffenburger, Douglas A
2012-03-01
Mathematical models have substantially improved our ability to predict the response of a complex biological system to perturbation, but their use is typically limited by difficulties in specifying model topology and parameter values. Additionally, incorporating entities across different biological scales ranging from molecular to organismal in the same model is not trivial. Here, we present a framework called "querying quantitative logic models" (Q2LM) for building and asking questions of constrained fuzzy logic (cFL) models. cFL is a recently developed modeling formalism that uses logic gates to describe influences among entities, with transfer functions to describe quantitative dependencies. Q2LM does not rely on dedicated data to train the parameters of the transfer functions, and it permits straight-forward incorporation of entities at multiple biological scales. The Q2LM framework can be employed to ask questions such as: Which therapeutic perturbations accomplish a designated goal, and under what environmental conditions will these perturbations be effective? We demonstrate the utility of this framework for generating testable hypotheses in two examples: (i) a intracellular signaling network model; and (ii) a model for pharmacokinetics and pharmacodynamics of cell-cytokine interactions; in the latter, we validate hypotheses concerning molecular design of granulocyte colony stimulating factor. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Applying Toulmin: Teaching Logical Reasoning and Argumentative Writing
ERIC Educational Resources Information Center
Rex, Lesley A.; Thomas, Ebony Elizabeth; Engel, Steven
2010-01-01
To learn to write well-reasoned persuasive arguments, students need in situ help thinking through the complexity and complications of an issue, making inferences based on evidence, and hierarchically grouping and logically sequencing ideas. They rely on teachers to make this happen. In this article, the authors explain the framework they used and…
Disability Policy Evaluation: Combining Logic Models and Systems Thinking.
Claes, Claudia; Ferket, Neelke; Vandevelde, Stijn; Verlet, Dries; De Maeyer, Jessica
2017-07-01
Policy evaluation focuses on the assessment of policy-related personal, family, and societal changes or benefits that follow as a result of the interventions, services, and supports provided to those persons to whom the policy is directed. This article describes a systematic approach to policy evaluation based on an evaluation framework and an evaluation process that combine the use of logic models and systems thinking. The article also includes an example of how the framework and process have recently been used in policy development and evaluation in Flanders (Belgium), as well as four policy evaluation guidelines based on relevant published literature.
Navigating a Mobile Robot Across Terrain Using Fuzzy Logic
NASA Technical Reports Server (NTRS)
Seraji, Homayoun; Howard, Ayanna; Bon, Bruce
2003-01-01
A strategy for autonomous navigation of a robotic vehicle across hazardous terrain involves the use of a measure of traversability of terrain within a fuzzy-logic conceptual framework. This navigation strategy requires no a priori information about the environment. Fuzzy logic was selected as a basic element of this strategy because it provides a formal methodology for representing and implementing a human driver s heuristic knowledge and operational experience. Within a fuzzy-logic framework, the attributes of human reasoning and decision- making can be formulated by simple IF (antecedent), THEN (consequent) rules coupled with easily understandable and natural linguistic representations. The linguistic values in the rule antecedents convey the imprecision associated with measurements taken by sensors onboard a mobile robot, while the linguistic values in the rule consequents represent the vagueness inherent in the reasoning processes to generate the control actions. The operational strategies of the human expert driver can be transferred, via fuzzy logic, to a robot-navigation strategy in the form of a set of simple conditional statements composed of linguistic variables. These linguistic variables are defined by fuzzy sets in accordance with user-defined membership functions. The main advantages of a fuzzy navigation strategy lie in the ability to extract heuristic rules from human experience and to obviate the need for an analytical model of the robot navigation process.
OncoLogicTM - A Computer System to Evaluate the Carcinogenic Potential of Chemicals
OncoLogicTM is a software program that evaluates the likelihood that a chemical may cause cancer. OncoLogicTM has been peer reviewed and is being rele...
Hand-Held Calculator Algorithms for Coastal Engineering.
1982-01-01
and water depth at the structure toe, ds. The development of the equation is derived on the solution sheet included with program 104R. Algorithm uses...Limited Design Breaking Wave Height at Structure (AOS logic)... .... ....... ......... .54 6. 105R Wave Transmission - Fuchs’ Equation (RPN logic...58 105A Wave Transmission - Fuchs’ Equation (AOS logic). . . . 61 APPENDIX BLANK PROGRAM FORMS ........ ....................... ... 67 4
Coinductive Logic Programming with Negation
NASA Astrophysics Data System (ADS)
Min, Richard; Gupta, Gopal
We introduce negation into coinductive logic programming (co-LP) via what we term Coinductive SLDNF (co-SLDNF) resolution. We present declarative and operational semantics of co-SLDNF resolution and present their equivalence under the restriction of rationality. Co-LP with co-SLDNF resolution provides a powerful, practical and efficient operational semantics for Fitting's Kripke-Kleene three-valued logic with restriction of rationality. Further, applications of co-SLDNF resolution are also discussed and illustrated where Co-SLDNF resolution allows one to develop elegant implementations of modal logics. Moreover it provides the capability of non-monotonic inference (e.g., predicate Answer Set Programming) that can be used to develop novel and effective first-order modal non-monotonic inference engines.
Gate Set Tomography on a trapped ion qubit
NASA Astrophysics Data System (ADS)
Nielsen, Erik; Blume-Kohout, Robin; Gamble, John; Rundinger, Kenneth; Mizrahi, Jonathan; Sterk, Johathan; Maunz, Peter
2015-03-01
We present enhancements to gate-set tomography (GST), which is a framework in which an entire set of quantum logic gates (including preparation and measurement) can be fully characterized without need for pre-calibrated operations. Our new method, ``extended Linear GST'' (eLGST) uses fast, reliable analysis of structured long gate sequences to deliver tomographic precision at the Heisenberg limit with GST's calibration-free framework. We demonstrate this precision on a trapped-ion qubit, and show significant (orders of magnitude) advantage over both standard process tomography and randomized benchmarking. This work was supported by the Laboratory Directed Research and Development (LDRD) program at Sandia National Laboratories. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.
LOGSIM user's manual. [Logic Simulation Program for computer aided design of logic circuits
NASA Technical Reports Server (NTRS)
Mitchell, C. L.; Taylor, J. F.
1972-01-01
The user's manual for the LOGSIM Program is presented. All program options are explained and a detailed definition of the format of each input card is given. LOGSIM Program operations, and the preparation of LOGSIM input data are discused along with data card formats, postprocessor data cards, and output interpretation.
Logic Models: A Tool for Designing and Monitoring Program Evaluations. REL 2014-007
ERIC Educational Resources Information Center
Lawton, Brian; Brandon, Paul R.; Cicchinelli, Louis; Kekahio, Wendy
2014-01-01
introduction to logic models as a tool for designing program evaluations defines the major components of education programs--resources, activities, outputs, and short-, mid-, and long-term outcomes--and uses an example to demonstrate the relationships among them. This quick…
A Logical Analysis of Quantum Voting Protocols
NASA Astrophysics Data System (ADS)
Rad, Soroush Rafiee; Shirinkalam, Elahe; Smets, Sonja
2017-12-01
In this paper we provide a logical analysis of the Quantum Voting Protocol for Anonymous Surveying as developed by Horoshko and Kilin in (Phys. Lett. A 375, 1172-1175 2011). In particular we make use of the probabilistic logic of quantum programs as developed in (Int. J. Theor. Phys. 53, 3628-3647 2014) to provide a formal specification of the protocol and to derive its correctness. Our analysis is part of a wider program on the application of quantum logics to the formal verification of protocols in quantum communication and quantum computation.
NASA Astrophysics Data System (ADS)
Wan, Danny; Manfrini, Mauricio; Vaysset, Adrien; Souriau, Laurent; Wouters, Lennaert; Thiam, Arame; Raymenants, Eline; Sayan, Safak; Jussot, Julien; Swerts, Johan; Couet, Sebastien; Rassoul, Nouredine; Babaei Gavan, Khashayar; Paredis, Kristof; Huyghebaert, Cedric; Ercken, Monique; Wilson, Christopher J.; Mocuta, Dan; Radu, Iuliana P.
2018-04-01
Magnetic tunnel junctions (MTJs) interconnected via a continuous ferromagnetic free layer were fabricated for spin torque majority gate (STMG) logic. The MTJs are biased independently and show magnetoelectric response under spin transfer torque. The electrical control of these devices paves the way to future spin logic devices based on domain wall (DW) motion. In particular, it is a significant step towards the realization of a majority gate. To our knowledge, this is the first fabrication of a cross-shaped free layer shared by several perpendicular MTJs. The fabrication process can be generalized to any geometry and any number of MTJs. Thus, this framework can be applied to other spin logic concepts based on magnetic interconnect. Moreover, it allows exploration of spin dynamics for logic applications.
Implementing neural nets with programmable logic
NASA Technical Reports Server (NTRS)
Vidal, Jacques J.
1988-01-01
Networks of Boolean programmable logic modules are presented as one purely digital class of artificial neural nets. The approach contrasts with the continuous analog framework usually suggested. Programmable logic networks are capable of handling many neural-net applications. They avoid some of the limitations of threshold logic networks and present distinct opportunities. The network nodes are called dynamically programmable logic modules. They can be implemented with digitally controlled demultiplexers. Each node performs a Boolean function of its inputs which can be dynamically assigned. The overall network is therefore a combinational circuit and its outputs are Boolean global functions of the network's input variables. The approach offers definite advantages for VLSI implementation, namely, a regular architecture with limited connectivity, simplicity of the control machinery, natural modularity, and the support of a mature technology.
Constraint Logic Programming approach to protein structure prediction.
Dal Palù, Alessandro; Dovier, Agostino; Fogolari, Federico
2004-11-30
The protein structure prediction problem is one of the most challenging problems in biological sciences. Many approaches have been proposed using database information and/or simplified protein models. The protein structure prediction problem can be cast in the form of an optimization problem. Notwithstanding its importance, the problem has very seldom been tackled by Constraint Logic Programming, a declarative programming paradigm suitable for solving combinatorial optimization problems. Constraint Logic Programming techniques have been applied to the protein structure prediction problem on the face-centered cube lattice model. Molecular dynamics techniques, endowed with the notion of constraint, have been also exploited. Even using a very simplified model, Constraint Logic Programming on the face-centered cube lattice model allowed us to obtain acceptable results for a few small proteins. As a test implementation their (known) secondary structure and the presence of disulfide bridges are used as constraints. Simplified structures obtained in this way have been converted to all atom models with plausible structure. Results have been compared with a similar approach using a well-established technique as molecular dynamics. The results obtained on small proteins show that Constraint Logic Programming techniques can be employed for studying protein simplified models, which can be converted into realistic all atom models. The advantage of Constraint Logic Programming over other, much more explored, methodologies, resides in the rapid software prototyping, in the easy way of encoding heuristics, and in exploiting all the advances made in this research area, e.g. in constraint propagation and its use for pruning the huge search space.
Humanoid Robotics: Real-Time Object Oriented Programming
NASA Technical Reports Server (NTRS)
Newton, Jason E.
2005-01-01
Programming of robots in today's world is often done in a procedural oriented fashion, where object oriented programming is not incorporated. In order to keep a robust architecture allowing for easy expansion of capabilities and a truly modular design, object oriented programming is required. However, concepts in object oriented programming are not typically applied to a real time environment. The Fujitsu HOAP-2 is the test bed for the development of a humanoid robot framework abstracting control of the robot into simple logical commands in a real time robotic system while allowing full access to all sensory data. In addition to interfacing between the motor and sensory systems, this paper discusses the software which operates multiple independently developed control systems simultaneously and the safety measures which keep the humanoid from damaging itself and its environment while running these systems. The use of this software decreases development time and costs and allows changes to be made while keeping results safe and predictable.
Making It Logical: Implementation of Inclusive Education Using a Logic Model Framework
ERIC Educational Resources Information Center
Stegemann, Kim Calder; Jaciw, Andrew P.
2018-01-01
Educational inclusion of children with special learning needs is a philosophy and movement with an international presence. Though Canada is a leader in educational inclusion, many would claim that our public educational systems have not yet fully realized the dream of inclusive education. As other countries have noted, making full-fledged changes…
An iLab for Teaching Advanced Logic Concepts with Hardware Descriptive Languages
ERIC Educational Resources Information Center
Ayodele, Kayode P.; Inyang, Isaac A.; Kehinde, Lawrence O.
2015-01-01
One of the more interesting approaches to teaching advanced logic concepts is the use of online laboratory frameworks to provide student access to remote field-programmable devices. There is as yet, however, no conclusive evidence of the effectiveness of such an approach. This paper presents the Advanced Digital Lab, a remote laboratory based on…
ERIC Educational Resources Information Center
Zarcone, Alessandra; Padó, Sebastian; Lenci, Alessandro
2014-01-01
Logical metonymy resolution ("begin a book" ? "begin reading a book" or "begin writing a book") has traditionally been explained either through complex lexical entries (qualia structures) or through the integration of the implicit event via post-lexical access to world knowledge. We propose that recent work within the…
Methods of Product Evaluation. Guide Number 10. Evaluation Guides Series.
ERIC Educational Resources Information Center
St. John, Mark
In this guide the logic of product evaluation is described in a framework that is meant to be general and adaptable to all kinds of evaluations. Evaluators should consider using the logic and methods of product evaluation when (1) the purpose of the evaluation is to aid evaluators in making a decision about purchases; (2) a comprehensive…
Interpreting Abstract Interpretations in Membership Equational Logic
NASA Technical Reports Server (NTRS)
Fischer, Bernd; Rosu, Grigore
2001-01-01
We present a logical framework in which abstract interpretations can be naturally specified and then verified. Our approach is based on membership equational logic which extends equational logics by membership axioms, asserting that a term has a certain sort. We represent an abstract interpretation as a membership equational logic specification, usually as an overloaded order-sorted signature with membership axioms. It turns out that, for any term, its least sort over this specification corresponds to its most concrete abstract value. Maude implements membership equational logic and provides mechanisms to calculate the least sort of a term efficiently. We first show how Maude can be used to get prototyping of abstract interpretations "for free." Building on the meta-logic facilities of Maude, we further develop a tool that automatically checks and abstract interpretation against a set of user-defined properties. This can be used to select an appropriate abstract interpretation, to characterize the specified loss of information during abstraction, and to compare different abstractions with each other.
ERIC Educational Resources Information Center
Dyehouse, Melissa; Bennett, Deborah; Harbor, Jon; Childress, Amy; Dark, Melissa
2009-01-01
Logic models are based on linear relationships between program resources, activities, and outcomes, and have been used widely to support both program development and evaluation. While useful in describing some programs, the linear nature of the logic model makes it difficult to capture the complex relationships within larger, multifaceted…
Logic integer programming models for signaling networks.
Haus, Utz-Uwe; Niermann, Kathrin; Truemper, Klaus; Weismantel, Robert
2009-05-01
We propose a static and a dynamic approach to model biological signaling networks, and show how each can be used to answer relevant biological questions. For this, we use the two different mathematical tools of Propositional Logic and Integer Programming. The power of discrete mathematics for handling qualitative as well as quantitative data has so far not been exploited in molecular biology, which is mostly driven by experimental research, relying on first-order or statistical models. The arising logic statements and integer programs are analyzed and can be solved with standard software. For a restricted class of problems the logic models reduce to a polynomial-time solvable satisfiability algorithm. Additionally, a more dynamic model enables enumeration of possible time resolutions in poly-logarithmic time. Computational experiments are included.
Programming Programmable Logic Controller. High-Technology Training Module.
ERIC Educational Resources Information Center
Lipsky, Kevin
This training module on programming programmable logic controllers (PLC) is part of the memory structure and programming unit used in a packaging systems equipment control course. In the course, students assemble, install, maintain, and repair industrial machinery used in industry. The module contains description, objectives, content outline,…
ERIC Educational Resources Information Center
Welty, Gordon A.
The logic of the evaluation of educational and other action programs is discussed from a methodological viewpoint. However, no attempt is made to develop methods of evaluating programs. In Part I, the structure of an educational program is viewed as a system with three components--inputs, transformation of inputs into outputs, and outputs. Part II…
A String Search Marketing Application Using Visual Programming
ERIC Educational Resources Information Center
Chin, Jerry M.; Chin, Mary H.; Van Landuyt, Cathryn
2013-01-01
This paper demonstrates the use of programing software that provides the student programmer visual cues to construct the code to a student programming assignment. This method does not disregard or minimize the syntax or required logical constructs. The student can concentrate more on the logic and less on the language itself.
Modelling default and likelihood reasoning as probabilistic
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.
Towards An Engineering Discipline of Computational Security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mili, Ali; Sheldon, Frederick T; Jilani, Lamia Labed
2007-01-01
George Boole ushered the era of modern logic by arguing that logical reasoning does not fall in the realm of philosophy, as it was considered up to his time, but in the realm of mathematics. As such, logical propositions and logical arguments are modeled using algebraic structures. Likewise, we submit that security attributes must be modeled as formal mathematical propositions that are subject to mathematical analysis. In this paper, we approach this problem by attempting to model security attributes in a refinement-like framework that has traditionally been used to represent reliability and safety claims. Keywords: Computable security attributes, survivability, integrity,more » dependability, reliability, safety, security, verification, testing, fault tolerance.« less
NASA Astrophysics Data System (ADS)
Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.
2014-12-01
In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.
Post optimization paradigm in maximum 3-satisfiability logic programming
NASA Astrophysics Data System (ADS)
Mansor, Mohd. Asyraf; Sathasivam, Saratha; Kasihmuddin, Mohd Shareduwan Mohd
2017-08-01
Maximum 3-Satisfiability (MAX-3SAT) is a counterpart of the Boolean satisfiability problem that can be treated as a constraint optimization problem. It deals with a conundrum of searching the maximum number of satisfied clauses in a particular 3-SAT formula. This paper presents the implementation of enhanced Hopfield network in hastening the Maximum 3-Satisfiability (MAX-3SAT) logic programming. Four post optimization techniques are investigated, including the Elliot symmetric activation function, Gaussian activation function, Wavelet activation function and Hyperbolic tangent activation function. The performances of these post optimization techniques in accelerating MAX-3SAT logic programming will be discussed in terms of the ratio of maximum satisfied clauses, Hamming distance and the computation time. Dev-C++ was used as the platform for training, testing and validating our proposed techniques. The results depict the Hyperbolic tangent activation function and Elliot symmetric activation function can be used in doing MAX-3SAT logic programming.
Analytical learning and term-rewriting systems
NASA Technical Reports Server (NTRS)
Laird, Philip; Gamble, Evan
1990-01-01
Analytical learning is a set of machine learning techniques for revising the representation of a theory based on a small set of examples of that theory. When the representation of the theory is correct and complete but perhaps inefficient, an important objective of such analysis is to improve the computational efficiency of the representation. Several algorithms with this purpose have been suggested, most of which are closely tied to a first order logical language and are variants of goal regression, such as the familiar explanation based generalization (EBG) procedure. But because predicate calculus is a poor representation for some domains, these learning algorithms are extended to apply to other computational models. It is shown that the goal regression technique applies to a large family of programming languages, all based on a kind of term rewriting system. Included in this family are three language families of importance to artificial intelligence: logic programming, such as Prolog; lambda calculus, such as LISP; and combinatorial based languages, such as FP. A new analytical learning algorithm, AL-2, is exhibited that learns from success but is otherwise quite different from EBG. These results suggest that term rewriting systems are a good framework for analytical learning research in general, and that further research should be directed toward developing new techniques.
Eco-logical successes : January 2011
DOT National Transportation Integrated Search
2011-01-01
This document identifies and explains each Eco-Logical signatory agency's strategic environmental programs, projects, and efforts that are either directly related to or share the vision set forth in Eco-Logical. A brief description of an agency's key...
Goodson, Patricia; Pruitt, B E; Suther, Sandy; Wilson, Kelly; Buhi, Eric
2006-04-01
Authors examined the logic (or the implicit theory) underlying 16 abstinence-only-until-marriage programs in Texas (50% of all programs funded under the federal welfare reform legislation during 2001 and 2002). Defined as a set of propositions regarding the relationship between program activities and their intended outcomes, program staff's implicit theories were summarized and compared to (a) data from studies on adolescent sexual behavior, (b) a theory-based model of youth abstinent behavior, and (c) preliminary findings from the national evaluation of Title V programs. Authors interviewed 62 program directors and instructors and employed selected principles of grounded theory to analyze interview data. Findings indicated that abstinence education staff could clearly articulate the logic guiding program activity choices. Comparisons between interview data and a theory-based model of adolescent sexual behavior revealed striking similarities. Implications of these findings for conceptualizing and evaluating abstinence-only-until-marriage (or similar) programs are examined.
ERIC Educational Resources Information Center
Stenning, Keith; van Lambalgen, Michiel
2004-01-01
Modern logic provides accounts of both interpretation and derivation which work together to provide abstract frameworks for modelling the sensitivity of human reasoning to task, context and content. Cognitive theories have underplayed the importance of interpretative processes. We illustrate, using Wason's [Q. J. Exp. Psychol. 20 (1968) 273]…
Organized Cognition: Theoretical Framework for Future C2 Research and Implementation
2011-06-01
Dordrecht, The Netherlands: Kluwer Academic Publishers. 105. Husserl, E., Analyses Concerning Passive and Active Synthesis: Lectures on Transcendental...Logic. 2001, Dordrecht, The Netherlands: Kluwer Academic Publishers. 106. Merleau-Ponty, M., Phenomenology of Perception. 1962, London: Routledge...Secaucus, NJ: Kluwer Academic Publishers. 110. Husserl, E. and L. Landgrebe, Experience and Judgment: Investigations in a Genealogy of Logic. 1973
Knowledge representation in fuzzy logic
NASA Technical Reports Server (NTRS)
Zadeh, Lotfi A.
1989-01-01
The author presents a summary of the basic concepts and techniques underlying the application of fuzzy logic to knowledge representation. He then describes a number of examples relating to its use as a computational system for dealing with uncertainty and imprecision in the context of knowledge, meaning, and inference. It is noted that one of the basic aims of fuzzy logic is to provide a computational framework for knowledge representation and inference in an environment of uncertainty and imprecision. In such environments, fuzzy logic is effective when the solutions need not be precise and/or it is acceptable for a conclusion to have a dispositional rather than categorical validity. The importance of fuzzy logic derives from the fact that there are many real-world applications which fit these conditions, especially in the realm of knowledge-based systems for decision-making and control.
EAGLE Monitors by Collecting Facts and Generating Obligations
NASA Technical Reports Server (NTRS)
Barrnger, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik
2003-01-01
We present a rule-based framework, called EAGLE, that has been shown to be capable of defining and implementing a range of finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time and metric temporal logics, interval logics, forms of quantified temporal logics, and so on. A monitor for an EAGLE formula checks if a finite trace of states satisfies the given formula. We present, in details, an algorithm for the synthesis of monitors for EAGLE. The algorithm is implemented as a Java application and involves novel techniques for rule definition, manipulation and execution. Monitoring is achieved on a state-by-state basis avoiding any need to store the input trace of states. Our initial experiments have been successful as EAGLE detected a previously unknown bug while testing a planetary rover controller.
Two autowire versions for CDC-3200 and IBM-360
NASA Technical Reports Server (NTRS)
Billingsley, J. B.
1972-01-01
Microelectronics program was initiated to evaluate circuitry, packaging methods, and fabrication approaches necessary to produce completely procured logic system. Two autowire programs were developed for CDC-3200 and IBM-360 computers for use in designing logic systems.
How Young Children Learn to Program with Sensor, Action, and Logic Blocks
ERIC Educational Resources Information Center
Wyeth, Peta
2008-01-01
Electronic Blocks are a new programming environment designed specifically for children aged between 3 and 8 years. These physical, stackable blocks include sensor blocks, action blocks, and logic blocks. By connecting these blocks, children can program a wide variety of structures that interact with one another and the environment. Electronic…
Using logic models in a community-based agricultural injury prevention project.
Helitzer, Deborah; Willging, Cathleen; Hathorn, Gary; Benally, Jeannie
2009-01-01
The National Institute for Occupational Safety and Health has long promoted the logic model as a useful tool in an evaluator's portfolio. Because a logic model supports a systematic approach to designing interventions, it is equally useful for program planners. Undertaken with community stakeholders, a logic model process articulates the underlying foundations of a particular programmatic effort and enhances program design and evaluation. Most often presented as sequenced diagrams or flow charts, logic models demonstrate relationships among the following components: statement of a problem, various causal and mitigating factors related to that problem, available resources to address the problem, theoretical foundations of the selected intervention, intervention goals and planned activities, and anticipated short- and long-term outcomes. This article describes a case example of how a logic model process was used to help community stakeholders on the Navajo Nation conceive, design, implement, and evaluate agricultural injury prevention projects.
Stone, Vathsala I; Lane, Joseph P
2012-05-16
Government-sponsored science, technology, and innovation (STI) programs support the socioeconomic aspects of public policies, in addition to expanding the knowledge base. For example, beneficial healthcare services and devices are expected to result from investments in research and development (R&D) programs, which assume a causal link to commercial innovation. Such programs are increasingly held accountable for evidence of impact-that is, innovative goods and services resulting from R&D activity. However, the absence of comprehensive models and metrics skews evidence gathering toward bibliometrics about research outputs (published discoveries), with less focus on transfer metrics about development outputs (patented prototypes) and almost none on econometrics related to production outputs (commercial innovations). This disparity is particularly problematic for the expressed intent of such programs, as most measurable socioeconomic benefits result from the last category of outputs. This paper proposes a conceptual framework integrating all three knowledge-generating methods into a logic model, useful for planning, obtaining, and measuring the intended beneficial impacts through the implementation of knowledge in practice. Additionally, the integration of the Context-Input-Process-Product (CIPP) model of evaluation proactively builds relevance into STI policies and programs while sustaining rigor. The resulting logic model framework explicitly traces the progress of knowledge from inputs, following it through the three knowledge-generating processes and their respective knowledge outputs (discovery, invention, innovation), as it generates the intended socio-beneficial impacts. It is a hybrid model for generating technology-based innovations, where best practices in new product development merge with a widely accepted knowledge-translation approach. Given the emphasis on evidence-based practice in the medical and health fields and "bench to bedside" expectations for knowledge transfer, sponsors and grantees alike should find the model useful for planning, implementing, and evaluating innovation processes. High-cost/high-risk industries like healthcare require the market deployment of technology-based innovations to improve domestic society in a global economy. An appropriate balance of relevance and rigor in research, development, and production is crucial to optimize the return on public investment in such programs. The technology-innovation process needs a comprehensive operational model to effectively allocate public funds and thereby deliberately and systematically accomplish socioeconomic benefits.
2012-01-01
Background Government-sponsored science, technology, and innovation (STI) programs support the socioeconomic aspects of public policies, in addition to expanding the knowledge base. For example, beneficial healthcare services and devices are expected to result from investments in research and development (R&D) programs, which assume a causal link to commercial innovation. Such programs are increasingly held accountable for evidence of impact—that is, innovative goods and services resulting from R&D activity. However, the absence of comprehensive models and metrics skews evidence gathering toward bibliometrics about research outputs (published discoveries), with less focus on transfer metrics about development outputs (patented prototypes) and almost none on econometrics related to production outputs (commercial innovations). This disparity is particularly problematic for the expressed intent of such programs, as most measurable socioeconomic benefits result from the last category of outputs. Methods This paper proposes a conceptual framework integrating all three knowledge-generating methods into a logic model, useful for planning, obtaining, and measuring the intended beneficial impacts through the implementation of knowledge in practice. Additionally, the integration of the Context-Input-Process-Product (CIPP) model of evaluation proactively builds relevance into STI policies and programs while sustaining rigor. Results The resulting logic model framework explicitly traces the progress of knowledge from inputs, following it through the three knowledge-generating processes and their respective knowledge outputs (discovery, invention, innovation), as it generates the intended socio-beneficial impacts. It is a hybrid model for generating technology-based innovations, where best practices in new product development merge with a widely accepted knowledge-translation approach. Given the emphasis on evidence-based practice in the medical and health fields and “bench to bedside” expectations for knowledge transfer, sponsors and grantees alike should find the model useful for planning, implementing, and evaluating innovation processes. Conclusions High-cost/high-risk industries like healthcare require the market deployment of technology-based innovations to improve domestic society in a global economy. An appropriate balance of relevance and rigor in research, development, and production is crucial to optimize the return on public investment in such programs. The technology-innovation process needs a comprehensive operational model to effectively allocate public funds and thereby deliberately and systematically accomplish socioeconomic benefits. PMID:22591638
An application of different dioids in public key cryptography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durcheva, Mariana I., E-mail: mdurcheva66@gmail.com
2014-11-18
Dioids provide a natural framework for analyzing a broad class of discrete event dynamical systems such as the design and analysis of bus and railway timetables, scheduling of high-throughput industrial processes, solution of combinatorial optimization problems, the analysis and improvement of flow systems in communication networks. They have appeared in several branches of mathematics such as functional analysis, optimization, stochastic systems and dynamic programming, tropical geometry, fuzzy logic. In this paper we show how to involve dioids in public key cryptography. The main goal is to create key – exchange protocols based on dioids. Additionally the digital signature scheme ismore » presented.« less
Knebel, Ann R.; Sharpe, Virginia A.; Danis, Marion; Toomey, Lauren M.; Knickerbocker, Deborah K.
2017-01-01
During catastrophic disasters, government leaders must decide how to efficiently and effectively allocate scarce public health and medical resources. The literature about triage decision making at the individual patient level is substantial, and the National Response Framework provides guidance about the distribution of responsibilities between federal and state governments. However, little has been written about the decision-making process of federal leaders in disaster situations when resources are not sufficient to meet the needs of several states simultaneously. We offer an ethical framework and logic model for decision making in such circumstances. We adapted medical triage and the federalism principle to the decision-making process for allocating scarce federal public health and medical resources. We believe that the logic model provides a values-based framework that can inform the gestalt during the iterative decision process used by federal leaders as they allocate scarce resources to states during catastrophic disasters. PMID:24612854
Expanding a First-Order Logic Mitigation Framework to Handle Multimorbid Patient Preferences
Michalowski, Martin; Wilk, Szymon; Rosu, Daniela; Kezadri, Mounira; Michalowski, Wojtek; Carrier, Marc
2015-01-01
The increasing prevalence of multimorbidity is a challenge for physicians who have to manage a constantly growing number of patients with simultaneous diseases. Adding to this challenge is the need to incorporate patient preferences as key components of the care process, thanks in part to the emergence of personalized and participatory medicine. In our previous work we proposed a framework employing first order logic to represent clinical practice guidelines (CPGs) and to mitigate possible adverse interactions when concurrently applying multiple CPGs to a multimorbid patient. In this paper, we describe extensions to our methodological framework that (1) broaden our definition of revision operators to support required and desired types of revisions defined in secondary knowledge sources, and (2) expand the mitigation algorithm to apply revisions based on their type. We illustrate the capabilities of the expanded framework using a clinical case study of a multimorbid patient with stable cardiac artery disease who suffers a sudden onset of deep vein thrombosis. PMID:26958226
NASA Technical Reports Server (NTRS)
Burgin, G. H.; Owens, A. J.
1975-01-01
A detailed description is presented of the computer programs in order to provide an understanding of the mathematical and geometrical relationships as implemented in the programs. The individual sbbroutines and their underlying mathematical relationships are described, and the required input data and the output provided by the program are explained. The relationship of the adaptive maneuvering logic program with the program to drive the differential maneuvering simulator is discussed.
An Assessment of Agency Theory as a Framework for the Government-University Relationship
ERIC Educational Resources Information Center
Kivisto, Jussi
2008-01-01
The aim of this paper is to use agency theory as the theoretical framework for an examination of the government-university relationship and to assess the main strengths and weaknesses of the theory in this context. Because of its logically consistent framework, agency theory is able to manifest many of the complexities and difficulties that…
A computer program for the generation of logic networks from task chart data
NASA Technical Reports Server (NTRS)
Herbert, H. E.
1980-01-01
The Network Generation Program (NETGEN), which creates logic networks from task chart data is presented. NETGEN is written in CDC FORTRAN IV (Extended) and runs in a batch mode on the CDC 6000 and CYBER 170 series computers. Data is input via a two-card format and contains information regarding the specific tasks in a project. From this data, NETGEN constructs a logic network of related activities with each activity having unique predecessor and successor nodes, activity duration, descriptions, etc. NETGEN then prepares this data on two files that can be used in the Project Planning Analysis and Reporting System Batch Network Scheduling program and the EZPERT graphics program.
Klatt, BN; Carender, WJ; Lin, CC; Alsubaie, SF; Kinnaird, CR; Sienko, KH; Whitney, SL
2016-01-01
There is little information in peer-reviewed literature to specifically guide the choice of exercise for persons with balance and vestibular disorders. The purpose of this study is to provide a rationale for the establishment of a progression framework and propose a logical sequence in progressing balance exercises for persons with vestibular disorders. Our preliminary conceptual framework was developed by a multidisciplinary team of physical therapists and engineers with extensive experience with people with vestibular disorders. Balance exercises are grouped into six different categories: static standing, compliant surface, weight shifting, modified center of gravity, gait, and vestibulo-ocular reflex (VOR). Through a systematized literature review, interviews and focus group discussions with physical therapists and postural control experts, and pilot studies involving repeated trials of each exercise, exercise progressions for each category were developed and ranked in order of degree of difficulty. Clinical expertise and experience guided decision making for the exercise progressions. Hundreds of exercise combinations were discussed and research is ongoing to validate the hypothesized rankings. The six exercise categories can be incorporated into a balance training program and the framework for exercise progression can be used to guide less experienced practitioners in the development of a balance program. It may also assist clinicians and researchers to design, develop, and progress interventions within a treatment plan of care, or within clinical trials. A structured exercise framework has the potential to maximize postural control, decrease symptoms of dizziness/visual vertigo, and provide “rules” for exercise progression for persons with vestibular disorders. The conceptual framework may also be applicable to persons with other balance-related issues. PMID:27489886
A novel logic-based approach for quantitative toxicology prediction.
Amini, Ata; Muggleton, Stephen H; Lodhi, Huma; Sternberg, Michael J E
2007-01-01
There is a pressing need for accurate in silico methods to predict the toxicity of molecules that are being introduced into the environment or are being developed into new pharmaceuticals. Predictive toxicology is in the realm of structure activity relationships (SAR), and many approaches have been used to derive such SAR. Previous work has shown that inductive logic programming (ILP) is a powerful approach that circumvents several major difficulties, such as molecular superposition, faced by some other SAR methods. The ILP approach reasons with chemical substructures within a relational framework and yields chemically understandable rules. Here, we report a general new approach, support vector inductive logic programming (SVILP), which extends the essentially qualitative ILP-based SAR to quantitative modeling. First, ILP is used to learn rules, the predictions of which are then used within a novel kernel to derive a support-vector generalization model. For a highly heterogeneous dataset of 576 molecules with known fathead minnow fish toxicity, the cross-validated correlation coefficients (R2CV) from a chemical descriptor method (CHEM) and SVILP are 0.52 and 0.66, respectively. The ILP, CHEM, and SVILP approaches correctly predict 55, 58, and 73%, respectively, of toxic molecules. In a set of 165 unseen molecules, the R2 values from the commercial software TOPKAT and SVILP are 0.26 and 0.57, respectively. In all calculations, SVILP showed significant improvements in comparison with the other methods. The SVILP approach has a major advantage in that it uses ILP automatically and consistently to derive rules, mostly novel, describing fragments that are toxicity alerts. The SVILP is a general machine-learning approach and has the potential of tackling many problems relevant to chemoinformatics including in silico drug design.
Modelling default and likelihood reasoning as probabilistic reasoning
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.
NASA Technical Reports Server (NTRS)
Kavelund, Klaus; Barringer, Howard
2012-01-01
TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.
ERIC Educational Resources Information Center
Cooper, Jeff
2009-01-01
This dissertation addresses theory and practice of evaluation and assessment in university student affairs, by applying logic modeling/program theory to a case study. I intend to add knowledge to ongoing dialogue among evaluation scholars and practitioners on student affairs program planning and improvement as integral considerations that serve…
ERIC Educational Resources Information Center
Schultz, Leah
2011-01-01
This research investigates the implementation of the programming language Alice to teach computer programming logic to computer information systems students. Alice has been implemented in other university settings and has been reported to have many benefits including object-oriented concepts and an engaging and fun learning environment. In this…
IT0: Discrete Math and Programming Logic Topics as a Hybrid Alternative to CS0
ERIC Educational Resources Information Center
Martin, Nancy L.
2015-01-01
This paper describes the development of a hybrid introductory course for students in their first or second year of an information systems technologies degree program at a large Midwestern university. The course combines topics from discrete mathematics and programming logic and design, a unique twist on most introductory courses. The objective of…
Using the Logic Model to Plan Extension and Outreach Program Development and Scholarship
ERIC Educational Resources Information Center
Corbin, Marilyn; Kiernan, Nancy Ellen; Koble, Margaret A.; Watson, Jack; Jackson, Daney
2004-01-01
In searching for a process to help program teams of campus-based faculty and field-based educators develop five-year and annual statewide program plans, cooperative extension administrators and specialists in Penn State's College of Agricultural Sciences discovered that the use of the logic model process can influence the successful design of…
ERIC Educational Resources Information Center
Vosinakis, Spyros; Anastassakis, George; Koutsabasis, Panayiotis
2018-01-01
Logic Programming (LP) follows the declarative programming paradigm, which novice students often find hard to grasp. The limited availability of visual teaching aids for LP can lead to low motivation for learning. In this paper, we present a platform for teaching and learning Prolog in Virtual Worlds, which enables the visual interpretation and…
Reactive system verification case study: Fault-tolerant transputer communication
NASA Technical Reports Server (NTRS)
Crane, D. Francis; Hamory, Philip J.
1993-01-01
A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.
Jafari, Mohieddin; Ansari-Pour, Naser; Azimzadeh, Sadegh; Mirzaie, Mehdi
It is nearly half a century past the age of the introduction of the Central Dogma (CD) of molecular biology. This biological axiom has been developed and currently appears to be all the more complex. In this study, we modified CD by adding further species to the CD information flow and mathematically expressed CD within a dynamic framework by using Boolean network based on its present-day and 1965 editions. We show that the enhancement of the Dogma not only now entails a higher level of complexity, but it also shows a higher level of robustness, thus far more consistent with the nature of biological systems. Using this mathematical modeling approach, we put forward a logic-based expression of our conceptual view of molecular biology. Finally, we show that such biological concepts can be converted into dynamic mathematical models using a logic-based approach and thus may be useful as a framework for improving static conceptual models in biology.
Jafari, Mohieddin; Ansari-Pour, Naser; Azimzadeh, Sadegh; Mirzaie, Mehdi
2017-01-01
It is nearly half a century past the age of the introduction of the Central Dogma (CD) of molecular biology. This biological axiom has been developed and currently appears to be all the more complex. In this study, we modified CD by adding further species to the CD information flow and mathematically expressed CD within a dynamic framework by using Boolean network based on its present-day and 1965 editions. We show that the enhancement of the Dogma not only now entails a higher level of complexity, but it also shows a higher level of robustness, thus far more consistent with the nature of biological systems. Using this mathematical modeling approach, we put forward a logic-based expression of our conceptual view of molecular biology. Finally, we show that such biological concepts can be converted into dynamic mathematical models using a logic-based approach and thus may be useful as a framework for improving static conceptual models in biology. PMID:29267315
Depicting the logic of three evaluation theories.
Hansen, Mark; Alkin, Marvin C; Wallace, Tanner Lebaron
2013-06-01
Here, we describe the development of logic models depicting three theories of evaluation practice: Practical Participatory (Cousins & Whitmore, 1998), Values-engaged (Greene, 2005a, 2005b), and Emergent Realist (Mark et al., 1998). We begin with a discussion of evaluation theory and the particular theories that were chosen for our analysis. We then outline the steps involved in constructing the models. The theoretical prescriptions and claims represented here follow a logic model template developed at the University Wisconsin-Extension (Taylor-Powell & Henert, 2008), which also closely aligns with Mark's (2008) framework for research on evaluation. Copyright © 2012 Elsevier Ltd. All rights reserved.
Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity
Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny
2015-01-01
Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity). PMID:25976626
Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity.
Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny
2015-05-15
Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).
Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity
NASA Astrophysics Data System (ADS)
Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny
2015-05-01
Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).
MIRAP, microcomputer reliability analysis program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jehee, J.N.T.
1989-01-01
A program for a microcomputer is outlined that can determine minimal cut sets from a specified fault tree logic. The speed and memory limitations of the microcomputers on which the program is implemented (Atari ST and IBM) are addressed by reducing the fault tree's size and by storing the cut set data on disk. Extensive well proven fault tree restructuring techniques, such as the identification of sibling events and of independent gate events, reduces the fault tree's size but does not alter its logic. New methods are used for the Boolean reduction of the fault tree logic. Special criteria formore » combining events in the 'AND' and 'OR' logic avoid the creation of many subsuming cut sets which all would cancel out due to existing cut sets. Figures and tables illustrates these methods. 4 refs., 5 tabs.« less
Semantic framework for mapping object-oriented model to semantic web languages
Ježek, Petr; Mouček, Roman
2015-01-01
The article deals with and discusses two main approaches in building semantic structures for electrophysiological metadata. It is the use of conventional data structures, repositories, and programming languages on one hand and the use of formal representations of ontologies, known from knowledge representation, such as description logics or semantic web languages on the other hand. Although knowledge engineering offers languages supporting richer semantic means of expression and technological advanced approaches, conventional data structures and repositories are still popular among developers, administrators and users because of their simplicity, overall intelligibility, and lower demands on technical equipment. The choice of conventional data resources and repositories, however, raises the question of how and where to add semantics that cannot be naturally expressed using them. As one of the possible solutions, this semantics can be added into the structures of the programming language that accesses and processes the underlying data. To support this idea we introduced a software prototype that enables its users to add semantically richer expressions into a Java object-oriented code. This approach does not burden users with additional demands on programming environment since reflective Java annotations were used as an entry for these expressions. Moreover, additional semantics need not to be written by the programmer directly to the code, but it can be collected from non-programmers using a graphic user interface. The mapping that allows the transformation of the semantically enriched Java code into the Semantic Web language OWL was proposed and implemented in a library named the Semantic Framework. This approach was validated by the integration of the Semantic Framework in the EEG/ERP Portal and by the subsequent registration of the EEG/ERP Portal in the Neuroscience Information Framework. PMID:25762923
Semantic framework for mapping object-oriented model to semantic web languages.
Ježek, Petr; Mouček, Roman
2015-01-01
The article deals with and discusses two main approaches in building semantic structures for electrophysiological metadata. It is the use of conventional data structures, repositories, and programming languages on one hand and the use of formal representations of ontologies, known from knowledge representation, such as description logics or semantic web languages on the other hand. Although knowledge engineering offers languages supporting richer semantic means of expression and technological advanced approaches, conventional data structures and repositories are still popular among developers, administrators and users because of their simplicity, overall intelligibility, and lower demands on technical equipment. The choice of conventional data resources and repositories, however, raises the question of how and where to add semantics that cannot be naturally expressed using them. As one of the possible solutions, this semantics can be added into the structures of the programming language that accesses and processes the underlying data. To support this idea we introduced a software prototype that enables its users to add semantically richer expressions into a Java object-oriented code. This approach does not burden users with additional demands on programming environment since reflective Java annotations were used as an entry for these expressions. Moreover, additional semantics need not to be written by the programmer directly to the code, but it can be collected from non-programmers using a graphic user interface. The mapping that allows the transformation of the semantically enriched Java code into the Semantic Web language OWL was proposed and implemented in a library named the Semantic Framework. This approach was validated by the integration of the Semantic Framework in the EEG/ERP Portal and by the subsequent registration of the EEG/ERP Portal in the Neuroscience Information Framework.
Applying gene regulatory network logic to the evolution of social behavior.
Baran, Nicole M; McGrath, Patrick T; Streelman, J Todd
2017-06-06
Animal behavior is ultimately the product of gene regulatory networks (GRNs) for brain development and neural networks for brain function. The GRN approach has advanced the fields of genomics and development, and we identify organizational similarities between networks of genes that build the brain and networks of neurons that encode brain function. In this perspective, we engage the analogy between developmental networks and neural networks, exploring the advantages of using GRN logic to study behavior. Applying the GRN approach to the brain and behavior provides a quantitative and manipulative framework for discovery. We illustrate features of this framework using the example of social behavior and the neural circuitry of aggression.
[Documenting a rehabilitation program using a logic model: an advantage to the assessment process].
Poncet, Frédérique; Swaine, Bonnie; Pradat-Diehl, Pascale
2017-03-06
The cognitive and behavioral disorders after brain injury can result in severe limitations of activities and restrictions of participation. An interdisciplinary rehabilitation program was developed in physical medicine and rehabilitation at the Pitié-Salpêtriere Hospital, Paris, France. Clinicians believe this program decreases activity limitations and improves participation in patients. However, the program’s effectiveness had never been assessed. To do this, we had to define/describe this program. However rehabilitation programs are holistic and thus complex making them difficult to describe. Therefore, to facilitate the evaluation of complex programs, including those for rehabilitation, we illustrate the use of a theoretical logic model, as proposed by Champagne, through the process of documentation of a specific complex and interdisciplinary rehabilitation program. Through participatory/collaborative research, the rehabilitation program was analyzed using three “submodels” of the logic model of intervention: causal model, intervention model and program theory model. This should facilitate the evaluation of programs, including those for rehabilitation.
A conceptual review of decision making in social dilemmas: applying a logic of appropriateness.
Weber, J Mark; Kopelman, Shirli; Messick, David M
2004-01-01
Despite decades of experimental social dilemma research, "theoretical integration has proven elusive" (Smithson & Foddy, 1999, p. 14). To advance a theory of decision making in social dilemmas, this article provides a conceptual review of the literature that applies a "logic of appropriateness" (March, 1994) framework. The appropriateness framework suggests that people making decisions ask themselves (explicitly or implicitly), "What does a person like me do in a situation like this? " This question identifies 3 significant factors: recognition and classification of the kind of situation encountered, the identity of the individual making the decision, and the application of rules or heuristics in guiding behavioral choice. In contrast with dominant rational choice models, the appropriateness framework proposed accommodates the inherently social nature of social dilemmas, and the role of rule and heuristic based processing. Implications for the interpretation of past findings and the direction of future research are discussed.
Logic Design Pathology and Space Flight Electronics
NASA Technical Reports Server (NTRS)
Katz, Richard B.; Barto, Rod L.; Erickson, Ken
1999-01-01
This paper presents a look at logic design from early in the US Space Program and examines faults in recent logic designs. Most examples are based on flight hardware failures and analysis of new tools and techniques. The paper is presented in viewgraph form.
The CMS tracker control system
NASA Astrophysics Data System (ADS)
Dierlamm, A.; Dirkes, G. H.; Fahrer, M.; Frey, M.; Hartmann, F.; Masetti, L.; Militaru, O.; Shah, S. Y.; Stringer, R.; Tsirou, A.
2008-07-01
The Tracker Control System (TCS) is a distributed control software to operate about 2000 power supplies for the silicon modules of the CMS Tracker and monitor its environmental sensors. TCS must thus be able to handle about 104 power supply parameters, about 103 environmental probes from the Programmable Logic Controllers of the Tracker Safety System (TSS), about 105 parameters read via DAQ from the DCUs in all front end hybrids and from CCUs in all control groups. TCS is built on top of an industrial SCADA program (PVSS) extended with a framework developed at CERN (JCOP) and used by all LHC experiments. The logical partitioning of the detector is reflected in the hierarchical structure of the TCS, where commands move down to the individual hardware devices, while states are reported up to the root which is interfaced to the broader CMS control system. The system computes and continuously monitors the mean and maximum values of critical parameters and updates the percentage of currently operating hardware. Automatic procedures switch off selected parts of the detector using detailed granularity and avoiding widespread TSS intervention.
The Illness Narratives of Health Managers: Developing an Analytical Framework
ERIC Educational Resources Information Center
Exworthy, Mark
2011-01-01
This paper examines the personal experience of illness and healthcare by health managers through their illness narratives. By synthesising a wider literature of illness narratives and health management, an analytical framework is presented, which considers the impact of illness narratives, comprising the logic of illness narratives, the actors…
Framework for Transforming Departmental Culture to Support Educational Innovation
ERIC Educational Resources Information Center
Corbo, Joel C.; Reinholz, Daniel L.; Dancy, Melissa H.; Deetz, Stanley; Finkelstein, Noah
2016-01-01
This paper provides a research-based framework for promoting institutional change in higher education. To date, most educational change efforts have focused on relatively narrow subsets of the university system (e.g., faculty teaching practices or administrative policies) and have been largely driven by implicit change logics; both of these…
MELD: A Logical Approach to Distributed and Parallel Programming
2012-03-01
0215 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 61101E 6. AUTHOR(S) Seth Copen Goldstein Flavio Cruz 5d. PROJECT NUMBER BI20 5e. TASK...Comp. Sci., vol. 50, pp. 1–102, 1987. [33] P. Ló pez, F. Pfenning, J. Polakow, and K. Watkins , “Monadic concurrent linear logic programming,” in
NASA Astrophysics Data System (ADS)
Ducksbury, P. G.; Kennedy, C.; Lock, Z.
2003-09-01
Grammars have been used for the formal specification of programming languages, and there are a number of commercial products which now use grammars. However, these have tended to be focused mainly on flow control type applications. In this paper, we consider the potential use of picture grammars and inductive logic programming in generic image understanding applications, such as object recognition. A number of issues are considered, such as what type of grammar needs to be used, how to construct the grammar with its associated attributes, difficulties encountered with parsing grammars followed by issues of automatically learning grammars using a genetic algorithm. The concept of inductive logic programming is then introduced as a method that can overcome some of the earlier difficulties.
Logic programming and metadata specifications
NASA Technical Reports Server (NTRS)
Lopez, Antonio M., Jr.; Saacks, Marguerite E.
1992-01-01
Artificial intelligence (AI) ideas and techniques are critical to the development of intelligent information systems that will be used to collect, manipulate, and retrieve the vast amounts of space data produced by 'Missions to Planet Earth.' Natural language processing, inference, and expert systems are at the core of this space application of AI. This paper presents logic programming as an AI tool that can support inference (the ability to draw conclusions from a set of complicated and interrelated facts). It reports on the use of logic programming in the study of metadata specifications for a small problem domain of airborne sensors, and the dataset characteristics and pointers that are needed for data access.
Oral health disparities and the workforce: a framework to guide innovation.
Hilton, Irene V; Lester, Arlene M
2010-06-01
Oral health disparities currently exist in the United States, and workforce innovations have been proposed as one strategy to address these disparities. A framework is needed to logically assess the possible role of workforce as a contributor to and to analyze workforce strategies addressing the issue of oral health disparities. Using an existing framework, A Strategic Framework for Improving Racial/Ethnic Minority Health and Eliminating Racial/Ethnic Health Disparities, workforce was sequentially applied across individual, environmental/community, and system levels to identify long-term problems, contributing factors, strategies/innovation, measurable outcomes/impacts, and long-term goals. Examples of current workforce innovations were applied to the framework. Contributing factors to oral health disparities included lack of racial/ethnic diversity of the workforce, lack of appropriate training, provider distribution, and a nonuser-centered system. The framework was applied to selected workforce innovation models delineating the potential impact on contributing factors across the individual, environmental/community, and system levels. The framework helps to define expected outcomes from workforce models that would contribute to the goal of reducing oral health disparities and examine impacts across multiple levels. However, the contributing factors to oral health disparities cannot be addressed by workforce innovation alone. The Strategic Framework is a logical approach to guide workforce innovation, solutions, and identification of other aspects of the oral healthcare delivery system that need innovation in order to reduce oral health disparities.
Choi, Jeeyae; Bakken, Suzanne; Lussier, Yves A; Mendonça, Eneida A
2006-01-01
Medical logic modules are a procedural representation for sharing task-specific knowledge for decision support systems. Based on the premise that clinicians may perceive object-oriented expressions as easier to read than procedural rules in Arden Syntax-based medical logic modules, we developed a method for improving the readability of medical logic modules. Two approaches were applied: exploiting the concept-oriented features of the Medical Entities Dictionary and building an executable Java program to replace Arden Syntax procedural expressions. The usability evaluation showed that 66% of participants successfully mapped all Arden Syntax rules to Java methods. These findings suggest that these approaches can play an essential role in the creation of human readable medical logic modules and can potentially increase the number of clinical experts who are able to participate in the creation of medical logic modules. Although our approaches are broadly applicable, we specifically discuss the relevance to concept-oriented nursing terminologies and automated processing of task-specific nursing knowledge.
Interpretation of IEEE-854 floating-point standard and definition in the HOL system
NASA Technical Reports Server (NTRS)
Carreno, Victor A.
1995-01-01
The ANSI/IEEE Standard 854-1987 for floating-point arithmetic is interpreted by converting the lexical descriptions in the standard into mathematical conditional descriptions organized in tables. The standard is represented in higher-order logic within the framework of the HOL (Higher Order Logic) system. The paper is divided in two parts with the first part the interpretation and the second part the description in HOL.
Agent Based Modeling and Simulation Framework for Supply Chain Risk Management
2012-03-01
Christopher and Peck 2004) macroeconomic , policy, competition, and resource (Ghoshal 1987) value chain, operational, event, and recurring (Shi 2004...clustering algorithms in agent logic to protect company privacy ( da Silva et al. 2006), aggregation of domain context in agent data analysis logic (Xiang...Operational Availability ( OA ) for FMC and PMC. 75 Mission Capable (MICAP) Hours is the measure of total time (in a month) consumable or reparable
Ordering Traces Logically to Identify Lateness in Message Passing Programs
Isaacs, Katherine E.; Gamblin, Todd; Bhatele, Abhinav; ...
2015-03-30
Event traces are valuable for understanding the behavior of parallel programs. However, automatically analyzing a large parallel trace is difficult, especially without a specific objective. We aid this endeavor by extracting a trace's logical structure, an ordering of trace events derived from happened-before relationships, while taking into account developer intent. Using this structure, we can calculate an operation's delay relative to its peers on other processes. The logical structure also serves as a platform for comparing and clustering processes as well as highlighting communication patterns in a trace visualization. We present an algorithm for determining this idealized logical structure frommore » traces of message passing programs, and we develop metrics to quantify delays and differences among processes. We implement our techniques in Ravel, a parallel trace visualization tool that displays both logical and physical timelines. Rather than showing the duration of each operation, we display where delays begin and end, and how they propagate. As a result, we apply our approach to the traces of several message passing applications, demonstrating the accuracy of our extracted structure and its utility in analyzing these codes.« less
Chen, Dong; Giampapa, Mark; Heidelberger, Philip; Ohmacht, Martin; Satterfield, David L; Steinmacher-Burow, Burkhard; Sugavanam, Krishnan
2013-05-21
A system and method for enhancing performance of a computer which includes a computer system including a data storage device. The computer system includes a program stored in the data storage device and steps of the program are executed by a processer. The processor processes instructions from the program. A wait state in the processor waits for receiving specified data. A thread in the processor has a pause state wherein the processor waits for specified data. A pin in the processor initiates a return to an active state from the pause state for the thread. A logic circuit is external to the processor, and the logic circuit is configured to detect a specified condition. The pin initiates a return to the active state of the thread when the specified condition is detected using the logic circuit.
Answer Sets in a Fuzzy Equilibrium Logic
NASA Astrophysics Data System (ADS)
Schockaert, Steven; Janssen, Jeroen; Vermeir, Dirk; de Cock, Martine
Since its introduction, answer set programming has been generalized in many directions, to cater to the needs of real-world applications. As one of the most general “classical” approaches, answer sets of arbitrary propositional theories can be defined as models in the equilibrium logic of Pearce. Fuzzy answer set programming, on the other hand, extends answer set programming with the capability of modeling continuous systems. In this paper, we combine the expressiveness of both approaches, and define answer sets of arbitrary fuzzy propositional theories as models in a fuzzification of equilibrium logic. We show that the resulting notion of answer set is compatible with existing definitions, when the syntactic restrictions of the corresponding approaches are met. We furthermore locate the complexity of the main reasoning tasks at the second level of the polynomial hierarchy. Finally, as an illustration of its modeling power, we show how fuzzy equilibrium logic can be used to find strong Nash equilibria.
ProgrammingRationalAgents in GOAL
NASA Astrophysics Data System (ADS)
Hindriks, Koen V.
The agent programming language GOAL is a high-level programming language to program rational agents that derive their choice of action from their beliefsand goals. The language provides the basic building blocks to design and implementrationalagents by meansofa setofprogramming constructs. These programming constructs allow and facilitate the manipulation of an agent’sbeliefs and goals and to structure its decision-making. GOAL agents are called rational because they satisfy a numberof basic rationality constraints and because they decide to perform actions to further their goals based uponareasoning scheme derived from practical reasoning. The programming concepts of belief and goal incorporated into GOAL provide the basis for this form of reasoning and are similarto their common sense counterparts used everyday to explain the actions that we perform. In addition, GOAL provides the means for agents to focus their attention on specic goals and to communicate at the knowledge level. This provides an intuitive basis for writing high-level agent programs. At the same time these concepts and programming constructs have a well-dened, formal semantics. The formal semantics provides the basis for deninga verication framework for GOAL for verifying and reasoning about GOAL agents whichis similar to some of the wellknownagent logics introduced in the literature.
'Healthy Eating and Lifestyle in Pregnancy (HELP)' trial: Process evaluation framework.
Simpson, Sharon A; Cassidy, Dunla; John, Elinor
2014-07-01
We developed and tested in a cluster RCT a theory-driven group-based intervention for obese pregnant women. It was designed to support women to moderate weight gain during pregnancy and reduce BMI one year after birth, in addition to targeting secondary health and wellbeing outcomes. In line with MRC guidance on developing and evaluating complex interventions in health, we conducted a process evaluation alongside the trial. This paper describes the development of the process evaluation framework. This cluster RCT recruited 598 pregnant women. Women in the intervention group were invited to attend a weekly weight-management group. Following a review of relevant literature, we developed a process evaluation framework which outlined key process indicators that we wanted to address and how we would measure these. Central to the process evaluation was to understand the mechanism of effect of the intervention. We utilised a logic-modelling approach to describe the intervention which helped us focus on what potential mediators of intervention effect to measure, and how. The resulting process evaluation framework was designed to address 9 core elements; context, reach, exposure, recruitment, fidelity, recruitment, retention, contamination and theory-testing. These were assessed using a variety of qualitative and quantitative approaches. The logic model explained the processes by which intervention components bring about change in target outcomes through various mediators and theoretical pathways including self-efficacy, social support, self-regulation and motivation. Process evaluation is a key element in assessing the effect of any RCT. We developed a process evaluation framework and logic model, and the results of analyses using these will offer insights into why the intervention is or is not effective. Copyright © 2014.
Monitoring Java Programs with Java PathExplorer
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)
2001-01-01
We present recent work on the development Java PathExplorer (JPAX), a tool for monitoring the execution of Java programs. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems. The tool facilitates automated instrumentation of a program's late code which will then omit events to an observer during its execution. The observer checks the events against user provided high level requirement specifications, for example temporal logic formulae, and against lower level error detection procedures, for example concurrency related such as deadlock and data race algorithms. High level requirement specifications together with their underlying logics are defined in the Maude rewriting logic, and then can either be directly checked using the Maude rewriting engine, or be first translated to efficient data structures and then checked in Java.
NASA Astrophysics Data System (ADS)
Lanzalaco, Felix; Pissanetzky, Sergio
2013-12-01
A recent theory of physical information based on the fundamental principles of causality and thermodynamics has proposed that a large number of observable life and intelligence signals can be described in terms of the Causal Mathematical Logic (CML), which is proposed to encode the natural principles of intelligence across any physical domain and substrate. We attempt to expound the current definition of CML, the "Action functional" as a theory in terms of its ability to possess a superior explanatory power for the current neuroscientific data we use to measure the mammalian brains "intelligence" processes at its most general biophysical level. Brain simulation projects define their success partly in terms of the emergence of "non-explicitly programmed" complex biophysical signals such as self-oscillation and spreading cortical waves. Here we propose to extend the causal theory to predict and guide the understanding of these more complex emergent "intelligence Signals". To achieve this we review whether causal logic is consistent with, can explain and predict the function of complete perceptual processes associated with intelligence. Primarily those are defined as the range of Event Related Potentials (ERP) which include their primary subcomponents; Event Related Desynchronization (ERD) and Event Related Synchronization (ERS). This approach is aiming for a universal and predictive logic for neurosimulation and AGi. The result of this investigation has produced a general "Information Engine" model from translation of the ERD and ERS. The CML algorithm run in terms of action cost predicts ERP signal contents and is consistent with the fundamental laws of thermodynamics. A working substrate independent natural information logic would be a major asset. An information theory consistent with fundamental physics can be an AGi. It can also operate within genetic information space and provides a roadmap to understand the live biophysical operation of the phenotype
ERIC Educational Resources Information Center
Straumanis, Joan
A major problem in teaching symbolic logic is that of providing individualized and early feedback to students who are learning to do proofs. To overcome this difficulty, a computer program was developed which functions as a line-by-line proof checker in Sentential Calculus. The program, DEMON, first evaluates any statement supplied by the student…
DOE Office of Scientific and Technical Information (OSTI.GOV)
McHale, M.L.
The field of artificial Intelligence strives to produce computer programs that exhibit intelligent behavior. One of the areas of interest is the processing of natural language. This report discusses the role of the computer language PROLOG in Natural Language Processing (NLP) both from theoretic and pragmatic viewpoints. The reasons for using PROLOG for NLP are numerous. First, linguists can write natural-language grammars almost directly as PROLOG programs; this allows fast-prototyping of NLP systems and facilitates analysis of NLP theories. Second, semantic representations of natural-language texts that use logic formalisms are readily produced in PROLOG because of PROLOG's logical foundations. Third,more » PROLOG's built-in inferencing mechanisms are often sufficient for inferences on the logical forms produced by NLPs. Fourth, the logical, declarative nature of PROLOG may make it the language of choice for parallel computing systems. Finally, the fact that PROLOG has a de facto standard (Edinburgh) makes the porting of code from one computer system to another virtually trouble free. Perhaps the strongest tie one could make between NLP and PROLOG was stated by John Stuart Mill in his inaugural Address at St. Andrews: The structure of every sentence is a lesson in logic.« less
1981-01-01
THIS PAGZ(Whan Doee Es tMord) Item 20 (Cont’d) ------ work in the area of artificial intelligence and those used in general program development into a...Controlling Gfile) IS. SECURITY CLASS. (of tis report) Same .,/ UNCLASSIFIED 13d. DECLASSIFICATION/ DOWN GRADING ..- ". .--- /A!CHEDULEI t I IS...logic programming with LISP for implementing intelligent data base query systems. Continued developments will allow for enhancements to be made to the
Logic Models: A Tool for Effective Program Planning, Collaboration, and Monitoring. REL 2014-025
ERIC Educational Resources Information Center
Kekahio, Wendy; Lawton, Brian; Cicchinelli, Louis; Brandon, Paul R.
2014-01-01
A logic model is a visual representation of the assumptions and theory of action that underlie the structure of an education program. A program can be a strategy for instruction in a classroom, a training session for a group of teachers, a grade-level curriculum, a building-level intervention, or a district-or statewide initiative. This guide, an…
Burnett, E; Curran, E; Loveday, H P; Kiernan, M A; Tannahill, M
2014-01-01
Healthcare is delivered in a dynamic environment with frequent changes in populations, methods, equipment and settings. Infection prevention and control practitioners (IPCPs) must ensure that they are competent in addressing the challenges they face and are equipped to develop infection prevention and control (IPC) services in line with a changing world of healthcare provision. A multifaceted Framework was developed to assist IPCPs to enhance competence at an individual, team and organisational level to enable quality performance and improved quality of care. However, if these aspirations are to be met, it is vital that competency frameworks are fit for purpose or they risk being ignored. The aim of this unique study was to evaluate short and medium term outcomes as set out in the Outcome Logic Model to assist with the evaluation of the impact and success of the Framework. This study found that while the Framework is being used effectively in some areas, it is not being used as much or in the ways that were anticipated. The findings will enable future work on revision, communication and dissemination, and will provide intelligence to those initiating education and training in the utilisation of the competences.
Curran, E; Loveday, HP; Kiernan, MA; Tannahill, M
2013-01-01
Healthcare is delivered in a dynamic environment with frequent changes in populations, methods, equipment and settings. Infection prevention and control practitioners (IPCPs) must ensure that they are competent in addressing the challenges they face and are equipped to develop infection prevention and control (IPC) services in line with a changing world of healthcare provision. A multifaceted Framework was developed to assist IPCPs to enhance competence at an individual, team and organisational level to enable quality performance and improved quality of care. However, if these aspirations are to be met, it is vital that competency frameworks are fit for purpose or they risk being ignored. The aim of this unique study was to evaluate short and medium term outcomes as set out in the Outcome Logic Model to assist with the evaluation of the impact and success of the Framework. This study found that while the Framework is being used effectively in some areas, it is not being used as much or in the ways that were anticipated. The findings will enable future work on revision, communication and dissemination, and will provide intelligence to those initiating education and training in the utilisation of the competences. PMID:28989348
Intelligent manipulation technique for multi-branch robotic systems
NASA Technical Reports Server (NTRS)
Chen, Alexander Y. K.; Chen, Eugene Y. S.
1990-01-01
New analytical development in kinematics planning is reported. The INtelligent KInematics Planner (INKIP) consists of the kinematics spline theory and the adaptive logic annealing process. Also, a novel framework of robot learning mechanism is introduced. The FUzzy LOgic Self Organized Neural Networks (FULOSONN) integrates fuzzy logic in commands, control, searching, and reasoning, the embedded expert system for nominal robotics knowledge implementation, and the self organized neural networks for the dynamic knowledge evolutionary process. Progress on the mechanical construction of SRA Advanced Robotic System (SRAARS) and the real time robot vision system is also reported. A decision was made to incorporate the Local Area Network (LAN) technology in the overall communication system.
Cognitive pathways and historical research.
Sutherland, J A
1997-01-01
The nursing literature is replete with articles detailing the logical reasoning processes required by the individual scientist to implement the rigors of research and theory development. Much less attention has been focused on creative and critical thinking as modes for deriving explanations, inferences, and conclusions essential to science as a product. Historical research, as a particular kind of qualitative research, is dependent on and compatible with such mental strategies as logical, creative, and critical thinking. These strategies depict an intellectual framework for the scientist examining archival data and offer a structure for such inquiry. A model for analyzing historical data delineating the cognitive pathways of logical reasoning, creative processing, and critical thinking is proposed.
Neural networks and logical reasoning systems: a translation table.
Martins, J; Mendes, R V
2001-04-01
A correspondence is established between the basic elements of logic reasoning systems (knowledge bases, rules, inference and queries) and the structure and dynamical evolution laws of neural networks. The correspondence is pictured as a translation dictionary which might allow to go back and forth between symbolic and network formulations, a desirable step in learning-oriented systems and multicomputer networks. In the framework of Horn clause logics, it is found that atomic propositions with n arguments correspond to nodes with nth order synapses, rules to synaptic intensity constraints, forward chaining to synaptic dynamics and queries either to simple node activation or to a query tensor dynamics.
[Psychodrama as a pedagogical teaching strategy about worker's health].
Martins, Júlia Trevisan; Opitz, Simone Perufo; Robazzi, Maria Lúcia do Carmo
2004-04-01
This study had the objective to report the experience of using pedagogic psychodrama as a teaching and learning strategy about the worker's health. It was developed with 18 students from the Master Program from the School of Nursing of the University of São Paulo at Ribeirão Preto, during the second semester of 2002. Interactive, dynamic and interpersonal activities, and role playing were initially conducted looking for students and educator's spontaneity. Moreno's psychodramatic theory was the theoretical framework used. Creativity, logical reasoning, involvement with learning, and organization of concepts using their own living experience were observed, contributing to the experience as a whole. Therefore, the experiment was considered successful.
Technical Assistance Model for Long-Term Systems Change: Three State Examples
ERIC Educational Resources Information Center
Kasprzak, Christina; Hurth, Joicey; Lucas, Anne; Marshall, Jacqueline; Terrell, Adriane; Jones, Elizabeth
2010-01-01
The National Early Childhood Technical Assistance Center (NECTAC) Technical Assistance (TA) Model for Long-Term Systems Change (LTSC) is grounded in conceptual frameworks in the literature on systems change and systems thinking. The NECTAC conceptual framework uses a logic model approach to change developed specifically for states' infant and…
Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model
ERIC Educational Resources Information Center
Sandaire, Johnny
2009-01-01
A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…
Mediation Analysis in a Latent Growth Curve Modeling Framework
ERIC Educational Resources Information Center
von Soest, Tilmann; Hagtvet, Knut A.
2011-01-01
This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…
Programmable Logic Controllers. Teacher Edition.
ERIC Educational Resources Information Center
Rauh, Bob; Kaltwasser, Stan
These materials were developed for a seven-unit secondary or postsecondary education course on programmable logic controllers (PLCs) that treats most of the skills needed to work effectively with PLCs as programming skills. The seven units of the course cover the following topics: fundamentals of programmable logic controllers; contracts, timers,…
Putting time into proof outlines
NASA Technical Reports Server (NTRS)
Schneider, Fred B.; Bloom, Bard; Marzullo, Keith
1991-01-01
A logic for reasoning about timing of concurrent programs is presented. The logic is based on proof outlines and can handle maximal parallelism as well as resource-constrained execution environments. The correctness proof for a mutual exclusion protocol that uses execution timings in a subtle way illustrates the logic in action.
Conceptualising the effectiveness of impact assessment processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chanchitpricha, Chaunjit, E-mail: chaunjit@g.sut.ac.th; Bond, Alan, E-mail: alan.bond@uea.ac.uk; Unit for Environmental Sciences and Management School of Geo and Spatial Sciences, Internal Box 375, North West University
2013-11-15
This paper aims at conceptualising the effectiveness of impact assessment processes through the development of a literature-based framework of criteria to measure impact assessment effectiveness. Four categories of effectiveness were established: procedural, substantive, transactive and normative, each containing a number of criteria; no studies have previously brought together all four of these categories into such a comprehensive, criteria-based framework and undertaken systematic evaluation of practice. The criteria can be mapped within a cycle/or cycles of evaluation, based on the ‘logic model’, at the stages of input, process, output and outcome to enable the identification of connections between the criteria acrossmore » the categories of effectiveness. This framework is considered to have potential application in measuring the effectiveness of many impact assessment processes, including strategic environmental assessment (SEA), environmental impact assessment (EIA), social impact assessment (SIA) and health impact assessment (HIA). -- Highlights: • Conceptualising effectiveness of impact assessment processes. • Identification of factors influencing effectiveness of impact assessment processes. • Development of criteria within a framework for evaluating IA effectiveness. • Applying the logic model to examine connections between effectiveness criteria.« less
The development of a digital logic concept inventory
NASA Astrophysics Data System (ADS)
Herman, Geoffrey Lindsay
Instructors in electrical and computer engineering and in computer science have developed innovative methods to teach digital logic circuits. These methods attempt to increase student learning, satisfaction, and retention. Although there are readily accessible and accepted means for measuring satisfaction and retention, there are no widely accepted means for assessing student learning. Rigorous assessment of learning is elusive because differences in topic coverage, curriculum and course goals, and exam content prevent direct comparison of two teaching methods when using tools such as final exam scores or course grades. Because of these difficulties, computing educators have issued a general call for the adoption of assessment tools to critically evaluate and compare the various teaching methods. Science, Technology, Engineering, and Mathematics (STEM) education researchers commonly measure students' conceptual learning to compare how much different pedagogies improve learning. Conceptual knowledge is often preferred because all engineering courses should teach a fundamental set of concepts even if they emphasize design or analysis to different degrees. Increasing conceptual learning is also important, because students who can organize facts and ideas within a consistent conceptual framework are able to learn new information quickly and can apply what they know in new situations. If instructors can accurately assess their students' conceptual knowledge, they can target instructional interventions to remedy common problems. To properly assess conceptual learning, several researchers have developed concept inventories (CIs) for core subjects in engineering sciences. CIs are multiple-choice assessment tools that evaluate how well a student's conceptual framework matches the accepted conceptual framework of a discipline or common faulty conceptual frameworks. We present how we created and evaluated the digital logic concept inventory (DLCI).We used a Delphi process to identify the important and difficult concepts to include on the DLCI. To discover and describe common student misconceptions, we interviewed students who had completed a digital logic course. Students vocalized their thoughts as they solved digital logic problems. We analyzed the interview data using a qualitative grounded theory approach. We have administered the DLCI at several institutions and have checked the validity, reliability, and bias of the DLCI with classical testing theory procedures. These procedures consisted of follow-up interviews with students, analysis of administration results with statistical procedures, and expert feedback. We discuss these results and present the DLCI's potential for providing a meaningful tool for comparing student learning at different institutions.
UTP and Temporal Logic Model Checking
NASA Astrophysics Data System (ADS)
Anderson, Hugh; Ciobanu, Gabriel; Freitas, Leo
In this paper we give an additional perspective to the formal verification of programs through temporal logic model checking, which uses Hoare and He Unifying Theories of Programming (UTP). Our perspective emphasizes the use of UTP designs, an alphabetised relational calculus expressed as a pre/post condition pair of relations, to verify state or temporal assertions about programs. The temporal model checking relation is derived from a satisfaction relation between the model and its properties. The contribution of this paper is that it shows a UTP perspective to temporal logic model checking. The approach includes the notion of efficiency found in traditional model checkers, which reduced a state explosion problem through the use of efficient data structures
Huang, Wei Tao; Luo, Hong Qun; Li, Nian Bing
2014-05-06
The most serious, and yet unsolved, problem of constructing molecular computing devices consists in connecting all of these molecular events into a usable device. This report demonstrates the use of Boolean logic tree for analyzing the chemical event network based on graphene, organic dye, thrombin aptamer, and Fenton reaction, organizing and connecting these basic chemical events. And this chemical event network can be utilized to implement fluorescent combinatorial logic (including basic logic gates and complex integrated logic circuits) and fuzzy logic computing. On the basis of the Boolean logic tree analysis and logic computing, these basic chemical events can be considered as programmable "words" and chemical interactions as "syntax" logic rules to construct molecular search engine for performing intelligent molecular search query. Our approach is helpful in developing the advanced logic program based on molecules for application in biosensing, nanotechnology, and drug delivery.
A Framework for Building and Reasoning with Adaptive and Interoperable PMESII Models
2007-11-01
Description Logic SOA Service Oriented Architecture SPARQL Simple Protocol And RDF Query Language SQL Standard Query Language SROM Stability and...another by providing a more expressive ontological structure for one of the models, e.g., semantic networks can be mapped to first- order logical...Pellet is an open-source reasoner that works with OWL-DL. It accepts the SPARQL protocol and RDF query language ( SPARQL ) and provides a Java API to
ERIC Educational Resources Information Center
Ragonis, Noa; Shilo, Gila
2014-01-01
The paper presents a theoretical investigational study of the potential advantages that secondary school learners may gain from learning two different subjects, namely, logic programming within computer science studies and argumentation texts within linguistics studies. The study suggests drawing an analogy between the two subjects since they both…
Application of Logic Models in a Large Scientific Research Program
ERIC Educational Resources Information Center
O'Keefe, Christine M.; Head, Richard J.
2011-01-01
It is the purpose of this article to discuss the development and application of a logic model in the context of a large scientific research program within the Commonwealth Scientific and Industrial Research Organisation (CSIRO). CSIRO is Australia's national science agency and is a publicly funded part of Australia's innovation system. It conducts…
Semi-Structured Interview Protocol for Constructing Logic Models
ERIC Educational Resources Information Center
Gugiu, P. Cristian; Rodriguez-Campos, Liliana
2007-01-01
This paper details a semi-structured interview protocol that evaluators can use to develop a logic model of a program's services and outcomes. The protocol presents a series of questions, which evaluators can ask of specific program informants, that are designed to: (1) identify key informants basic background and contextual information, (2)…
Implementing a Knowledge-Based Library Information System with Typed Horn Logic.
ERIC Educational Resources Information Center
Ait-Kaci, Hassan; And Others
1990-01-01
Describes a prototype library expert system called BABEL which uses a new programing language, LOGIN, that combines the idea of attribute inheritance with logic programing. Use of hierarchical classification of library objects to build a knowledge base for a library information system is explained, and further research is suggested. (11…
Evaluating bacterial gene-finding HMM structures as probabilistic logic programs.
Mørk, Søren; Holmes, Ian
2012-03-01
Probabilistic logic programming offers a powerful way to describe and evaluate structured statistical models. To investigate the practicality of probabilistic logic programming for structure learning in bioinformatics, we undertook a simplified bacterial gene-finding benchmark in PRISM, a probabilistic dialect of Prolog. We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length modeling and three-state versions of the five model structures. The models are all represented as probabilistic logic programs and evaluated using the PRISM machine learning system in terms of statistical information criteria and gene-finding prediction accuracy, in two bacterial genomes. Neither of our implementations of the two currently most used model structures are best performing in terms of statistical information criteria or prediction performances, suggesting that better-fitting models might be achievable. The source code of all PRISM models, data and additional scripts are freely available for download at: http://github.com/somork/codonhmm. Supplementary data are available at Bioinformatics online.
ERIC Educational Resources Information Center
Korkmaz, Özgen
2016-01-01
The aim of this study was to investigate the effect of the Scratch and Lego Mindstorms Ev3 programming activities on academic achievement with respect to computer programming, and on the problem-solving and logical-mathematical thinking skills of students. This study was a semi-experimental, pretest-posttest study with two experimental groups and…
Efficient dynamic optimization of logic programs
NASA Technical Reports Server (NTRS)
Laird, Phil
1992-01-01
A summary is given of the dynamic optimization approach to speed up learning for logic programs. The problem is to restructure a recursive program into an equivalent program whose expected performance is optimal for an unknown but fixed population of problem instances. We define the term 'optimal' relative to the source of input instances and sketch an algorithm that can come within a logarithmic factor of optimal with high probability. Finally, we show that finding high-utility unfolding operations (such as EBG) can be reduced to clause reordering.
DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS
2017-10-01
DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS UNIVERSITY OF SOUTHERN CALIFORNIA OCTOBER 2017 FINAL...SUBTITLE DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS 5a. CONTRACT NUMBER FA8750-15-C-0203 5b. GRANT NUMBER N/A 5c. PROGRAM...of this project was to investigate the state-of-the-art in design and optimization of single-flux quantum (SFQ) logic circuits, e.g., RSFQ and ERSFQ
Liebow, Edward; Phelps, Jerry; Van Houten, Bennett; Rose, Shyanika; Orians, Carlyn; Cohen, Jennifer; Monroe, Philip; Drew, Christina H.
2009-01-01
Background In the past 15 years, asthma prevalence has increased and is disproportionately distributed among children, minorities, and low-income persons. The National Institute of Environmental Health Sciences (NIEHS) Division of Extramural Research and Training developed a framework to measure the scientific and health impacts of its extramural asthma research to improve the scientific basis for reducing the health effects of asthma. Objectives Here we apply the framework to characterize the NIEHS asthma portfolio’s impact in terms of publications, clinical applications of findings, community interventions, and technology developments. Methods A logic model was tailored to inputs, outputs, and outcomes of the NIEHS asthma portfolio. Data from existing National Institutes of Health (NIH) databases are used, along with publicly available bibliometric data and structured elicitation of expert judgment. Results NIEHS is the third largest source of asthma-related research grant funding within the NIH between 1975 and 2005, after the National Heart, Lung, and Blood Institute and the National Institute of Allergy and Infectious Diseases. Much of NIEHS-funded asthma research focuses on basic research, but results are often published in journals focused on clinical investigation, increasing the likelihood that the work is moved into practice along the “bench to bedside” continuum. NIEHS support has led to key breakthroughs in scientific research concerning susceptibility to asthma, environmental conditions that heighten asthma symptoms, and cellular mechanisms that may be involved in treating asthma. Conclusions If gaps and limitations in publicly available data receive adequate attention, further linkages can be demonstrated between research activities and public health improvements. This logic model approach to research impact assessment demonstrates that it is possible to conceptualize program components, mine existing databases, and begin to show longer-term impacts of program results. The next challenges will be to modify current data structures, improve the linkages among relevant databases, incorporate as much electronically available data as possible, and determine how to improve the quality and health impact of the science that we support. PMID:19654926
2014-01-01
Background This paper describes the development of a model of Comprehensive Primary Health Care (CPHC) applicable to the Australian context. CPHC holds promise as an effective model of health system organization able to improve population health and increase health equity. However, there is little literature that describes and evaluates CPHC as a whole, with most evaluation focusing on specific programs. The lack of a consensus on what constitutes CPHC, and the complex and context-sensitive nature of CPHC are all barriers to evaluation. Methods The research was undertaken in partnership with six Australian primary health care services: four state government funded and managed services, one sexual health non-government organization, and one Aboriginal community controlled health service. A draft model was crafted combining program logic and theory-based approaches, drawing on relevant literature, 68 interviews with primary health care service staff, and researcher experience. The model was then refined through an iterative process involving two to three workshops at each of the six participating primary health care services, engaging health service staff, regional health executives and central health department staff. Results The resultant Southgate Model of CPHC in Australia model articulates the theory of change of how and why CPHC service components and activities, based on the theory, evidence and values which underpin a CPHC approach, are likely to lead to individual and population health outcomes and increased health equity. The model captures the importance of context, the mechanisms of CPHC, and the space for action services have to work within. The process of development engendered and supported collaborative relationships between researchers and stakeholders and the product provided a description of CPHC as a whole and a framework for evaluation. The model was endorsed at a research symposium involving investigators, service staff, and key stakeholders. Conclusions The development of a theory-based program logic model provided a framework for evaluation that allows the tracking of progress towards desired outcomes and exploration of the particular aspects of context and mechanisms that produce outcomes. This is important because there are no existing models which enable the evaluation of CPHC services in their entirety. PMID:24885812
General purpose programmable accelerator board
Robertson, Perry J.; Witzke, Edward L.
2001-01-01
A general purpose accelerator board and acceleration method comprising use of: one or more programmable logic devices; a plurality of memory blocks; bus interface for communicating data between the memory blocks and devices external to the board; and dynamic programming capabilities for providing logic to the programmable logic device to be executed on data in the memory blocks.
California Geriatric Education Center Logic Model: An Evaluation and Communication Tool
ERIC Educational Resources Information Center
Price, Rachel M.; Alkema, Gretchen E.; Frank, Janet C.
2009-01-01
A logic model is a communications tool that graphically represents a program's resources, activities, priority target audiences for change, and the anticipated outcomes. This article describes the logic model development process undertaken by the California Geriatric Education Center in spring 2008. The CGEC is one of 48 Geriatric Education…
Augustinavicius, Jura L; Greene, M Claire; Lakin, Daniel P; Tol, Wietse A
2018-01-01
Monitoring and evaluation of mental health and psychosocial support (MHPSS) programs is critical to facilitating learning and providing accountability to stakeholders. As part of an inter-agency effort to develop recommendations on MHPSS monitoring and evaluation, this scoping review aimed to identify the terminology and focus of monitoring and evaluation frameworks in this field. We collected program documents (logical frameworks (logframes) and theories of change) from members of the Inter-Agency Standing Committee Reference Group on MHPSS, and systematically searched the peer-reviewed literature across five databases. We included program documents and academic articles that reported on monitoring and evaluation of MHPSS in low- and middle-income countries describing original data. Inclusion and data extraction were conducted in parallel by independent reviewers. Thematic analysis was used to identify common language in the description of practices and the focus of each monitoring and evaluation framework. Logframe outcomes were mapped to MHPSS activity categories. We identified 38 program documents and 89 peer-reviewed articles, describing monitoring and evaluation of a wide range of MHPSS activities. In both program documents and peer-reviewed literature there was a lack of specificity and overlap in language used for goals and outcomes. Well-validated, reliable instruments were reported in the academic literature, but rarely used in monitoring and evaluation practices. We identified six themes in the terminology used to describe goals and outcomes. Logframe outcomes were more commonly mapped to generic program implementation activities (e.g. "capacity building") and those related to family and community support, while outcomes from academic articles were most frequently mapped to specialized psychological treatments. Inconsistencies between the language used in research and practice and discrepancies in measurement have broader implications for monitoring and evaluation in MHPSS programs in humanitarian settings within low- and middle-income countries. This scoping review of the terminology commonly used to describe monitoring and evaluation practices and their focus within MHPSS programming highlights areas of importance for the development of a more standardized approach to monitoring and evaluation.
ERIC Educational Resources Information Center
Rudd, Tim
2017-01-01
This paper offers conceptual and theoretical insights relating to the Teaching Excellence Framework (TEF), highlighting a range of potential systemic and institutional outcomes and issues. The paper is organised around three key areas of discussion that are often under-explored in debates. Firstly, after considering the TEF in the wider context of…
Joining the Club: The Ideology of Quality and Business School Badging
ERIC Educational Resources Information Center
Bell, Emma; Taylor, Scott
2005-01-01
The ideology of quality and the frameworks used to measure it can profoundly affect academic identity. This article explores the role of quality frameworks in UK business schools, focusing on the way that individuals confront the logic of accreditation when they are subject to its discipline. By defining business schools as an institutional field,…
The BMW Model: A New Framework for Teaching Monetary Economics
ERIC Educational Resources Information Center
Bofinger, Peter; Mayer, Eric; Wollmershauser, Timo
2006-01-01
Although the IS/LM-AS/AD model is still the central tool of macroeconomic teaching in most macroeconomic textbooks, it has been criticized by several economists. Colander (1995) demonstrated that the framework is logically inconsistent, Romer (2000) showed that it is unable to deal with a monetary policy that uses the interest rate as its…
An Introduction to Logic Control Systems for the Behavioral Scientist, Part I, Text.
ERIC Educational Resources Information Center
Larsen, Lawrence A.
This programed instruction course gives a basic introduction to solid state programing equipment. Course objectives include giving the student (1) a working knowledge of the various types of units used in building digital logic control systems and (2) an idea of how they interconnect to perform different functions. The course has no prerequisites…
The Application of LOGO! in Control System of a Transmission and Sorting Mechanism
NASA Astrophysics Data System (ADS)
Liu, Jian; Lv, Yuan-Jun
Logic programming of general logic control module LOGO! has been recommended the application in transmission and sorting mechanism. First, the structure and operating principle of the mechanism had been introduced. Then the pneumatic loop of the mechanism had been plotted in the software of FluidSIM-P. At last, pneumatic loop and motors had been control by LOGO!, which makes the control process simple and clear instead of the complicated control of ordinary relay. LOGO! can achieve the complicated interlock control composed of inter relays and time relays. In the control process, the logic control function of LOGO! is fully used to logic programming so that the system realizes the control of air cylinder and motor. It is reliable and adjustable mechanism after application.
Fuzzy logic and neural network technologies
NASA Technical Reports Server (NTRS)
Villarreal, James A.; Lea, Robert N.; Savely, Robert T.
1992-01-01
Applications of fuzzy logic technologies in NASA projects are reviewed to examine their advantages in the development of neural networks for aerospace and commercial expert systems and control. Examples of fuzzy-logic applications include a 6-DOF spacecraft controller, collision-avoidance systems, and reinforcement-learning techniques. The commercial applications examined include a fuzzy autofocusing system, an air conditioning system, and an automobile transmission application. The practical use of fuzzy logic is set in the theoretical context of artificial neural systems (ANSs) to give the background for an overview of ANS research programs at NASA. The research and application programs include the Network Execution and Training Simulator and faster training algorithms such as the Difference Optimized Training Scheme. The networks are well suited for pattern-recognition applications such as predicting sunspots, controlling posture maintenance, and conducting adaptive diagnoses.
The shuttle main engine: A first look
NASA Technical Reports Server (NTRS)
Schreur, Barbara
1996-01-01
Anyone entering the Space Shuttle Main Engine (SSME) team attends a two week course to become familiar with the design and workings of the engine. This course provides intensive coverage of the individual hardware items and their functions. Some individuals, particularly those involved with software maintenance and development, have felt overwhelmed by this volume of material and their lack of a logical framework in which to place it. To provide this logical framework, it was decided that a brief self-taught introduction to the overall operation of the SSME should be designed. To aid the people or new team members with an interest in the software, this new course should also explain the structure and functioning of the controller and its software. This paper presents a description of this presentation.
NEVESIM: event-driven neural simulation framework with a Python interface.
Pecevski, Dejan; Kappel, David; Jonke, Zeno
2014-01-01
NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.
NEVESIM: event-driven neural simulation framework with a Python interface
Pecevski, Dejan; Kappel, David; Jonke, Zeno
2014-01-01
NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291
Simulated Laboratory in Digital Logic.
ERIC Educational Resources Information Center
Cleaver, Thomas G.
Design of computer circuits used to be a pencil and paper task followed by laboratory tests, but logic circuit design can now be done in half the time as the engineer accesses a program which simulates the behavior of real digital circuits, and does all the wiring and testing on his computer screen. A simulated laboratory in digital logic has been…
A Public Service-Dominant Logic for the Executive Education of Public Managers
ERIC Educational Resources Information Center
Hiedemann, Alexander M.; Nasi, Greta; Saporito, Raffaella
2017-01-01
Building on the concept of Public Service-Dominant Logic (PSDL), this article aims to apply the public service-dominant logic to executive education. We argue that fit-for-purpose and effective executive master programs for public managers (EMPA) need to be designed from a public service perspective. Framing executive education as a service…
Kneale, Dylan; Thomas, James; Harris, Katherine
2015-01-01
Logic models are becoming an increasingly common feature of systematic reviews, as is the use of programme theory more generally in systematic reviewing. Logic models offer a framework to help reviewers to 'think' conceptually at various points during the review, and can be a useful tool in defining study inclusion and exclusion criteria, guiding the search strategy, identifying relevant outcomes, identifying mediating and moderating factors, and communicating review findings. In this paper we critique the use of logic models in systematic reviews and protocols drawn from two databases representing reviews of health interventions and international development interventions. Programme theory featured only in a minority of the reviews and protocols included. Despite drawing from different disciplinary traditions, reviews and protocols from both sources shared several limitations in their use of logic models and theories of change, and these were used almost unanimously to solely depict pictorially the way in which the intervention worked. Logic models and theories of change were consequently rarely used to communicate the findings of the review. Logic models have the potential to be an aid integral throughout the systematic reviewing process. The absence of good practice around their use and development may be one reason for the apparent limited utility of logic models in many existing systematic reviews. These concerns are addressed in the second half of this paper, where we offer a set of principles in the use of logic models and an example of how we constructed a logic model for a review of school-based asthma interventions.
Wang, Edwin; Zaman, Naif; Mcgee, Shauna; Milanese, Jean-Sébastien; Masoudi-Nejad, Ali; O'Connor-McCourt, Maureen
2015-02-01
Tumor genome sequencing leads to documenting thousands of DNA mutations and other genomic alterations. At present, these data cannot be analyzed adequately to aid in the understanding of tumorigenesis and its evolution. Moreover, we have little insight into how to use these data to predict clinical phenotypes and tumor progression to better design patient treatment. To meet these challenges, we discuss a cancer hallmark network framework for modeling genome sequencing data to predict cancer clonal evolution and associated clinical phenotypes. The framework includes: (1) cancer hallmarks that can be represented by a few molecular/signaling networks. 'Network operational signatures' which represent gene regulatory logics/strengths enable to quantify state transitions and measures of hallmark traits. Thus, sets of genomic alterations which are associated with network operational signatures could be linked to the state/measure of hallmark traits. The network operational signature transforms genotypic data (i.e., genomic alterations) to regulatory phenotypic profiles (i.e., regulatory logics/strengths), to cellular phenotypic profiles (i.e., hallmark traits) which lead to clinical phenotypic profiles (i.e., a collection of hallmark traits). Furthermore, the framework considers regulatory logics of the hallmark networks under tumor evolutionary dynamics and therefore also includes: (2) a self-promoting positive feedback loop that is dominated by a genomic instability network and a cell survival/proliferation network is the main driver of tumor clonal evolution. Surrounding tumor stroma and its host immune systems shape the evolutionary paths; (3) cell motility initiating metastasis is a byproduct of the above self-promoting loop activity during tumorigenesis; (4) an emerging hallmark network which triggers genome duplication dominates a feed-forward loop which in turn could act as a rate-limiting step for tumor formation; (5) mutations and other genomic alterations have specific patterns and tissue-specificity, which are driven by aging and other cancer-inducing agents. This framework represents the logics of complex cancer biology as a myriad of phenotypic complexities governed by a limited set of underlying organizing principles. It therefore adds to our understanding of tumor evolution and tumorigenesis, and moreover, potential usefulness of predicting tumors' evolutionary paths and clinical phenotypes. Strategies of using this framework in conjunction with genome sequencing data in an attempt to predict personalized drug targets, drug resistance, and metastasis for cancer patients, as well as cancer risks for healthy individuals are discussed. Accurate prediction of cancer clonal evolution and clinical phenotypes will have substantial impact on timely diagnosis, personalized treatment and personalized prevention of cancer. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.
Syllogistic reasoning in fuzzy logic and its application to usuality and reasoning with dispositions
NASA Technical Reports Server (NTRS)
Zadeh, L. A.
1985-01-01
A fuzzy syllogism in fuzzy logic is defined to be an inference schema in which the major premise, the minor premise and the conclusion are propositions containing fuzzy quantifiers. A basic fuzzy syllogism in fuzzy logic is the intersection/product syllogism. Several other basic syllogisms are developed that may be employed as rules of combination of evidence in expert systems. Among these is the consequent conjunction syllogism. Furthermore, it is shown that syllogistic reasoning in fuzzy logic provides a basis for reasoning with dispositions; that is, with propositions that are preponderantly but not necessarily always true. It is also shown that the concept of dispositionality is closely related to the notion of usuality and serves as a basis for what might be called a theory of usuality - a theory which may eventually provide a computational framework for commonsense reasoning.
Huang, Jian-hua; Li, Wen-wei; Bian, Qin; Shen, Zi-yin
2011-09-01
The true meanings of the terms of traditional Chinese medicine (TCM) need to be analyzed on a logical basis. It is not suitable to use a new term to interpret an old term of TCM, or arbitrarily specify the special term of TCM corresponding to some substances of modern medicine. In philosophy of language, language has a logical structure, which reflects the structure of the world, that is to say, language is the picture of the world in a logical sense. Using this idea, the authors collected the ancient literature on "kidney essence", and extracted each necessary condition for "kidney essence". All necessary conditions formed a sufficient condition to define the term "kidney essence". It is expected that this example can show the effectiveness of philosophy of language in analysis of the terms of TCM.
NASA Astrophysics Data System (ADS)
Bosse, Stefan
2013-05-01
Sensorial materials consisting of high-density, miniaturized, and embedded sensor networks require new robust and reliable data processing and communication approaches. Structural health monitoring is one major field of application for sensorial materials. Each sensor node provides some kind of sensor, electronics, data processing, and communication with a strong focus on microchip-level implementation to meet the goals of miniaturization and low-power energy environments, a prerequisite for autonomous behaviour and operation. Reliability requires robustness of the entire system in the presence of node, link, data processing, and communication failures. Interaction between nodes is required to manage and distribute information. One common interaction model is the mobile agent. An agent approach provides stronger autonomy than a traditional object or remote-procedure-call based approach. Agents can decide for themselves, which actions are performed, and they are capable of flexible behaviour, reacting on the environment and other agents, providing some degree of robustness. Traditionally multi-agent systems are abstract programming models which are implemented in software and executed on program controlled computer architectures. This approach does not well scale to micro-chip level and requires full equipped computers and communication structures, and the hardware architecture does not consider and reflect the requirements for agent processing and interaction. We propose and demonstrate a novel design paradigm for reliable distributed data processing systems and a synthesis methodology and framework for multi-agent systems implementable entirely on microchip-level with resource and power constrained digital logic supporting Agent-On-Chip architectures (AoC). The agent behaviour and mobility is fully integrated on the micro-chip using pipelined communicating processes implemented with finite-state machines and register-transfer logic. The agent behaviour, interaction (communication), and mobility features are modelled and specified on a machine-independent abstract programming level using a state-based agent behaviour language (APL). With this APL a high-level agent compiler is able to synthesize a hardware model (RTL, VHDL), a software model (C, ML), or a simulation model (XML) suitable to simulate a multi-agent system using the SeSAm simulator framework. Agent communication is provided by a simple tuple-space database implemented on node level providing fault tolerant access of global data. A novel synthesis development kit (SynDK) based on a graph-structured database approach is introduced to support the rapid development of compilers and synthesis tools, used for example for the design and implementation of the APL compiler.
ERIC Educational Resources Information Center
Le, Nguyen-Thinh; Menzel, Wolfgang
2009-01-01
In this paper, we introduce logic programming as a domain that exhibits some characteristics of being ill-defined. In order to diagnose student errors in such a domain, we need a means to hypothesise the student's intention, that is the strategy underlying her solution. This is achieved by weighting constraints, so that hypotheses about solution…
Rigorous Science: a How-To Guide
Fang, Ferric C.
2016-01-01
ABSTRACT Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. PMID:27834205
Rigorous Science: a How-To Guide.
Casadevall, Arturo; Fang, Ferric C
2016-11-08
Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. Copyright © 2016 Casadevall and Fang.
ERIC Educational Resources Information Center
Pearce, Kimber Charles; Fadely, Dean
1992-01-01
Analyzes the quasi-logical argumentative framework of George Bush's address in which he endeavored to gain compliance and justify his actions at the beginning of the Persian Gulf War. Identifies arguments of comparison and sacrifice within that framework and examines the role of justice in the speech. (TB)
ERIC Educational Resources Information Center
Black, Beth; Suto, Irenka; Bramley, Tom
2011-01-01
In this paper we develop an evidence-based framework for considering many of the factors affecting marker agreement in GCSEs and A levels. A logical analysis of the demands of the marking task suggests a core grouping comprising: (i) question features; (ii) mark scheme features; and (iii) examinee response features. The framework synthesises…
Analysis of individual risk belief structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonn, B.E.; Travis, C.B.; Arrowood, L.
An interactive computer program developed at Oak Ridge National Laboratory is presented as a methodology to model individualized belief structures. The logic and general strategy of the model is presented for two risk topics: AIDs and toxic waste. Subjects identified desirable and undesirable consequences for each topic and formulated an associative rule linking topic and consequence in either a causal or correlational framework. Likelihood estimates, generated by subjects in several formats (probability, odds statements, etc.), constituted one outcome measure. Additionally, source of belief (personal experience, news media, etc.) and perceived personal and societal impact are reviewed. Briefly, subjects believe thatmore » AIDs causes significant emotional problems, and to a lesser degree, physical health problems whereas toxic waste causes significant environmental problems.« less
Kahn, Michael G; Callahan, Tiffany J; Barnard, Juliana; Bauck, Alan E; Brown, Jeff; Davidson, Bruce N; Estiri, Hossein; Goerg, Carsten; Holve, Erin; Johnson, Steven G; Liaw, Siaw-Teng; Hamilton-Lopez, Marianne; Meeker, Daniella; Ong, Toan C; Ryan, Patrick; Shang, Ning; Weiskopf, Nicole G; Weng, Chunhua; Zozus, Meredith N; Schilling, Lisa
2016-01-01
Harmonized data quality (DQ) assessment terms, methods, and reporting practices can establish a common understanding of the strengths and limitations of electronic health record (EHR) data for operational analytics, quality improvement, and research. Existing published DQ terms were harmonized to a comprehensive unified terminology with definitions and examples and organized into a conceptual framework to support a common approach to defining whether EHR data is 'fit' for specific uses. DQ publications, informatics and analytics experts, managers of established DQ programs, and operational manuals from several mature EHR-based research networks were reviewed to identify potential DQ terms and categories. Two face-to-face stakeholder meetings were used to vet an initial set of DQ terms and definitions that were grouped into an overall conceptual framework. Feedback received from data producers and users was used to construct a draft set of harmonized DQ terms and categories. Multiple rounds of iterative refinement resulted in a set of terms and organizing framework consisting of DQ categories, subcategories, terms, definitions, and examples. The harmonized terminology and logical framework's inclusiveness was evaluated against ten published DQ terminologies. Existing DQ terms were harmonized and organized into a framework by defining three DQ categories: (1) Conformance (2) Completeness and (3) Plausibility and two DQ assessment contexts: (1) Verification and (2) Validation. Conformance and Plausibility categories were further divided into subcategories. Each category and subcategory was defined with respect to whether the data may be verified with organizational data, or validated against an accepted gold standard, depending on proposed context and uses. The coverage of the harmonized DQ terminology was validated by successfully aligning to multiple published DQ terminologies. Existing DQ concepts, community input, and expert review informed the development of a distinct set of terms, organized into categories and subcategories. The resulting DQ terms successfully encompassed a wide range of disparate DQ terminologies. Operational definitions were developed to provide guidance for implementing DQ assessment procedures. The resulting structure is an inclusive DQ framework for standardizing DQ assessment and reporting. While our analysis focused on the DQ issues often found in EHR data, the new terminology may be applicable to a wide range of electronic health data such as administrative, research, and patient-reported data. A consistent, common DQ terminology, organized into a logical framework, is an initial step in enabling data owners and users, patients, and policy makers to evaluate and communicate data quality findings in a well-defined manner with a shared vocabulary. Future work will leverage the framework and terminology to develop reusable data quality assessment and reporting methods.
1990-07-01
replacing "logic diagrams" or "flow charts") to aid in coordinating the functions to be performed by a computer program and its associated Inputs...ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT ITASK IWORK UNIT ELEMENT NO. NO. NO. ACCESSION NO. 11. TITLE...the analysis. Both the logical model and detailed procedures are used to develop the application software programs which will be provided to Government
A system for programming experiments and for recording and analyzing data automatically1
Herrick, Robert M.; Denelsbeck, John S.
1963-01-01
A system designed for use in complex operant conditioning experiments is described. Some of its key features are: (a) plugboards that permit the experimenter to change either from one program to another or from one analysis to another in less than a minute, (b) time-sharing of permanently-wired, electronic logic components, (c) recordings suitable for automatic analyses. Included are flow diagrams of the system and sample logic diagrams for programming experiments and for analyzing data. ImagesFig. 4. PMID:14055967
Optical programmable Boolean logic unit.
Chattopadhyay, Tanay
2011-11-10
Logic units are the building blocks of many important computational operations likes arithmetic, multiplexer-demultiplexer, radix conversion, parity checker cum generator, etc. Multifunctional logic operation is very much essential in this respect. Here a programmable Boolean logic unit is proposed that can perform 16 Boolean logical operations from a single optical input according to the programming input without changing the circuit design. This circuit has two outputs. One output is complementary to the other. Hence no loss of data can occur. The circuit is basically designed by a 2×2 polarization independent optical cross bar switch. Performance of the proposed circuit has been achieved by doing numerical simulations. The binary logical states (0,1) are represented by the absence of light (null) and presence of light, respectively.
Federal Highway Administration research and technology evaluation final report : Eco-Logical
DOT National Transportation Integrated Search
2018-03-01
This report documents an evaluation of Federal Highway Administrations (FHWA) Research and Technology Programs activities on the implementation of the Eco-Logical approach by State transportation departments and metropolitan planning organizati...
Fuzzy logic based sensor performance evaluation of vehicle mounted metal detector systems
NASA Astrophysics Data System (ADS)
Abeynayake, Canicious; Tran, Minh D.
2015-05-01
Vehicle Mounted Metal Detector (VMMD) systems are widely used for detection of threat objects in humanitarian demining and military route clearance scenarios. Due to the diverse nature of such operational conditions, operational use of VMMD without a proper understanding of its capability boundaries may lead to heavy causalities. Multi-criteria fitness evaluations are crucial for determining capability boundaries of any sensor-based demining equipment. Evaluation of sensor based military equipment is a multi-disciplinary topic combining the efforts of researchers, operators, managers and commanders having different professional backgrounds and knowledge profiles. Information acquired through field tests usually involves uncertainty, vagueness and imprecision due to variations in test and evaluation conditions during a single test or series of tests. This report presents a fuzzy logic based methodology for experimental data analysis and performance evaluation of VMMD. This data evaluation methodology has been developed to evaluate sensor performance by consolidating expert knowledge with experimental data. A case study is presented by implementing the proposed data analysis framework in a VMMD evaluation scenario. The results of this analysis confirm accuracy, practicability and reliability of the fuzzy logic based sensor performance evaluation framework.
Logic programming to infer complex RNA expression patterns from RNA-seq data.
Weirick, Tyler; Militello, Giuseppe; Ponomareva, Yuliya; John, David; Döring, Claudia; Dimmeler, Stefanie; Uchida, Shizuka
2018-03-01
To meet the increasing demand in the field, numerous long noncoding RNA (lncRNA) databases are available. Given many lncRNAs are specifically expressed in certain cell types and/or time-dependent manners, most lncRNA databases fall short of providing such profiles. We developed a strategy using logic programming to handle the complex organization of organs, their tissues and cell types as well as gender and developmental time points. To showcase this strategy, we introduce 'RenalDB' (http://renaldb.uni-frankfurt.de), a database providing expression profiles of RNAs in major organs focusing on kidney tissues and cells. RenalDB uses logic programming to describe complex anatomy, sample metadata and logical relationships defining expression, enrichment or specificity. We validated the content of RenalDB with biological experiments and functionally characterized two long intergenic noncoding RNAs: LOC440173 is important for cell growth or cell survival, whereas PAXIP1-AS1 is a regulator of cell death. We anticipate RenalDB will be used as a first step toward functional studies of lncRNAs in the kidney.
Kneale, Dylan; Thomas, James; Harris, Katherine
2015-01-01
Background Logic models are becoming an increasingly common feature of systematic reviews, as is the use of programme theory more generally in systematic reviewing. Logic models offer a framework to help reviewers to ‘think’ conceptually at various points during the review, and can be a useful tool in defining study inclusion and exclusion criteria, guiding the search strategy, identifying relevant outcomes, identifying mediating and moderating factors, and communicating review findings. Methods and Findings In this paper we critique the use of logic models in systematic reviews and protocols drawn from two databases representing reviews of health interventions and international development interventions. Programme theory featured only in a minority of the reviews and protocols included. Despite drawing from different disciplinary traditions, reviews and protocols from both sources shared several limitations in their use of logic models and theories of change, and these were used almost unanimously to solely depict pictorially the way in which the intervention worked. Logic models and theories of change were consequently rarely used to communicate the findings of the review. Conclusions Logic models have the potential to be an aid integral throughout the systematic reviewing process. The absence of good practice around their use and development may be one reason for the apparent limited utility of logic models in many existing systematic reviews. These concerns are addressed in the second half of this paper, where we offer a set of principles in the use of logic models and an example of how we constructed a logic model for a review of school-based asthma interventions. PMID:26575182
Putting time into proof outlines
NASA Technical Reports Server (NTRS)
Schneider, Fred B.; Bloom, Bard; Marzullo, Keith
1993-01-01
A logic for reasoning about timing properties of concurrent programs is presented. The logic is based on Hoare-style proof outlines and can handle maximal parallelism as well as certain resource-constrained execution environments. The correctness proof for a mutual exclusion protocol that uses execution timings in a subtle way illustrates the logic in action. A soundness proof using structural operational semantics is outlined in the appendix.
Application of logic models in a large scientific research program.
O'Keefe, Christine M; Head, Richard J
2011-08-01
It is the purpose of this article to discuss the development and application of a logic model in the context of a large scientific research program within the Commonwealth Scientific and Industrial Research Organisation (CSIRO). CSIRO is Australia's national science agency and is a publicly funded part of Australia's innovation system. It conducts mission-driven scientific research focussed on delivering results with relevance and impact for Australia, where impact is defined and measured in economic, environmental and social terms at the national level. The Australian Government has recently signalled an increasing emphasis on performance assessment and evaluation, which in the CSIRO context implies an increasing emphasis on ensuring and demonstrating the impact of its research programs. CSIRO continues to develop and improve its approaches to impact planning and evaluation, including conducting a trial of a program logic approach in the CSIRO Preventative Health National Research Flagship. During the trial, improvements were observed in clarity of the research goals and path to impact, as well as in alignment of science and support function activities with national challenge goals. Further benefits were observed in terms of communication of the goals and expected impact of CSIRO's research programs both within CSIRO and externally. The key lesson learned was that significant value was achieved through the process itself, as well as the outcome. Recommendations based on the CSIRO trial may be of interest to managers of scientific research considering developing similar logic models for their research projects. The CSIRO experience has shown that there are significant benefits to be gained, especially if the project participants have a major role in the process of developing the logic model. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
1979-01-01
The detailed logic flow for the Flight Design System Executive is presented. The system is designed to provide the hardware/software capability required for operational support of shuttle flight planning.
Molecular implementation of simple logic programs.
Ran, Tom; Kaplan, Shai; Shapiro, Ehud
2009-10-01
Autonomous programmable computing devices made of biomolecules could interact with a biological environment and be used in future biological and medical applications. Biomolecular implementations of finite automata and logic gates have already been developed. Here, we report an autonomous programmable molecular system based on the manipulation of DNA strands that is capable of performing simple logical deductions. Using molecular representations of facts such as Man(Socrates) and rules such as Mortal(X) <-- Man(X) (Every Man is Mortal), the system can answer molecular queries such as Mortal(Socrates)? (Is Socrates Mortal?) and Mortal(X)? (Who is Mortal?). This biomolecular computing system compares favourably with previous approaches in terms of expressive power, performance and precision. A compiler translates facts, rules and queries into their molecular representations and subsequently operates a robotic system that assembles the logical deductions and delivers the result. This prototype is the first simple programming language with a molecular-scale implementation.
Bracht, Marianne; Heffer, Michael; O'Brien, Karel
2005-02-01
To implement and deliver a respiratory syncytial virus prophylaxis (RSVP) program in response to the Canadian Pediatric Society recommendations. A novel program was designed to provide inpatient RSVP for at-risk infants cared for in 1 tertiary care newborn intensive care unit (NICU). This inpatient program was part of a coordinated approach to RSVP, designed and implemented by 3 hospitals. An RSVP program logic model was created and used by a multidisciplinary team to evaluate the in-house program and identify areas of program activity requiring improvement. Following the 2000 to 2001 RSV season, a compliance and outcomes audit was performed in the tertiary center; 193 infants were enrolled in the RSVP program and 162 infants had received RSVP in the NICU [Mean = 1.64 doses]. Telephone follow-up with the parents of discharged infants identified that 159 infants (98%) had successfully completed their full course of RSVP. Using the RSVP program logic model, 5 areas for program improvement were identified including infant recruitment, patient transfer/discharge processes, product procurement, preparation/distribution/administration of doses, and healthcare team communication. Interdisciplinary collaboration is an important factor in the success of the RSVP program and has supported a consistent model of care for the delivery of RSVP. The program logic model provided a useful structure to systematically review the RSVP program in this organization.
Evaluation of properties over phylogenetic trees using stochastic logics.
Requeno, José Ignacio; Colom, José Manuel
2016-06-14
Model checking has been recently introduced as an integrated framework for extracting information of the phylogenetic trees using temporal logics as a querying language, an extension of modal logics that imposes restrictions of a boolean formula along a path of events. The phylogenetic tree is considered a transition system modeling the evolution as a sequence of genomic mutations (we understand mutation as different ways that DNA can be changed), while this kind of logics are suitable for traversing it in a strict and exhaustive way. Given a biological property that we desire to inspect over the phylogeny, the verifier returns true if the specification is satisfied or a counterexample that falsifies it. However, this approach has been only considered over qualitative aspects of the phylogeny. In this paper, we repair the limitations of the previous framework for including and handling quantitative information such as explicit time or probability. To this end, we apply current probabilistic continuous-time extensions of model checking to phylogenetics. We reinterpret a catalog of qualitative properties in a numerical way, and we also present new properties that couldn't be analyzed before. For instance, we obtain the likelihood of a tree topology according to a mutation model. As case of study, we analyze several phylogenies in order to obtain the maximum likelihood with the model checking tool PRISM. In addition, we have adapted the software for optimizing the computation of maximum likelihoods. We have shown that probabilistic model checking is a competitive framework for describing and analyzing quantitative properties over phylogenetic trees. This formalism adds soundness and readability to the definition of models and specifications. Besides, the existence of model checking tools hides the underlying technology, omitting the extension, upgrade, debugging and maintenance of a software tool to the biologists. A set of benchmarks justify the feasibility of our approach.
Chen, Carla Chia-Ming; Schwender, Holger; Keith, Jonathan; Nunkesser, Robin; Mengersen, Kerrie; Macrossan, Paula
2011-01-01
Due to advancements in computational ability, enhanced technology and a reduction in the price of genotyping, more data are being generated for understanding genetic associations with diseases and disorders. However, with the availability of large data sets comes the inherent challenges of new methods of statistical analysis and modeling. Considering a complex phenotype may be the effect of a combination of multiple loci, various statistical methods have been developed for identifying genetic epistasis effects. Among these methods, logic regression (LR) is an intriguing approach incorporating tree-like structures. Various methods have built on the original LR to improve different aspects of the model. In this study, we review four variations of LR, namely Logic Feature Selection, Monte Carlo Logic Regression, Genetic Programming for Association Studies, and Modified Logic Regression-Gene Expression Programming, and investigate the performance of each method using simulated and real genotype data. We contrast these with another tree-like approach, namely Random Forests, and a Bayesian logistic regression with stochastic search variable selection.
Texas traffic thermostat software tool.
DOT National Transportation Integrated Search
2013-04-01
The traffic thermostat decision tool is built to help guide the user through a logical, step-wise, process of examining potential changes to their Manage Lane/toll facility. : **NOTE: Project Title: Application of the Traffic Thermostat Framework. Ap...
Texas traffic thermostat marketing package.
DOT National Transportation Integrated Search
2013-04-01
The traffic thermostat decision tool is built to help guide the user through a logical, step-wise, process of examining potential changes to their Manage Lane/toll facility. : **NOTE: Project Title: Application of the Traffic Thermostat Framework. Ap...
Kahn, Michael G.; Callahan, Tiffany J.; Barnard, Juliana; Bauck, Alan E.; Brown, Jeff; Davidson, Bruce N.; Estiri, Hossein; Goerg, Carsten; Holve, Erin; Johnson, Steven G.; Liaw, Siaw-Teng; Hamilton-Lopez, Marianne; Meeker, Daniella; Ong, Toan C.; Ryan, Patrick; Shang, Ning; Weiskopf, Nicole G.; Weng, Chunhua; Zozus, Meredith N.; Schilling, Lisa
2016-01-01
Objective: Harmonized data quality (DQ) assessment terms, methods, and reporting practices can establish a common understanding of the strengths and limitations of electronic health record (EHR) data for operational analytics, quality improvement, and research. Existing published DQ terms were harmonized to a comprehensive unified terminology with definitions and examples and organized into a conceptual framework to support a common approach to defining whether EHR data is ‘fit’ for specific uses. Materials and Methods: DQ publications, informatics and analytics experts, managers of established DQ programs, and operational manuals from several mature EHR-based research networks were reviewed to identify potential DQ terms and categories. Two face-to-face stakeholder meetings were used to vet an initial set of DQ terms and definitions that were grouped into an overall conceptual framework. Feedback received from data producers and users was used to construct a draft set of harmonized DQ terms and categories. Multiple rounds of iterative refinement resulted in a set of terms and organizing framework consisting of DQ categories, subcategories, terms, definitions, and examples. The harmonized terminology and logical framework’s inclusiveness was evaluated against ten published DQ terminologies. Results: Existing DQ terms were harmonized and organized into a framework by defining three DQ categories: (1) Conformance (2) Completeness and (3) Plausibility and two DQ assessment contexts: (1) Verification and (2) Validation. Conformance and Plausibility categories were further divided into subcategories. Each category and subcategory was defined with respect to whether the data may be verified with organizational data, or validated against an accepted gold standard, depending on proposed context and uses. The coverage of the harmonized DQ terminology was validated by successfully aligning to multiple published DQ terminologies. Discussion: Existing DQ concepts, community input, and expert review informed the development of a distinct set of terms, organized into categories and subcategories. The resulting DQ terms successfully encompassed a wide range of disparate DQ terminologies. Operational definitions were developed to provide guidance for implementing DQ assessment procedures. The resulting structure is an inclusive DQ framework for standardizing DQ assessment and reporting. While our analysis focused on the DQ issues often found in EHR data, the new terminology may be applicable to a wide range of electronic health data such as administrative, research, and patient-reported data. Conclusion: A consistent, common DQ terminology, organized into a logical framework, is an initial step in enabling data owners and users, patients, and policy makers to evaluate and communicate data quality findings in a well-defined manner with a shared vocabulary. Future work will leverage the framework and terminology to develop reusable data quality assessment and reporting methods. PMID:27713905
Implementation and Evaluation of Microcomputer Systems for the Republic of Turkey’s Naval Ships.
1986-03-01
important database design tool for both logical and physical database design, such as flowcharts or pseudocodes are used for program design. Logical...string manipulation in FORTRAN is difficult but not impossible. BASIC ( Beginners All-Purpose Symbolic Instruction Code): Basic is currently the most...63 APPENDIX B GLOSSARY/ACRONYM LIST AC Alternating Current AP Application Program BASIC Beginners All-purpose Symbolic Instruction Code CCP
NASA Astrophysics Data System (ADS)
Zhang, X.; Wan, C. H.; Yuan, Z. H.; Fang, C.; Kong, W. J.; Wu, H.; Zhang, Q. T.; Tao, B. S.; Han, X. F.
2017-04-01
Confronting with the gigantic volume of data produced every day, raising integration density by reducing the size of devices becomes harder and harder to meet the ever-increasing demand for high-performance computers. One feasible path is to actualize more logic functions in one cell. In this respect, we experimentally demonstrate a prototype spin-orbit torque based spin logic cell integrated with five frequently used logic functions (AND, OR, NOT, NAND and NOR). The cell can be easily programmed and reprogrammed to perform desired function. Furthermore, the information stored in cells is symmetry-protected, making it possible to expand into logic gate array where the cell can be manipulated one by one without changing the information of other undesired cells. This work provides a prospective example of multi-functional spin logic cell with reprogrammability and nonvolatility, which will advance the application of spin logic devices.
NASA Technical Reports Server (NTRS)
Lee, Taesik; Jeziorek, Peter
2004-01-01
Large complex projects cost large sums of money throughout their life cycle for a variety of reasons and causes. For such large programs, the credible estimation of the project cost, a quick assessment of the cost of making changes, and the management of the project budget with effective cost reduction determine the viability of the project. Cost engineering that deals with these issues requires a rigorous method and systematic processes. This paper introduces a logical framework to a&e effective cost engineering. The framework is built upon Axiomatic Design process. The structure in the Axiomatic Design process provides a good foundation to closely tie engineering design and cost information together. The cost framework presented in this paper is a systematic link between the functional domain (FRs), physical domain (DPs), cost domain (CUs), and a task/process-based model. The FR-DP map relates a system s functional requirements to design solutions across all levels and branches of the decomposition hierarchy. DPs are mapped into CUs, which provides a means to estimate the cost of design solutions - DPs - from the cost of the physical entities in the system - CUs. The task/process model describes the iterative process ot-developing each of the CUs, and is used to estimate the cost of CUs. By linking the four domains, this framework provides a superior traceability from requirements to cost information.
Optical reversible programmable Boolean logic unit.
Chattopadhyay, Tanay
2012-07-20
Computing with reversibility is the only way to avoid dissipation of energy associated with bit erase. So, a reversible microprocessor is required for future computing. In this paper, a design of a simple all-optical reversible programmable processor is proposed using a polarizing beam splitter, liquid crystal-phase spatial light modulators, a half-wave plate, and plane mirrors. This circuit can perform 16 logical operations according to three programming inputs. Also, inputs can be easily recovered from the outputs. It is named the "reversible programmable Boolean logic unit (RPBLU)." The logic unit is the basic building block of many complex computational operations. Hence the design is important in sense. Two orthogonally polarized lights are defined here as two logical states, respectively.
Assessment of groundwater vulnerability using supervised committee to combine fuzzy logic models.
Nadiri, Ata Allah; Gharekhani, Maryam; Khatibi, Rahman; Moghaddam, Asghar Asghari
2017-03-01
Vulnerability indices of an aquifer assessed by different fuzzy logic (FL) models often give rise to differing values with no theoretical or empirical basis to establish a validated baseline or to develop a comparison basis between the modeling results and baselines, if any. Therefore, this research presents a supervised committee fuzzy logic (SCFL) method, which uses artificial neural networks to overarch and combine a selection of FL models. The indices are expressed by the widely used DRASTIC framework, which include geological, hydrological, and hydrogeological parameters often subject to uncertainty. DRASTIC indices represent collectively intrinsic (or natural) vulnerability and give a sense of contaminants, such as nitrate-N, percolating to aquifers from the surface. The study area is an aquifer in Ardabil plain, the province of Ardabil, northwest Iran. Improvements on vulnerability indices are achieved by FL techniques, which comprise Sugeno fuzzy logic (SFL), Mamdani fuzzy logic (MFL), and Larsen fuzzy logic (LFL). As the correlation between estimated DRASTIC vulnerability index values and nitrate-N values is as low as 0.4, it is improved significantly by FL models (SFL, MFL, and LFL), which perform in similar ways but have differences. Their synergy is exploited by SCFL and uses the FL modeling results "conditioned" by nitrate-N values to raise their correlation to higher than 0.9.
Bayesian Logic Programs for Plan Recognition and Machine Reading
2012-12-01
models is that they can handle both uncertainty and structured/ relational data. As a result, they are widely used in domains like social network...data. As a result, they are widely used in domains like social net- work analysis, biological data analysis, and natural language processing. Bayesian...the Story Understanding data set. (b) The logical representation of the observations. (c) The set of ground rules obtained from logical abduction
Synthesizing Dynamic Programming Algorithms from Linear Temporal Logic Formulae
NASA Technical Reports Server (NTRS)
Rosu, Grigore; Havelund, Klaus
2001-01-01
The problem of testing a linear temporal logic (LTL) formula on a finite execution trace of events, generated by an executing program, occurs naturally in runtime analysis of software. We present an algorithm which takes an LTL formula and generates an efficient dynamic programming algorithm. The generated algorithm tests whether the LTL formula is satisfied by a finite trace of events given as input. The generated algorithm runs in linear time, its constant depending on the size of the LTL formula. The memory needed is constant, also depending on the size of the formula.
Gervais, Christine; de Montigny, Francine; Lacharité, Carl; Dubeau, Diane
2015-10-01
The transition to fatherhood, with its numerous challenges, has been well documented. Likewise, fathers' relationships with health and social services have also begun to be explored. Yet despite the problems fathers experience in interactions with healthcare services, few programs have been developed for them. To explain this, some authors point to the difficulty practitioners encounter in developing and structuring the theory of programs they are trying to create to promote and support father involvement (Savaya, R., & Waysman, M. (2005). Administration in Social Work, 29(2), 85), even when such theory is key to a program's effectiveness (Chen, H.-T. (2005). Practical program evaluation. Thousand Oaks, CA: Sage Publications). The objective of the present paper is to present a tool, the logic model, to bridge this gap and to equip practitioners for structuring program theory. This paper addresses two questions: (1) What would be a useful instrument for structuring the development of program theory in interventions for fathers? (2) How would the concepts of a father involvement program best be organized? The case of the Father Friendly Initiative within Families (FFIF) program is used to present and illustrate six simple steps for developing a logic model that are based on program theory and demonstrate its relevance. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
Chol understandings of suicide and human agency.
Imberton, Gracia
2012-06-01
According to ethnographic material collected since 2003, the Chol Mayan indigenous people in southern Mexico have different causal explanations for suicide. It can be attributed to witchcraft that forces victims to take their lives against their own will, to excessive drinking, or to fate determined by God. However, it can also be conceived of as a conscious decision made by a person overwhelmed by daily problems. Drawing from the theoretical framework developed by Laura M. Ahearn, inspired by practice theory, the paper contends that these different explanations operate within two different logics or understandings of human agency. The first logic attributes responsibility to supernatural causes such as witchcraft or divine destiny, and reflects Chol notions of personhood. The second logic accepts personal responsibility for suicide, and is related to processes of social change such as the introduction of wage labor, education and a market economy. The contemporary Chol resort to both logics to make sense of the human drama of suicide.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ondrej Linda; Todd Vollmer; Jim Alves-Foss
2011-08-01
Resiliency and cyber security of modern critical infrastructures is becoming increasingly important with the growing number of threats in the cyber-environment. This paper proposes an extension to a previously developed fuzzy logic based anomaly detection network security cyber sensor via incorporating Type-2 Fuzzy Logic (T2 FL). In general, fuzzy logic provides a framework for system modeling in linguistic form capable of coping with imprecise and vague meanings of words. T2 FL is an extension of Type-1 FL which proved to be successful in modeling and minimizing the effects of various kinds of dynamic uncertainties. In this paper, T2 FL providesmore » a basis for robust anomaly detection and cyber security state awareness. In addition, the proposed algorithm was specifically developed to comply with the constrained computational requirements of low-cost embedded network security cyber sensors. The performance of the system was evaluated on a set of network data recorded from an experimental cyber-security test-bed.« less
Nature and place of crime scene management within forensic sciences.
Crispino, Frank
2008-03-01
This short paper presents the preliminary results of a recent study aimed at appreciating the relevant parameters required to qualify forensic science as a science through an epistemological analysis. The reader is invited to reflect upon references within a historical and logical framework which assert that forensic science is based upon two fundamental principles (those of Locard and Kirk). The basis of the assertion that forensic science is indeed a science should be appreciated not only on one epistemological criteria (as Popper's falsification raised by the Daubert hearing was), but also on the logical frameworks used by the individuals involved (investigator, expert witness and trier of fact) from the crime scene examination to the final interpretation of the evidence. Hence, it can be argued that the management of the crime scene should be integrated into the scientific way of thinking rather than remain as a technical discipline as recently suggested by Harrison.
An acceleration framework for synthetic aperture radar algorithms
NASA Astrophysics Data System (ADS)
Kim, Youngsoo; Gloster, Clay S.; Alexander, Winser E.
2017-04-01
Algorithms for radar signal processing, such as Synthetic Aperture Radar (SAR) are computationally intensive and require considerable execution time on a general purpose processor. Reconfigurable logic can be used to off-load the primary computational kernel onto a custom computing machine in order to reduce execution time by an order of magnitude as compared to kernel execution on a general purpose processor. Specifically, Field Programmable Gate Arrays (FPGAs) can be used to accelerate these kernels using hardware-based custom logic implementations. In this paper, we demonstrate a framework for algorithm acceleration. We used SAR as a case study to illustrate the potential for algorithm acceleration offered by FPGAs. Initially, we profiled the SAR algorithm and implemented a homomorphic filter using a hardware implementation of the natural logarithm. Experimental results show a linear speedup by adding reasonably small processing elements in Field Programmable Gate Array (FPGA) as opposed to using a software implementation running on a typical general purpose processor.
Hierarchical semantic structures for medical NLP.
Taira, Ricky K; Arnold, Corey W
2013-01-01
We present a framework for building a medical natural language processing (NLP) system capable of deep understanding of clinical text reports. The framework helps developers understand how various NLP-related efforts and knowledge sources can be integrated. The aspects considered include: 1) computational issues dealing with defining layers of intermediate semantic structures to reduce the dimensionality of the NLP problem; 2) algorithmic issues in which we survey the NLP literature and discuss state-of-the-art procedures used to map between various levels of the hierarchy; and 3) implementation issues to software developers with available resources. The objective of this poster is to educate readers to the various levels of semantic representation (e.g., word level concepts, ontological concepts, logical relations, logical frames, discourse structures, etc.). The poster presents an architecture for which diverse efforts and resources in medical NLP can be integrated in a principled way.
Strategic Mobility 21: Modeling, Simulation, and Analysis
2010-04-14
using AnyLogic , which is a Java programmed, multi-method simulation modeling tool developed by XJ Technologies. The last section examines the academic... simulation model from an Arena platform to an AnyLogic based Web Service. MATLAB is useful for small problems with few nodes, but GAMS/CPLEX is better... Transportation Modeling Studio TM . The SCASN modeling and simulation program was designed to be generic in nature to allow for use by both commercial and
Logic Programming as an Inference Engine for Non-Monotonic Reasoning
1991-11-11
Mathematical Sciences . ... University of Texas at El Paso AdI!ar, El Pazo , TX 79968-0514 [ A , (teodor math.ep.utexas.edu) Dist November 11, 1991 Title...Przymusinska, L. Pereira and D.S. Warren. Significant progress has been made towards both theoretical and algorithmic foundations of a non-monotonic...reasoning system based on logic programming. An implementation of such a system, limited to circumscrip- tive thoories, has been also completed. 14
Difference to Inference: teaching logical and statistical reasoning through on-line interactivity.
Malloy, T E
2001-05-01
Difference to Inference is an on-line JAVA program that simulates theory testing and falsification through research design and data collection in a game format. The program, based on cognitive and epistemological principles, is designed to support learning of the thinking skills underlying deductive and inductive logic and statistical reasoning. Difference to Inference has database connectivity so that game scores can be counted as part of course grades.
Logical Form as a Determinant of Cognitive Processes
NASA Astrophysics Data System (ADS)
van Lambalgen, Michiel
We discuss a research program on reasoning patterns in subjects with autism, showing that they fail to engage in certain forms of non-monotonic reasoning that come naturally to neurotypical subjects. The striking reasoning patterns of autists occur both in verbal and in non-verbal tasks. Upon formalising the relevant non-verbal tasks, one sees that their logical form is the same as that of the verbal tasks. This suggests that logical form can play a causal role in cognitive processes, and we suggest that this logical form is actually embodied in the cognitive capacity called 'executive function'.
Software Safety Assurance of Programmable Logic
NASA Technical Reports Server (NTRS)
Berens, Kalynnda
2002-01-01
Programmable Logic (PLC, FPGA, ASIC) devices are hybrids - hardware devices that are designed and programmed like software. As such, they fall in an assurance gray area. Programmable Logic is usually tested and verified as hardware, and the software aspects are ignored, potentially leading to safety or mission success concerns. The objective of this proposal is to first determine where and how Programmable Logic (PL) is used within NASA and document the current methods of assurance. Once that is known, raise awareness of the PL software aspects within the NASA engineering community and provide guidance for the use and assurance of PL form a software perspective.
Orbach, Ron; Willner, Bilha; Willner, Itamar
2015-03-11
This feature article addresses the implementation of catalytic nucleic acids as functional units for the construction of logic gates and computing circuits, and discusses the future applications of these systems. The assembly of computational modules composed of DNAzymes has led to the operation of a universal set of logic gates, to field programmable logic gates and computing circuits, to the development of multiplexers/demultiplexers, and to full-adder systems. Also, DNAzyme cascades operating as logic gates and computing circuits were demonstrated. DNAzyme logic systems find important practical applications. These include the use of DNAzyme-based systems for sensing and multiplexed analyses, for the development of controlled release and drug delivery systems, for regulating intracellular biosynthetic pathways, and for the programmed synthesis and operation of cascades.
Towards programming languages for genetic engineering of living cells
Pedersen, Michael; Phillips, Andrew
2009-01-01
Synthetic biology aims at producing novel biological systems to carry out some desired and well-defined functions. An ultimate dream is to design these systems at a high level of abstraction using engineering-based tools and programming languages, press a button, and have the design translated to DNA sequences that can be synthesized and put to work in living cells. We introduce such a programming language, which allows logical interactions between potentially undetermined proteins and genes to be expressed in a modular manner. Programs can be translated by a compiler into sequences of standard biological parts, a process that relies on logic programming and prototype databases that contain known biological parts and protein interactions. Programs can also be translated to reactions, allowing simulations to be carried out. While current limitations on available data prevent full use of the language in practical applications, the language can be used to develop formal models of synthetic systems, which are otherwise often presented by informal notations. The language can also serve as a concrete proposal on which future language designs can be discussed, and can help to guide the emerging standard of biological parts which so far has focused on biological, rather than logical, properties of parts. PMID:19369220
Towards programming languages for genetic engineering of living cells.
Pedersen, Michael; Phillips, Andrew
2009-08-06
Synthetic biology aims at producing novel biological systems to carry out some desired and well-defined functions. An ultimate dream is to design these systems at a high level of abstraction using engineering-based tools and programming languages, press a button, and have the design translated to DNA sequences that can be synthesized and put to work in living cells. We introduce such a programming language, which allows logical interactions between potentially undetermined proteins and genes to be expressed in a modular manner. Programs can be translated by a compiler into sequences of standard biological parts, a process that relies on logic programming and prototype databases that contain known biological parts and protein interactions. Programs can also be translated to reactions, allowing simulations to be carried out. While current limitations on available data prevent full use of the language in practical applications, the language can be used to develop formal models of synthetic systems, which are otherwise often presented by informal notations. The language can also serve as a concrete proposal on which future language designs can be discussed, and can help to guide the emerging standard of biological parts which so far has focused on biological, rather than logical, properties of parts.
Extending XNAT Platform with an Incremental Semantic Framework
Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael
2017-01-01
Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases. PMID:28912709
Extending XNAT Platform with an Incremental Semantic Framework.
Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael
2017-01-01
Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases.
Redox processes and water quality of selected principal aquifer systems
McMahon, P.B.; Chapelle, F.H.
2008-01-01
Reduction/oxidation (redox) conditions in 15 principal aquifer (PA) systems of the United States, and their impact on several water quality issues, were assessed from a large data base collected by the National Water-Quality Assessment Program of the USGS. The logic of these assessments was based on the observed ecological succession of electron acceptors such as dissolved oxygen, nitrate, and sulfate and threshold concentrations of these substrates needed to support active microbial metabolism. Similarly, the utilization of solid-phase electron acceptors such as Mn(IV) and Fe(III) is indicated by the production of dissolved manganese and iron. An internally consistent set of threshold concentration criteria was developed and applied to a large data set of 1692 water samples from the PAs to assess ambient redox conditions. The indicated redox conditions then were related to the occurrence of selected natural (arsenic) and anthropogenic (nitrate and volatile organic compounds) contaminants in ground water. For the natural and anthropogenic contaminants assessed in this study, considering redox conditions as defined by this framework of redox indicator species and threshold concentrations explained many water quality trends observed at a regional scale. An important finding of this study was that samples indicating mixed redox processes provide information on redox heterogeneity that is useful for assessing common water quality issues. Given the interpretive power of the redox framework and given that it is relatively inexpensive and easy to measure the chemical parameters included in the framework, those parameters should be included in routine water quality monitoring programs whenever possible.
Wu, Cuichen; Wan, Shuo; Hou, Weijia; Zhang, Liqin; Xu, Jiehua; Cui, Cheng; Wang, Yanyue; Hu, Jun; Tan, Weihong
2015-03-04
Nucleic acid-based logic devices were first introduced in 1994. Since then, science has seen the emergence of new logic systems for mimicking mathematical functions, diagnosing disease and even imitating biological systems. The unique features of nucleic acids, such as facile and high-throughput synthesis, Watson-Crick complementary base pairing, and predictable structures, together with the aid of programming design, have led to the widespread applications of nucleic acids (NA) for logic gate and computing in biotechnology and biomedicine. In this feature article, the development of in vitro NA logic systems will be discussed, as well as the expansion of such systems using various input molecules for potential cellular, or even in vivo, applications.
Wu, Cuichen; Wan, Shuo; Hou, Weijia; Zhang, Liqin; Xu, Jiehua; Cui, Cheng; Wang, Yanyue; Hu, Jun
2015-01-01
Nucleic acid-based logic devices were first introduced in 1994. Since then, science has seen the emergence of new logic systems for mimicking mathematical functions, diagnosing disease and even imitating biological systems. The unique features of nucleic acids, such as facile and high-throughput synthesis, Watson-Crick complementary base pairing, and predictable structures, together with the aid of programming design, have led to the widespread applications of nucleic acids (NA) for logic gating and computing in biotechnology and biomedicine. In this feature article, the development of in vitro NA logic systems will be discussed, as well as the expansion of such systems using various input molecules for potential cellular, or even in vivo, applications. PMID:25597946
On the formalization and reuse of scientific research.
King, Ross D; Liakata, Maria; Lu, Chuan; Oliver, Stephen G; Soldatova, Larisa N
2011-10-07
The reuse of scientific knowledge obtained from one investigation in another investigation is basic to the advance of science. Scientific investigations should therefore be recorded in ways that promote the reuse of the knowledge they generate. The use of logical formalisms to describe scientific knowledge has potential advantages in facilitating such reuse. Here, we propose a formal framework for using logical formalisms to promote reuse. We demonstrate the utility of this framework by using it in a worked example from biology: demonstrating cycles of investigation formalization [F] and reuse [R] to generate new knowledge. We first used logic to formally describe a Robot scientist investigation into yeast (Saccharomyces cerevisiae) functional genomics [f(1)]. With Robot scientists, unlike human scientists, the production of comprehensive metadata about their investigations is a natural by-product of the way they work. We then demonstrated how this formalism enabled the reuse of the research in investigating yeast phenotypes [r(1) = R(f(1))]. This investigation found that the removal of non-essential enzymes generally resulted in enhanced growth. The phenotype investigation was then formally described using the same logical formalism as the functional genomics investigation [f(2) = F(r(1))]. We then demonstrated how this formalism enabled the reuse of the phenotype investigation to investigate yeast systems-biology modelling [r(2) = R(f(2))]. This investigation found that yeast flux-balance analysis models fail to predict the observed changes in growth. Finally, the systems biology investigation was formalized for reuse in future investigations [f(3) = F(r(2))]. These cycles of reuse are a model for the general reuse of scientific knowledge.
Instantons in Self-Organizing Logic Gates
NASA Astrophysics Data System (ADS)
Bearden, Sean R. B.; Manukian, Haik; Traversa, Fabio L.; Di Ventra, Massimiliano
2018-03-01
Self-organizing logic is a recently suggested framework that allows the solution of Boolean truth tables "in reverse"; i.e., it is able to satisfy the logical proposition of gates regardless to which terminal(s) the truth value is assigned ("terminal-agnostic logic"). It can be realized if time nonlocality (memory) is present. A practical realization of self-organizing logic gates (SOLGs) can be done by combining circuit elements with and without memory. By employing one such realization, we show, numerically, that SOLGs exploit elementary instantons to reach equilibrium points. Instantons are classical trajectories of the nonlinear equations of motion describing SOLGs and connect topologically distinct critical points in the phase space. By linear analysis at those points, we show that these instantons connect the initial critical point of the dynamics, with at least one unstable direction, directly to the final fixed point. We also show that the memory content of these gates affects only the relaxation time to reach the logically consistent solution. Finally, we demonstrate, by solving the corresponding stochastic differential equations, that, since instantons connect critical points, noise and perturbations may change the instanton trajectory in the phase space but not the initial and final critical points. Therefore, even for extremely large noise levels, the gates self-organize to the correct solution. Our work provides a physical understanding of, and can serve as an inspiration for, models of bidirectional logic gates that are emerging as important tools in physics-inspired, unconventional computing.
Verification and Planning Based on Coinductive Logic Programming
NASA Technical Reports Server (NTRS)
Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal
2008-01-01
Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution as well as preliminary implementations of the planning and verification applications have been developed [4]. Co-LP and Model Checking: The vast majority of properties that are to be verified can be classified into safety properties and liveness properties. It is well known within model checking that safety properties can be verified by reachability analysis, i.e, if a counter-example to the property exists, it can be finitely determined by enumerating all the reachable states of the Kripke structure.
Programming Cell Adhesion for On-Chip Sequential Boolean Logic Functions.
Qu, Xiangmeng; Wang, Shaopeng; Ge, Zhilei; Wang, Jianbang; Yao, Guangbao; Li, Jiang; Zuo, Xiaolei; Shi, Jiye; Song, Shiping; Wang, Lihua; Li, Li; Pei, Hao; Fan, Chunhai
2017-08-02
Programmable remodelling of cell surfaces enables high-precision regulation of cell behavior. In this work, we developed in vitro constructed DNA-based chemical reaction networks (CRNs) to program on-chip cell adhesion. We found that the RGD-functionalized DNA CRNs are entirely noninvasive when interfaced with the fluidic mosaic membrane of living cells. DNA toehold with different lengths could tunably alter the release kinetics of cells, which shows rapid release in minutes with the use of a 6-base toehold. We further demonstrated the realization of Boolean logic functions by using DNA strand displacement reactions, which include multi-input and sequential cell logic gates (AND, OR, XOR, and AND-OR). This study provides a highly generic tool for self-organization of biological systems.
Gomez, Fernando; Curcio, Carmen Lucia
2013-01-01
The underlying rationale to support interdisciplinary collaboration in geriatrics and gerontology is based on the complexity of elderly care. The most important characteristic about interdisciplinary health care teams for older people in Latin America is their subjective-basis framework. In other regions, teams are organized according to a theoretical knowledge basis with well-justified priorities, functions, and long-term goals, in Latin America teams are arranged according to subjective interests on solving their problems. Three distinct approaches of interdisciplinary collaboration in gerontology are proposed. The first approach is grounded in the scientific rationalism of European origin. Denominated "logical-rational approach," its core is to identify the significance of knowledge. The second approach is grounded in pragmatism and is more associated with a North American tradition. The core of this approach consists in enhancing the skills and competences of each participant; denominated "logical-instrumental approach." The third approach denominated "logical-subjective approach" has a Latin America origin. Its core consists in taking into account the internal and emotional dimensions of the team. These conceptual frameworks based in geographical contexts will permit establishing the differences and shared characteristics of interdisciplinary collaboration in geriatrics and gerontology to look for operational answers to solve the "complex problems" of older adults.
Guidance for modeling causes and effects in environmental problem solving
Armour, Carl L.; Williamson, Samuel C.
1988-01-01
Environmental problems are difficult to solve because their causes and effects are not easily understood. When attempts are made to analyze causes and effects, the principal challenge is organization of information into a framework that is logical, technically defensible, and easy to understand and communicate. When decisionmakers attempt to solve complex problems before an adequate cause and effect analysis is performed there are serious risks. These risks include: greater reliance on subjective reasoning, lessened chance for scoping an effective problem solving approach, impaired recognition of the need for supplemental information to attain understanding, increased chance for making unsound decisions, and lessened chance for gaining approval and financial support for a program/ Cause and effect relationships can be modeled. This type of modeling has been applied to various environmental problems, including cumulative impact assessment (Dames and Moore 1981; Meehan and Weber 1985; Williamson et al. 1987; Raley et al. 1988) and evaluation of effects of quarrying (Sheate 1986). This guidance for field users was written because of the current interest in documenting cause-effect logic as a part of ecological problem solving. Principal literature sources relating to the modeling approach are: Riggs and Inouye (1975a, b), Erickson (1981), and United States Office of Personnel Management (1986).
Execution of Educational Mechanical Production Programs for School Children
NASA Astrophysics Data System (ADS)
Itoh, Nobuhide; Itoh, Goroh; Shibata, Takayuki
The authors are conducting experience-based engineering educational programs for elementary and junior high school students with the aim to provide a chance for them to experience mechanical production. As part of this endeavor, we planned and conducted a program called “Fabrication of Original Magnet Plates by Casting” for elementary school students. This program included a course for leading nature laws and logical thinking method. Prior to the program, a preliminary program was applied to school teachers to get comments and to modify for the program accordingly. The children responded excellently to the production process which realizes their ideas, but it was found that the course on natural laws and logical methods need to be improved to draw their interest and attention. We will continue to plan more effective programs, deepening ties with the local community.
An efficient annealing in Boltzmann machine in Hopfield neural network
NASA Astrophysics Data System (ADS)
Kin, Teoh Yeong; Hasan, Suzanawati Abu; Bulot, Norhisam; Ismail, Mohammad Hafiz
2012-09-01
This paper proposes and implements Boltzmann machine in Hopfield neural network doing logic programming based on the energy minimization system. The temperature scheduling in Boltzmann machine enhancing the performance of doing logic programming in Hopfield neural network. The finest temperature is determined by observing the ratio of global solution and final hamming distance using computer simulations. The study shows that Boltzmann Machine model is more stable and competent in term of representing and solving difficult combinatory problems.
Detecting Payload Attacks on Programmable Logic Controllers (PLCs)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Huan
Programmable logic controllers (PLCs) play critical roles in industrial control systems (ICS). Providing hardware peripherals and firmware support for control programs (i.e., a PLC’s “payload”) written in languages such as ladder logic, PLCs directly receive sensor readings and control ICS physical processes. An attacker with access to PLC development software (e.g., by compromising an engineering workstation) can modify the payload program and cause severe physical damages to the ICS. To protect critical ICS infrastructure, we propose to model runtime behaviors of legitimate PLC payload program and use runtime behavior monitoring in PLC firmware to detect payload attacks. By monitoring themore » I/O access patterns, network access patterns, as well as payload program timing characteristics, our proposed firmware-level detection mechanism can detect abnormal runtime behaviors of malicious PLC payload. Using our proof-of-concept implementation, we evaluate the memory and execution time overhead of implementing our proposed method and find that it is feasible to incorporate our method into existing PLC firmware. In addition, our evaluation results show that a wide variety of payload attacks can be effectively detected by our proposed approach. The proposed firmware-level payload attack detection scheme complements existing bumpin- the-wire solutions (e.g., external temporal-logic-based model checkers) in that it can detect payload attacks that violate realtime requirements of ICS operations and does not require any additional apparatus.« less
Use of LOGIC to support lidar operations
NASA Astrophysics Data System (ADS)
Davis-Lunde, Kimberley; Jugan, Laurie A.; Shoemaker, J. Todd
1999-10-01
The Naval Oceanographic Office (NAVOCEANO) and Planning Systems INcorporated are developing the Littoral Optics Geospatial Integrated Capability (LOGIC). LOGIC supports NAVOCEANO's directive to assess the impact of the environment on Fleet systems in areas of operational interest. LOGIC is based in the Geographic Information System (GIS) ARC/INFO and offers a method to view and manipulate optics and ancillary data to support emerging Fleet lidar systems. LOGIC serves as a processing (as required) and quality-checking mechanism for data entering NAVOCEANO's Data Warehouse and handles both remotely sensed and in-water data. LOGIC provides a link between these data and the GIS-based Graphical User Interface, allowing the user to select data manipulation routines and/or system support products. The results of individual modules are displayed via the GIS to provide such products as lidar system performance, laser penetration depth, and asset vulnerability from a lidar threat. LOGIC is being developed for integration into other NAVOCEANO programs, most notably for Comprehensive Environmental Assessment System, an established tool supporting sonar-based systems. The prototype for LOGIC was developed for the Yellow Sea, focusing on a diver visibility support product.
Research in mathematical theory of computation. [computer programming applications
NASA Technical Reports Server (NTRS)
Mccarthy, J.
1973-01-01
Research progress in the following areas is reviewed: (1) new version of computer program LCF (logic for computable functions) including a facility to search for proofs automatically; (2) the description of the language PASCAL in terms of both LCF and in first order logic; (3) discussion of LISP semantics in LCF and attempt to prove the correctness of the London compilers in a formal way; (4) design of both special purpose and domain independent proving procedures specifically program correctness in mind; (5) design of languages for describing such proof procedures; and (6) the embedding of ideas in the first order checker.
NASA Technical Reports Server (NTRS)
Jones, W. V.
1973-01-01
Modifications to the basic computer program for performing the simulations are reported. The major changes include: (1) extension of the calculations to include the development of cascades initiated by heavy nuclei, (2) improved treatment of the nuclear disintegrations which occur during the interactions of hadrons in heavy absorbers, (3) incorporation of accurate multi-pion final-state cross sections for various interactions at accelerator energies, (4) restructuring of the program logic so that calculations can be made for sandwich-type detectors, and (5) logic modifications related to execution of the program.
Institutional logic in self-management support: coexistence and diversity.
Bossy, Dagmara; Knutsen, Ingrid Ruud; Rogers, Anne; Foss, Christina
2016-11-01
The prevalence of chronic conditions in Europe has been the subject of health-political reforms that have increasingly targeted collaboration between public, private and voluntary organisations for the purpose of supporting self-management of long-term diseases. The international literature describes collaboration across sectors as challenging, which implies that their respective logics are conflicting or incompatible. In line with the European context, recent Norwegian health policy advocates inter-sectorial partnerships. The aim of this policy is to create networks supporting better self-management for people with chronic conditions. The purpose of our qualitative study was to map different understandings of self-management support in private for-profit, volunteer and public organisations. These organisations are seen as potential self-management support networks for individuals with chronic conditions in Norway. From December 2012 to April 2013, we conducted 50 semi-structured interviews with representatives from relevant health and well-being organisations in different parts of Norway. According to the theoretical framework of institutional logic, representatives' statements are embedded with organisational understandings. In the analysis, we systematically assessed the representatives' different understandings of self-management support. The institutional logic we identified revealed traits of organisational historical backgrounds, and transitions in understanding. We found that the merging of individualism and fellowship in contemporary health policy generates different types of logic in different organisational contexts. The private for-profit organisations were concerned with the logic of a healthy appearance and mindset, whereas the private non-profit organisations emphasised fellowship and moral responsibility. Finally, the public, illness-oriented organisations tended to highlight individual conditions for illness management. Different types of logic may attract different users, and simultaneously, a diversity of logic types may challenge collaboration at the user's expense. Moral implications embed institutional logic implying a change towards individual responsibility for disease. Policy makers ought to consider complexities of logic in order to tailor the different needs of users. © 2015 John Wiley & Sons Ltd.
A psychometric evaluation of the digital logic concept inventory
NASA Astrophysics Data System (ADS)
Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.
2014-10-01
Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric evaluation). Classical Test Theory and Item Response Theory provide two psychometric frameworks for evaluating the quality of assessment tools. We discuss how these theories can be applied to assessment tools generally and then apply them to the Digital Logic Concept Inventory (DLCI). We demonstrate that the DLCI is sufficiently reliable for research purposes when used in its entirety and as a post-course assessment of students' conceptual understanding of digital logic. The DLCI can also discriminate between students across a wide range of ability levels, providing the most information about weaker students' ability levels.
Evaluation of a Postdischarge Call System Using the Logic Model.
Frye, Timothy C; Poe, Terri L; Wilson, Marisa L; Milligan, Gary
2018-02-01
This mixed-method study was conducted to evaluate a postdischarge call program for congestive heart failure patients at a major teaching hospital in the southeastern United States. The program was implemented based on the premise that it would improve patient outcomes and overall quality of life, but it had never been evaluated for effectiveness. The Logic Model was used to evaluate the input of key staff members to determine whether the outputs and results of the program matched the expectations of the organization. Interviews, online surveys, reviews of existing patient outcome data, and reviews of publicly available program marketing materials were used to ascertain current program output. After analyzing both qualitative and quantitative data from the evaluation, recommendations were made to the organization to improve the effectiveness of the program.
Naimoli, Joseph F; Frymus, Diana E; Wuliji, Tana; Franco, Lynne M; Newsome, Martha H
2014-10-02
There has been a resurgence of interest in national Community Health Worker (CHW) programs in low- and middle-income countries (LMICs). A lack of strong research evidence persists, however, about the most efficient and effective strategies to ensure optimal, sustained performance of CHWs at scale. To facilitate learning and research to address this knowledge gap, the authors developed a generic CHW logic model that proposes a theoretical causal pathway to improved performance. The logic model draws upon available research and expert knowledge on CHWs in LMICs. Construction of the model entailed a multi-stage, inductive, two-year process. It began with the planning and implementation of a structured review of the existing research on community and health system support for enhanced CHW performance. It continued with a facilitated discussion of review findings with experts during a two-day consultation. The process culminated with the authors' review of consultation-generated documentation, additional analysis, and production of multiple iterations of the model. The generic CHW logic model posits that optimal CHW performance is a function of high quality CHW programming, which is reinforced, sustained, and brought to scale by robust, high-performing health and community systems, both of which mobilize inputs and put in place processes needed to fully achieve performance objectives. Multiple contextual factors can influence CHW programming, system functioning, and CHW performance. The model is a novel contribution to current thinking about CHWs. It places CHW performance at the center of the discussion about CHW programming, recognizes the strengths and limitations of discrete, targeted programs, and is comprehensive, reflecting the current state of both scientific and tacit knowledge about support for improving CHW performance. The model is also a practical tool that offers guidance for continuous learning about what works. Despite the model's limitations and several challenges in translating the potential for learning into tangible learning, the CHW generic logic model provides a solid basis for exploring and testing a causal pathway to improved performance.
NASA Technical Reports Server (NTRS)
Ferguson, D. R.; Keith, J. S.
1975-01-01
The improvements which have been incorporated in the Streamtube Curvature Program to enhance both its computational and diagnostic capabilities are described. Detailed descriptions are given of the revisions incorporated to more reliably handle the jet stream-external flow interaction at trailing edges. Also presented are the augmented boundary layer procedures and a variety of other program changes relating to program diagnostics and extended solution capabilities. An updated User's Manual, that includes information on the computer program operation, usage, and logical structure, is presented. User documentation includes an outline of the general logical flow of the program and detailed instructions for program usage and operation. From the standpoint of the programmer, the overlay structure is described. The input data, output formats, and diagnostic printouts are covered in detail and illustrated with three typical test cases.
Peptide Logic Circuits Based on Chemoenzymatic Ligation for Programmable Cell Apoptosis.
Li, Yong; Sun, Sujuan; Fan, Lin; Hu, Shanfang; Huang, Yan; Zhang, Ke; Nie, Zhou; Yao, Shouzhou
2017-11-20
A novel and versatile peptide-based bio-logic system capable of regulating cell function is developed using sortase A (SrtA), a peptide ligation enzyme, as a generic processor. By modular peptide design, we demonstrate that mammalian cells apoptosis can be programmed by peptide-based logic operations, including binary and combination gates (AND, INHIBIT, OR, and AND-INHIBIT), and a complex sequential logic circuit (multi-input keypad lock). Moreover, a proof-of-concept peptide regulatory circuit was developed to analyze the expression profile of cell-secreted protein biomarkers and trigger cancer-cell-specific apoptosis. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Effects of Practice-Based Training on Graduate Teaching Assistants’ Classroom Practices
Becker, Erin A.; Easlon, Erin J.; Potter, Sarah C.; Guzman-Alvarez, Alberto; Spear, Jensen M.; Facciotti, Marc T.; Igo, Michele M.; Singer, Mitchell; Pagliarulo, Christopher
2017-01-01
Evidence-based teaching is a highly complex skill, requiring repeated cycles of deliberate practice and feedback to master. Despite existing well-characterized frameworks for practice-based training in K–12 teacher education, the major principles of these frameworks have not yet been transferred to instructor development in higher educational contexts, including training of graduate teaching assistants (GTAs). We sought to determine whether a practice-based training program could help GTAs learn and use evidence-based teaching methods in their classrooms. We implemented a weekly training program for introductory biology GTAs that included structured drills of techniques selected to enhance student practice, logic development, and accountability and reduce apprehension. These elements were selected based on their previous characterization as dimensions of active learning. GTAs received regular performance feedback based on classroom observations. To quantify use of target techniques and levels of student participation, we collected and coded 160 h of video footage. We investigated the relationship between frequency of GTA implementation of target techniques and student exam scores; however, we observed no significant relationship. Although GTAs adopted and used many of the target techniques with high frequency, techniques that enforced student participation were not stably adopted, and their use was unresponsive to formal feedback. We also found that techniques discussed in training, but not practiced, were not used at quantifiable frequencies, further supporting the importance of practice-based training for influencing instructional practices. PMID:29146664
The motor theory of speech perception revisited.
Massaro, Dominic W; Chen, Trevor H
2008-04-01
Galantucci, Fowler, and Turvey (2006) have claimed that perceiving speech is perceiving gestures and that the motor system is recruited for perceiving speech. We make the counter argument that perceiving speech is not perceiving gestures, that the motor system is not recruitedfor perceiving speech, and that speech perception can be adequately described by a prototypical pattern recognition model, the fuzzy logical model of perception (FLMP). Empirical evidence taken as support for gesture and motor theory is reconsidered in more detail and in the framework of the FLMR Additional theoretical and logical arguments are made to challenge gesture and motor theory.
A framework for qualitative reasoning about solid objects
NASA Technical Reports Server (NTRS)
Davis, E.
1987-01-01
Predicting the behavior of a qualitatively described system of solid objects requires a combination of geometrical, temporal, and physical reasoning. Methods based upon formulating and solving differential equations are not adequate for robust prediction, since the behavior of a system over extended time may be much simpler than its behavior over local time. A first-order logic, in which one can state simple physical problems and derive their solution deductively, without recourse to solving the differential equations, is discussed. This logic is substantially more expressive and powerful than any previous AI representational system in this domain.
Design automation techniques for custom LSI arrays
NASA Technical Reports Server (NTRS)
Feller, A.
1975-01-01
The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.
A Deductive Approach to Computer Programming.
1986-01-01
82] K. L. (’lark and S.-A. Thrnlund (editors). Logic Programming, Academic Press (1982). A.R.(’. Studies in Data Processing No. 16. : .(;Goguen and...Tiii Siiillii’>>- oftlie t ralnSforunatloll rukisi 5 (v ielt Since e’ach prodite’ ani (’xjpl’ssiill equliv- * ~ Llil i t’qi ii (ilk t it(’ theo’try...S. Boyer and J S. Moore, A Computational Logic, Academic Press, New York, N.Y., 1979. Brand [751 D. Brand, Proving theorems with the modification
1984-06-01
Eacn stock point is autonomous witn respect to how it implements data processing support, as long as it accommodates the Navy Supply Systems Command...has its own data elements, files, programs , transactions, users, reports, and some have additional hardware. To augment them all and not force redesign... programs are written to request session establishments among them using only logical addressing names (mailboxes) whicn are independent from physical
Playing Tic-Tac-Toe with a Sugar-Based Molecular Computer.
Elstner, M; Schiller, A
2015-08-24
Today, molecules can perform Boolean operations and circuits at a level of higher complexity. However, concatenation of logic gates and inhomogeneous inputs and outputs are still challenging tasks. Novel approaches for logic gate integration are possible when chemical programming and software programming are combined. Here it is shown that a molecular finite automaton based on the concatenated implication function (IMP) of a fluorescent two-component sugar probe via a wiring algorithm is able to play tic-tac-toe.
From complexity to reality: providing useful frameworks for defining systems of care.
Levison-Johnson, Jody; Wenz-Gross, Melodie
2010-02-01
Because systems of care are not uniform across communities, there is a need to better document the process of system development, define the complexity, and describe the development of the structures, processes, and relationships within communities engaged in system transformation. By doing so, we begin to identify the necessary and sufficient components that, at minimum, move us from usual care within a naturally occurring system to a true system of care. Further, by documenting and measuring the degree to which key components are operating, we may be able to identify the most successful strategies in creating system reform. The theory of change and logic model offer a useful framework for communities to begin the adaptive work necessary to effect true transformation. Using the experience of two system of care communities, this new definition and the utility of a theory of change and logic model framework for defining local system transformation efforts will be discussed. Implications for the field, including the need to further examine the natural progression of systems change and to create quantifiable measures of transformation, will be raised as new challenges for the evolving system of care movement.
Runtime verification of embedded real-time systems.
Reinbacher, Thomas; Függer, Matthias; Brauer, Jörg
We present a runtime verification framework that allows on-line monitoring of past-time Metric Temporal Logic (ptMTL) specifications in a discrete time setting. We design observer algorithms for the time-bounded modalities of ptMTL, which take advantage of the highly parallel nature of hardware designs. The algorithms can be translated into efficient hardware blocks, which are designed for reconfigurability, thus, facilitate applications of the framework in both a prototyping and a post-deployment phase of embedded real-time systems. We provide formal correctness proofs for all presented observer algorithms and analyze their time and space complexity. For example, for the most general operator considered, the time-bounded Since operator, we obtain a time complexity that is doubly logarithmic both in the point in time the operator is executed and the operator's time bounds. This result is promising with respect to a self-contained, non-interfering monitoring approach that evaluates real-time specifications in parallel to the system-under-test. We implement our framework on a Field Programmable Gate Array platform and use extensive simulation and logic synthesis runs to assess the benefits of the approach in terms of resource usage and operating frequency.
Airport Landside. Volume IV. Appendix A. ALSIM AUXILIARY and MAIN Programs.
DOT National Transportation Integrated Search
1982-06-01
This Appendix describes the Program Logic of the Airport Landside Simulation Model (ALSIM) AUXILIARY and MAIN Programs. Both programs are written in GPSS-V. The AUXILIARY program is operated prior to the MAIN Program to create GPSS transactions repre...
Clapham, Kathleen; Manning, Claire; Williams, Kathryn; O'Brien, Ginger; Sutherland, Margaret
2017-04-01
Despite clear evidence that learning and social opportunities for children with disabilities and special needs are more effective in inclusive not segregated settings, there are few known effective inclusion programs available to children with disabilities, their families or teachers in the early years within Australia. The Kids Together program was developed to support children with disabilities/additional needs aged 0-8 years attending mainstream early learning environments. Using a key worker transdisciplinary team model, the program aligns with the individualised package approach of the National Disability Insurance Scheme (NDIS). This paper reports on the use of a logic model to underpin the process, outcomes and impact evaluation of the Kids Together program. The research team worked across 15 Early Childhood Education and Care (ECEC) centres and in home and community settings. A realist evaluation using mixed methods was undertaken to understand what works, for whom and in what contexts. The development of a logic model provided a structured way to explore how the program was implemented and achieved short, medium and long term outcomes within a complex community setting. Kids Together was shown to be a highly effective and innovative model for supporting the inclusion of children with disabilities/additional needs in a range of environments central for early childhood learning and development. The use of a logic model provided a visual representation of the Kids Together model and its component parts and enabled a theory of change to be inferred, showing how a coordinated and collaborative approached can work across multiple environments. Copyright © 2016 Elsevier Ltd. All rights reserved.
Xu, Elvis G B; Leung, Kenneth M Y; Morton, Brian; Lee, Joseph H W
2015-02-01
Marine protected areas (MPAs), such as marine parks and reserves, contain natural resources of immense value to the environment and mankind. Since MPAs may be situated in close proximity to urbanized areas and influenced by anthropogenic activities (e.g. continuous discharges of contaminated waters), the marine organisms contained in such waters are probably at risk. This study aimed at developing an integrated environmental risk assessment and management (IERAM) framework for enhancing the sustainability of such MPAs. The IERAM framework integrates conventional environmental risk assessment methods with a multi-layer-DPSIR (Driver-Pressure-State-Impact-Response) conceptual approach, which can simplify the complex issues embraced by environmental management strategies and provide logical and concise management information. The IERAM process can generate a useful database, offer timely update on the status of MPAs, and assist in the prioritization of management options. We use the Cape d'Aguilar Marine Reserve in Hong Kong as an example to illustrate the IERAM framework. A comprehensive set of indicators were selected, aggregated and analyzed using this framework. Effects of management practices and programs were also assessed by comparing the temporal distributions of these indicators over a certain timeframe. Based on the obtained results, we have identified the most significant components for safeguarding the integrity of the marine reserve, and indicated the existing information gaps concerned with the management of the reserve. Apart from assessing the MPA's present condition, a successful implementation of the IERAM framework as evocated here would also facilitate better-informed decision-making and, hence, indirectly enhance the protection and conservation of the MPA's marine biodiversity. Copyright © 2014 Elsevier B.V. All rights reserved.
1982-11-03
define the maximum count for the pattern defined by the first 3 bits. Since there are 11 bits involved it is possible to define patterns up to 2048 ...applied to the UUT directly through the driver for any count up to 2048 . Any one of the 7 clocks may be selected under program control and applied to any...one ievel for the driver ( VDI ), the logic zero level for the driver (VDO), the logic one level for the receiver (VRl), and the logic zero level for the
Design of a Ferroelectric Programmable Logic Gate Array
NASA Technical Reports Server (NTRS)
MacLeod, Todd C.; Ho, Fat Duen
2003-01-01
A programmable logic gate array has been designed utilizing ferroelectric field effect transistors. The design has only a small number of gates, but this could be scaled up to a more useful size. Using FFET's in a logic array gives several advantages. First, it allows real-time programmability to the array to give high speed reconfiguration. It also allows the array to be configured nearly an unlimited number of times, unlike a FLASH FPGA. Finally, the Ferroelectric Programmable Logic Gate Array (FPLGA) can be implemented using a smaller number of transistors because of the inherent logic characteristics of an FFET. The device was only designed and modeled using Spice models of the circuit, including the FFET. The actual device was not produced. The design consists of a small array of NAND and NOR logic gates. Other gates could easily be produced. They are linked by FFET's that control the logic flow. Timing and logic tables have been produced showing the array can produce a variety of logic combinations at a real time usable speed. This device could be a prototype for a device that could be put into imbedded systems that need the high speed of hardware implementation of logic and the complexity to need to change the logic algorithm. Because of the non-volatile nature of the FFET, it would also be useful in situations that needed to program a logic array once and use it repeatedly after the power has been shut off.
NASA Technical Reports Server (NTRS)
Keith, J. S.; Ferguson, D. R.; Heck, P. H.
1972-01-01
The computer program, Streamtube Curvature Analysis, is described for the engineering user and for the programmer. The user oriented documentation includes a description of the mathematical governing equations, their use in the solution, and the method of solution. The general logical flow of the program is outlined and detailed instructions for program usage and operation are explained. General procedures for program use and the program capabilities and limitations are described. From the standpoint of the grammar, the overlay structure of the program is described. The various storage tables are defined and their uses explained. The input and output are discussed in detail. The program listing includes numerous comments so that the logical flow within the program is easily followed. A test case showing input data and output format is included as well as an error printout description.
[Styles of programming 1952-1972].
van den Bogaard, Adrienne
2008-01-01
In the field of history of computing, the construction of the early computers has received much scholarly attention. However, these machines have not only been important because of their logical design and their engineering, but also because of the programming practices that emerged around these first machines. This article compares two styles of programming that developed around Dutch 'first computers'. The first style is represented by Edsger Wybe Dijkstra (1930-2002), who would receive the Turing Award for his work in 1972. Dijkstra developed a mathematical style of programming--a program was something you should be able to design mathematically and prove it logically. The second style is represented by Willem Louis van der Poel (born 1926). For him, programming is 'trickology'. A program is primarily a technical artefact that should work: a program is something you play with, comparable to the way one solves a puzzle.
1986-03-21
i t a t i v e frameworks (e.g., Doyle, Toulmin , P . Cohen), and e f f o r t s t o syn thes i ze l o g i c and p r o b a b i l i t y (Nilsson...logic allows for provisional acceptance of uncer- tain premises, which may later be retracted when they lead to contradictory conclusions. Toulmin (1958...A1 researchers] have accepted without hesitation as impeccable." * The basic framework of an argument, according to Toulmin , is as follows ( Toulmin
High speed CMOS/SOS standard cell notebook
NASA Technical Reports Server (NTRS)
1978-01-01
The NASA/MSFC high speed CMOS/SOS standard cell family, designed to be compatible with the PR2D (Place, Route in 2-Dimensions) automatic layout program, is described. Standard cell data sheets show the logic diagram, the schematic, the truth table, and propagation delays for each logic cell.
Online Collaboration for Programming: Assessing Students' Cognitive Abilities
ERIC Educational Resources Information Center
Othman, Mahfudzah; Muhd Zain, Nurzaid
2015-01-01
This study is primarily focused on assessing the students' logical thinking and cognitive levels in an online collaborative environment. The aim is to investigate whether the online collaboration has significant impact to the students' cognitive abilities. The assessment of the logical thinking involved the use of the online Group Assessment…
Personal Epistemology of Urban Elementary School Teachers
ERIC Educational Resources Information Center
Pearrow, Melissa; Sanchez, William
2008-01-01
Personal epistemology, originating from social construction theory, provides a framework for researchers to understand how individuals view their world. The Attitudes About Reality (AAR) scale is one survey method that qualitatively assesses personal epistemology along the logical positivist and social constructionist continuum; however, the…
Schünemann, Holger J; Wiercioch, Wojtek; Brozek, Jan; Etxeandia-Ikobaltzeta, Itziar; Mustafa, Reem A; Manja, Veena; Brignardello-Petersen, Romina; Neumann, Ignacio; Falavigna, Maicon; Alhazzani, Waleed; Santesso, Nancy; Zhang, Yuan; Meerpohl, Jörg J; Morgan, Rebecca L; Rochwerg, Bram; Darzi, Andrea; Rojas, Maria Ximenas; Carrasco-Labra, Alonso; Adi, Yaser; AlRayees, Zulfa; Riva, John; Bollig, Claudia; Moore, Ainsley; Yepes-Nuñez, Juan José; Cuello, Carlos; Waziry, Reem; Akl, Elie A
2017-01-01
Guideline developers can: (1) adopt existing recommendations from others; (2) adapt existing recommendations to their own context; or (3) create recommendations de novo. Monetary and nonmonetary resources, credibility, maximization of uptake, as well as logical arguments should guide the choice of the approach and processes. To describe a potentially efficient model for guideline production based on adoption, adaptation, and/or de novo development of recommendations utilizing the Grading of Recommendations Assessment, Development and Evaluation (GRADE) Evidence to Decision (EtD) frameworks. We applied the model in a new national guideline program producing 22 practice guidelines. We searched for relevant evidence that informs the direction and strength of a recommendation. We then produced GRADE EtDs for guideline panels to develop recommendations. We produced a total of 80 EtD frameworks in approximately 4 months and 146 EtDs in approximately 6 months in two waves. Use of the EtD frameworks allowed panel members understand judgments of others about the criteria that bear on guideline recommendations and then make their own judgments about those criteria in a systematic approach. The "GRADE-ADOLOPMENT" approach to guideline production combines adoption, adaptation, and, as needed, de novo development of recommendations. If developers of guidelines follow EtD criteria more widely and make their work publically available, this approach should prove even more useful. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.
A Fuzzy Logic Optimal Control Law Solution to the CMMCA Tracking Problem
1993-03-01
or from a transfer function. Many times, however, the resulting algorithms are so complex as to be completely or essentially useless. Applications...implemented in a nearly real time computer simulation. Located within the LQ framework are all the performance data for both the ClMCA and the CX...repuired nor desired. 34 - / k more general and less exacting framework was used. In order to concentrate on tho theory and problem solution, it was
Understanding the dynamic effects of returning patients toward emergency department density
NASA Astrophysics Data System (ADS)
Ahmad, Norazura; Zulkepli, Jafri; Ramli, Razamin; Ghani, Noraida Abdul; Teo, Aik Howe
2017-11-01
This paper presents the development of a dynamic hypothesis for the effect of returning patients to the emergency department (ED). A logical tree from the Theory of Constraint known as Current Reality Tree was used to identify the key variables. Then, a hypothetical framework portraying the interrelated variables and its influencing relationships was developed using causal loop diagrams (CLD). The conceptual framework was designed as the basis for the development of a system dynamics model.
Minimally inconsistent reasoning in Semantic Web.
Zhang, Xiaowang
2017-01-01
Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning.
Minimally inconsistent reasoning in Semantic Web
Zhang, Xiaowang
2017-01-01
Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning. PMID:28750030
Quantum Weak Values and Logic: An Uneasy Couple
NASA Astrophysics Data System (ADS)
Svensson, Bengt E. Y.
2017-03-01
Quantum mechanical weak values of projection operators have been used to answer which-way questions, e. g. to trace which arms in a multiple Mach-Zehnder setup a particle may have traversed from a given initial to a prescribed final state. I show that this procedure might lead to logical inconsistencies in the sense that different methods used to answer composite questions, like "Has the particle traversed the way X or the way Y?", may result in different answers depending on which methods are used to find the answer. I illustrate the problem by considering some examples: the "quantum pigeonhole" framework of Aharonov et al., the three-box problem, and Hardy's paradox. To prepare the ground for my main conclusion on the incompatibility in certain cases of weak values and logic, I study the corresponding situation for strong/projective measurements. In this case, no logical inconsistencies occur provided one is always careful in specifying exactly to which ensemble or sample space one refers. My results cast doubts on the utility of quantum weak values in treating cases like the examples mentioned.
Sibthorpe, Beverly; Gardner, Karen; McAullay, Daniel
2016-01-01
A rapidly expanding interest in quality in the Aboriginal-community-controlled health sector has led to widespread uptake of accreditation using more than one set of standards, a proliferation of continuous quality improvement programs and the introduction of key performance indicators. As yet, there has been no overarching logic that shows how they relate to each other, with consequent confusion within and outside the sector. We map the three approaches to the Framework for Performance Assessment in Primary Health Care, demonstrating their key differences and complementarity. There needs to be greater attention in both policy and practice to the purposes and alignment of the three approaches if they are to embed a system-wide focus that supports quality improvement at the service level.
Heliocentric interplanetary low thrust trajectory optimization program, supplement 1, part 2
NASA Technical Reports Server (NTRS)
Mann, F. I.; Horsewood, J. L.
1978-01-01
The improvements made to the HILTOP electric propulsion trajectory computer program are described. A more realistic propulsion system model was implemented in which various thrust subsystem efficiencies and specific impulse are modeled as variable functions of power available to the propulsion system. The number of operating thrusters are staged, and the beam voltage is selected from a set of five (or less) constant voltages, based upon the application of variational calculus. The constant beam voltages may be optimized individually or collectively. The propulsion system logic is activated by a single program input key in such a manner as to preserve the HILTOP logic. An analysis describing these features, a complete description of program input quantities, and sample cases of computer output illustrating the program capabilities are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Raedt, Hans; Katsnelson, Mikhail I.; Donker, Hylke C.
It is shown that the Pauli equation and the concept of spin naturally emerge from logical inference applied to experiments on a charged particle under the conditions that (i) space is homogeneous (ii) the observed events are logically independent, and (iii) the observed frequency distributions are robust with respect to small changes in the conditions under which the experiment is carried out. The derivation does not take recourse to concepts of quantum theory and is based on the same principles which have already been shown to lead to e.g. the Schrödinger equation and the probability distributions of pairs of particles inmore » the singlet or triplet state. Application to Stern–Gerlach experiments with chargeless, magnetic particles, provides additional support for the thesis that quantum theory follows from logical inference applied to a well-defined class of experiments. - Highlights: • The Pauli equation is obtained through logical inference applied to robust experiments on a charged particle. • The concept of spin appears as an inference resulting from the treatment of two-valued data. • The same reasoning yields the quantum theoretical description of neutral magnetic particles. • Logical inference provides a framework to establish a bridge between objective knowledge gathered through experiments and their description in terms of concepts.« less
NASA Astrophysics Data System (ADS)
Moussa, Jonathan; Ryan-Anderson, Ciaran
The canonical modern plan for universal quantum computation is a Clifford+T gate set implemented in a topological error-correcting code. This plan has the basic disparity that logical Clifford gates are natural for codes in two spatial dimensions while logical T gates are natural in three. Recent progress has reduced this disparity by proposing logical T gates in two dimensions with doubled, stacked, or gauge color codes, but these proposals lack an error threshold. An alternative universal gate set is Clifford+F, where a fusion (F) gate converts two logical qubits into a logical qudit. We show that logical F gates can be constructed by identifying compatible pairs of qubit and qudit codes that stabilize the same logical subspace, much like the original Bravyi-Kitaev construction of magic state distillation. The simplest example of high-distance compatible codes results in a proposal that is very similar to the stacked color code with the key improvement of retaining an error threshold. Sandia National Labs is a multi-program laboratory managed and operated by Sandia Corp, a wholly owned subsidiary of Lockheed Martin Corp, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Quantum probabilistic logic programming
NASA Astrophysics Data System (ADS)
Balu, Radhakrishnan
2015-05-01
We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.
Comprehensive Fault Tolerance and Science-Optimal Attitude Planning for Spacecraft Applications
NASA Astrophysics Data System (ADS)
Nasir, Ali
Spacecraft operate in a harsh environment, are costly to launch, and experience unavoidable communication delay and bandwidth constraints. These factors motivate the need for effective onboard mission and fault management. This dissertation presents an integrated framework to optimize science goal achievement while identifying and managing encountered faults. Goal-related tasks are defined by pointing the spacecraft instrumentation toward distant targets of scientific interest. The relative value of science data collection is traded with risk of failures to determine an optimal policy for mission execution. Our major innovation in fault detection and reconfiguration is to incorporate fault information obtained from two types of spacecraft models: one based on the dynamics of the spacecraft and the second based on the internal composition of the spacecraft. For fault reconfiguration, we consider possible changes in both dynamics-based control law configuration and the composition-based switching configuration. We formulate our problem as a stochastic sequential decision problem or Markov Decision Process (MDP). To avoid the computational complexity involved in a fully-integrated MDP, we decompose our problem into multiple MDPs. These MDPs include planning MDPs for different fault scenarios, a fault detection MDP based on a logic-based model of spacecraft component and system functionality, an MDP for resolving conflicts between fault information from the logic-based model and the dynamics-based spacecraft models" and the reconfiguration MDP that generates a policy optimized over the relative importance of the mission objectives versus spacecraft safety. Approximate Dynamic Programming (ADP) methods for the decomposition of the planning and fault detection MDPs are applied. To show the performance of the MDP-based frameworks and ADP methods, a suite of spacecraft attitude planning case studies are described. These case studies are used to analyze the content and behavior of computed policies in response to the changes in design parameters. A primary case study is built from the Far Ultraviolet Spectroscopic Explorer (FUSE) mission for which component models and their probabilities of failure are based on realistic mission data. A comparison of our approach with an alternative framework for spacecraft task planning and fault management is presented in the context of the FUSE mission.
Keller, Adrienne; Bauerle, Jennifer A
2009-01-01
Logic models are a ubiquitous tool for specifying the tactics--including implementation and evaluation--of interventions in the public health, health and social behaviors arenas. Similarly, social norms interventions are a common strategy, particularly in college settings, to address hazardous drinking and other dangerous or asocial behaviors. This paper illustrates an extension of logic models to include strategic as well as tactical components, using a specific example developed for social norms interventions. Placing the evaluation of projects within the context of this kind of logic model addresses issues related to the lack of a research design to evaluate effectiveness.
Logic Design Pathology and Space Flight Electronics
NASA Technical Reports Server (NTRS)
Katz, Richard; Barto, Rod L.; Erickson, K.
1997-01-01
Logic design errors have been observed in space flight missions and the final stages of ground test. The technologies used by designers and their design/analysis methodologies will be analyzed. This will give insight to the root causes of the failures. These technologies include discrete integrated circuit based systems, systems based on field and mask programmable logic, and the use computer aided engineering (CAE) systems. State-of-the-art (SOTA) design tools and methodologies will be analyzed with respect to high-reliability spacecraft design and potential pitfalls are discussed. Case studies of faults from large expensive programs to "smaller, faster, cheaper" missions will be used to explore the fundamental reasons for logic design problems.
Topological Properties of Some Integrated Circuits for Very Large Scale Integration Chip Designs
NASA Astrophysics Data System (ADS)
Swanson, S.; Lanzerotti, M.; Vernizzi, G.; Kujawski, J.; Weatherwax, A.
2015-03-01
This talk presents topological properties of integrated circuits for Very Large Scale Integration chip designs. These circuits can be implemented in very large scale integrated circuits, such as those in high performance microprocessors. Prior work considered basic combinational logic functions and produced a mathematical framework based on algebraic topology for integrated circuits composed of logic gates. Prior work also produced an historically-equivalent interpretation of Mr. E. F. Rent's work for today's complex circuitry in modern high performance microprocessors, where a heuristic linear relationship was observed between the number of connections and number of logic gates. This talk will examine topological properties and connectivity of more complex functionally-equivalent integrated circuits. The views expressed in this article are those of the author and do not reflect the official policy or position of the United States Air Force, Department of Defense or the U.S. Government.
Language, procedures, and the non-perceptual origin of number word meanings.
Barner, David
2017-05-01
Perceptual representations of objects and approximate magnitudes are often invoked as building blocks that children combine to acquire the positive integers. Systems of numerical perception are either assumed to contain the logical foundations of arithmetic innately, or to supply the basis for their induction. I propose an alternative to this framework, and argue that the integers are not learned from perceptual systems, but arise to explain perception. Using cross-linguistic and developmental data, I show that small (~1-4) and large (~5+) numbers arise both historically and in individual children via distinct mechanisms, constituting independent learning problems, neither of which begins with perceptual building blocks. Children first learn small numbers using the same logic that supports other linguistic number marking (e.g. singular/plural). Years later, they infer the logic of counting from the relations between large number words and their roles in blind counting procedures, only incidentally associating number words with approximate magnitudes.
Assessment of Seismic Damage on The Exist Buildings Using Fuzzy Logic
NASA Astrophysics Data System (ADS)
Pınar, USTA; Nihat, MOROVA; EVCİ, Ahmet; ERGÜN, Serap
2018-01-01
Earthquake as a natural disaster could damage the lives of many people and buildings all over the world. These is micvulnerability of the buildings needs to be evaluated. Accurate evaluation of damage sustained by buildings during natural disaster events is critical to determine the buildings safety and their suitability for future occupancy. The earthquake is one of the disasters that structures face the most. There fore, there is a need to evaluate seismic damage and vulnerability of the buildings to protect them. These days fuzzy systems have been widely used in different fields of science because of its simpli city and efficiency. Fuzzy logic provides a suitable framework for reasoning, deduction, and decision making in fuzzy conditions. In this paper, studies on earthquake hazard evaluation of buildings by fuzzy logic modeling concepts in the literature have been investigated and evaluated, as a whole.
Active matter logic for autonomous microfluidics
NASA Astrophysics Data System (ADS)
Woodhouse, Francis G.; Dunkel, Jörn
2017-04-01
Chemically or optically powered active matter plays an increasingly important role in materials design, but its computational potential has yet to be explored systematically. The competition between energy consumption and dissipation imposes stringent physical constraints on the information transport in active flow networks, facilitating global optimization strategies that are not well understood. Here, we combine insights from recent microbial experiments with concepts from lattice-field theory and non-equilibrium statistical mechanics to introduce a generic theoretical framework for active matter logic. Highlighting conceptual differences with classical and quantum computation, we demonstrate how the inherent non-locality of incompressible active flow networks can be utilized to construct universal logical operations, Fredkin gates and memory storage in set-reset latches through the synchronized self-organization of many individual network components. Our work lays the conceptual foundation for developing autonomous microfluidic transport devices driven by bacterial fluids, active liquid crystals or chemically engineered motile colloids.
Competing Logics and Healthcare
Saks, Mike
2018-01-01
This paper offers a short commentary on the editorial by Mannion and Exworthy. The paper highlights the positive insights offered by their analysis into the tensions between the competing institutional logics of standardization and customization in healthcare, in part manifested in the conflict between managers and professionals, and endorses the plea of the authors for further research in this field. However, the editorial is criticized for its lack of a strong societal reference point, the comparative absence of focus on hybridization, and its failure to highlight structural factors impinging on the opposing logics in a broader neo-institutional framework. With reference to the Procrustean metaphor, it is argued that greater stress should be placed on the healthcare user in future health policy. Finally, the case of complementary and alternative medicine is set out which – while not explicitly mentioned in the editorial – most effectively concretizes the tensions at the heart of this analysis of healthcare. PMID:29626406
Community science, philosophy of science, and the practice of research.
Tebes, Jacob Kraemer
2005-06-01
Embedded in community science are implicit theories on the nature of reality (ontology), the justification of knowledge claims (epistemology), and how knowledge is constructed (methodology). These implicit theories influence the conceptualization and practice of research, and open up or constrain its possibilities. The purpose of this paper is to make some of these theories explicit, trace their intellectual history, and propose a shift in the way research in the social and behavioral sciences, and community science in particular, is conceptualized and practiced. After describing the influence and decline of logical empiricism, the underlying philosophical framework for science for the past century, I summarize contemporary views in the philosophy of science that are alternatives to logical empiricism. These include contextualism, normative naturalism, and scientific realism, and propose that a modified version of contextualism, known as perspectivism, affords the philosophical framework for an emerging community science. I then discuss the implications of perspectivism for community science in the form of four propositions to guide the practice of research.
A Hardware-Accelerated Quantum Monte Carlo framework (HAQMC) for N-body systems
NASA Astrophysics Data System (ADS)
Gothandaraman, Akila; Peterson, Gregory D.; Warren, G. Lee; Hinde, Robert J.; Harrison, Robert J.
2009-12-01
Interest in the study of structural and energetic properties of highly quantum clusters, such as inert gas clusters has motivated the development of a hardware-accelerated framework for Quantum Monte Carlo simulations. In the Quantum Monte Carlo method, the properties of a system of atoms, such as the ground-state energies, are averaged over a number of iterations. Our framework is aimed at accelerating the computations in each iteration of the QMC application by offloading the calculation of properties, namely energy and trial wave function, onto reconfigurable hardware. This gives a user the capability to run simulations for a large number of iterations, thereby reducing the statistical uncertainty in the properties, and for larger clusters. This framework is designed to run on the Cray XD1 high performance reconfigurable computing platform, which exploits the coarse-grained parallelism of the processor along with the fine-grained parallelism of the reconfigurable computing devices available in the form of field-programmable gate arrays. In this paper, we illustrate the functioning of the framework, which can be used to calculate the energies for a model cluster of helium atoms. In addition, we present the capabilities of the framework that allow the user to vary the chemical identities of the simulated atoms. Program summaryProgram title: Hardware Accelerated Quantum Monte Carlo (HAQMC) Catalogue identifier: AEEP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEP_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 691 537 No. of bytes in distributed program, including test data, etc.: 5 031 226 Distribution format: tar.gz Programming language: C/C++ for the QMC application, VHDL and Xilinx 8.1 ISE/EDK tools for FPGA design and development Computer: Cray XD1 consisting of a dual-core, dualprocessor AMD Opteron 2.2 GHz with a Xilinx Virtex-4 (V4LX160) or Xilinx Virtex-II Pro (XC2VP50) FPGA per node. We use the compute node with the Xilinx Virtex-4 FPGA Operating system: Red Hat Enterprise Linux OS Has the code been vectorised or parallelized?: Yes Classification: 6.1 Nature of problem: Quantum Monte Carlo is a practical method to solve the Schrödinger equation for large many-body systems and obtain the ground-state properties of such systems. This method involves the sampling of a number of configurations of atoms and averaging the properties of the configurations over a number of iterations. We are interested in applying the QMC method to obtain the energy and other properties of highly quantum clusters, such as inert gas clusters. Solution method: The proposed framework provides a combined hardware-software approach, in which the QMC simulation is performed on the host processor, with the computationally intensive functions such as energy and trial wave function computations mapped onto the field-programmable gate array (FPGA) logic device attached as a co-processor to the host processor. We perform the QMC simulation for a number of iterations as in the case of our original software QMC approach, to reduce the statistical uncertainty of the results. However, our proposed HAQMC framework accelerates each iteration of the simulation, by significantly reducing the time taken to calculate the ground-state properties of the configurations of atoms, thereby accelerating the overall QMC simulation. We provide a generic interpolation framework that can be extended to study a variety of pure and doped atomic clusters, irrespective of the chemical identities of the atoms. For the FPGA implementation of the properties, we use a two-region approach for accurately computing the properties over the entire domain, employ deep pipelines and fixed-point for all our calculations guaranteeing the accuracy required for our simulation.
A DNAzyme-mediated logic gate for programming molecular capture and release on DNA origami.
Li, Feiran; Chen, Haorong; Pan, Jing; Cha, Tae-Gon; Medintz, Igor L; Choi, Jong Hyun
2016-06-28
Here we design a DNA origami-based site-specific molecular capture and release platform operated by a DNAzyme-mediated logic gate process. We show the programmability and versatility of this platform with small molecules, proteins, and nanoparticles, which may also be controlled by external light signals.
Stein, Karen
2016-01-01
This commentary discusses the need to evaluate the impact of World Elder Abuse Awareness Day activities, the elder abuse field's most sustained public awareness initiative. A logic model is proposed with measures for short-term, medium-term, and long-term outcomes for community-based programs.
Sign-And-Magnitude Up/Down Counter
NASA Technical Reports Server (NTRS)
Cole, Steven W.
1991-01-01
Magnitude-and-sign counter includes conventional up/down counter for magnitude part and special additional circuitry for sign part. Negative numbers indicated more directly. Counter implemented by programming erasable programmable logic device (EPLD) or programmable logic array (PLA). Used in place of conventional up/down counter to provide sign and magnitude values directly to other circuits.
Teaching Machines to Think Fuzzy
ERIC Educational Resources Information Center
Technology Teacher, 2004
2004-01-01
Fuzzy logic programs for computers make them more human. Computers can then think through messy situations and make smart decisions. It makes computers able to control things the way people do. Fuzzy logic has been used to control subway trains, elevators, washing machines, microwave ovens, and cars. Pretty much all the human has to do is push one…
Obesity services planning framework for interprofessional primary care organizations.
Brauer, Paula; Royall, Dawna; Dwyer, John; Edwards, A Michelle; Hussey, Tracy; Kates, Nick; Smith, Heidi; Kirkconnell, Ross
2017-03-01
Aim We report on a formative project to develop an organization-level planning framework for obesity prevention and management services. It is common when developing new services to first develop a logic model outlining expected outcomes and key processes. This can be onerous for single primary care organizations, especially for complex conditions like obesity. The initial draft was developed by the research team, based on results from provider and patient focus groups in one large Family Health Team (FHT) in Ontario. This draft was reviewed and activities prioritized by 20 FHTs using a moderated electronic consensus process. A national panel then reviewed the draft. Findings Providers identified five main target groups: pregnancy to 2, 3-12, 13-18, 18+ years at health risk, and 18+ with complex care needs. Desired outcomes were identified and activities were prioritized under categories: raising awareness (eg, providing information and resources on weight-health), identification and initial management (eg, wellness care), follow-up management (eg, group programs), expanded services (eg, availability of team services), and practice initiatives (eg, interprofessional education). Overall, there was strong support for raising awareness by providing information on the weight-health connection and on community services. There was also strong support for growth assessment in pediatric care. In adults, there was strong support for wellness care/health check visits and episodic care to identify people for interventions, for group programs, and for additional provider education. Joint development by different teams proved useful for consensus on outcomes and for ensuring relevancy across practices. While priorities will vary depending on local context, the basic descriptions of care processes were endorsed by reviewers. Key next steps are to trial the use of the framework and for further implementation studies to find optimally effective approaches for obesity prevention and management across the lifespan.
Programmable bioelectronics in a stimuli-encoded 3D graphene interface
NASA Astrophysics Data System (ADS)
Parlak, Onur; Beyazit, Selim; Tse-Sum-Bui, Bernadette; Haupt, Karsten; Turner, Anthony P. F.; Tiwari, Ashutosh
2016-05-01
The ability to program and mimic the dynamic microenvironment of living organisms is a crucial step towards the engineering of advanced bioelectronics. Here, we report for the first time a design for programmable bioelectronics, with `built-in' switchable and tunable bio-catalytic performance that responds simultaneously to appropriate stimuli. The designed bio-electrodes comprise light and temperature responsive compartments, which allow the building of Boolean logic gates (i.e. ``OR'' and ``AND'') based on enzymatic communications to deliver logic operations.The ability to program and mimic the dynamic microenvironment of living organisms is a crucial step towards the engineering of advanced bioelectronics. Here, we report for the first time a design for programmable bioelectronics, with `built-in' switchable and tunable bio-catalytic performance that responds simultaneously to appropriate stimuli. The designed bio-electrodes comprise light and temperature responsive compartments, which allow the building of Boolean logic gates (i.e. ``OR'' and ``AND'') based on enzymatic communications to deliver logic operations. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr02355j
Dual Logic and Cerebral Coordinates for Reciprocal Interaction in Eye Contact
Lee, Ray F.
2015-01-01
In order to scientifically study the human brain’s response to face-to-face social interaction, the scientific method itself needs to be reconsidered so that both quantitative observation and symbolic reasoning can be adapted to the situation where the observer is also observed. In light of the recent development of dyadic fMRI which can directly observe dyadic brain interacting in one MRI scanner, this paper aims to establish a new form of logic, dual logic, which provides a theoretical platform for deductive reasoning in a complementary dual system with emergence mechanism. Applying the dual logic in the dfMRI experimental design and data analysis, the exogenous and endogenous dual systems in the BOLD responses can be identified; the non-reciprocal responses in the dual system can be suppressed; a cerebral coordinate for reciprocal interaction can be generated. Elucidated by dual logic deductions, the cerebral coordinate for reciprocal interaction suggests: the exogenous and endogenous systems consist of the empathy network and the mentalization network respectively; the default-mode network emerges from the resting state to activation in the endogenous system during reciprocal interaction; the cingulate plays an essential role in the emergence from the exogenous system to the endogenous system. Overall, the dual logic deductions are supported by the dfMRI experimental results and are consistent with current literature. Both the theoretical framework and experimental method set the stage to formally apply the scientific method in studying complex social interaction. PMID:25885446
Winnicott and Derrida: development of logic-of-play.
Bitan, Shachaf
2012-02-01
In this essay I develop the logic of play from the writings of the British psychoanalyst Donald W. Winnicott and the French philosopher Jacques Derrida. The logic of play serves as both a conceptual framework for theoretical clinical thinking and a space of experiencing in which the therapeutic situation is located and to which it aspires. I argue that both Winnicott and Derrida proposed a playful turn in Western thinking by their attitude towards oppositions, viewing them not as complementary or contradictory, but as 'peacefully-coexisting'. Derrida criticizes the dichotomous structure of Western thought, proposing playful movement as an alternative that does not constitute itself as a mastering construction. I will show that Winnicott, too, proposes playful logic through which he thinks and acts in the therapeutic situation. The therapeutic encounter is understood as a playful space in which analyst and analysand continuously coexist, instead of facing each other as exclusionary oppositions. I therefore propose the logic of play as the basis for the therapeutic encounter. The playful turn, then, is crucial for the thought and praxis expressed by the concept of two-person psychology. I suggest the term playful psychoanalysis to characterize the present perspective of psychoanalysis in the light of the playful turn. I will first present Derrida's playful thought, go on to Winnicott's playful revolutionism, and conclude with an analysis of Winicott's clinical material in the light of the logic of play. Copyright © 2012 Institute of Psychoanalysis.
Grossi, Enzo
2005-09-27
The concept of risk has pervaded medical literature in the last decades and has become a familiar topic, and the concept of probability, linked to binary logic approach, is commonly applied in epidemiology and clinical medicine. The application of probability theory to groups of individuals is quite straightforward but can pose communication challenges at individual level. Few articles by the way have tried to focus the concept of "risk" at the individual subject level rather than at population level. The author has reviewed the conceptual framework which has led to the use of probability theory in the medical field in a time when the principal causes of death were represented by acute disease often of infective origin. In the present scenario, in which chronic degenerative disease dominate and there are smooth transitions between health and disease the use of fuzzy logic rather than binary logic would be more appropriate. The use of fuzzy logic in which more than two possible truth-value assignments are allowed overcomes the trap of probability theory when dealing with uncertain outcomes, thereby making the meaning of a certain prognostic statement easier to understand by the patient. At individual subject level the recourse to the term plausibility, related to fuzzy logic, would help the physician to communicate to the patient more efficiently in comparison with the term probability, related to binary logic. This would represent an evident advantage for the transfer of medical evidences to individual subjects.
Abstract quantum computing machines and quantum computational logics
NASA Astrophysics Data System (ADS)
Chiara, Maria Luisa Dalla; Giuntini, Roberto; Sergioli, Giuseppe; Leporini, Roberto
2016-06-01
Classical and quantum parallelism are deeply different, although it is sometimes claimed that quantum Turing machines are nothing but special examples of classical probabilistic machines. We introduce the concepts of deterministic state machine, classical probabilistic state machine and quantum state machine. On this basis, we discuss the question: To what extent can quantum state machines be simulated by classical probabilistic state machines? Each state machine is devoted to a single task determined by its program. Real computers, however, behave differently, being able to solve different kinds of problems. This capacity can be modeled, in the quantum case, by the mathematical notion of abstract quantum computing machine, whose different programs determine different quantum state machines. The computations of abstract quantum computing machines can be linguistically described by the formulas of a particular form of quantum logic, termed quantum computational logic.
DNA-programmed dynamic assembly of quantum dots for molecular computation.
He, Xuewen; Li, Zhi; Chen, Muzi; Ma, Nan
2014-12-22
Despite the widespread use of quantum dots (QDs) for biosensing and bioimaging, QD-based bio-interfaceable and reconfigurable molecular computing systems have not yet been realized. DNA-programmed dynamic assembly of multi-color QDs is presented for the construction of a new class of fluorescence resonance energy transfer (FRET)-based QD computing systems. A complete set of seven elementary logic gates (OR, AND, NOR, NAND, INH, XOR, XNOR) are realized using a series of binary and ternary QD complexes operated by strand displacement reactions. The integration of different logic gates into a half-adder circuit for molecular computation is also demonstrated. This strategy is quite versatile and straightforward for logical operations and would pave the way for QD-biocomputing-based intelligent molecular diagnostics. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Formal Compiler Implementation in a Logical Framework
2003-04-29
variable set [], we omit the brackets and use the simpler notation v. MetaPRL is a tactic-based prover that uses OCaml [20] as its meta-language. When a...rewrite is defined in MetaPRL, the framework creates an OCaml expression that can be used to apply the rewrite. Code to guide the application of...rewrites is written in OCaml , using a rich set of primitives provided by MetaPRL. MetaPRL automates the construction of most guidance code; we describe
Knowledge discovery from structured mammography reports using inductive logic programming.
Burnside, Elizabeth S; Davis, Jesse; Costa, Victor Santos; Dutra, Inês de Castro; Kahn, Charles E; Fine, Jason; Page, David
2005-01-01
The development of large mammography databases provides an opportunity for knowledge discovery and data mining techniques to recognize patterns not previously appreciated. Using a database from a breast imaging practice containing patient risk factors, imaging findings, and biopsy results, we tested whether inductive logic programming (ILP) could discover interesting hypotheses that could subsequently be tested and validated. The ILP algorithm discovered two hypotheses from the data that were 1) judged as interesting by a subspecialty trained mammographer and 2) validated by analysis of the data itself.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haney, Thomas Jay
This document describes the process used to develop data quality objectives for the Idaho National Laboratory (INL) Environmental Soil Monitoring Program in accordance with U.S. Environmental Protection Agency guidance. This document also develops and presents the logic that was used to determine the specific number of soil monitoring locations at the INL Site, at locations bordering the INL Site, and at locations in the surrounding regional area. The monitoring location logic follows the guidance from the U.S. Department of Energy for environmental surveillance of its facilities.
Development of multiple user AMTRAN on the Datacraft DC6024
NASA Technical Reports Server (NTRS)
Austin, S. L.
1973-01-01
A multiple user version of AMTRAn was implemented on the Datacraft DC6024 computer is reported. The major portion of the multiple user logic is incorporated in the main program which remains in core during all AMTRAN processes. A detailed flowchart of the main program is provided as documentation of the multiple user capability. Activities are directed toward perfecting its capability, providing new features in response to user needs and requests, providing a two-dimensional array AMTRAN containing multiple user logic, and providing documentation as the tasks progress.
NASA Lewis F100 engine testing
NASA Technical Reports Server (NTRS)
Werner, R. A.; Willoh, R. G., Jr.; Abdelwahab, M.
1984-01-01
Two builds of an F100 engine model derivative (EMD) engine were evaluated for improvements in engine components and digital electronic engine control (DEEC) logic. Two DEEC flight logics were verified throughout the flight envelope in support of flight clearance for the F100 engine model derivative program (EMPD). A nozzle instability and a faster augmentor transient capability was investigated in support of the F-15 DEEC flight program. Off schedule coupled system mode fan flutter, DEEC nose-boom pressure correlation, DEEC station six pressure comparison, and a new fan inlet variable vane (CIVV) schedule are identified.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. Alfonsi; C. Rabiti; D. Mandelli
The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less
Cross-Cultural Counseling and Cross-Cultural Meanings: An Exploration of Morita Psychotherapy.
ERIC Educational Resources Information Center
Aldous, Jane L.
1994-01-01
Describes theoretical framework and techniques of Morita psychotherapy. Western research indicates that Asian American clients prefer active-directive, logical, rational, and structured approaches. Suggests that ethnocentric counseling approaches may be imposed upon clients of Asian origin because meanings attached to terms describing counseling…
Second Language Acquisition and Universal Grammar.
ERIC Educational Resources Information Center
White, Lydia
1990-01-01
Discusses the motivation for Universal Grammar (UG), as assumed in the principles and parameters framework of generative grammar (Chomsky, 1981), focusing on the logical problem of first-language acquisition and the potential role of UG in second-language acquisition. Recent experimental research regarding the second-language status of the…
Evaluation and Strategic Planning for the GLOBE Program
NASA Astrophysics Data System (ADS)
Geary, E. E.; Williams, V. L.
2010-12-01
The Global Learning and Observations to Benefit the Environment (GLOBE) Program is an international environmental education program. It unites educators, students and scientists worldwide to collaborate on inquiry based investigations of the environment and Earth system science. Evaluation of the GLOBE program has been challenging because of its broad reach, diffuse models of implementation, and multiple stakeholders. In an effort to guide current evaluation efforts, a logic model was developed that provides a visual display of how the GLOBE program operates. Using standard elements of inputs, activities, outputs, customers and outcomes, this model describes how the program operates to achieve its goals. The template used to develop this particular logic model aligns the GLOBE program operations with its program strategy, thus ensuring that what the program is doing supports the achievement of long-term, intermediate and annual goals. It also provides a foundation for the development of key programmatic metrics that can be used to gauge progress toward the achievement of strategic goals.
Human Memory Organization for Computer Programs.
ERIC Educational Resources Information Center
Norcio, A. F.; Kerst, Stephen M.
1983-01-01
Results of study investigating human memory organization in processing of computer programming languages indicate that algorithmic logic segments form a cognitive organizational structure in memory for programs. Statement indentation and internal program documentation did not enhance organizational process of recall of statements in five Fortran…
Index to Computer Assisted Instruction.
ERIC Educational Resources Information Center
Lekan, Helen A., Ed.
The computer assisted instruction (CAI) programs and projects described in this index are listed by subject matter. The index gives the program name, author, source, description, prerequisites, level of instruction, type of student, average completion time, logic and program, purpose for which program was designed, supplementary…
QUARTERLY TECHNICAL PROGRESS REPORT, JULY, AUGUST, SEPTEMBER 1966.
Contents: Circuit research program; Hardware systems research; Software systems research program; Numerical methods, computer arithmetic and...artificial languages; Library automation; Illiac II service , use, and program development; IBM service , use, and program development; Problem specifications; Switching theory and logical design; General laboratory information.
Munro, Alice; Shakeshaft, Anthony; Clifford, Anton
2017-12-04
Given the well-established evidence of disproportionately high rates of substance-related morbidity and mortality after release from incarceration for Indigenous Australians, access to comprehensive, effective and culturally safe residential rehabilitation treatment will likely assist in reducing recidivism to both prison and substance dependence for this population. In the absence of methodologically rigorous evidence, the delivery of Indigenous drug and alcohol residential rehabilitation services vary widely, and divergent views exist regarding the appropriateness and efficacy of different potential treatment components. One way to increase the methodological quality of evaluations of Indigenous residential rehabilitation services is to develop partnerships with researchers to better align models of care with the client's, and the community's, needs. An emerging research paradigm to guide the development of high quality evidence through a number of sequential steps that equitably involves services, stakeholders and researchers is community-based participatory research (CBPR). The purpose of this study is to articulate an Indigenous drug and alcohol residential rehabilitation service model of care, developed in collaboration between clients, service providers and researchers using a CBPR approach. This research adopted a mixed methods CBPR approach to triangulate collected data to inform the development of a model of care for a remote Indigenous drug and alcohol residential rehabilitation service. Four iterative CBPR steps of research activity were recorded during the 3-year research partnership. As a direct outcome of the CBPR framework, the service and researchers co-designed a Healing Model of Care that comprises six core treatment components, three core organisational components and is articulated in two program logics. The program logics were designed to specifically align each component and outcome with the mechanism of change for the client or organisation to improve data collection and program evaluation. The description of the CBPR process and the Healing Model of Care provides one possible solution about how to provide better care for the large and growing population of Indigenous people with substance.
Gschwind, Michael K
2013-04-16
Mechanisms for generating and executing programs for a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA) are provided. A computer program product comprising a computer recordable medium having a computer readable program recorded thereon is provided. The computer readable program, when executed on a computing device, causes the computing device to receive one or more instructions and execute the one or more instructions using logic in an execution unit of the computing device. The logic implements a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA), based on data stored in a vector register file of the computing device. The vector register file is configured to store both scalar and floating point values as vectors having a plurality of vector elements.
DNAzyme-Based Logic Gate-Mediated DNA Self-Assembly.
Zhang, Cheng; Yang, Jing; Jiang, Shuoxing; Liu, Yan; Yan, Hao
2016-01-13
Controlling DNA self-assembly processes using rationally designed logic gates is a major goal of DNA-based nanotechnology and programming. Such controls could facilitate the hierarchical engineering of complex nanopatterns responding to various molecular triggers or inputs. Here, we demonstrate the use of a series of DNAzyme-based logic gates to control DNA tile self-assembly onto a prescribed DNA origami frame. Logic systems such as "YES," "OR," "AND," and "logic switch" are implemented based on DNAzyme-mediated tile recognition with the DNA origami frame. DNAzyme is designed to play two roles: (1) as an intermediate messenger to motivate downstream reactions and (2) as a final trigger to report fluorescent signals, enabling information relay between the DNA origami-framed tile assembly and fluorescent signaling. The results of this study demonstrate the plausibility of DNAzyme-mediated hierarchical self-assembly and provide new tools for generating dynamic and responsive self-assembly systems.
Benitez, Cecil M.; Qu, Kun; Sugiyama, Takuya; Pauerstein, Philip T.; Liu, Yinghua; Tsai, Jennifer; Gu, Xueying; Ghodasara, Amar; Arda, H. Efsun; Zhang, Jiajing; Dekker, Joseph D.; Tucker, Haley O.; Chang, Howard Y.; Kim, Seung K.
2014-01-01
The regulatory logic underlying global transcriptional programs controlling development of visceral organs like the pancreas remains undiscovered. Here, we profiled gene expression in 12 purified populations of fetal and adult pancreatic epithelial cells representing crucial progenitor cell subsets, and their endocrine or exocrine progeny. Using probabilistic models to decode the general programs organizing gene expression, we identified co-expressed gene sets in cell subsets that revealed patterns and processes governing progenitor cell development, lineage specification, and endocrine cell maturation. Purification of Neurog3 mutant cells and module network analysis linked established regulators such as Neurog3 to unrecognized gene targets and roles in pancreas development. Iterative module network analysis nominated and prioritized transcriptional regulators, including diabetes risk genes. Functional validation of a subset of candidate regulators with corresponding mutant mice revealed that the transcription factors Etv1, Prdm16, Runx1t1 and Bcl11a are essential for pancreas development. Our integrated approach provides a unique framework for identifying regulatory genes and functional gene sets underlying pancreas development and associated diseases such as diabetes mellitus. PMID:25330008
NASA Astrophysics Data System (ADS)
Clem, Douglas Wayne
Spatial ability refers to an individual's capacity to visualize and mentally manipulate three dimensional objects. Since sonographers manually manipulate 2D and 3D sonographic images to generate multi-viewed, logical, sequential renderings of an anatomical structure, it can be assumed that spatial ability is central to the perception and interpretation of these medical images. Using Ackerman's theory of ability determinants of skilled performance as a conceptual framework, this study explored the relationship of spatial ability and learning sonographic scanning. Beginning first year sonography students from four different educational institutions were administered a spatial abilities test prior to their initial scanning lab coursework. The students' spatial test scores were compared with their scanning competency performance scores. A significant relationship between the students' spatial ability scores and their scanning performance scores was found. This result suggests that the use of spatial ability tests for admission to sonography programs may improve candidate selection, as well as assist programs in adjusting instruction and curriculum for students who demonstrate low spatial ability.
Representation of research hypotheses
2011-01-01
Background Hypotheses are now being automatically produced on an industrial scale by computers in biology, e.g. the annotation of a genome is essentially a large set of hypotheses generated by sequence similarity programs; and robot scientists enable the full automation of a scientific investigation, including generation and testing of research hypotheses. Results This paper proposes a logically defined way for recording automatically generated hypotheses in machine amenable way. The proposed formalism allows the description of complete hypotheses sets as specified input and output for scientific investigations. The formalism supports the decomposition of research hypotheses into more specialised hypotheses if that is required by an application. Hypotheses are represented in an operational way – it is possible to design an experiment to test them. The explicit formal description of research hypotheses promotes the explicit formal description of the results and conclusions of an investigation. The paper also proposes a framework for automated hypotheses generation. We demonstrate how the key components of the proposed framework are implemented in the Robot Scientist “Adam”. Conclusions A formal representation of automatically generated research hypotheses can help to improve the way humans produce, record, and validate research hypotheses. Availability http://www.aber.ac.uk/en/cs/research/cb/projects/robotscientist/results/ PMID:21624164
A Scalable Data Integration and Analysis Architecture for Sensor Data of Pediatric Asthma.
Stripelis, Dimitris; Ambite, José Luis; Chiang, Yao-Yi; Eckel, Sandrah P; Habre, Rima
2017-04-01
According to the Centers for Disease Control, in the United States there are 6.8 million children living with asthma. Despite the importance of the disease, the available prognostic tools are not sufficient for biomedical researchers to thoroughly investigate the potential risks of the disease at scale. To overcome these challenges we present a big data integration and analysis infrastructure developed by our Data and Software Coordination and Integration Center (DSCIC) of the NIBIB-funded Pediatric Research using Integrated Sensor Monitoring Systems (PRISMS) program. Our goal is to help biomedical researchers to efficiently predict and prevent asthma attacks. The PRISMS-DSCIC is responsible for collecting, integrating, storing, and analyzing real-time environmental, physiological and behavioral data obtained from heterogeneous sensor and traditional data sources. Our architecture is based on the Apache Kafka, Spark and Hadoop frameworks and PostgreSQL DBMS. A main contribution of this work is extending the Spark framework with a mediation layer, based on logical schema mappings and query rewriting, to facilitate data analysis over a consistent harmonized schema. The system provides both batch and stream analytic capabilities over the massive data generated by wearable and fixed sensors.
Semantics-enabled service discovery framework in the SIMDAT pharma grid.
Qu, Cangtao; Zimmermann, Falk; Kumpf, Kai; Kamuzinzi, Richard; Ledent, Valérie; Herzog, Robert
2008-03-01
We present the design and implementation of a semantics-enabled service discovery framework in the data Grids for process and product development using numerical simulation and knowledge discovery (SIMDAT) Pharma Grid, an industry-oriented Grid environment for integrating thousands of Grid-enabled biological data services and analysis services. The framework consists of three major components: the Web ontology language (OWL)-description logic (DL)-based biological domain ontology, OWL Web service ontology (OWL-S)-based service annotation, and semantic matchmaker based on the ontology reasoning. Built upon the framework, workflow technologies are extensively exploited in the SIMDAT to assist biologists in (semi)automatically performing in silico experiments. We present a typical usage scenario through the case study of a biological workflow: IXodus.
Evaluating community and campus environmental public health programs.
Pettibone, Kristianna G; Parras, Juan; Croisant, Sharon Petronella; Drew, Christina H
2014-01-01
The National Institute of Environmental Health Sciences' (NIEHS) Partnerships for Environmental Public Health (PEPH) program created the Evaluation Metrics Manual as a tool to help grantees understand how to map out their programs using a logic model, and to identify measures for documenting their achievements in environmental public health research. This article provides an overview of the manual, describing how grantees and community partners contributed to the manual, and how the basic components of a logic model can be used to identify metrics. We illustrate how the approach can be implemented, using a real-world case study from the University of Texas Medical Branch, where researchers worked with community partners to develop a network to address environmental justice issues.
SEE Sensitivity Analysis of 180 nm NAND CMOS Logic Cell for Space Applications
NASA Astrophysics Data System (ADS)
Sajid, Muhammad
2016-07-01
This paper focus on Single Event Effects caused by energetic particle strike on sensitive locations in CMOS NAND logic cell designed in 180nm technology node to be operated in space radiation environment. The generation of SE transients as well as upsets as function of LET of incident particle has been determined for logic devices onboard LEO and GEO satellites. The minimum magnitude pulse and pulse-width for threshold LET was determined to estimate the vulnerability /susceptibility of device for heavy ion strike. The impact of temperature, strike location and logic state of NAND circuit on total SEU/SET rate was estimated with physical mechanism simulations using Visual TCAD, Genius, runSEU program and Crad computer codes.
Automated radiotherapy treatment plan integrity verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang Deshan; Moore, Kevin L.
2012-03-15
Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method ofmore » dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.« less
Constructing and Verifying Program Theory Using Source Documentation
ERIC Educational Resources Information Center
Renger, Ralph
2010-01-01
Making the program theory explicit is an essential first step in Theory Driven Evaluation (TDE). Once explicit, the program logic can be established making necessary links between the program theory, activities, and outcomes. Despite its importance evaluators often encounter situations where the program theory is not explicitly stated. Under such…
Programme Costing - A Logical Step Toward Improved Management.
ERIC Educational Resources Information Center
McDougall, Ronald N.
The analysis of costs of university activities from a functional or program point of view, rather than an organizational unit basis, is not only an imperative for the planning and management of universities, but also a logical method of examing the costs of university operations. A task force of the Committee of Finance Officers-Universities of…
A Project-Based Learning Approach to Programmable Logic Design and Computer Architecture
ERIC Educational Resources Information Center
Kellett, C. M.
2012-01-01
This paper describes a course in programmable logic design and computer architecture as it is taught at the University of Newcastle, Australia. The course is designed around a major design project and has two supplemental assessment tasks that are also described. The context of the Computer Engineering degree program within which the course is…
Teaching Semantic Tableaux Method for Propositional Classical Logic with a CAS
ERIC Educational Resources Information Center
Aguilera-Venegas, Gabriel; Galán-García, José Luis; Galán-García, María Ángeles; Rodríguez-Cielos, Pedro
2015-01-01
Automated theorem proving (ATP) for Propositional Classical Logic is an algorithm to check the validity of a formula. It is a very well-known problem which is decidable but co-NP-complete. There are many algorithms for this problem. In this paper, an educationally oriented implementation of Semantic Tableaux method is described. The program has…
Submicron Systems Architecture Project
1981-11-01
This project is concerned with the architecture , design , and testing of VLSI Systems. The principal activities in this report period include: The Tree Machine; COPE, The Homogeneous Machine; Computational Arrays; Switch-Level Model for MOS Logic Design; Testing; Local Network and Designer Workstations; Self-timed Systems; Characterization of Deadlock Free Resource Contention; Concurrency Algebra; Language Design and Logic for Program Verification.
"Modeling" Youth Work: Logic Models, Neoliberalism, and Community Praxis
ERIC Educational Resources Information Center
Carpenter, Sara
2016-01-01
This paper examines the use of logic models in the development of community initiatives within the AmeriCorps program. AmeriCorps is the civilian national service programme in the U.S., operating as a grants programme to local governments and not-for-profit organisations and providing low-cost labour to address pressing issues of social…
[New horizons in medicine. The application of "fuzzy logic" in clinical and experimental medicine].
Guarini, G
1994-06-01
In medicine, the study of physiological and physiopathological problems is generally programmed by elaborating models which respond to the principals of formal logic. This gives the advantage of favouring the transformation of the formal model into a mathematical model of reference which responds to the principles of the set theories. All this is in the utopian wish to obtain as a result of each research, a net answer whether positive or negative, according to the Aristotelian principal of tertium non datur. Taking this into consideration, the A. briefly traces the principles of modal logic and, in particular, those of fuzzy logic, proposing that the latter substitute the actual definition of "logic with more truth values", with that perhaps more pertinent of "logic of conditioned possibilities". After a brief synthesis on the state of the art on the application of fuzzy logic, the A. reports an example of graphic expression of fuzzy logic by demonstrating how the basic glycemic data (expressed by the vectors magnitude) revealed in a sample of healthy individuals, constituted on the whole an unbroken continuous stream of set partials. The A. calls attention to fuzzy logic as a useful instrument to elaborate in a new way the analysis of scenario qualified to acquire the necessary information to single out the critical points which characterize the potential development of any biological phenomenon.
Data Processing: Fifteen Suggestions for Computer Training in Your Business Education Classes.
ERIC Educational Resources Information Center
Barr, Lowell L.
1980-01-01
Presents 15 suggestions for training business education students in the use of computers. Suggestions involve computer language, method of presentation, laboratory time, programing assignments, instructions and handouts, problem solving, deadlines, reviews, programming concepts, programming logic, documentation, and defensive programming. (CT)
Program to Optimize Simulated Trajectories (POST). Volume 3: Programmer's manual
NASA Technical Reports Server (NTRS)
Brauer, G. L.; Cornick, D. E.; Habeger, A. R.; Petersen, F. M.; Stevenson, R.
1975-01-01
Information pertinent to the programmer and relating to the program to optimize simulated trajectories (POST) is presented. Topics discussed include: program structure and logic, subroutine listings and flow charts, and internal FORTRAN symbols. The POST core requirements are summarized along with program macrologic.
NASA Technical Reports Server (NTRS)
1973-01-01
A shuttle (ARS) atmosphere revitalization subsystem active thermal control subsystem (ATCS) performance routine was developed. This computer program is adapted from the Shuttle EC/LSS Design Computer Program. The program was upgraded in three noteworthy areas: (1) The functional ARS/ATCS schematic has been revised to accurately synthesize the shuttle baseline system definition. (2) The program logic has been improved to provide a more accurate prediction of the integrated ARS/ATCS system performance. Additionally, the logic has been expanded to model all components and thermal loads in the ARS/ATCS system. (3) The program is designed to be used on the NASA JSC crew system division's programmable calculator system. As written the new computer routine has an average running time of five minutes. The use of desk top type calculation equipment, and the rapid response of the program provides the NASA with an analytical tool for trade studies to refine the system definition, and for test support of the RSECS or integrated Shuttle ARS/ATCS test programs.
S3DB core: a framework for RDF generation and management in bioinformatics infrastructures
2010-01-01
Background Biomedical research is set to greatly benefit from the use of semantic web technologies in the design of computational infrastructure. However, beyond well defined research initiatives, substantial issues of data heterogeneity, source distribution, and privacy currently stand in the way towards the personalization of Medicine. Results A computational framework for bioinformatic infrastructure was designed to deal with the heterogeneous data sources and the sensitive mixture of public and private data that characterizes the biomedical domain. This framework consists of a logical model build with semantic web tools, coupled with a Markov process that propagates user operator states. An accompanying open source prototype was developed to meet a series of applications that range from collaborative multi-institution data acquisition efforts to data analysis applications that need to quickly traverse complex data structures. This report describes the two abstractions underlying the S3DB-based infrastructure, logical and numerical, and discusses its generality beyond the immediate confines of existing implementations. Conclusions The emergence of the "web as a computer" requires a formal model for the different functionalities involved in reading and writing to it. The S3DB core model proposed was found to address the design criteria of biomedical computational infrastructure, such as those supporting large scale multi-investigator research, clinical trials, and molecular epidemiology. PMID:20646315
Guziolowski, Carito; Videla, Santiago; Eduati, Federica; Thiele, Sven; Cokelaer, Thomas; Siegel, Anne; Saez-Rodriguez, Julio
2013-09-15
Logic modeling is a useful tool to study signal transduction across multiple pathways. Logic models can be generated by training a network containing the prior knowledge to phospho-proteomics data. The training can be performed using stochastic optimization procedures, but these are unable to guarantee a global optima or to report the complete family of feasible models. This, however, is essential to provide precise insight in the mechanisms underlaying signal transduction and generate reliable predictions. We propose the use of Answer Set Programming to explore exhaustively the space of feasible logic models. Toward this end, we have developed caspo, an open-source Python package that provides a powerful platform to learn and characterize logic models by leveraging the rich modeling language and solving technologies of Answer Set Programming. We illustrate the usefulness of caspo by revisiting a model of pro-growth and inflammatory pathways in liver cells. We show that, if experimental error is taken into account, there are thousands (11 700) of models compatible with the data. Despite the large number, we can extract structural features from the models, such as links that are always (or never) present or modules that appear in a mutual exclusive fashion. To further characterize this family of models, we investigate the input-output behavior of the models. We find 91 behaviors across the 11 700 models and we suggest new experiments to discriminate among them. Our results underscore the importance of characterizing in a global and exhaustive manner the family of feasible models, with important implications for experimental design. caspo is freely available for download (license GPLv3) and as a web service at http://caspo.genouest.org/. Supplementary materials are available at Bioinformatics online. santiago.videla@irisa.fr.
Guziolowski, Carito; Videla, Santiago; Eduati, Federica; Thiele, Sven; Cokelaer, Thomas; Siegel, Anne; Saez-Rodriguez, Julio
2013-01-01
Motivation: Logic modeling is a useful tool to study signal transduction across multiple pathways. Logic models can be generated by training a network containing the prior knowledge to phospho-proteomics data. The training can be performed using stochastic optimization procedures, but these are unable to guarantee a global optima or to report the complete family of feasible models. This, however, is essential to provide precise insight in the mechanisms underlaying signal transduction and generate reliable predictions. Results: We propose the use of Answer Set Programming to explore exhaustively the space of feasible logic models. Toward this end, we have developed caspo, an open-source Python package that provides a powerful platform to learn and characterize logic models by leveraging the rich modeling language and solving technologies of Answer Set Programming. We illustrate the usefulness of caspo by revisiting a model of pro-growth and inflammatory pathways in liver cells. We show that, if experimental error is taken into account, there are thousands (11 700) of models compatible with the data. Despite the large number, we can extract structural features from the models, such as links that are always (or never) present or modules that appear in a mutual exclusive fashion. To further characterize this family of models, we investigate the input–output behavior of the models. We find 91 behaviors across the 11 700 models and we suggest new experiments to discriminate among them. Our results underscore the importance of characterizing in a global and exhaustive manner the family of feasible models, with important implications for experimental design. Availability: caspo is freely available for download (license GPLv3) and as a web service at http://caspo.genouest.org/. Supplementary information: Supplementary materials are available at Bioinformatics online. Contact: santiago.videla@irisa.fr PMID:23853063
Theory Learning as Stochastic Search in the Language of Thought
ERIC Educational Resources Information Center
Ullman, Tomer D.; Goodman, Noah D.; Tenenbaum, Joshua B.
2012-01-01
We present an algorithmic model for the development of children's intuitive theories within a hierarchical Bayesian framework, where theories are described as sets of logical laws generated by a probabilistic context-free grammar. We contrast our approach with connectionist and other emergentist approaches to modeling cognitive development. While…
Ontology-Based Learner Categorization through Case Based Reasoning and Fuzzy Logic
ERIC Educational Resources Information Center
Sarwar, Sohail; García-Castro, Raul; Qayyum, Zia Ul; Safyan, Muhammad; Munir, Rana Faisal
2017-01-01
Learner categorization has a pivotal role in making e-learning systems a success. However, learner characteristics exploited at abstract level of granularity by contemporary techniques cannot categorize the learners effectively. In this paper, an architecture of e-learning framework has been presented that exploits the machine learning based…
Cyber Power Potential of the Army’s Reserve Component
2017-01-01
and could extend logically to include electric power, water, food, railway, gas pipelines , and so forth. One consideration to note is that in cases...29 CHAPTER FOUR Army Reserve Component Cyber Inventory Analysis .......................... 31...Background and Analytical Framework ........................................................... 31 Army Reserve Component Cyber Inventory Analysis , 2015
Consensus Knowledge Acquisition
1989-12-01
ex- plicit the logical structure of their positions. Structured frameworks for analyzing 3 SOME USEFUL IDEAS 3 arguments ( Toulmin , 1958; Fogelin, 1982...358-87, 1987. Stefik M, et al., Beyond the chalkboard, CACM, 30:1, Jan 1987, pp. 32-47. Toulmin , S. The Uses of Argument. Cambridge, England: Cambridge University Press, 1958. 01
Some Thoughts on John Dewey's Ethics and Education
ERIC Educational Resources Information Center
Karafillis, Gregorios
2012-01-01
The philosopher and educator, John Dewey, explores the emergence of the terms "ethics" and "education" from a pragmatist's perspective, i.e., within the linguistic and social components' framework, and society's existing cognitive and cultural level. In the current article, we examine the development, logical control and the relation between…
Rejoinder to Guterman, Martin, and Kopp
ERIC Educational Resources Information Center
Hansen, James T.
2012-01-01
In their reply to the author's keystone article (Hansen, 2012), Guterman, Martin, and Kopp (2012) charge that the author's integrative framework was not sufficiently integrative. They also argue that his proposal results in logical contradictions and the mind-body problem. The author responds by noting that his proposal fully integrates the…
Testing for Factorial Invariance in the Context of Construct Validation
ERIC Educational Resources Information Center
Dimitrov, Dimiter M.
2010-01-01
This article describes the logic and procedures behind testing for factorial invariance across groups in the context of construct validation. The procedures include testing for configural, measurement, and structural invariance in the framework of multiple-group confirmatory factor analysis (CFA). The "forward" (sequential constraint imposition)…
The Seven Silos of Accountability in Higher Education: Systematizing Multiple Logics and Fields
ERIC Educational Resources Information Center
Brown, Joshua Travis
2017-01-01
Higher education accountability is a field characterized by complexity. Prior frameworks grounded in psychometrics, economics, and history fall short in explaining the persistence and composition of its complexity. This article employs organizational theory to identify the multiple conflicting approaches of higher education accountability and…
2011-01-01
Background Remote homology detection is a hard computational problem. Most approaches have trained computational models by using either full protein sequences or multiple sequence alignments (MSA), including all positions. However, when we deal with proteins in the "twilight zone" we can observe that only some segments of sequences (motifs) are conserved. We introduce a novel logical representation that allows us to represent physico-chemical properties of sequences, conserved amino acid positions and conserved physico-chemical positions in the MSA. From this, Inductive Logic Programming (ILP) finds the most frequent patterns (motifs) and uses them to train propositional models, such as decision trees and support vector machines (SVM). Results We use the SCOP database to perform our experiments by evaluating protein recognition within the same superfamily. Our results show that our methodology when using SVM performs significantly better than some of the state of the art methods, and comparable to other. However, our method provides a comprehensible set of logical rules that can help to understand what determines a protein function. Conclusions The strategy of selecting only the most frequent patterns is effective for the remote homology detection. This is possible through a suitable first-order logical representation of homologous properties, and through a set of frequent patterns, found by an ILP system, that summarizes essential features of protein functions. PMID:21429187
Bernardes, Juliana S; Carbone, Alessandra; Zaverucha, Gerson
2011-03-23
Remote homology detection is a hard computational problem. Most approaches have trained computational models by using either full protein sequences or multiple sequence alignments (MSA), including all positions. However, when we deal with proteins in the "twilight zone" we can observe that only some segments of sequences (motifs) are conserved. We introduce a novel logical representation that allows us to represent physico-chemical properties of sequences, conserved amino acid positions and conserved physico-chemical positions in the MSA. From this, Inductive Logic Programming (ILP) finds the most frequent patterns (motifs) and uses them to train propositional models, such as decision trees and support vector machines (SVM). We use the SCOP database to perform our experiments by evaluating protein recognition within the same superfamily. Our results show that our methodology when using SVM performs significantly better than some of the state of the art methods, and comparable to other. However, our method provides a comprehensible set of logical rules that can help to understand what determines a protein function. The strategy of selecting only the most frequent patterns is effective for the remote homology detection. This is possible through a suitable first-order logical representation of homologous properties, and through a set of frequent patterns, found by an ILP system, that summarizes essential features of protein functions.
DNA strand displacement system running logic programs.
Rodríguez-Patón, Alfonso; Sainz de Murieta, Iñaki; Sosík, Petr
2014-01-01
The paper presents a DNA-based computing model which is enzyme-free and autonomous, not requiring a human intervention during the computation. The model is able to perform iterated resolution steps with logical formulae in conjunctive normal form. The implementation is based on the technique of DNA strand displacement, with each clause encoded in a separate DNA molecule. Propositions are encoded assigning a strand to each proposition p, and its complementary strand to the proposition ¬p; clauses are encoded comprising different propositions in the same strand. The model allows to run logic programs composed of Horn clauses by cascading resolution steps. The potential of the model is demonstrated also by its theoretical capability of solving SAT. The resulting SAT algorithm has a linear time complexity in the number of resolution steps, whereas its spatial complexity is exponential in the number of variables of the formula. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Satisfiability of logic programming based on radial basis function neural networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamadneh, Nawaf; Sathasivam, Saratha; Tilahun, Surafel Luleseged
2014-07-10
In this paper, we propose a new technique to test the Satisfiability of propositional logic programming and quantified Boolean formula problem in radial basis function neural networks. For this purpose, we built radial basis function neural networks to represent the proportional logic which has exactly three variables in each clause. We used the Prey-predator algorithm to calculate the output weights of the neural networks, while the K-means clustering algorithm is used to determine the hidden parameters (the centers and the widths). Mean of the sum squared error function is used to measure the activity of the two algorithms. We appliedmore » the developed technique with the recurrent radial basis function neural networks to represent the quantified Boolean formulas. The new technique can be applied to solve many applications such as electronic circuits and NP-complete problems.« less
NASA Technical Reports Server (NTRS)
Roth, J. P.
1972-01-01
The following problems are considered: (1) methods for development of logic design together with algorithms, so that it is possible to compute a test for any failure in the logic design, if such a test exists, and developing algorithms and heuristics for the purpose of minimizing the computation for tests; and (2) a method of design of logic for ultra LSI (large scale integration). It was discovered that the so-called quantum calculus can be extended to render it possible: (1) to describe the functional behavior of a mechanism component by component, and (2) to compute tests for failures, in the mechanism, using the diagnosis algorithm. The development of an algorithm for the multioutput two-level minimization problem is presented and the program MIN 360 was written for this algorithm. The program has options of mode (exact minimum or various approximations), cost function, cost bound, etc., providing flexibility.
Pennetti, Adelina
2018-07-01
The purpose of this case report is to present a multimodal approach for patient management using the Maitland concept framework for cervical and lumbar radiculitis with an underlying diagnosis of Ehlers-Danlos Syndrome-Hypermobility Type (EDS-HT). This case presents care guided by evidence, patient values, and rationale for the selected course of physical therapy treatment provided by therapist experience. A 35-year-old female with a 2-year history of worsening lumbar and cervical pain was referred to physical therapy to address these musculoskeletal issues concurrent with diagnostic testing for EDS. A multimodal approach including manual therapy, therapeutic exercise, postural and body mechanics education, and a home exercise program was used. The patient specific functional scale (PSFS) was used to gauge patient's perceived improvements which were demonstrated by increased scores at reevaluation and at discharge. Following the Maitland concept framework, the physical therapist was able to make sound clinical decisions by tracking the logical flow of constant patient assessment. A 10-month course of treatment designed to maximize recovery of function was successful with a chronic history of pain and the EDS-HT diagnosis. The role of education and empowering the patient is shown to be of utmost importance. Optimizing therapeutic outcomes long-term for this patient population requires maintaining a home exercise program, adaptation and modifications of work and lifestyle activities.
In the soft-to-hard technical spectrum: Where is software engineering?
NASA Technical Reports Server (NTRS)
Leibfried, Theodore F.; Macdonald, Robert B.
1992-01-01
In the computer journals and tabloids, there have been a plethora of articles written about the software engineering field. But while advocates of the need for an engineering approach to software development, it is impressive how many authors have treated the subject of software engineering without adequately addressing the fundamentals of what engineering as a discipline consists of. A discussion is presented of the various related facets of this issue in a logical framework to advance the thesis that the software development process is necessarily an engineering process. The purpose is to examine more of the details of the issue of whether or not the design and development of software for digital computer processing systems should be both viewed and treated as a legitimate field of professional engineering. Also, the type of academic and professional level education programs that would be required to support a software engineering discipline is examined.
Treml, Benjamin; Gillman, Andrew; Buskohl, Philip; Vaia, Richard
2018-06-18
Robots autonomously interact with their environment through a continual sense-decide-respond control loop. Most commonly, the decide step occurs in a central processing unit; however, the stiffness mismatch between rigid electronics and the compliant bodies of soft robots can impede integration of these systems. We develop a framework for programmable mechanical computation embedded into the structure of soft robots that can augment conventional digital electronic control schemes. Using an origami waterbomb as an experimental platform, we demonstrate a 1-bit mechanical storage device that writes, erases, and rewrites itself in response to a time-varying environmental signal. Further, we show that mechanical coupling between connected origami units can be used to program the behavior of a mechanical bit, produce logic gates such as AND, OR, and three input majority gates, and transmit signals between mechanologic gates. Embedded mechanologic provides a route to add autonomy and intelligence in soft robots and machines. Copyright © 2018 the Author(s). Published by PNAS.
LEGO-MM: LEarning structured model by probabilistic loGic Ontology tree for MultiMedia.
Tang, Jinhui; Chang, Shiyu; Qi, Guo-Jun; Tian, Qi; Rui, Yong; Huang, Thomas S
2016-09-22
Recent advances in Multimedia ontology have resulted in a number of concept models, e.g., LSCOM and Mediamill 101, which are accessible and public to other researchers. However, most current research effort still focuses on building new concepts from scratch, very few work explores the appropriate method to construct new concepts upon the existing models already in the warehouse. To address this issue, we propose a new framework in this paper, termed LEGO1-MM, which can seamlessly integrate both the new target training examples and the existing primitive concept models to infer the more complex concept models. LEGOMM treats the primitive concept models as the lego toy to potentially construct an unlimited vocabulary of new concepts. Specifically, we first formulate the logic operations to be the lego connectors to combine existing concept models hierarchically in probabilistic logic ontology trees. Then, we incorporate new target training information simultaneously to efficiently disambiguate the underlying logic tree and correct the error propagation. Extensive experiments are conducted on a large vehicle domain data set from ImageNet. The results demonstrate that LEGO-MM has significantly superior performance over existing state-of-the-art methods, which build new concept models from scratch.
NASA Technical Reports Server (NTRS)
Heinmiller, J. P.
1971-01-01
This document is the programmer's guide for the GNAT computer program developed under MSC/TRW Task 705-2, Apollo cryogenic storage system analysis, subtask 2, is reported. Detailed logic flow charts and compiled program listings are provided for all program elements.
Functional and space programming.
Hayward, C
1988-01-01
In this article, the author expands the earlier stated case for functional and space programming based on objective evidence of user needs. It provides an in-depth examination of the logic and processes of programming as a continuum which precedes, then parallels, architectural design.
Wright, Demia Sundra; Anderson, Lynda A; Brownson, Ross C; Gwaltney, Margaret K; Scherer, Jennifer; Cross, Alan W; Goodman, Robert M; Schwartz, Randy; Sims, Tom; White, Carol R
2008-01-01
The Centers for Disease Control and Prevention's (CDC's) Prevention Research Centers (PRC) Program underwent a 2-year evaluation planning project using a participatory process that allowed perspectives from the national community of PRC partners to be expressed and reflected in a national logic model. The PRC Program recognized the challenge in developing a feasible, useable, and relevant evaluation process for a large, diverse program. To address the challenge, participatory and utilization-focused evaluation models were used. Four tactics guided the evaluation planning process: 1) assessing stakeholders' communication needs and existing communication mechanisms and infrastructure; 2) using existing mechanisms and establishing others as needed to inform, educate, and request feedback; 3) listening to and using feedback received; and 4) obtaining adequate resources and building flexibility into the project plan to support multifaceted mechanisms for data collection. Participatory methods resulted in buy-in from stakeholders and the development of a national logic model. Benefits included CDC's use of the logic model for program planning and development of a national evaluation protocol and increased expectations among PRC partners for involvement. Challenges included the time, effort, and investment of program resources required for the participatory approach and the identification of whom to engage and when to engage them for feedback on project decisions. By using a participatory and utilization-focused model, program partners positively influenced how CDC developed an evaluation plan. The tactics we used can guide the involvement of program stakeholders and help with decisions on appropriate methods and approaches for engaging partners.
Cell-to-Cell Communication Circuits: Quantitative Analysis of Synthetic Logic Gates
Hoffman-Sommer, Marta; Supady, Adriana; Klipp, Edda
2012-01-01
One of the goals in the field of synthetic biology is the construction of cellular computation devices that could function in a manner similar to electronic circuits. To this end, attempts are made to create biological systems that function as logic gates. In this work we present a theoretical quantitative analysis of a synthetic cellular logic-gates system, which has been implemented in cells of the yeast Saccharomyces cerevisiae (Regot et al., 2011). It exploits endogenous MAP kinase signaling pathways. The novelty of the system lies in the compartmentalization of the circuit where all basic logic gates are implemented in independent single cells that can then be cultured together to perform complex logic functions. We have constructed kinetic models of the multicellular IDENTITY, NOT, OR, and IMPLIES logic gates, using both deterministic and stochastic frameworks. All necessary model parameters are taken from literature or estimated based on published kinetic data, in such a way that the resulting models correctly capture important dynamic features of the included mitogen-activated protein kinase pathways. We analyze the models in terms of parameter sensitivity and we discuss possible ways of optimizing the system, e.g., by tuning the culture density. We apply a stochastic modeling approach, which simulates the behavior of whole populations of cells and allows us to investigate the noise generated in the system; we find that the gene expression units are the major sources of noise. Finally, the model is used for the design of system modifications: we show how the current system could be transformed to operate on three discrete values. PMID:22934039
Improvements to the adaptive maneuvering logic program
NASA Technical Reports Server (NTRS)
Burgin, George H.
1986-01-01
The Adaptive Maneuvering Logic (AML) computer program simulates close-in, one-on-one air-to-air combat between two fighter aircraft. Three important improvements are described. First, the previously available versions of AML were examined for their suitability as a baseline program. The selected program was then revised to eliminate some programming bugs which were uncovered over the years. A listing of this baseline program is included. Second, the equations governing the motion of the aircraft were completely revised. This resulted in a model with substantially higher fidelity than the original equations of motion provided. It also completely eliminated the over-the-top problem, which occurred in the older versions when the AML-driven aircraft attempted a vertical or near vertical loop. Third, the requirements for a versatile generic, yet realistic, aircraft model were studied and implemented in the program. The report contains detailed tables which make the generic aircraft to be either a modern, high performance aircraft, an older high performance aircraft, or a previous generation jet fighter.
Model checking for linear temporal logic: An efficient implementation
NASA Technical Reports Server (NTRS)
Sherman, Rivi; Pnueli, Amir
1990-01-01
This report provides evidence to support the claim that model checking for linear temporal logic (LTL) is practically efficient. Two implementations of a linear temporal logic model checker is described. One is based on transforming the model checking problem into a satisfiability problem; the other checks an LTL formula for a finite model by computing the cross-product of the finite state transition graph of the program with a structure containing all possible models for the property. An experiment was done with a set of mutual exclusion algorithms and tested safety and liveness under fairness for these algorithms.
NASA Astrophysics Data System (ADS)
Hirst, Jonathan D.; King, Ross D.; Sternberg, Michael J. E.
1994-08-01
One of the largest available data sets for developing a quantitative structure-activity relationship (QSAR) — the inhibition of dihydrofolate reductase (DHFR) by 2,4-diamino-6,6-dimethyl-5-phenyl-dihydrotriazine derivatives — has been used for a sixfold cross-validation trial of neural networks, inductive logic programming (ILP) and linear regression. No statistically significant difference was found between the predictive capabilities of the methods. However, the representation of molecules by attributes, which is integral to the ILP approach, provides understandable rules about drug-receptor interactions.
d-Neighborhood system and generalized F-contraction in dislocated metric space.
Kumari, P Sumati; Zoto, Kastriot; Panthi, Dinesh
2015-01-01
This paper, gives an answer for the Question 1.1 posed by Hitzler (Generalized metrics and topology in logic programming semantics, 2001) by means of "Topological aspects of d-metric space with d-neighborhood system". We have investigated the topological aspects of a d-neighborhood system obtained from dislocated metric space (simply d-metric space) which has got useful applications in the semantic analysis of logic programming. Further more we have generalized the notion of F-contraction in the view of d-metric spaces and investigated the uniqueness of fixed point and coincidence point of such mappings.
System for corrosion monitoring in pipeline applying fuzzy logic mathematics
NASA Astrophysics Data System (ADS)
Kuzyakov, O. N.; Kolosova, A. L.; Andreeva, M. A.
2018-05-01
A list of factors influencing corrosion rate on the external side of underground pipeline is determined. Principles of constructing a corrosion monitoring system are described; the system performance algorithm and program are elaborated. A comparative analysis of methods for calculating corrosion rate is undertaken. Fuzzy logic mathematics is applied to reduce calculations while considering a wider range of corrosion factors.
ERIC Educational Resources Information Center
Jones, Bruce William
The results of implementing computer-assisted instruction (CAI) in two religion courses and a logic course at California State College, Bakersfield, are examined along with student responses. The main purpose of the CAI project was to teach interpretive skills. The most positive results came in the logic course. The programs in the New Testament…
Coping with Logical Fallacies: A Developmental Training Program for Learning to Reason
ERIC Educational Resources Information Center
Christoforides, Michael; Spanoudis, George; Demetriou, Andreas
2016-01-01
This study trained children to master logical fallacies and examined how learning is related to processing efficiency and fluid intelligence (gf). A total of one hundred and eighty 8- and 11-year-old children living in Cyprus were allocated to a control, a limited (LI), and a full instruction (FI) group. The LI group learned the notion of logical…
Derivation of sorting programs
NASA Technical Reports Server (NTRS)
Varghese, Joseph; Loganantharaj, Rasiah
1990-01-01
Program synthesis for critical applications has become a viable alternative to program verification. Nested resolution and its extension are used to synthesize a set of sorting programs from their first order logic specifications. A set of sorting programs, such as, naive sort, merge sort, and insertion sort, were successfully synthesized starting from the same set of specifications.
Rachlis, Beth; Sodhi, Sumeet; Burciul, Barry; Orbinski, James; Cheng, Amy H.Y.; Cole, Donald
2013-01-01
Community-based care (CBC) can increase access to key services for people affected by HIV/AIDS through the mobilization of community interests and resources and their integration with formal health structures. Yet, the lack of a systematic framework for analysis of CBC focused on HIV/AIDS impedes our ability to understand and study CBC programs. We sought to develop taxonomy of CBC programs focused on HIV/AIDS in resource-limited settings in an effort to understand their key characteristics, uncover any gaps in programming, and highlight the potential roles they play. Our review aimed to systematically identify key CBC programs focused on HIV/AIDS in resource-limited settings. We used both bibliographic database searches (Medline, CINAHL, and EMBASE) for peer-reviewed literature and internet-based searches for gray literature. Our search terms were ‘HIV’ or ‘AIDS’ and ‘community-based care’ or ‘CBC’. Two co-authors developed a descriptive taxonomy through an iterative, inductive process using the retrieved program information. We identified 21 CBC programs useful for developing taxonomy. Extensive variation was observed within each of the nine categories identified: region, vision, characteristics of target populations, program scope, program operations, funding models, human resources, sustainability, and monitoring and evaluation strategies. While additional research may still be needed to identify the conditions that lead to overall program success, our findings can help to inform our understanding of the various aspects of CBC programs and inform potential logic models for CBC programming in the context of HIV/AIDS in resource-limited settings. Importantly, the findings of the present study can be used to develop sustainable HIV/AIDS-service delivery programs in regions with health resource shortages. PMID:23594416
Rachlis, Beth; Sodhi, Sumeet; Burciul, Barry; Orbinski, James; Cheng, Amy H Y; Cole, Donald
2013-04-16
Community-based care (CBC) can increase access to key services for people affected by HIV/AIDS through the mobilization of community interests and resources and their integration with formal health structures. Yet, the lack of a systematic framework for analysis of CBC focused on HIV/AIDS impedes our ability to understand and study CBC programs. We sought to develop taxonomy of CBC programs focused on HIV/AIDS in resource-limited settings in an effort to understand their key characteristics, uncover any gaps in programming, and highlight the potential roles they play. Our review aimed to systematically identify key CBC programs focused on HIV/AIDS in resource-limited settings. We used both bibliographic database searches (Medline, CINAHL, and EMBASE) for peer-reviewed literature and internet-based searches for gray literature. Our search terms were 'HIV' or 'AIDS' and 'community-based care' or 'CBC'. Two co-authors developed a descriptive taxonomy through an iterative, inductive process using the retrieved program information. We identified 21 CBC programs useful for developing taxonomy. Extensive variation was observed within each of the nine categories identified: region, vision, characteristics of target populations, program scope, program operations, funding models, human resources, sustainability, and monitoring and evaluation strategies. While additional research may still be needed to identify the conditions that lead to overall program success, our findings can help to inform our understanding of the various aspects of CBC programs and inform potential logic models for CBC programming in the context of HIV/AIDS in resource-limited settings. Importantly, the findings of the present study can be used to develop sustainable HIV/AIDS-service delivery programs in regions with health resource shortages.
Haskell before Haskell: Curry's Contribution to Programming (1946-1950)
NASA Astrophysics Data System (ADS)
de Mol, Liesbeth; Bullynck, Maarten; Carlé, Martin
This paper discusses Curry's work on how to implement the problem of inverse interpolation on the ENIAC (1946) and his subsequent work on developing a theory of program composition (1948-1950). It is shown that Curry anticipated automatic programming and that his logical work influenced his composition of programs.
The Programmable Calculator in the Classroom.
ERIC Educational Resources Information Center
Stolarz, Theodore J.
The uses of programable calculators in the mathematics classroom are presented. A discussion of the "microelectronics revolution" that has brought programable calculators into our society is also included. Pointed out is that the logical or mental processes used to program the programable calculator are identical to those used to program…
Diagnosable structured logic array
NASA Technical Reports Server (NTRS)
Whitaker, Sterling (Inventor); Miles, Lowell (Inventor); Gambles, Jody (Inventor); Maki, Gary K. (Inventor)
2009-01-01
A diagnosable structured logic array and associated process is provided. A base cell structure is provided comprising a logic unit comprising a plurality of input nodes, a plurality of selection nodes, and an output node, a plurality of switches coupled to the selection nodes, where the switches comprises a plurality of input lines, a selection line and an output line, a memory cell coupled to the output node, and a test address bus and a program control bus coupled to the plurality of input lines and the selection line of the plurality of switches. A state on each of the plurality of input nodes is verifiably loaded and read from the memory cell. A trusted memory block is provided. The associated process is provided for testing and verifying a plurality of truth table inputs of the logic unit.
NASA Technical Reports Server (NTRS)
Rodgers, T. E.; Johnson, J. F.
1977-01-01
The logic and methodology for a preliminary grouping of Spacelab and mixed-cargo payloads is proposed in a form that can be readily coded into a computer program by NASA. The logic developed for this preliminary cargo grouping analysis is summarized. Principal input data include the NASA Payload Model, payload descriptive data, Orbiter and Spacelab capabilities, and NASA guidelines and constraints. The first step in the process is a launch interval selection in which the time interval for payload grouping is identified. Logic flow steps are then taken to group payloads and define flight configurations based on criteria that includes dedication, volume, area, orbital parameters, pointing, g-level, mass, center of gravity, energy, power, and crew time.
Schulz, S; Romacker, M; Hahn, U
1998-01-01
The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics.
Property Specification Patterns for intelligence building software
NASA Astrophysics Data System (ADS)
Chun, Seungsu
2018-03-01
In this paper, through the property specification pattern research for Modal MU(μ) logical aspects present a single framework based on the pattern of intelligence building software. In this study, broken down by state property specification pattern classification of Dwyer (S) and action (A) and was subdivided into it again strong (A) and weaknesses (E). Through these means based on a hierarchical pattern classification of the property specification pattern analysis of logical aspects Mu(μ) was applied to the pattern classification of the examples used in the actual model checker. As a result, not only can a more accurate classification than the existing classification systems were easy to create and understand the attributes specified.
Improving ontology matching with propagation strategy and user feedback
NASA Astrophysics Data System (ADS)
Li, Chunhua; Cui, Zhiming; Zhao, Pengpeng; Wu, Jian; Xin, Jie; He, Tianxu
2015-07-01
Markov logic networks which unify probabilistic graphical model and first-order logic provide an excellent framework for ontology matching. The existing approach requires a threshold to produce matching candidates and use a small set of constraints acting as filter to select the final alignments. We introduce novel match propagation strategy to model the influences between potential entity mappings across ontologies, which can help to identify the correct correspondences and produce missed correspondences. The estimation of appropriate threshold is a difficult task. We propose an interactive method for threshold selection through which we obtain an additional measurable improvement. Running experiments on a public dataset has demonstrated the effectiveness of proposed approach in terms of the quality of result alignment.
Ecological resilience in lakes and the conjunction fallacy.
Spears, Bryan M; Futter, Martyn N; Jeppesen, Erik; Huser, Brian J; Ives, Stephen; Davidson, Thomas A; Adrian, Rita; Angeler, David G; Burthe, Sarah J; Carvalho, Laurence; Daunt, Francis; Gsell, Alena S; Hessen, Dag O; Janssen, Annette B G; Mackay, Eleanor B; May, Linda; Moorhouse, Heather; Olsen, Saara; Søndergaard, Martin; Woods, Helen; Thackeray, Stephen J
2017-11-01
There is a pressing need to apply stability and resilience theory to environmental management to restore degraded ecosystems effectively and to mitigate the effects of impending environmental change. Lakes represent excellent model case studies in this respect and have been used widely to demonstrate theories of ecological stability and resilience that are needed to underpin preventative management approaches. However, we argue that this approach is not yet fully developed because the pursuit of empirical evidence to underpin such theoretically grounded management continues in the absence of an objective probability framework. This has blurred the lines between intuitive logic (based on the elementary principles of probability) and extensional logic (based on assumption and belief) in this field.
Schulz, S.; Romacker, M.; Hahn, U.
1998-01-01
The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics. Images Figure 3 PMID:9929335
Magnetic-field-controlled reconfigurable semiconductor logic.
Joo, Sungjung; Kim, Taeyueb; Shin, Sang Hoon; Lim, Ju Young; Hong, Jinki; Song, Jin Dong; Chang, Joonyeon; Lee, Hyun-Woo; Rhie, Kungwon; Han, Suk Hee; Shin, Kyung-Ho; Johnson, Mark
2013-02-07
Logic devices based on magnetism show promise for increasing computational efficiency while decreasing consumed power. They offer zero quiescent power and yet combine novel functions such as programmable logic operation and non-volatile built-in memory. However, practical efforts to adapt a magnetic device to logic suffer from a low signal-to-noise ratio and other performance attributes that are not adequate for logic gates. Rather than exploiting magnetoresistive effects that result from spin-dependent transport of carriers, we have approached the development of a magnetic logic device in a different way: we use the phenomenon of large magnetoresistance found in non-magnetic semiconductors in high electric fields. Here we report a device showing a strong diode characteristic that is highly sensitive to both the sign and the magnitude of an external magnetic field, offering a reversible change between two different characteristic states by the application of a magnetic field. This feature results from magnetic control of carrier generation and recombination in an InSb p-n bilayer channel. Simple circuits combining such elementary devices are fabricated and tested, and Boolean logic functions including AND, OR, NAND and NOR are performed. They are programmed dynamically by external electric or magnetic signals, demonstrating magnetic-field-controlled semiconductor reconfigurable logic at room temperature. This magnetic technology permits a new kind of spintronic device, characterized as a current switch rather than a voltage switch, and provides a simple and compact platform for non-volatile reconfigurable logic devices.
ERIC Educational Resources Information Center
Classroom Computer Learning, 1984
1984-01-01
Five computer-oriented classroom activities are suggested. They include: Logo programming to help students develop estimation, logic and spatial skills; creating flow charts; inputting data; making snowflakes using Logo; and developing and using a database management program. (JN)
ITS logical architecture : traceability matrix.
DOT National Transportation Integrated Search
2003-11-01
This document provides information to aid in understanding and using the Long-Term Pavement Performance (LTPP) program pavement performance database. This document provides an introduction to the structure of the LTPP program, the relational structur...
NASA Technical Reports Server (NTRS)
Bains, R. W.; Herwig, H. A.; Luedeman, J. K.; Torina, E. M.
1974-01-01
The Shuttle Electric Power System (SEPS) computer program is considered in terms of the program manual, programmer guide, and program utilization. The main objective is to provide the information necessary to interpret and use the routines comprising the SEPS program. Subroutine descriptions including the name, purpose, method, variable definitions, and logic flow are presented.
Towards a molecular logic machine
NASA Astrophysics Data System (ADS)
Remacle, F.; Levine, R. D.
2001-06-01
Finite state logic machines can be realized by pump-probe spectroscopic experiments on an isolated molecule. The most elaborate setup, a Turing machine, can be programmed to carry out a specific computation. We argue that a molecule can be similarly programmed, and provide examples using two photon spectroscopies. The states of the molecule serve as the possible states of the head of the Turing machine and the physics of the problem determines the possible instructions of the program. The tape is written in an alphabet that allows the listing of the different pump and probe signals that are applied in a given experiment. Different experiments using the same set of molecular levels correspond to different tapes that can be read and processed by the same head and program. The analogy to a Turing machine is not a mechanical one and is not completely molecular because the tape is not part of the molecular machine. We therefore also discuss molecular finite state machines, such as sequential devices, for which the tape is not part of the machine. Nonmolecular tapes allow for quite long input sequences with a rich alphabet (at the level of 7 bits) and laser pulse shaping experiments provide concrete examples. Single molecule spectroscopies show that a single molecule can be repeatedly cycled through a logical operation.
Engineered modular biomaterial logic gates for environmentally triggered therapeutic delivery
NASA Astrophysics Data System (ADS)
Badeau, Barry A.; Comerford, Michael P.; Arakawa, Christopher K.; Shadish, Jared A.; Deforest, Cole A.
2018-03-01
The successful transport of drug- and cell-based therapeutics to diseased sites represents a major barrier in the development of clinical therapies. Targeted delivery can be mediated through degradable biomaterial vehicles that utilize disease biomarkers to trigger payload release. Here, we report a modular chemical framework for imparting hydrogels with precise degradative responsiveness by using multiple environmental cues to trigger reactions that operate user-programmable Boolean logic. By specifying the molecular architecture and connectivity of orthogonal stimuli-labile moieties within material cross-linkers, we show selective control over gel dissolution and therapeutic delivery. To illustrate the versatility of this methodology, we synthesized 17 distinct stimuli-responsive materials that collectively yielded all possible YES/OR/AND logic outputs from input combinations involving enzyme, reductant and light. Using these hydrogels we demonstrate the first sequential and environmentally stimulated release of multiple cell lines in well-defined combinations from a material. We expect these platforms will find utility in several diverse fields including drug delivery, diagnostics and regenerative medicine.
Software Process Assurance for Complex Electronics
NASA Technical Reports Server (NTRS)
Plastow, Richard A.
2007-01-01
Complex Electronics (CE) now perform tasks that were previously handled in software, such as communication protocols. Many methods used to develop software bare a close resemblance to CE development. Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. With CE devices obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that used standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques were used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that was more easily maintained, consistent and configurable based on the device used.
Reasoning about real-time systems with temporal interval logic constraints on multi-state automata
NASA Technical Reports Server (NTRS)
Gabrielian, Armen
1991-01-01
Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system.
Engineered modular biomaterial logic gates for environmentally triggered therapeutic delivery.
Badeau, Barry A; Comerford, Michael P; Arakawa, Christopher K; Shadish, Jared A; DeForest, Cole A
2018-03-01
The successful transport of drug- and cell-based therapeutics to diseased sites represents a major barrier in the development of clinical therapies. Targeted delivery can be mediated through degradable biomaterial vehicles that utilize disease biomarkers to trigger payload release. Here, we report a modular chemical framework for imparting hydrogels with precise degradative responsiveness by using multiple environmental cues to trigger reactions that operate user-programmable Boolean logic. By specifying the molecular architecture and connectivity of orthogonal stimuli-labile moieties within material cross-linkers, we show selective control over gel dissolution and therapeutic delivery. To illustrate the versatility of this methodology, we synthesized 17 distinct stimuli-responsive materials that collectively yielded all possible YES/OR/AND logic outputs from input combinations involving enzyme, reductant and light. Using these hydrogels we demonstrate the first sequential and environmentally stimulated release of multiple cell lines in well-defined combinations from a material. We expect these platforms will find utility in several diverse fields including drug delivery, diagnostics and regenerative medicine.
The logical foundations of forensic science: towards reliable knowledge
Evett, Ian
2015-01-01
The generation of observations is a technical process and the advances that have been made in forensic science techniques over the last 50 years have been staggering. But science is about reasoning—about making sense from observations. For the forensic scientist, this is the challenge of interpreting a pattern of observations within the context of a legal trial. Here too, there have been major advances over recent years and there is a broad consensus among serious thinkers, both scientific and legal, that the logical framework is furnished by Bayesian inference (Aitken et al. Fundamentals of Probability and Statistical Evidence in Criminal Proceedings). This paper shows how the paradigm has matured, centred on the notion of the balanced scientist. Progress through the courts has not been always smooth and difficulties arising from recent judgments are discussed. Nevertheless, the future holds exciting prospects, in particular the opportunities for managing and calibrating the knowledge of the forensic scientists who assign the probabilities that are at the foundation of logical inference in the courtroom. PMID:26101288
On Childhood and the Logic of Difference: Some Empirical Examples
ERIC Educational Resources Information Center
Dahlbeck, Johan
2012-01-01
This article argues that universal documents on children's rights can provide illustrative examples as to how childhood is identified as a unity using difference as an instrument. Using Gille Deleuze's theorising on difference and sameness as a framework, the article seeks to relate the children's rights project with a critique of representation.…