Sample records for based design rules

  1. Integrated layout based Monte-Carlo simulation for design arc optimization

    NASA Astrophysics Data System (ADS)

    Shao, Dongbing; Clevenger, Larry; Zhuang, Lei; Liebmann, Lars; Wong, Robert; Culp, James

    2016-03-01

    Design rules are created considering a wafer fail mechanism with the relevant design levels under various design cases, and the values are set to cover the worst scenario. Because of the simplification and generalization, design rule hinders, rather than helps, dense device scaling. As an example, SRAM designs always need extensive ground rule waivers. Furthermore, dense design also often involves "design arc", a collection of design rules, the sum of which equals critical pitch defined by technology. In design arc, a single rule change can lead to chain reaction of other rule violations. In this talk we present a methodology using Layout Based Monte-Carlo Simulation (LBMCS) with integrated multiple ground rule checks. We apply this methodology on SRAM word line contact, and the result is a layout that has balanced wafer fail risks based on Process Assumptions (PAs). This work was performed at the IBM Microelectronics Div, Semiconductor Research and Development Center, Hopewell Junction, NY 12533

  2. Investigation of model-based physical design restrictions (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Lucas, Kevin; Baron, Stanislas; Belledent, Jerome; Boone, Robert; Borjon, Amandine; Couderc, Christophe; Patterson, Kyle; Riviere-Cazaux, Lionel; Rody, Yves; Sundermann, Frank; Toublan, Olivier; Trouiller, Yorick; Urbani, Jean-Christophe; Wimmer, Karl

    2005-05-01

    As lithography and other patterning processes become more complex and more non-linear with each generation, the task of physical design rules necessarily increases in complexity also. The goal of the physical design rules is to define the boundary between the physical layout structures which will yield well from those which will not. This is essentially a rule-based pre-silicon guarantee of layout correctness. However the rapid increase in design rule requirement complexity has created logistical problems for both the design and process functions. Therefore, similar to the semiconductor industry's transition from rule-based to model-based optical proximity correction (OPC) due to increased patterning complexity, opportunities for improving physical design restrictions by implementing model-based physical design methods are evident. In this paper we analyze the possible need and applications for model-based physical design restrictions (MBPDR). We first analyze the traditional design rule evolution, development and usage methodologies for semiconductor manufacturers. Next we discuss examples of specific design rule challenges requiring new solution methods in the patterning regime of low K1 lithography and highly complex RET. We then evaluate possible working strategies for MBPDR in the process development and product design flows, including examples of recent model-based pre-silicon verification techniques. Finally we summarize with a proposed flow and key considerations for MBPDR implementation.

  3. A Rule Based Approach to ISS Interior Volume Control and Layout

    NASA Technical Reports Server (NTRS)

    Peacock, Brian; Maida, Jim; Fitts, David; Dory, Jonathan

    2001-01-01

    Traditional human factors design involves the development of human factors requirements based on a desire to accommodate a certain percentage of the intended user population. As the product is developed human factors evaluation involves comparison between the resulting design and the specifications. Sometimes performance metrics are involved that allow leniency in the design requirements given that the human performance result is satisfactory. Clearly such approaches may work but they give rise to uncertainty and negotiation. An alternative approach is to adopt human factors design rules that articulate a range of each design continuum over which there are varying outcome expectations and interactions with other variables, including time. These rules are based on a consensus of human factors specialists, designers, managers and customers. The International Space Station faces exactly this challenge in interior volume control, which is based on anthropometric, performance and subjective preference criteria. This paper describes the traditional approach and then proposes a rule-based alternative. The proposed rules involve spatial, temporal and importance dimensions. If successful this rule-based concept could be applied to many traditional human factors design variables and could lead to a more effective and efficient contribution of human factors input to the design process.

  4. A novel methodology for building robust design rules by using design based metrology (DBM)

    NASA Astrophysics Data System (ADS)

    Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan

    2013-03-01

    This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.

  5. Bayesian design of decision rules for failure detection

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.; Willsky, A. S.

    1984-01-01

    The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.

  6. Using pattern enumeration to accelerate process development and ramp yield

    NASA Astrophysics Data System (ADS)

    Zhuang, Linda; Pang, Jenny; Xu, Jessy; Tsai, Mengfeng; Wang, Amy; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua

    2016-03-01

    During a new technology node process setup phase, foundries do not initially have enough product chip designs to conduct exhaustive process development. Different operational teams use manually designed simple test keys to set up their process flows and recipes. When the very first version of the design rule manual (DRM) is ready, foundries enter the process development phase where new experiment design data is manually created based on these design rules. However, these IP/test keys contain very uniform or simple design structures. This kind of design normally does not contain critical design structures or process unfriendly design patterns that pass design rule checks but are found to be less manufacturable. It is desired to have a method to generate exhaustive test patterns allowed by design rules at development stage to verify the gap of design rule and process. This paper presents a novel method of how to generate test key patterns which contain known problematic patterns as well as any constructs which designers could possibly draw based on current design rules. The enumerated test key patterns will contain the most critical design structures which are allowed by any particular design rule. A layout profiling method is used to do design chip analysis in order to find potential weak points on new incoming products so fab can take preemptive action to avoid yield loss. It can be achieved by comparing different products and leveraging the knowledge learned from previous manufactured chips to find possible yield detractors.

  7. Rule-based navigation control design for autonomous flight

    NASA Astrophysics Data System (ADS)

    Contreras, Hugo; Bassi, Danilo

    2008-04-01

    This article depicts a navigation control system design that is based on a set of rules in order to follow a desired trajectory. The full control of the aircraft considered here comprises: a low level stability control loop, based on classic PID controller and the higher level navigation whose main job is to exercise lateral control (course) and altitude control, trying to follow a desired trajectory. The rules and PID gains were adjusted systematically according to the result of flight simulation. In spite of its simplicity, the rule-based navigation control proved to be robust, even with big perturbation, like crossing winds.

  8. Genetic reinforcement learning through symbiotic evolution for fuzzy controller design.

    PubMed

    Juang, C F; Lin, J Y; Lin, C T

    2000-01-01

    An efficient genetic reinforcement learning algorithm for designing fuzzy controllers is proposed in this paper. The genetic algorithm (GA) adopted in this paper is based upon symbiotic evolution which, when applied to fuzzy controller design, complements the local mapping property of a fuzzy rule. Using this Symbiotic-Evolution-based Fuzzy Controller (SEFC) design method, the number of control trials, as well as consumed CPU time, are considerably reduced when compared to traditional GA-based fuzzy controller design methods and other types of genetic reinforcement learning schemes. Moreover, unlike traditional fuzzy controllers, which partition the input space into a grid, SEFC partitions the input space in a flexible way, thus creating fewer fuzzy rules. In SEFC, different types of fuzzy rules whose consequent parts are singletons, fuzzy sets, or linear equations (TSK-type fuzzy rules) are allowed. Further, the free parameters (e.g., centers and widths of membership functions) and fuzzy rules are all tuned automatically. For the TSK-type fuzzy rule especially, which put the proposed learning algorithm in use, only the significant input variables are selected to participate in the consequent of a rule. The proposed SEFC design method has been applied to different simulated control problems, including the cart-pole balancing system, a magnetic levitation system, and a water bath temperature control system. The proposed SEFC has been verified to be efficient and superior from these control problems, and from comparisons with some traditional GA-based fuzzy systems.

  9. 77 FR 65037 - Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Order Approving a Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... Organizations; C2 Options Exchange, Incorporated; Order Approving a Proposed Rule Change To Adopt a Designated... thereunder,\\2\\ a proposed rule change to adopt a Designated Primary Market-Maker (``DPM'') program. The... the Notice, C2 has proposed to adopt a DPM program. The associated proposed rules are based on the...

  10. Optimal Test Design with Rule-Based Item Generation

    ERIC Educational Resources Information Center

    Geerlings, Hanneke; van der Linden, Wim J.; Glas, Cees A. W.

    2013-01-01

    Optimal test-design methods are applied to rule-based item generation. Three different cases of automated test design are presented: (a) test assembly from a pool of pregenerated, calibrated items; (b) test generation on the fly from a pool of calibrated item families; and (c) test generation on the fly directly from calibrated features defining…

  11. 50 CFR 424.16 - Proposed rules.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.16 Proposed rules. (a) General. Based... any proposed rule to list, delist, or reclassify a species, or to designate or revise critical habitat...

  12. Redundancy checking algorithms based on parallel novel extension rule

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Yang, Yang; Li, Guangli; Wang, Qi; Lü, Shuai

    2017-05-01

    Redundancy checking (RC) is a key knowledge reduction technology. Extension rule (ER) is a new reasoning method, first presented in 2003 and well received by experts at home and abroad. Novel extension rule (NER) is an improved ER-based reasoning method, presented in 2009. In this paper, we first analyse the characteristics of the extension rule, and then present a simple algorithm for redundancy checking based on extension rule (RCER). In addition, we introduce MIMF, a type of heuristic strategy. Using the aforementioned rule and strategy, we design and implement RCHER algorithm, which relies on MIMF. Next we design and implement an RCNER (redundancy checking based on NER) algorithm based on NER. Parallel computing greatly accelerates the NER algorithm, which has weak dependence among tasks when executed. Considering this, we present PNER (parallel NER) and apply it to redundancy checking and necessity checking. Furthermore, we design and implement the RCPNER (redundancy checking based on PNER) and NCPPNER (necessary clause partition based on PNER) algorithms as well. The experimental results show that MIMF significantly influences the acceleration of algorithm RCER in formulae on a large scale and high redundancy. Comparing PNER with NER and RCPNER with RCNER, the average speedup can reach up to the number of task decompositions when executed. Comparing NCPNER with the RCNER-based algorithm on separating redundant formulae, speedup increases steadily as the scale of the formulae is incrementing. Finally, we describe the challenges that the extension rule will be faced with and suggest possible solutions.

  13. Model-based approach for design verification and co-optimization of catastrophic and parametric-related defects due to systematic manufacturing variations

    NASA Astrophysics Data System (ADS)

    Perry, Dan; Nakamoto, Mark; Verghese, Nishath; Hurat, Philippe; Rouse, Rich

    2007-03-01

    Model-based hotspot detection and silicon-aware parametric analysis help designers optimize their chips for yield, area and performance without the high cost of applying foundries' recommended design rules. This set of DFM/ recommended rules is primarily litho-driven, but cannot guarantee a manufacturable design without imposing overly restrictive design requirements. This rule-based methodology of making design decisions based on idealized polygons that no longer represent what is on silicon needs to be replaced. Using model-based simulation of the lithography, OPC, RET and etch effects, followed by electrical evaluation of the resulting shapes, leads to a more realistic and accurate analysis. This analysis can be used to evaluate intelligent design trade-offs and identify potential failures due to systematic manufacturing defects during the design phase. The successful DFM design methodology consists of three parts: 1. Achieve a more aggressive layout through limited usage of litho-related recommended design rules. A 10% to 15% area reduction is achieved by using more aggressive design rules. DFM/recommended design rules are used only if there is no impact on cell size. 2. Identify and fix hotspots using a model-based layout printability checker. Model-based litho and etch simulation are done at the cell level to identify hotspots. Violations of recommended rules may cause additional hotspots, which are then fixed. The resulting design is ready for step 3. 3. Improve timing accuracy with a process-aware parametric analysis tool for transistors and interconnect. Contours of diffusion, poly and metal layers are used for parametric analysis. In this paper, we show the results of this physical and electrical DFM methodology at Qualcomm. We describe how Qualcomm was able to develop more aggressive cell designs that yielded a 10% to 15% area reduction using this methodology. Model-based shape simulation was employed during library development to validate architecture choices and to optimize cell layout. At the physical verification stage, the shape simulator was run at full-chip level to identify and fix residual hotspots on interconnect layers, on poly or metal 1 due to interaction between adjacent cells, or on metal 1 due to interaction between routing (via and via cover) and cell geometry. To determine an appropriate electrical DFM solution, Qualcomm developed an experiment to examine various electrical effects. After reporting the silicon results of this experiment, which showed sizeable delay variations due to lithography-related systematic effects, we also explain how contours of diffusion, poly and metal can be used for silicon-aware parametric analysis of transistors and interconnect at the cell-, block- and chip-level.

  14. Attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm.

    PubMed

    Zhang, Jie; Wang, Yuping; Feng, Junhong

    2013-01-01

    In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption.

  15. Attribute Index and Uniform Design Based Multiobjective Association Rule Mining with Evolutionary Algorithm

    PubMed Central

    Wang, Yuping; Feng, Junhong

    2013-01-01

    In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption. PMID:23766683

  16. A hybrid intelligence approach to artifact recognition in digital publishing

    NASA Astrophysics Data System (ADS)

    Vega-Riveros, J. Fernando; Santos Villalobos, Hector J.

    2006-02-01

    The system presented integrates rule-based and case-based reasoning for artifact recognition in Digital Publishing. In Variable Data Printing (VDP) human proofing could result prohibitive since a job could contain millions of different instances that may contain two types of artifacts: 1) evident defects, like a text overflow or overlapping 2) style-dependent artifacts, subtle defects that show as inconsistencies with regard to the original job design. We designed a Knowledge-Based Artifact Recognition tool for document segmentation, layout understanding, artifact detection, and document design quality assessment. Document evaluation is constrained by reference to one instance of the VDP job proofed by a human expert against the remaining instances. Fundamental rules of document design are used in the rule-based component for document segmentation and layout understanding. Ambiguities in the design principles not covered by the rule-based system are analyzed by case-based reasoning, using the Nearest Neighbor Algorithm, where features from previous jobs are used to detect artifacts and inconsistencies within the document layout. We used a subset of XSL-FO and assembled a set of 44 document samples. The system detected all the job layout changes, while obtaining an overall average accuracy of 84.56%, with the highest accuracy of 92.82%, for overlapping and the lowest, 66.7%, for the lack-of-white-space.

  17. Biology Teachers Designing Context-Based Lessons for Their Classroom Practice--The Importance of Rules-of-Thumb

    ERIC Educational Resources Information Center

    Wieringa, Nienke; Janssen, Fred J. J. M.; Van Driel, Jan H.

    2011-01-01

    In science education in the Netherlands new, context-based, curricula are being developed. As in any innovation, the outcome will largely depend on the teachers who design and implement lessons. Central to the study presented here is the idea that teachers, when designing lessons, use rules-of-thumb: notions of what a lesson should look like if…

  18. A Hybrid Genetic Programming Algorithm for Automated Design of Dispatching Rules.

    PubMed

    Nguyen, Su; Mei, Yi; Xue, Bing; Zhang, Mengjie

    2018-06-04

    Designing effective dispatching rules for production systems is a difficult and timeconsuming task if it is done manually. In the last decade, the growth of computing power, advanced machine learning, and optimisation techniques has made the automated design of dispatching rules possible and automatically discovered rules are competitive or outperform existing rules developed by researchers. Genetic programming is one of the most popular approaches to discovering dispatching rules in the literature, especially for complex production systems. However, the large heuristic search space may restrict genetic programming from finding near optimal dispatching rules. This paper develops a new hybrid genetic programming algorithm for dynamic job shop scheduling based on a new representation, a new local search heuristic, and efficient fitness evaluators. Experiments show that the new method is effective regarding the quality of evolved rules. Moreover, evolved rules are also significantly smaller and contain more relevant attributes.

  19. META II Complex Systems Design and Analysis (CODA)

    DTIC Science & Technology

    2011-08-01

    37  3.8.7  Variables, Parameters and Constraints ............................................................. 37  3.8.8  Objective...18  Figure 7: Inputs, States, Outputs and Parameters of System Requirements Specifications ......... 19...Design Rule Based on Device Parameter ....................................................... 57  Figure 35: AEE Device Design Rules (excerpt

  20. Monitoring Agents for Assisting NASA Engineers with Shuttle Ground Processing

    NASA Technical Reports Server (NTRS)

    Semmel, Glenn S.; Davis, Steven R.; Leucht, Kurt W.; Rowe, Danil A.; Smith, Kevin E.; Boeloeni, Ladislau

    2005-01-01

    The Spaceport Processing Systems Branch at NASA Kennedy Space Center has designed, developed, and deployed a rule-based agent to monitor the Space Shuttle's ground processing telemetry stream. The NASA Engineering Shuttle Telemetry Agent increases situational awareness for system and hardware engineers during ground processing of the Shuttle's subsystems. The agent provides autonomous monitoring of the telemetry stream and automatically alerts system engineers when user defined conditions are satisfied. Efficiency and safety are improved through increased automation. Sandia National Labs' Java Expert System Shell is employed as the agent's rule engine. The shell's predicate logic lends itself well to capturing the heuristics and specifying the engineering rules within this domain. The declarative paradigm of the rule-based agent yields a highly modular and scalable design spanning multiple subsystems of the Shuttle. Several hundred monitoring rules have been written thus far with corresponding notifications sent to Shuttle engineers. This chapter discusses the rule-based telemetry agent used for Space Shuttle ground processing. We present the problem domain along with design and development considerations such as information modeling, knowledge capture, and the deployment of the product. We also present ongoing work with other condition monitoring agents.

  1. Design of Composite Structures Using Knowledge-Based and Case Based Reasoning

    NASA Technical Reports Server (NTRS)

    Lambright, Jonathan Paul

    1996-01-01

    A method of using knowledge based and case based reasoning to assist designers during conceptual design tasks of composite structures was proposed. The cooperative use of heuristics, procedural knowledge, and previous similar design cases suggests a potential reduction in design cycle time and ultimately product lead time. The hypothesis of this work is that the design process of composite structures can be improved by using Case-Based Reasoning (CBR) and Knowledge-Based (KB) reasoning in the early design stages. The technique of using knowledge-based and case-based reasoning facilitates the gathering of disparate information into one location that is easily and readily available. The method suggests that the inclusion of downstream life-cycle issues into the conceptual design phase reduces potential of defective, and sub-optimal composite structures. Three industry experts were interviewed extensively. The experts provided design rules, previous design cases, and test problems. A Knowledge Based Reasoning system was developed using the CLIPS (C Language Interpretive Procedural System) environment and a Case Based Reasoning System was developed using the Design Memory Utility For Sharing Experiences (MUSE) xviii environment. A Design Characteristic State (DCS) was used to document the design specifications, constraints, and problem areas using attribute-value pair relationships. The DCS provided consistent design information between the knowledge base and case base. Results indicated that the use of knowledge based and case based reasoning provided a robust design environment for composite structures. The knowledge base provided design guidance from well defined rules and procedural knowledge. The case base provided suggestions on design and manufacturing techniques based on previous similar designs and warnings of potential problems and pitfalls. The case base complemented the knowledge base and extended the problem solving capability beyond the existence of limited well defined rules. The findings indicated that the technique is most effective when used as a design aid and not as a tool to totally automate the composites design process. Other areas of application and implications for future research are discussed.

  2. Explanation-based learning in infancy.

    PubMed

    Baillargeon, Renée; DeJong, Gerald F

    2017-10-01

    In explanation-based learning (EBL), domain knowledge is leveraged in order to learn general rules from few examples. An explanation is constructed for initial exemplars and is then generalized into a candidate rule that uses only the relevant features specified in the explanation; if the rule proves accurate for a few additional exemplars, it is adopted. EBL is thus highly efficient because it combines both analytic and empirical evidence. EBL has been proposed as one of the mechanisms that help infants acquire and revise their physical rules. To evaluate this proposal, 11- and 12-month-olds (n = 260) were taught to replace their current support rule (that an object is stable when half or more of its bottom surface is supported) with a more sophisticated rule (that an object is stable when half or more of the entire object is supported). Infants saw teaching events in which asymmetrical objects were placed on a base, followed by static test displays involving a novel asymmetrical object and a novel base. When the teaching events were designed to facilitate EBL, infants learned the new rule with as few as two (12-month-olds) or three (11-month-olds) exemplars. When the teaching events were designed to impede EBL, however, infants failed to learn the rule. Together, these results demonstrate that even infants, with their limited knowledge about the world, benefit from the knowledge-based approach of EBL.

  3. 77 FR 21161 - National Forest System Land Management Planning

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-09

    ... ecosystem services and multiple uses. The planning rule is designed to ensure that plans provide for the... adaptive and science-based, engages the public, and is designed to be efficient, effective, and within the..., the new rule is designed to make planning more efficient and effective. Purpose and Need for the New...

  4. 50 CFR 424.16 - Proposed rules.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.16 Proposed rules. (a) General. Based..., delisting, or reclassification of a species or the designation or revision of critical habitat will also...

  5. A CLIPS-based tool for aircraft pilot-vehicle interface design

    NASA Technical Reports Server (NTRS)

    Fowler, Thomas D.; Rogers, Steven P.

    1991-01-01

    The Pilot-Vehicle Interface of modern aircraft is the cognitive, sensory, and psychomotor link between the pilot, the avionics modules, and all other systems on board the aircraft. To assist pilot-vehicle interface designers, a C Language Integrated Production System (CLIPS) based tool was developed that allows design information to be stored in a table that can be modified by rules representing design knowledge. Developed for the Apple Macintosh, the tool allows users without any CLIPS programming experience to form simple rules using a point and click interface.

  6. 50 CFR 424.16 - Proposed rules.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.16 Proposed rules. (a) General. Based... reclassification of a species or the designation or revision of critical habitat shall also include a summary of...

  7. Design rules for successful governmental payments for ecosystem services: Taking agri-environmental measures in Germany as an example.

    PubMed

    Meyer, Claas; Reutter, Michaela; Matzdorf, Bettina; Sattler, Claudia; Schomers, Sarah

    2015-07-01

    In recent years, increasing attention has been paid to financial environmental policy instruments that have played important roles in solving agri-environmental problems throughout the world, particularly in the European Union and the United States. The ample and increasing literature on Payments for Ecosystem Services (PES) and agri-environmental measures (AEMs), generally understood as governmental PES, shows that certain single design rules may have an impact on the success of a particular measure. Based on this research, we focused on the interplay of several design rules and conducted a comparative analysis of AEMs' institutional arrangements by examining 49 German cases. We analyzed the effects of the design rules and certain rule combinations on the success of AEMs. Compliance and noncompliance with the hypothesized design rules and the success of the AEMs were surveyed by questioning the responsible agricultural administration and the AEMs' mid-term evaluators. The different rules were evaluated in regard to their necessity and sufficiency for success using Qualitative Comparative Analysis (QCA). Our results show that combinations of certain design rules such as environmental goal targeting and area targeting conditioned the success of the AEMs. Hence, we generalize design principles for AEMs and discuss implications for the general advancement of ecosystem services and the PES approach in agri-environmental policies. Moreover, we highlight the relevance of the results for governmental PES program research and design worldwide. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. CD volume design and verification

    NASA Technical Reports Server (NTRS)

    Li, Y. P.; Hughes, J. S.

    1993-01-01

    In this paper, we describe a prototype for CD-ROM volume design and verification. This prototype allows users to create their own model of CD volumes by modifying a prototypical model. Rule-based verification of the test volumes can then be performed later on against the volume definition. This working prototype has proven the concept of model-driven rule-based design and verification for large quantity of data. The model defined for the CD-ROM volumes becomes a data model as well as an executable specification.

  9. A rule-based, dose-finding design for use in stroke rehabilitation research: methodological development.

    PubMed

    Colucci, E; Clark, A; Lang, C E; Pomeroy, V M

    2017-12-01

    Dose-optimisation studies as precursors to clinical trials are rare in stroke rehabilitation. To develop a rule-based, dose-finding design for stroke rehabilitation research. 3+3 rule-based, dose-finding study. Dose escalation/de-escalation was undertaken according to preset rules and a mathematical sequence (modified Fibonacci sequence). The target starting daily dose was 50 repetitions for the first cohort. Adherence was recorded by an electronic counter. At the end of the 2-week training period, the adherence record indicated dose tolerability (adherence to target dose) and the outcome measure indicated dose benefit (10% increase in motor function). The preset increment/decrease and checking rules were then applied to set the dose for the subsequent cohort. The process was repeated until preset stopping rules were met. Participants had a mean age of 68 (range 48 to 81) years, and were a mean of 70 (range 9 to 289) months post stroke with moderate upper limb paresis. A custom-built model of exercise-based training to enhance ability to open the paretic hand. Repetitions per minute of extension/flexion of paretic digits against resistance. Usability of the preset rules and whether the maximally tolerated dose was identifiable. Five cohorts of three participants were involved. Discernibly different doses were set for each subsequent cohort (i.e. 50, 100, 167, 251 and 209 repetitions/day). The maximally tolerated dose for the model training task was 209 repetitions/day. This dose-finding design is a feasible method for use in stroke rehabilitation research. Copyright © 2017 Chartered Society of Physiotherapy. All rights reserved.

  10. Automatic de-identification of French clinical records: comparison of rule-based and machine-learning approaches.

    PubMed

    Grouin, Cyril; Zweigenbaum, Pierre

    2013-01-01

    In this paper, we present a comparison of two approaches to automatically de-identify medical records written in French: a rule-based system and a machine-learning based system using a conditional random fields (CRF) formalism. Both systems have been designed to process nine identifiers in a corpus of medical records in cardiology. We performed two evaluations: first, on 62 documents in cardiology, and on 10 documents in foetopathology - produced by optical character recognition (OCR) - to evaluate the robustness of our systems. We achieved a 0.843 (rule-based) and 0.883 (machine-learning) exact match overall F-measure in cardiology. While the rule-based system allowed us to achieve good results on nominative (first and last names) and numerical data (dates, phone numbers, and zip codes), the machine-learning approach performed best on more complex categories (postal addresses, hospital names, medical devices, and towns). On the foetopathology corpus, although our systems have not been designed for this corpus and despite OCR character recognition errors, we obtained promising results: a 0.681 (rule-based) and 0.638 (machine-learning) exact-match overall F-measure. This demonstrates that existing tools can be applied to process new documents of lower quality.

  11. Research on key technology of the verification system of steel rule based on vision measurement

    NASA Astrophysics Data System (ADS)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  12. Process Materialization Using Templates and Rules to Design Flexible Process Models

    NASA Astrophysics Data System (ADS)

    Kumar, Akhil; Yao, Wen

    The main idea in this paper is to show how flexible processes can be designed by combining generic process templates and business rules. We instantiate a process by applying rules to specific case data, and running a materialization algorithm. The customized process instance is then executed in an existing workflow engine. We present an architecture and also give an algorithm for process materialization. The rules are written in a logic-based language like Prolog. Our focus is on capturing deeper process knowledge and achieving a holistic approach to robust process design that encompasses control flow, resources and data, as well as makes it easier to accommodate changes to business policy.

  13. Simulation-Based Rule Generation Considering Readability

    PubMed Central

    Yahagi, H.; Shimizu, S.; Ogata, T.; Hara, T.; Ota, J.

    2015-01-01

    Rule generation method is proposed for an aircraft control problem in an airport. Designing appropriate rules for motion coordination of taxiing aircraft in the airport is important, which is conducted by ground control. However, previous studies did not consider readability of rules, which is important because it should be operated and maintained by humans. Therefore, in this study, using the indicator of readability, we propose a method of rule generation based on parallel algorithm discovery and orchestration (PADO). By applying our proposed method to the aircraft control problem, the proposed algorithm can generate more readable and more robust rules and is found to be superior to previous methods. PMID:27347501

  14. Architecture For The Optimization Of A Machining Process In Real Time Through Rule-Based Expert System

    NASA Astrophysics Data System (ADS)

    Serrano, Rafael; González, Luis Carlos; Martín, Francisco Jesús

    2009-11-01

    Under the project SENSOR-IA which has had financial funding from the Order of Incentives to the Regional Technology Centers of the Counsil of Innovation, Science and Enterprise of Andalusia, an architecture for the optimization of a machining process in real time through rule-based expert system has been developed. The architecture consists of an acquisition system and sensor data processing engine (SATD) from an expert system (SE) rule-based which communicates with the SATD. The SE has been designed as an inference engine with an algorithm for effective action, using a modus ponens rule model of goal-oriented rules.The pilot test demonstrated that it is possible to govern in real time the machining process based on rules contained in a SE. The tests have been done with approximated rules. Future work includes an exhaustive collection of data with different tool materials and geometries in a database to extract more precise rules.

  15. 50 CFR 424.16 - Proposed rules.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.16 Proposed rules. (a) General. Based...—(1) Notifications. In the case of any proposed rule to list, delist, or reclassify a species, or to...

  16. Automating the design of scientific computing software

    NASA Technical Reports Server (NTRS)

    Kant, Elaine

    1992-01-01

    SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.

  17. 76 FR 51442 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing of Proposed Rule Change To List...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-18

    ...-Adviser has designed the following quantitative stock selection rules to make allocation decisions and to..., the Sub-Adviser's investment process is quantitative. Based on extensive historical research, the Sub... open-end fund's portfolio composition must be subject to procedures designed to prevent the use and...

  18. RuleML-Based Learning Object Interoperability on the Semantic Web

    ERIC Educational Resources Information Center

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  19. ASICs Approach for the Implementation of a Symmetric Triangular Fuzzy Coprocessor and Its Application to Adaptive Filtering

    NASA Technical Reports Server (NTRS)

    Starks, Scott; Abdel-Hafeez, Saleh; Usevitch, Bryan

    1997-01-01

    This paper discusses the implementation of a fuzzy logic system using an ASICs design approach. The approach is based upon combining the inherent advantages of symmetric triangular membership functions and fuzzy singleton sets to obtain a novel structure for fuzzy logic system application development. The resulting structure utilizes a fuzzy static RAM to store the rule-base and the end-points of the triangular membership functions. This provides advantages over other approaches in which all sampled values of membership functions for all universes must be stored. The fuzzy coprocessor structure implements the fuzzification and defuzzification processes through a two-stage parallel pipeline architecture which is capable of executing complex fuzzy computations in less than 0.55us with an accuracy of more than 95%, thus making it suitable for a wide range of applications. Using the approach presented in this paper, a fuzzy logic rule-base can be directly downloaded via a host processor to an onchip rule-base memory with a size of 64 words. The fuzzy coprocessor's design supports up to 49 rules for seven fuzzy membership functions associated with each of the chip's two input variables. This feature allows designers to create fuzzy logic systems without the need for additional on-board memory. Finally, the paper reports on simulation studies that were conducted for several adaptive filter applications using the least mean squared adaptive algorithm for adjusting the knowledge rule-base.

  20. Improved specificity of TALE-based genome editing using an expanded RVD repertoire.

    PubMed

    Miller, Jeffrey C; Zhang, Lei; Xia, Danny F; Campo, John J; Ankoudinova, Irina V; Guschin, Dmitry Y; Babiarz, Joshua E; Meng, Xiangdong; Hinkley, Sarah J; Lam, Stephen C; Paschon, David E; Vincent, Anna I; Dulay, Gladys P; Barlow, Kyle A; Shivak, David A; Leung, Elo; Kim, Jinwon D; Amora, Rainier; Urnov, Fyodor D; Gregory, Philip D; Rebar, Edward J

    2015-05-01

    Transcription activator-like effector (TALE) proteins have gained broad appeal as a platform for targeted DNA recognition, largely owing to their simple rules for design. These rules relate the base specified by a single TALE repeat to the identity of two key residues (the repeat variable diresidue, or RVD) and enable design for new sequence targets via modular shuffling of these units. A key limitation of these rules is that their simplicity precludes options for improving designs that are insufficiently active or specific. Here we address this limitation by developing an expanded set of RVDs and applying them to improve the performance of previously described TALEs. As an extreme example, total conversion of a TALE nuclease to new RVDs substantially reduced off-target cleavage in cellular studies. By providing new RVDs and design strategies, these studies establish options for developing improved TALEs for broader application across medicine and biotechnology.

  1. Complexity, Training Paradigm Design, and the Contribution of Memory Subsystems to Grammar Learning

    PubMed Central

    Ettlinger, Marc; Wong, Patrick C. M.

    2016-01-01

    Although there is variability in nonnative grammar learning outcomes, the contributions of training paradigm design and memory subsystems are not well understood. To examine this, we presented learners with an artificial grammar that formed words via simple and complex morphophonological rules. Across three experiments, we manipulated training paradigm design and measured subjects' declarative, procedural, and working memory subsystems. Experiment 1 demonstrated that passive, exposure-based training boosted learning of both simple and complex grammatical rules, relative to no training. Additionally, procedural memory correlated with simple rule learning, whereas declarative memory correlated with complex rule learning. Experiment 2 showed that presenting corrective feedback during the test phase did not improve learning. Experiment 3 revealed that structuring the order of training so that subjects are first exposed to the simple rule and then the complex improved learning. The cumulative findings shed light on the contributions of grammatical complexity, training paradigm design, and domain-general memory subsystems in determining grammar learning success. PMID:27391085

  2. A step-by-step introduction to rule-based design of synthetic genetic constructs using GenoCAD.

    PubMed

    Wilson, Mandy L; Hertzberg, Russell; Adam, Laura; Peccoud, Jean

    2011-01-01

    GenoCAD is an open source web-based system that provides a streamlined, rule-driven process for designing genetic sequences. GenoCAD provides a graphical interface that allows users to design sequences consistent with formalized design strategies specific to a domain, organization, or project. Design strategies include limited sets of user-defined parts and rules indicating how these parts are to be combined in genetic constructs. In addition to reducing design time to minutes, GenoCAD improves the quality and reliability of the finished sequence by ensuring that the designs follow established rules of sequence construction. GenoCAD.org is a publicly available instance of GenoCAD that can be found at www.genocad.org. The source code and latest build are available from SourceForge to allow advanced users to install and customize GenoCAD for their unique needs. This chapter focuses primarily on how the GenoCAD tools can be used to organize genetic parts into customized personal libraries, then how these libraries can be used to design sequences. In addition, GenoCAD's parts management system and search capabilities are described in detail. Instructions are provided for installing a local instance of GenoCAD on a server. Some of the future enhancements of this rapidly evolving suite of applications are briefly described. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. An expert system design to diagnose cancer by using a new method reduced rule base.

    PubMed

    Başçiftçi, Fatih; Avuçlu, Emre

    2018-04-01

    A Medical Expert System (MES) was developed which uses Reduced Rule Base to diagnose cancer risk according to the symptoms in an individual. A total of 13 symptoms were used. With the new MES, the reduced rules are controlled instead of all possibilities (2 13 = 8192 different possibilities occur). By controlling reduced rules, results are found more quickly. The method of two-level simplification of Boolean functions was used to obtain Reduced Rule Base. Thanks to the developed application with the number of dynamic inputs and outputs on different platforms, anyone can easily test their own cancer easily. More accurate results were obtained considering all the possibilities related to cancer. Thirteen different risk factors were determined to determine the type of cancer. The truth table produced in our study has 13 inputs and 4 outputs. The Boolean Function Minimization method is used to obtain less situations by simplifying logical functions. Diagnosis of cancer quickly thanks to control of the simplified 4 output functions. Diagnosis made with the 4 output values obtained using Reduced Rule Base was found to be quicker than diagnosis made by screening all 2 13 = 8192 possibilities. With the improved MES, more probabilities were added to the process and more accurate diagnostic results were obtained. As a result of the simplification process in breast and renal cancer diagnosis 100% diagnosis speed gain, in cervical cancer and lung cancer diagnosis rate gain of 99% was obtained. With Boolean function minimization, less number of rules is evaluated instead of evaluating a large number of rules. Reducing the number of rules allows the designed system to work more efficiently and to save time, and facilitates to transfer the rules to the designed Expert systems. Interfaces were developed in different software platforms to enable users to test the accuracy of the application. Any one is able to diagnose the cancer itself using determinative risk factors. Thereby likely to beat the cancer with early diagnosis. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Approach to design neural cryptography: a generalized architecture and a heuristic rule.

    PubMed

    Mu, Nankun; Liao, Xiaofeng; Huang, Tingwen

    2013-06-01

    Neural cryptography, a type of public key exchange protocol, is widely considered as an effective method for sharing a common secret key between two neural networks on public channels. How to design neural cryptography remains a great challenge. In this paper, in order to provide an approach to solve this challenge, a generalized network architecture and a significant heuristic rule are designed. The proposed generic framework is named as tree state classification machine (TSCM), which extends and unifies the existing structures, i.e., tree parity machine (TPM) and tree committee machine (TCM). Furthermore, we carefully study and find that the heuristic rule can improve the security of TSCM-based neural cryptography. Therefore, TSCM and the heuristic rule can guide us to designing a great deal of effective neural cryptography candidates, in which it is possible to achieve the more secure instances. Significantly, in the light of TSCM and the heuristic rule, we further expound that our designed neural cryptography outperforms TPM (the most secure model at present) on security. Finally, a series of numerical simulation experiments are provided to verify validity and applicability of our results.

  5. Ability-Grouping and Academic Inequality: Evidence from Rule-Based Student Assignments. NBER Working Paper No. 14911

    ERIC Educational Resources Information Center

    Jackson, C. Kirabo

    2009-01-01

    In Trinidad and Tobago students are assigned to secondary schools after fifth grade based on achievement tests, leading to large differences in the school environments to which students of differing initial levels of achievement are exposed. Using both a regression discontinuity design and rule-based instrumental variables to address…

  6. Evaluation of a multi-arm multi-stage Bayesian design for phase II drug selection trials - an example in hemato-oncology.

    PubMed

    Jacob, Louis; Uvarova, Maria; Boulet, Sandrine; Begaj, Inva; Chevret, Sylvie

    2016-06-02

    Multi-Arm Multi-Stage designs aim at comparing several new treatments to a common reference, in order to select or drop any treatment arm to move forward when such evidence already exists based on interim analyses. We redesigned a Bayesian adaptive design initially proposed for dose-finding, focusing our interest in the comparison of multiple experimental drugs to a control on a binary criterion measure. We redesigned a phase II clinical trial that randomly allocates patients across three (one control and two experimental) treatment arms to assess dropping decision rules. We were interested in dropping any arm due to futility, either based on historical control rate (first rule) or comparison across arms (second rule), and in stopping experimental arm due to its ability to reach a sufficient response rate (third rule), using the difference of response probabilities in Bayes binomial trials between the treated and control as a measure of treatment benefit. Simulations were then conducted to investigate the decision operating characteristics under a variety of plausible scenarios, as a function of the decision thresholds. Our findings suggest that one experimental treatment was less efficient than the control and could have been dropped from the trial based on a sample of approximately 20 instead of 40 patients. In the simulation study, stopping decisions were reached sooner for the first rule than for the second rule, with close mean estimates of response rates and small bias. According to the decision threshold, the mean sample size to detect the required 0.15 absolute benefit ranged from 63 to 70 (rule 3) with false negative rates of less than 2 % (rule 1) up to 6 % (rule 2). In contrast, detecting a 0.15 inferiority in response rates required a sample size ranging on average from 23 to 35 (rules 1 and 2, respectively) with a false positive rate ranging from 3.6 to 0.6 % (rule 3). Adaptive trial design is a good way to improve clinical trials. It allows removing ineffective drugs and reducing the trial sample size, while maintaining unbiased estimates. Decision thresholds can be set according to predefined fixed error decision rates. ClinicalTrials.gov Identifier: NCT01342692 .

  7. Testing the Developmental Origins of Health and Disease Hypothesis for Psychopathology Using Family-Based Quasi-Experimental Designs

    PubMed Central

    D’Onofrio, Brian M.; Class, Quetzal A.; Lahey, Benjamin B.; Larsson, Henrik

    2014-01-01

    The Developmental Origin of Health and Disease (DOHaD) hypothesis is a broad theoretical framework that emphasizes how early risk factors have a causal influence on psychopathology. Researchers have raised concerns about the causal interpretation of statistical associations between early risk factors and later psychopathology because most existing studies have been unable to rule out the possibility of environmental and genetic confounding. In this paper we illustrate how family-based quasi-experimental designs can test the DOHaD hypothesis by ruling out alternative hypotheses. We review the logic underlying sibling-comparison, co-twin control, offspring of siblings/twins, adoption, and in vitro fertilization designs. We then present results from studies using these designs focused on broad indices of fetal development (low birth weight and gestational age) and a particular teratogen, smoking during pregnancy. The results provide mixed support for the DOHaD hypothesis for psychopathology, illustrating the critical need to use design features that rule out unmeasured confounding. PMID:25364377

  8. Designing boosting ensemble of relational fuzzy systems.

    PubMed

    Scherer, Rafał

    2010-10-01

    A method frequently used in classification systems for improving classification accuracy is to combine outputs of several classifiers. Among various types of classifiers, fuzzy ones are tempting because of using intelligible fuzzy if-then rules. In the paper we build an AdaBoost ensemble of relational neuro-fuzzy classifiers. Relational fuzzy systems bond input and output fuzzy linguistic values by a binary relation; thus, fuzzy rules have additional, comparing to traditional fuzzy systems, weights - elements of a fuzzy relation matrix. Thanks to this the system is better adjustable to data during learning. In the paper an ensemble of relational fuzzy systems is proposed. The problem is that such an ensemble contains separate rule bases which cannot be directly merged. As systems are separate, we cannot treat fuzzy rules coming from different systems as rules from the same (single) system. In the paper, the problem is addressed by a novel design of fuzzy systems constituting the ensemble, resulting in normalization of individual rule bases during learning. The method described in the paper is tested on several known benchmarks and compared with other machine learning solutions from the literature.

  9. Automated visualization of rule-based models

    PubMed Central

    Tapia, Jose-Juan; Faeder, James R.

    2017-01-01

    Frameworks such as BioNetGen, Kappa and Simmune use “reaction rules” to specify biochemical interactions compactly, where each rule specifies a mechanism such as binding or phosphorylation and its structural requirements. Current rule-based models of signaling pathways have tens to hundreds of rules, and these numbers are expected to increase as more molecule types and pathways are added. Visual representations are critical for conveying rule-based models, but current approaches to show rules and interactions between rules scale poorly with model size. Also, inferring design motifs that emerge from biochemical interactions is an open problem, so current approaches to visualize model architecture rely on manual interpretation of the model. Here, we present three new visualization tools that constitute an automated visualization framework for rule-based models: (i) a compact rule visualization that efficiently displays each rule, (ii) the atom-rule graph that conveys regulatory interactions in the model as a bipartite network, and (iii) a tunable compression pipeline that incorporates expert knowledge and produces compact diagrams of model architecture when applied to the atom-rule graph. The compressed graphs convey network motifs and architectural features useful for understanding both small and large rule-based models, as we show by application to specific examples. Our tools also produce more readable diagrams than current approaches, as we show by comparing visualizations of 27 published models using standard graph metrics. We provide an implementation in the open source and freely available BioNetGen framework, but the underlying methods are general and can be applied to rule-based models from the Kappa and Simmune frameworks also. We expect that these tools will promote communication and analysis of rule-based models and their eventual integration into comprehensive whole-cell models. PMID:29131816

  10. A method and knowledge base for automated inference of patient problems from structured data in an electronic medical record

    PubMed Central

    Pang, Justine; Feblowitz, Joshua C; Maloney, Francine L; Wilcox, Allison R; Ramelson, Harley Z; Schneider, Louise I; Bates, David W

    2011-01-01

    Background Accurate knowledge of a patient's medical problems is critical for clinical decision making, quality measurement, research, billing and clinical decision support. Common structured sources of problem information include the patient problem list and billing data; however, these sources are often inaccurate or incomplete. Objective To develop and validate methods of automatically inferring patient problems from clinical and billing data, and to provide a knowledge base for inferring problems. Study design and methods We identified 17 target conditions and designed and validated a set of rules for identifying patient problems based on medications, laboratory results, billing codes, and vital signs. A panel of physicians provided input on a preliminary set of rules. Based on this input, we tested candidate rules on a sample of 100 000 patient records to assess their performance compared to gold standard manual chart review. The physician panel selected a final rule for each condition, which was validated on an independent sample of 100 000 records to assess its accuracy. Results Seventeen rules were developed for inferring patient problems. Analysis using a validation set of 100 000 randomly selected patients showed high sensitivity (range: 62.8–100.0%) and positive predictive value (range: 79.8–99.6%) for most rules. Overall, the inference rules performed better than using either the problem list or billing data alone. Conclusion We developed and validated a set of rules for inferring patient problems. These rules have a variety of applications, including clinical decision support, care improvement, augmentation of the problem list, and identification of patients for research cohorts. PMID:21613643

  11. EPE fundamentals and impact of EUV: Will traditional design-rule calculations work in the era of EUV?

    NASA Astrophysics Data System (ADS)

    Gabor, Allen H.; Brendler, Andrew C.; Brunner, Timothy A.; Chen, Xuemei; Culp, James A.; Levinson, Harry J.

    2018-03-01

    The relationship between edge placement error, semiconductor design-rule determination and predicted yield in the era of EUV lithography is examined. This paper starts with the basics of edge placement error and then builds up to design-rule calculations. We show that edge placement error (EPE) definitions can be used as the building blocks for design-rule equations but that in the last several years the term "EPE" has been used in the literature to refer to many patterning errors that are not EPE. We then explore the concept of "Good Fields"1 and use it predict the n-sigma value needed for design-rule determination. Specifically, fundamental yield calculations based on the failure opportunities per chip are used to determine at what n-sigma "value" design-rules need to be tested to ensure high yield. The "value" can be a space between two features, an intersect area between two features, a minimum area of a feature, etc. It is shown that across chip variation of design-rule important values needs to be tested at sigma values between seven and eight which is much higher than the four-sigma values traditionally used for design-rule determination. After recommending new statistics be used for design-rule calculations the paper examines the impact of EUV lithography on sources of variation important for design-rule calculations. We show that stochastics can be treated as an effective dose variation that is fully sampled across every chip. Combining the increased within chip variation from EUV with the understanding that across chip variation of design-rule important values needs to not cause a yield loss at significantly higher sigma values than have traditionally been looked at, the conclusion is reached that across-wafer, wafer-to-wafer and lot-to-lot variation will have to overscale for any technology introducing EUV lithography where stochastic noise is a significant fraction of the effective dose variation. We will emphasize stochastic effects on edge placement error distributions and appropriate design-rule setting. While CD distributions with long tails coming from stochastic effects do bring increased risk of failure (especially on chips that may have over a billion failure opportunities per layer) there are other sources of variation that have sharp cutoffs, i.e. have no tails. We will review these sources and show how distributions with different skew and kurtosis values combine.

  12. Exploring the combinatorial space of complete pathways to chemicals.

    PubMed

    Wang, Lin; Ng, Chiam Yu; Dash, Satyakam; Maranas, Costas D

    2018-04-06

    Computational pathway design tools often face the challenges of balancing the stoichiometry of co-metabolites and cofactors, and dealing with reaction rule utilization in a single workflow. To this end, we provide an overview of two complementary stoichiometry-based pathway design tools optStoic and novoStoic developed in our group to tackle these challenges. optStoic is designed to determine the stoichiometry of overall conversion first which optimizes a performance criterion (e.g. high carbon/energy efficiency) and ensures a comprehensive search of co-metabolites and cofactors. The procedure then identifies the minimum number of intervening reactions to connect the source and sink metabolites. We also further the pathway design procedure by expanding the search space to include both known and hypothetical reactions, represented by reaction rules, in a new tool termed novoStoic. Reaction rules are derived based on a mixed-integer linear programming (MILP) compatible reaction operator, which allow us to explore natural promiscuous enzymes, engineer candidate enzymes that are not already promiscuous as well as design de novo enzymes. The identified biochemical reaction rules then guide novoStoic to design routes that expand the currently known biotransformation space using a single MILP modeling procedure. We demonstrate the use of the two computational tools in pathway elucidation by designing novel synthetic routes for isobutanol. © 2018 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.

  13. Failure detection system design methodology. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.

    1980-01-01

    The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.

  14. Proposal to designate the order Actinomycetales Buchanan 1917, 162 (Approved Lists 1980) as the nomenclatural type of the class Actinobacteria. Request for an Opinion

    PubMed Central

    2017-01-01

    The name of the class Actinobacteria is illegitimate according to Rules 15, 22 and 27(3) because it was proposed without the designation of a nomenclatural type. I therefore propose to designate the order Actinomycetales Buchanan 1917, 162 (Approved Lists 1980) as its nomenclatural type, based on Rule 22 of the International Code of Nomenclature of Prokaryotes. PMID:28840812

  15. Proposal to designate the order Actinomycetales Buchanan 1917, 162 (Approved Lists 1980) as the nomenclatural type of the class Actinobacteria. Request for an Opinion.

    PubMed

    Oren, Aharon

    2017-09-01

    The name of the class Actinobacteria is illegitimate according to Rules 15, 22 and 27(3) because it was proposed without the designation of a nomenclatural type. I therefore propose to designate the order Actinomycetales Buchanan 1917, 162 (Approved Lists 1980) as its nomenclatural type, based on Rule 22 of the International Code of Nomenclature of Prokaryotes.

  16. Decision support system for triage management: A hybrid approach using rule-based reasoning and fuzzy logic.

    PubMed

    Dehghani Soufi, Mahsa; Samad-Soltani, Taha; Shams Vahdati, Samad; Rezaei-Hachesu, Peyman

    2018-06-01

    Fast and accurate patient triage for the response process is a critical first step in emergency situations. This process is often performed using a paper-based mode, which intensifies workload and difficulty, wastes time, and is at risk of human errors. This study aims to design and evaluate a decision support system (DSS) to determine the triage level. A combination of the Rule-Based Reasoning (RBR) and Fuzzy Logic Classifier (FLC) approaches were used to predict the triage level of patients according to the triage specialist's opinions and Emergency Severity Index (ESI) guidelines. RBR was applied for modeling the first to fourth decision points of the ESI algorithm. The data relating to vital signs were used as input variables and modeled using fuzzy logic. Narrative knowledge was converted to If-Then rules using XML. The extracted rules were then used to create the rule-based engine and predict the triage levels. Fourteen RBR and 27 fuzzy rules were extracted and used in the rule-based engine. The performance of the system was evaluated using three methods with real triage data. The accuracy of the clinical decision support systems (CDSSs; in the test data) was 99.44%. The evaluation of the error rate revealed that, when using the traditional method, 13.4% of the patients were miss-triaged, which is statically significant. The completeness of the documentation also improved from 76.72% to 98.5%. Designed system was effective in determining the triage level of patients and it proved helpful for nurses as they made decisions, generated nursing diagnoses based on triage guidelines. The hybrid approach can reduce triage misdiagnosis in a highly accurate manner and improve the triage outcomes. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. 78 FR 48214 - Self-Regulatory Organizations; The Options Clearing Corporation; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-07

    ... and Rules to Certain U.S. Dollar-Settled Gold Futures Designed to Replicate Positions in the Spot... Futures That Were Based on the Value of Gold in the Spot Market With an Additional Daily Cost of Carry... Rules to certain U.S. dollar-settled gold futures designed to replicate positions in the spot market...

  18. Designing seasonal initial attack resource deployment and dispatch rules using a two-stage stochastic programming procedure

    Treesearch

    Yu Wei; Michael Bevers; Erin J. Belval

    2015-01-01

    Initial attack dispatch rules can help shorten fire suppression response times by providing easy-to-follow recommendations based on fire weather, discovery time, location, and other factors that may influence fire behavior and the appropriate response. A new procedure is combined with a stochastic programming model and tested in this study for designing initial attack...

  19. Designing Rules for Accounting Transaction Identification based on Indonesian NLP

    NASA Astrophysics Data System (ADS)

    Iswandi, I.; Suwardi, I. S.; Maulidevi, N. U.

    2017-03-01

    Recording accounting transactions carried out by the evidence of the transactions. It can be invoices, receipts, letters of intent, electricity bill, telephone bill, etc. In this paper, we proposed design of rules to identify the entities located on the sales invoice. There are some entities identified in a sales invoice, namely : invoice date, company name, invoice number, product id, product name, quantity and total price. Identification this entities using named entity recognition method. The entities generated from the rules used as a basis for automation process of data input into the accounting system.

  20. A Swarm Optimization approach for clinical knowledge mining.

    PubMed

    Christopher, J Jabez; Nehemiah, H Khanna; Kannan, A

    2015-10-01

    Rule-based classification is a typical data mining task that is being used in several medical diagnosis and decision support systems. The rules stored in the rule base have an impact on classification efficiency. Rule sets that are extracted with data mining tools and techniques are optimized using heuristic or meta-heuristic approaches in order to improve the quality of the rule base. In this work, a meta-heuristic approach called Wind-driven Swarm Optimization (WSO) is used. The uniqueness of this work lies in the biological inspiration that underlies the algorithm. WSO uses Jval, a new metric, to evaluate the efficiency of a rule-based classifier. Rules are extracted from decision trees. WSO is used to obtain different permutations and combinations of rules whereby the optimal ruleset that satisfies the requirement of the developer is used for predicting the test data. The performance of various extensions of decision trees, namely, RIPPER, PART, FURIA and Decision Tables are analyzed. The efficiency of WSO is also compared with the traditional Particle Swarm Optimization. Experiments were carried out with six benchmark medical datasets. The traditional C4.5 algorithm yields 62.89% accuracy with 43 rules for liver disorders dataset where as WSO yields 64.60% with 19 rules. For Heart disease dataset, C4.5 is 68.64% accurate with 98 rules where as WSO is 77.8% accurate with 34 rules. The normalized standard deviation for accuracy of PSO and WSO are 0.5921 and 0.5846 respectively. WSO provides accurate and concise rulesets. PSO yields results similar to that of WSO but the novelty of WSO lies in its biological motivation and it is customization for rule base optimization. The trade-off between the prediction accuracy and the size of the rule base is optimized during the design and development of rule-based clinical decision support system. The efficiency of a decision support system relies on the content of the rule base and classification accuracy. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. A Darwinian approach to control-structure design

    NASA Technical Reports Server (NTRS)

    Zimmerman, David C.

    1993-01-01

    Genetic algorithms (GA's), as introduced by Holland (1975), are one form of directed random search. The form of direction is based on Darwin's 'survival of the fittest' theories. GA's are radically different from the more traditional design optimization techniques. GA's work with a coding of the design variables, as opposed to working with the design variables directly. The search is conducted from a population of designs (i.e., from a large number of points in the design space), unlike the traditional algorithms which search from a single design point. The GA requires only objective function information, as opposed to gradient or other auxiliary information. Finally, the GA is based on probabilistic transition rules, as opposed to deterministic rules. These features allow the GA to attack problems with local-global minima, discontinuous design spaces and mixed variable problems, all in a single, consistent framework.

  2. Use of an Explicit Rule Decreases Procrastination in University Students

    ERIC Educational Resources Information Center

    Johnson, Paul E.; Perrin, Christopher J.; Salo, Allen; Deschaine, Elyssa; Johnson, Beth

    2016-01-01

    The procrastination behavior of students from a small rural university was decreased by presenting them with a rule indicating that a sooner final due date for a writing assignment would be contingent on procrastination during earlier phases of the paper. A counterbalanced AB BA design was used to measure the effects of the rule-based treatment…

  3. Enhancements to the Design Manager's Aide for Intelligent Decomposition (DeMAID)

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Barthelemy, Jean-Francois M.

    1992-01-01

    This paper discusses the addition of two new enhancements to the program Design Manager's Aide for Intelligent Decomposition (DeMAID). DeMAID is a knowledge-based tool used to aid a design manager in understanding the interactions among the tasks of a complex design problem. This is done by ordering the tasks to minimize feedback, determining the participating subsystems, and displaying them in an easily understood format. The two new enhancements include (1) rules for ordering a complex assembly process and (2) rules for determining which analysis tasks must be re-executed to compute the output of one task based on a change in input to that or another task.

  4. Enhancements to the Design Manager's Aide for Intelligent Decomposition (DeMaid)

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Barthelemy, Jean-Francois M.

    1992-01-01

    This paper discusses the addition of two new enhancements to the program Design Manager's Aide for Intelligent Decomposition (DeMAID). DeMAID is a knowledge-based tool used to aid a design manager in understanding the interactions among the tasks of a complex design problem. This is done by ordering the tasks to minimize feedback, determining the participating subsystems, and displaying them in an easily understood format. The two new enhancements include (1) rules for ordering a complex assembly process and (2) rules for determining which analysis tasks must be re-executed to compute the output of one task based on a change in input to that or another task.

  5. Opinion evolution based on cellular automata rules in small world networks

    NASA Astrophysics Data System (ADS)

    Shi, Xiao-Ming; Shi, Lun; Zhang, Jie-Fang

    2010-03-01

    In this paper, we apply cellular automata rules, which can be given by a truth table, to human memory. We design each memory as a tracking survey mode that keeps the most recent three opinions. Each cellular automata rule, as a personal mechanism, gives the final ruling in one time period based on the data stored in one's memory. The key focus of the paper is to research the evolution of people's attitudes to the same question. Based on a great deal of empirical observations from computer simulations, all the rules can be classified into 20 groups. We highlight the fact that the phenomenon shown by some rules belonging to the same group will be altered within several steps by other rules in different groups. It is truly amazing that, compared with the last hundreds of presidential voting in America, the eras of important events in America's history coincide with the simulation results obtained by our model.

  6. Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software

    NASA Astrophysics Data System (ADS)

    Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.

    2017-12-01

    Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.

  7. Knowledge base rule partitioning design for CLIPS

    NASA Technical Reports Server (NTRS)

    Mainardi, Joseph D.; Szatkowski, G. P.

    1990-01-01

    This describes a knowledge base (KB) partitioning approach to solve the problem of real-time performance using the CLIPS AI shell when containing large numbers of rules and facts. This work is funded under the joint USAF/NASA Advanced Launch System (ALS) Program as applied research in expert systems to perform vehicle checkout for real-time controller and diagnostic monitoring tasks. The Expert System advanced development project (ADP-2302) main objective is to provide robust systems responding to new data frames of 0.1 to 1.0 second intervals. The intelligent system control must be performed within the specified real-time window, in order to meet the demands of the given application. Partitioning the KB reduces the complexity of the inferencing Rete net at any given time. This reduced complexity improves performance but without undo impacts during load and unload cycles. The second objective is to produce highly reliable intelligent systems. This requires simple and automated approaches to the KB verification & validation task. Partitioning the KB reduces rule interaction complexity overall. Reduced interaction simplifies the V&V testing necessary by focusing attention only on individual areas of interest. Many systems require a robustness that involves a large number of rules, most of which are mutually exclusive under different phases or conditions. The ideal solution is to control the knowledge base by loading rules that directly apply for that condition, while stripping out all rules and facts that are not used during that cycle. The practical approach is to cluster rules and facts into associated 'blocks'. A simple approach has been designed to control the addition and deletion of 'blocks' of rules and facts, while allowing real-time operations to run freely. Timing tests for real-time performance for specific machines under R/T operating systems have not been completed but are planned as part of the analysis process to validate the design.

  8. 77 FR 9532 - Air Quality Designations for the 2010 Primary Nitrogen Dioxide (NO2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-17

    ...This rule establishes air quality designations for all areas in the United States for the 2010 Primary Nitrogen Dioxide (NO2) National Ambient Air Quality Standards (NAAQS). Based on air quality monitoring data, the EPA is issuing this rule to designate all areas of the country as ``unclassifiable/attainment'' for the 2010 NO2 NAAQS. The EPA is designating areas as ``unclassifiable/attainment'' to mean that available information does not indicate that the air quality in these areas exceeds the 2010 NO2 NAAQS.

  9. Rule-Based Design of Plant Expression Vectors Using GenoCAD.

    PubMed

    Coll, Anna; Wilson, Mandy L; Gruden, Kristina; Peccoud, Jean

    2015-01-01

    Plant synthetic biology requires software tools to assist on the design of complex multi-genic expression plasmids. Here a vector design strategy to express genes in plants is formalized and implemented as a grammar in GenoCAD, a Computer-Aided Design software for synthetic biology. It includes a library of plant biological parts organized in structural categories and a set of rules describing how to assemble these parts into large constructs. Rules developed here are organized and divided into three main subsections according to the aim of the final construct: protein localization studies, promoter analysis and protein-protein interaction experiments. The GenoCAD plant grammar guides the user through the design while allowing users to customize vectors according to their needs. Therefore the plant grammar implemented in GenoCAD will help plant biologists take advantage of methods from synthetic biology to design expression vectors supporting their research projects.

  10. 76 FR 16650 - Self-Regulatory Organizations; NASDAQ OMX BX, Inc.; Order Approving Proposed Rule Change To Amend...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-24

    ... investors and the public interest; and are not designed to permit unfair discrimination among customers... proposed rule's impact on efficiency, competition, and capital formation. 15 U.S.C. 78c(f). \\9\\ 15 U.S.C... between customers based on the nature and profitability of their business. Currently under BOX's rules, an...

  11. Promoting Changes in Children's Predictive Rules about Natural Phenomena: The Role of Computer-Based Modelling Strategies. Technical Report.

    ERIC Educational Resources Information Center

    Frenette, Micheline

    Trying to change the predictive rule for the sinking and floating phenomena, students have a great difficulty in understanding density and they are insensitive to empirical counter-examples designed to challenge their own rule. The purpose of this study is to examine the process whereby students from sixth and seventh grades relinquish their…

  12. 76 FR 76799 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-08

    ... average daily volume (``CADV''). The text of the proposed rule change is available at the Exchange, the... Proposed Rule Change To Amend NYSE Rule 104(a)(1)(A) To Reflect That Designated Market Maker Unit Quoting Requirements Are Based on Consolidated Average Daily Volume December 2, 2011. Pursuant to Section 19(b)(1) of...

  13. Adaptive Critic-based Neurofuzzy Controller for the Steam Generator Water Level

    NASA Astrophysics Data System (ADS)

    Fakhrazari, Amin; Boroushaki, Mehrdad

    2008-06-01

    In this paper, an adaptive critic-based neurofuzzy controller is presented for water level regulation of nuclear steam generators. The problem has been of great concern for many years as the steam generator is a highly nonlinear system showing inverse response dynamics especially at low operating power levels. Fuzzy critic-based learning is a reinforcement learning method based on dynamic programming. The only information available for the critic agent is the system feedback which is interpreted as the last action the controller has performed in the previous state. The signal produced by the critic agent is used alongside the backpropagation of error algorithm to tune online conclusion parts of the fuzzy inference rules. The critic agent here has a proportional-derivative structure and the fuzzy rule base has nine rules. The proposed controller shows satisfactory transient responses, disturbance rejection and robustness to model uncertainty. Its simple design procedure and structure, nominates it as one of the suitable controller designs for the steam generator water level control in nuclear power plant industry.

  14. An expert system to manage the operation of the Space Shuttle's fuel cell cryogenic reactant tanks

    NASA Technical Reports Server (NTRS)

    Murphey, Amy Y.

    1990-01-01

    This paper describes a rule-based expert system to manage the operation of the Space Shuttle's cryogenic fuel system. Rules are based on standard fuel tank operating procedures described in the EECOM Console Handbook. The problem of configuring the operation of the Space Shuttle's fuel tanks is well-bounded and well defined. Moreover, the solution of this problem can be encoded in a knowledge-based system. Therefore, a rule-based expert system is the appropriate paradigm. Furthermore, the expert system could be used in coordination with power system simulation software to design operating procedures for specific missions.

  15. Theoretical and subjective bit assignments in transform picture

    NASA Technical Reports Server (NTRS)

    Jones, H. W., Jr.

    1977-01-01

    It is shown that all combinations of symmetrical input distributions with difference distortion measures give a bit assignment rule identical to the well-known rule for a Gaussian input distribution with mean-square error. Published work is examined to show that the bit assignment rule is useful for transforms of full pictures, but subjective bit assignments for transform picture coding using small block sizes are significantly different from the theoretical bit assignment rule. An intuitive explanation is based on subjective design experience, and a subjectively obtained bit assignment rule is given.

  16. Rule-based support system for multiple UMLS semantic type assignments

    PubMed Central

    Geller, James; He, Zhe; Perl, Yehoshua; Morrey, C. Paul; Xu, Julia

    2012-01-01

    Background When new concepts are inserted into the UMLS, they are assigned one or several semantic types from the UMLS Semantic Network by the UMLS editors. However, not every combination of semantic types is permissible. It was observed that many concepts with rare combinations of semantic types have erroneous semantic type assignments or prohibited combinations of semantic types. The correction of such errors is resource-intensive. Objective We design a computational system to inform UMLS editors as to whether a specific combination of two, three, four, or five semantic types is permissible or prohibited or questionable. Methods We identify a set of inclusion and exclusion instructions in the UMLS Semantic Network documentation and derive corresponding rule-categories as well as rule-categories from the UMLS concept content. We then design an algorithm adviseEditor based on these rule-categories. The algorithm specifies rules for an editor how to proceed when considering a tuple (pair, triple, quadruple, quintuple) of semantic types to be assigned to a concept. Results Eight rule-categories were identified. A Web-based system was developed to implement the adviseEditor algorithm, which returns for an input combination of semantic types whether it is permitted, prohibited or (in a few cases) requires more research. The numbers of semantic type pairs assigned to each rule-category are reported. Interesting examples for each rule-category are illustrated. Cases of semantic type assignments that contradict rules are listed, including recently introduced ones. Conclusion The adviseEditor system implements explicit and implicit knowledge available in the UMLS in a system that informs UMLS editors about the permissibility of a desired combination of semantic types. Using adviseEditor might help accelerate the work of the UMLS editors and prevent erroneous semantic type assignments. PMID:23041716

  17. Mode Matching for Optical Antennas

    NASA Astrophysics Data System (ADS)

    Feichtner, Thorsten; Christiansen, Silke; Hecht, Bert

    2017-11-01

    The emission rate of a point dipole can be strongly increased in the presence of a well-designed optical antenna. Yet, optical antenna design is largely based on radio-frequency rules, ignoring, e.g., Ohmic losses and non-negligible field penetration in metals at optical frequencies. Here, we combine reciprocity and Poynting's theorem to derive a set of optical-frequency antenna design rules for benchmarking and optimizing the performance of optical antennas driven by single quantum emitters. Based on these findings a novel plasmonic cavity antenna design is presented exhibiting a considerably improved performance compared to a reference two-wire antenna. Our work will be useful for the design of high-performance optical antennas and nanoresonators for diverse applications ranging from quantum optics to antenna-enhanced single-emitter spectroscopy and sensing.

  18. Systematic methods for knowledge acquisition and expert system development

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    Nine cooperating rule-based systems, collectively called AUTOCREW, were designed to automate functions and decisions associated with a combat aircraft's subsystem. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base, and to assess the cooperation between the rule-bases. Each AUTOCREW subsystem is composed of several expert systems that perform specific tasks. AUTOCREW's NAVIGATOR was analyzed in detail to understand the difficulties involved in designing the system and to identify tools and methodologies that ease development. The NAVIGATOR determines optimal navigation strategies from a set of available sensors. A Navigation Sensor Management (NSM) expert system was systematically designed from Kalman filter covariance data; four ground-based, a satellite-based, and two on-board INS-aiding sensors were modeled and simulated to aid an INS. The NSM Expert was developed using the Analysis of Variance (ANOVA) and the ID3 algorithm. Navigation strategy selection is based on an RSS position error decision metric, which is computed from the covariance data. Results show that the NSM Expert predicts position error correctly between 45 and 100 percent of the time for a specified navaid configuration and aircraft trajectory. The NSM Expert adapts to new situations, and provides reasonable estimates of hybrid performance. The systematic nature of the ANOVA/ID3 method makes it broadly applicable to expert system design when experimental or simulation data is available.

  19. Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0

    NASA Technical Reports Server (NTRS)

    Schmidt, Conrad K.

    2013-01-01

    Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.

  20. Effective Design of Multifunctional Peptides by Combining Compatible Functions

    PubMed Central

    Diener, Christian; Garza Ramos Martínez, Georgina; Moreno Blas, Daniel; Castillo González, David A.; Corzo, Gerardo; Castro-Obregon, Susana; Del Rio, Gabriel

    2016-01-01

    Multifunctionality is a common trait of many natural proteins and peptides, yet the rules to generate such multifunctionality remain unclear. We propose that the rules defining some protein/peptide functions are compatible. To explore this hypothesis, we trained a computational method to predict cell-penetrating peptides at the sequence level and learned that antimicrobial peptides and DNA-binding proteins are compatible with the rules of our predictor. Based on this finding, we expected that designing peptides for CPP activity may render AMP and DNA-binding activities. To test this prediction, we designed peptides that embedded two independent functional domains (nuclear localization and yeast pheromone activity), linked by optimizing their composition to fit the rules characterizing cell-penetrating peptides. These peptides presented effective cell penetration, DNA-binding, pheromone and antimicrobial activities, thus confirming the effectiveness of our computational approach to design multifunctional peptides with potential therapeutic uses. Our computational implementation is available at http://bis.ifc.unam.mx/en/software/dcf. PMID:27096600

  1. Hyper-heuristic Evolution of Dispatching Rules: A Comparison of Rule Representations.

    PubMed

    Branke, Jürgen; Hildebrandt, Torsten; Scholz-Reiter, Bernd

    2015-01-01

    Dispatching rules are frequently used for real-time, online scheduling in complex manufacturing systems. Design of such rules is usually done by experts in a time consuming trial-and-error process. Recently, evolutionary algorithms have been proposed to automate the design process. There are several possibilities to represent rules for this hyper-heuristic search. Because the representation determines the search neighborhood and the complexity of the rules that can be evolved, a suitable choice of representation is key for a successful evolutionary algorithm. In this paper we empirically compare three different representations, both numeric and symbolic, for automated rule design: A linear combination of attributes, a representation based on artificial neural networks, and a tree representation. Using appropriate evolutionary algorithms (CMA-ES for the neural network and linear representations, genetic programming for the tree representation), we empirically investigate the suitability of each representation in a dynamic stochastic job shop scenario. We also examine the robustness of the evolved dispatching rules against variations in the underlying job shop scenario, and visualize what the rules do, in order to get an intuitive understanding of their inner workings. Results indicate that the tree representation using an improved version of genetic programming gives the best results if many candidate rules can be evaluated, closely followed by the neural network representation that already leads to good results for small to moderate computational budgets. The linear representation is found to be competitive only for extremely small computational budgets.

  2. Reliability based design of the primary structure of oil tankers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casella, G.; Dogliani, M.; Guedes Soares, C.

    1996-12-31

    The present paper describes the reliability analysis carried out for two oil tanker-ships having comparable dimensions but different design. The scope of the analysis was to derive indications on the value of the reliability index obtained for existing, typical and well designed oil tankers, as well as to apply the tentative rule checking formulation developed within the CEC-funded SHIPREL Project. The checking formula was adopted to redesign the midships section of one of the considered ships, upgrading her in order to meet the target failure probability considered in the rule development process. The resulting structure, in view of an upgradingmore » of the steel grade in the central part of the deck, lead to a convenient reliability level. The results of the analysis clearly showed that a large scatter exists presently in the design safety levels of ships, even when the Classification Societies` unified requirements are satisfied. A reliability based approach for the calibration of the rules for the global strength of ships is therefore proposed, in order to assist designers and Classification Societies in the process of producing ships which are more optimized, with respect to ensured safety levels. Based on the work reported in the paper, the feasibility and usefulness of a reliability based approach in the development of ship longitudinal strength requirements has been demonstrated.« less

  3. Effect of commercial and military performance requirements for transport category aircraft on space shuttle booster design and operation

    NASA Technical Reports Server (NTRS)

    Bithell, R. A.; Pence, W. A., Jr.

    1972-01-01

    The effect of two sets of performance requirements, commercial and military, on the design and operation of the space shuttle booster is evaluated. Critical thrust levels are established according to both sets of operating rules for the takeoff, cruise, and go-around flight modes, and the effect on engine requirements determined. Both flyback and ferry operations are considered. The impact of landing rules on potential shuttle flyback and ferry bases is evaluated. Factors affecting reserves are discussed, including winds, temperature, and nonstandard flight operations. Finally, a recommended set of operating rules is proposed for both flyback and ferry operations that allows adequate performance capability and safety margins without compromising design requirements for either flight phase.

  4. Unit operations for gas-liquid mass transfer in reduced gravity environments

    NASA Technical Reports Server (NTRS)

    Pettit, Donald R.; Allen, David T.

    1992-01-01

    Basic scaling rules are derived for converting Earth-based designs of mass transfer equipment into designs for a reduced gravity environment. Three types of gas-liquid mass transfer operations are considered: bubble columns, spray towers, and packed columns. Application of the scaling rules reveals that the height of a bubble column in lunar- and Mars-based operations would be lower than terrestrial designs by factors of 0.64 and 0.79 respectively. The reduced gravity columns would have greater cross-sectional areas, however, by factors of 2.4 and 1.6 for lunar and Martian settings. Similar results were obtained for spray towers. In contract, packed column height was found to be nearly independent of gravity.

  5. Electromigration failures under bidirectional current stress

    NASA Astrophysics Data System (ADS)

    Tao, Jiang; Cheung, Nathan W.; Hu, Chenming

    1998-01-01

    Electromigration failure under DC stress has been studied for more than 30 years, and the methodologies for accelerated DC testing and design rules have been well established in the IC industry. However, the electromigration behavior and design rules under time-varying current stress are still unclear. In CMOS circuits, as many interconnects carry pulsed-DC (local VCC and VSS lines) and bidirectional AC current (clock and signal lines), it is essential to assess the reliability of metallization systems under these conditions. Failure mechanisms of different metallization systems (Al-Si, Al-Cu, Cu, TiN/Al-alloy/TiN, etc.) and different metallization structures (via, plug and interconnect) under AC current stress in a wide frequency range (from mHz to 500 MHz) has been study in this paper. Based on these experimental results, a damage healing model is developed, and electromigration design rules are proposed. It shows that in the circuit operating frequency range, the "design-rule current" is the time-average current. The pure AC component of the current only contributes to self-heating, while the average (DC component) current contributes to electromigration. To ensure longer thermal-migration lifetime under high frequency AC stress, an additional design rule is proposed to limit the temperature rise due to self-joule heating.

  6. Rule-based topology system for spatial databases to validate complex geographic datasets

    NASA Astrophysics Data System (ADS)

    Martinez-Llario, J.; Coll, E.; Núñez-Andrés, M.; Femenia-Ribera, C.

    2017-06-01

    A rule-based topology software system providing a highly flexible and fast procedure to enforce integrity in spatial relationships among datasets is presented. This improved topology rule system is built over the spatial extension Jaspa. Both projects are open source, freely available software developed by the corresponding author of this paper. Currently, there is no spatial DBMS that implements a rule-based topology engine (considering that the topology rules are designed and performed in the spatial backend). If the topology rules are applied in the frontend (as in many GIS desktop programs), ArcGIS is the most advanced solution. The system presented in this paper has several major advantages over the ArcGIS approach: it can be extended with new topology rules, it has a much wider set of rules, and it can mix feature attributes with topology rules as filters. In addition, the topology rule system can work with various DBMSs, including PostgreSQL, H2 or Oracle, and the logic is performed in the spatial backend. The proposed topology system allows users to check the complex spatial relationships among features (from one or several spatial layers) that require some complex cartographic datasets, such as the data specifications proposed by INSPIRE in Europe and the Land Administration Domain Model (LADM) for Cadastral data.

  7. Using an improved association rules mining optimization algorithm in web-based mobile-learning system

    NASA Astrophysics Data System (ADS)

    Huang, Yin; Chen, Jianhua; Xiong, Shaojun

    2009-07-01

    Mobile-Learning (M-learning) makes many learners get the advantages of both traditional learning and E-learning. Currently, Web-based Mobile-Learning Systems have created many new ways and defined new relationships between educators and learners. Association rule mining is one of the most important fields in data mining and knowledge discovery in databases. Rules explosion is a serious problem which causes great concerns, as conventional mining algorithms often produce too many rules for decision makers to digest. Since Web-based Mobile-Learning System collects vast amounts of student profile data, data mining and knowledge discovery techniques can be applied to find interesting relationships between attributes of learners, assessments, the solution strategies adopted by learners and so on. Therefore ,this paper focus on a new data-mining algorithm, combined with the advantages of genetic algorithm and simulated annealing algorithm , called ARGSA(Association rules based on an improved Genetic Simulated Annealing Algorithm), to mine the association rules. This paper first takes advantage of the Parallel Genetic Algorithm and Simulated Algorithm designed specifically for discovering association rules. Moreover, the analysis and experiment are also made to show the proposed method is superior to the Apriori algorithm in this Mobile-Learning system.

  8. Ultimate strength performance of tankers associated with industry corrosion addition practices

    NASA Astrophysics Data System (ADS)

    Kim, Do Kyun; Kim, Han Byul; Zhang, Xiaoming; Li, Chen Guang; Paik, Jeom Kee

    2014-09-01

    In the ship and offshore structure design, age-related problems such as corrosion damage, local denting, and fatigue damage are important factors to be considered in building a reliable structure as they have a significant influence on the residual structural capacity. In shipping, corrosion addition methods are widely adopted in structural design to prevent structural capacity degradation. The present study focuses on the historical trend of corrosion addition rules for ship structural design and investigates their effects on the ultimate strength performance such as hull girder and stiffened panel of double hull oil tankers. Three types of rules based on corrosion addition models, namely historic corrosion rules (pre-CSR), Common Structural Rules (CSR), and harmonised Common Structural Rules (CSRH) are considered and compared with two other corrosion models namely UGS model, suggested by the Union of Greek Shipowners (UGS), and Time-Dependent Corrosion Wastage Model (TDCWM). To identify the general trend in the effects of corrosion damage on the ultimate longitudinal strength performance, the corrosion addition rules are applied to four representative sizes of double hull oil tankers namely Panamax, Aframax, Suezmax, and VLCC. The results are helpful in understanding the trend of corrosion additions for tanker structures

  9. Strength Analysis on Ship Ladder Using Finite Element Method

    NASA Astrophysics Data System (ADS)

    Budianto; Wahyudi, M. T.; Dinata, U.; Ruddianto; Eko P., M. M.

    2018-01-01

    In designing the ship’s structure, it should refer to the rules in accordance with applicable classification standards. In this case, designing Ladder (Staircase) on a Ferry Ship which is set up, it must be reviewed based on the loads during ship operations, either during sailing or at port operations. The classification rules in ship design refer to the calculation of the structure components described in Classification calculation method and can be analysed using the Finite Element Method. Classification Regulations used in the design of Ferry Ships used BKI (Bureau of Classification Indonesia). So the rules for the provision of material composition in the mechanical properties of the material should refer to the classification of the used vessel. The analysis in this structure used program structure packages based on Finite Element Method. By using structural analysis on Ladder (Ladder), it obtained strength and simulation structure that can withstand load 140 kg both in static condition, dynamic, and impact. Therefore, the result of the analysis included values of safety factors in the ship is to keep the structure safe but the strength of the structure is not excessive.

  10. Traditional versus rule-based programming techniques: Application to the control of optional flight information

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.; Abbott, Kathy H.

    1987-01-01

    To the software design community, the concern over the costs associated with a program's execution time and implementation is great. It is always desirable, and sometimes imperative, that the proper programming technique is chosen which minimizes all costs for a given application or type of application. A study is described that compared cost-related factors associated with traditional programming techniques to rule-based programming techniques for a specific application. The results of this study favored the traditional approach regarding execution efficiency, but favored the rule-based approach regarding programmer productivity (implementation ease). Although this study examined a specific application, the results should be widely applicable.

  11. Quantum-splitting oxide-based phosphors, method of producing, and rules for designing the same

    DOEpatents

    Setlur, Anant Achyut; Comanzo, Holly Ann; Srivastava, Alok Mani

    2003-09-16

    Strontium and strontium calcium aluminates and lanthanum and lanthanum magnesium borates activated with Pr.sup.3+ and Mn.sup.2+ exhibit characteristics of quantum-splitting phosphors. Improved quantum efficiency may be obtained by further doping with Gd.sup.3+. Refined rules for designing quantum-splitting phosphors include the requirement of incorporation of Gd.sup.3+ and Mn.sup.2+ in the host lattice for facilitation of energy migration.

  12. Spectromicroscopic insights for rational design of redox-based memristive devices

    PubMed Central

    Baeumer, Christoph; Schmitz, Christoph; Ramadan, Amr H. H.; Du, Hongchu; Skaja, Katharina; Feyer, Vitaliy; Müller, Philipp; Arndt, Benedikt; Jia, Chun-Lin; Mayer, Joachim; De Souza, Roger A.; Michael Schneider, Claus; Waser, Rainer; Dittmann, Regina

    2015-01-01

    The demand for highly scalable, low-power devices for data storage and logic operations is strongly stimulating research into resistive switching as a novel concept for future non-volatile memory devices. To meet technological requirements, it is imperative to have a set of material design rules based on fundamental material physics, but deriving such rules is proving challenging. Here, we elucidate both switching mechanism and failure mechanism in the valence-change model material SrTiO3, and on this basis we derive a design rule for failure-resistant devices. Spectromicroscopy reveals that the resistance change during device operation and failure is indeed caused by nanoscale oxygen migration resulting in localized valence changes between Ti4+ and Ti3+. While fast reoxidation typically results in retention failure in SrTiO3, local phase separation within the switching filament stabilizes the retention. Mimicking this phase separation by intentionally introducing retention-stabilization layers with slow oxygen transport improves retention times considerably. PMID:26477940

  13. A RULE-BASED SYSTEM FOR EVALUATING FINAL COVERS FOR HAZARDOUS WASTE LANDFILLS

    EPA Science Inventory

    This chapter examines how rules are used as a knowledge representation formalism in the domain of hazardous waste management. A specific example from this domain involves performance evaluation of final covers used to close hazardous waste landfills. Final cover design and associ...

  14. Knowledge-based control of an adaptive interface

    NASA Technical Reports Server (NTRS)

    Lachman, Roy

    1989-01-01

    The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.

  15. Building distributed rule-based systems using the AI Bus

    NASA Technical Reports Server (NTRS)

    Schultz, Roger D.; Stobie, Iain C.

    1990-01-01

    The AI Bus software architecture was designed to support the construction of large-scale, production-quality applications in areas of high technology flux, running heterogeneous distributed environments, utilizing a mix of knowledge-based and conventional components. These goals led to its current development as a layered, object-oriented library for cooperative systems. This paper describes the concepts and design of the AI Bus and its implementation status as a library of reusable and customizable objects, structured by layers from operating system interfaces up to high-level knowledge-based agents. Each agent is a semi-autonomous process with specialized expertise, and consists of a number of knowledge sources (a knowledge base and inference engine). Inter-agent communication mechanisms are based on blackboards and Actors-style acquaintances. As a conservative first implementation, we used C++ on top of Unix, and wrapped an embedded Clips with methods for the knowledge source class. This involved designing standard protocols for communication and functions which use these protocols in rules. Embedding several CLIPS objects within a single process was an unexpected problem because of global variables, whose solution involved constructing and recompiling a C++ version of CLIPS. We are currently working on a more radical approach to incorporating CLIPS, by separating out its pattern matcher, rule and fact representations and other components as true object oriented modules.

  16. Realization of planning design of mechanical manufacturing system by Petri net simulation model

    NASA Astrophysics Data System (ADS)

    Wu, Yanfang; Wan, Xin; Shi, Weixiang

    1991-09-01

    Planning design is to work out a more overall long-term plan. In order to guarantee a mechanical manufacturing system (MMS) designed to obtain maximum economical benefit, it is necessary to carry out a reasonable planning design for the system. First, some principles on planning design for MMS are introduced. Problems of production scheduling and their decision rules for computer simulation are presented. Realizable method of each production scheduling decision rule in Petri net model is discussed. Second, the solution of conflict rules for conflict problems during running Petri net is given. Third, based on the Petri net model of MMS which includes part flow and tool flow, according to the principle of minimum event time advance, a computer dynamic simulation of the Petri net model, that is, a computer dynamic simulation of MMS, is realized. Finally, the simulation program is applied to a simulation exmple, so the scheme of a planning design for MMS can be evaluated effectively.

  17. VIP: A knowledge-based design aid for the engineering of space systems

    NASA Technical Reports Server (NTRS)

    Lewis, Steven M.; Bellman, Kirstie L.

    1990-01-01

    The Vehicles Implementation Project (VIP), a knowledge-based design aid for the engineering of space systems is described. VIP combines qualitative knowledge in the form of rules, quantitative knowledge in the form of equations, and other mathematical modeling tools. The system allows users rapidly to develop and experiment with models of spacecraft system designs. As information becomes available to the system, appropriate equations are solved symbolically and the results are displayed. Users may browse through the system, observing dependencies and the effects of altering specific parameters. The system can also suggest approaches to the derivation of specific parameter values. In addition to providing a tool for the development of specific designs, VIP aims at increasing the user's understanding of the design process. Users may rapidly examine the sensitivity of a given parameter to others in the system and perform tradeoffs or optimizations of specific parameters. A second major goal of VIP is to integrate the existing corporate knowledge base of models and rules into a central, symbolic form.

  18. Automated Title Page Cataloging: A Feasibility Study.

    ERIC Educational Resources Information Center

    Weibel, Stuart; And Others

    1989-01-01

    Describes the design of a prototype rule-based system for the automation of descriptive cataloging from title pages. The discussion covers the results of tests of the prototype, major impediments to automatic cataloging from title pages, and prospects for further progress. The rules implemented in the prototype are appended. (16 references)…

  19. Evaluation of a rule base for decision making in general practice.

    PubMed Central

    Essex, B; Healy, M

    1994-01-01

    BACKGROUND. Decision making in general practice relies heavily on judgmental expertise. It should be possible to codify this expertise into rules and principles. AIM. A study was undertaken to evaluate the effectiveness, of rules from a rule base designed to improve students' and trainees' management decisions relating to patients seen in general practice. METHOD. The rule base was developed after studying decisions about and management of thousands of patients seen in one general practice over an eight year period. Vignettes were presented to 93 fourth year medical students and 179 general practitioner trainees. They recorded their perception and management of each case before and after being presented with a selection of relevant rules. Participants also commented on their level of agreement with each of the rules provided with the vignettes. A panel of five independent assessors then rated as good, acceptable or poor, the participants' perception and management of each case before and after seeing the rules. RESULTS. Exposure to a few selected rules of thumb improved the problem perception and management decisions of both undergraduates and trainees. The degree of improvement was not related to previous experience or to the stated level of agreement with the proposed rules. The assessors identified difficulties students and trainees experienced in changing their perceptions and management decisions when the rules suggested options they had not considered. CONCLUSION. The rules developed to improve decision making skills in general practice are effective when used with vignettes. The next phase is to transform the rule base into an expert system to train students and doctors to acquire decision making skills. It could also be used to provide decision support when confronted with difficult management decisions in general practice. PMID:8204334

  20. Comparison of conventional rule based flow control with control processes based on fuzzy logic in a combined sewer system.

    PubMed

    Klepiszewski, K; Schmitt, T G

    2002-01-01

    While conventional rule based, real time flow control of sewer systems is in common use, control systems based on fuzzy logic have been used only rarely, but successfully. The intention of this study is to compare a conventional rule based control of a combined sewer system with a fuzzy logic control by using hydrodynamic simulation. The objective of both control strategies is to reduce the combined sewer overflow volume by an optimization of the utilized storage capacities of four combined sewer overflow tanks. The control systems affect the outflow of four combined sewer overflow tanks depending on the water levels inside the structures. Both systems use an identical rule base. The developed control systems are tested and optimized for a single storm event which affects heterogeneously hydraulic load conditions and local discharge. Finally the efficiencies of the two different control systems are compared for two more storm events. The results indicate that the conventional rule based control and the fuzzy control similarly reach the objective of the control strategy. In spite of the higher expense to design the fuzzy control system its use provides no advantages in this case.

  1. Analytic and rule-based decision support tool for VDT workstation adjustment and computer accessories arrangement.

    PubMed

    Rurkhamet, Busagarin; Nanthavanij, Suebsak

    2004-12-01

    One important factor that leads to the development of musculoskeletal disorders (MSD) and cumulative trauma disorders (CTD) among visual display terminal (VDT) users is their work posture. While operating a VDT, a user's body posture is strongly influenced by the task, VDT workstation settings, and layout of computer accessories. This paper presents an analytic and rule-based decision support tool called EQ-DeX (an ergonomics and quantitative design expert system) that is developed to provide valid and practical recommendations regarding the adjustment of a VDT workstation and the arrangement of computer accessories. The paper explains the structure and components of EQ-DeX, input data, rules, and adjustment and arrangement algorithms. From input information such as gender, age, body height, task, etc., EQ-DeX uses analytic and rule-based algorithms to estimate quantitative settings of a computer table and a chair, as well as locations of computer accessories such as monitor, document holder, keyboard, and mouse. With the input and output screens that are designed using the concept of usability, the interactions between the user and EQ-DeX are convenient. Examples are also presented to demonstrate the recommendations generated by EQ-DeX.

  2. Intelligent Diagnostic Assistant for Complicated Skin Diseases through C5's Algorithm.

    PubMed

    Jeddi, Fatemeh Rangraz; Arabfard, Masoud; Kermany, Zahra Arab

    2017-09-01

    Intelligent Diagnostic Assistant can be used for complicated diagnosis of skin diseases, which are among the most common causes of disability. The aim of this study was to design and implement a computerized intelligent diagnostic assistant for complicated skin diseases through C5's Algorithm. An applied-developmental study was done in 2015. Knowledge base was developed based on interviews with dermatologists through questionnaires and checklists. Knowledge representation was obtained from the train data in the database using Excel Microsoft Office. Clementine Software and C5's Algorithms were applied to draw the decision tree. Analysis of test accuracy was performed based on rules extracted using inference chains. The rules extracted from the decision tree were entered into the CLIPS programming environment and the intelligent diagnostic assistant was designed then. The rules were defined using forward chaining inference technique and were entered into Clips programming environment as RULE. The accuracy and error rates obtained in the training phase from the decision tree were 99.56% and 0.44%, respectively. The accuracy of the decision tree was 98% and the error was 2% in the test phase. Intelligent diagnostic assistant can be used as a reliable system with high accuracy, sensitivity, specificity, and agreement.

  3. Design rules for quantum imaging devices: experimental progress using CMOS single-photon detectors

    NASA Astrophysics Data System (ADS)

    Charbon, Edoardo; Gunther, Neil J.; Boiko, Dmitri L.; Beretta, Giordano B.

    2006-08-01

    We continue our previous program1 where we introduced a set of quantum-based design rules directed at quantum engineers who design single-photon quantum communications and quantum imaging devices. Here, we report on experimental progress using SPAD (single photon avalanche diode) arrays of our design and fabricated in CMOS (complementary metal oxide semiconductor) technology. Emerging high-resolution imaging techniques based on SPAD arrays have proven useful in a variety of disciplines including bio-fluorescence microscopy and 3D vision systems. They have also been particularly successful for intra-chip optical communications implemented entirely in CMOS technology. More importantly for our purposes, a very low dark count allows SPADs to detect rare photon events with a high dynamic range and high signal-to-noise ratio. Our CMOS SPADs support multi-channel detection of photon arrivals with picosecond accuracy, several million times per second, due to a very short detection cycle. The tiny chip area means they are suitable for highly miniaturized quantum imaging devices and that is how we employ them in this paper. Our quantum path integral analysis of the Young-Afshar-Wheeler interferometer showed that Bohr's complementarity principle was not violated due the previously overlooked effect of photon bifurcation within the lens--a phenomenon consistent with our quantum design rules--which accounts for the loss of which-path information in the presence of interference. In this paper, we report on our progress toward the construction of quantitative design rules as well as some proposed tests for quantum imaging devices using entangled photon sources with our SPAD imager.

  4. Fuzzy based attitude controller for flexible spacecraft with on/off thrusters

    NASA Astrophysics Data System (ADS)

    Knapp, Roger G.; Adams, Neil J.

    A fuzzy-based attitude controller is designed for attitude control of a generic spacecraft with on/off thrusters. The controller is comprised of packages of rules dedicated to addressing different objectives (e.g., disturbance rejection, low fuel consumption, avoiding the excitation of flexible appendages, etc.). These rule packages can be inserted or removed depending on the requirements of the particular spacecraft and are parameterized based on vehicle parameters such as inertia or operational parameters such as the maneuvering rate. Individual rule packages can be 'weighted' relative to each other to emphasize the importance of one objective relative to another. Finally, the fuzzy controller and rule packages are demonstrated using the high-fidelity Space Shuttle Interactive On-Orbit Simulator (IOS) while performing typical on-orbit operations and are subsequently compared with the existing shuttle flight control system performance.

  5. Fuzzy based attitude controller for flexible spacecraft with on/off thrusters

    NASA Astrophysics Data System (ADS)

    Knapp, Roger Glenn

    1993-05-01

    A fuzzy-based attitude controller is designed for attitude control of a generic spacecraft with on/off thrusters. The controller is comprised of packages of rules dedicated to addressing different objectives (e.g., disturbance rejection, low fuel consumption, avoiding the excitation of flexible appendages, etc.). These rule packages can be inserted or removed depending on the requirements of the particular spacecraft and are parameterized based on vehicle parameters such as inertia or operational parameters such as the maneuvering rate. Individual rule packages can be 'weighted' relative to each other to emphasize the importance of one objective relative to another. Finally, the fuzzy controller and rule packages are demonstrated using the high-fidelity Space Shuttle Interactive On-Orbit Simulator (IOS) while performing typical on-orbit operations and are subsequently compared with the existing shuttle flight control system performance.

  6. Life insurance risk assessment using a fuzzy logic expert system

    NASA Technical Reports Server (NTRS)

    Carreno, Luis A.; Steel, Roy A.

    1992-01-01

    In this paper, we present a knowledge based system that combines fuzzy processing with rule-based processing to form an improved decision aid for evaluating risk for life insurance. This application illustrates the use of FuzzyCLIPS to build a knowledge based decision support system possessing fuzzy components to improve user interactions and KBS performance. The results employing FuzzyCLIPS are compared with the results obtained from the solution of the problem using traditional numerical equations. The design of the fuzzy solution consists of a CLIPS rule-based system for some factors combined with fuzzy logic rules for others. This paper describes the problem, proposes a solution, presents the results, and provides a sample output of the software product.

  7. Optics Toolbox: An Intelligent Relational Database System For Optical Designers

    NASA Astrophysics Data System (ADS)

    Weller, Scott W.; Hopkins, Robert E.

    1986-12-01

    Optical designers were among the first to use the computer as an engineering tool. Powerful programs have been written to do ray-trace analysis, third-order layout, and optimization. However, newer computing techniques such as database management and expert systems have not been adopted by the optical design community. For the purpose of this discussion we will define a relational database system as a database which allows the user to specify his requirements using logical relations. For example, to search for all lenses in a lens database with a F/number less than two, and a half field of view near 28 degrees, you might enter the following: FNO < 2.0 and FOV of 28 degrees ± 5% Again for the purpose of this discussion, we will define an expert system as a program which contains expert knowledge, can ask intelligent questions, and can form conclusions based on the answers given and the knowledge which it contains. Most expert systems store this knowledge in the form of rules-of-thumb, which are written in an English-like language, and which are easily modified by the user. An example rule is: IF require microscope objective in air and require NA > 0.9 THEN suggest the use of an oil immersion objective The heart of the expert system is the rule interpreter, sometimes called an inference engine, which reads the rules and forms conclusions based on them. The use of a relational database system containing lens prototypes seems to be a viable prospect. However, it is not clear that expert systems have a place in optical design. In domains such as medical diagnosis and petrology, expert systems are flourishing. These domains are quite different from optical design, however, because optical design is a creative process, and the rules are difficult to write down. We do think that an expert system is feasible in the area of first order layout, which is sufficiently diagnostic in nature to permit useful rules to be written. This first-order expert would emulate an expert designer as he interacted with a customer for the first time: asking the right questions, forming conclusions, and making suggestions. With these objectives in mind, we have developed the Optics Toolbox. Optics Toolbox is actually two programs in one: it is a powerful relational database system with twenty-one search parameters, four search modes, and multi-database support, as well as a first-order optical design expert system with a rule interpreter which has full access to the relational database. The system schematic is shown in Figure 1.

  8. Systematic methods for knowledge acquisition and expert system development

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    Nine cooperating rule-based systems, collectively called AUTOCREW which were designed to automate functions and decisions associated with a combat aircraft's subsystems, are discussed. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base and to assess the cooperation between the rule bases. Simulation and comparative workload results for two mission scenarios are given. The scenarios are inbound surface-to-air-missile attack on the aircraft and pilot incapacitation. The methodology used to develop the AUTOCREW knowledge bases is summarized. Issues involved in designing the navigation sensor selection expert in AUTOCREW's NAVIGATOR knowledge base are discussed in detail. The performance of seven navigation systems aiding a medium-accuracy INS was investigated using Kalman filter covariance analyses. A navigation sensor management (NSM) expert system was formulated from covariance simulation data using the analysis of variance (ANOVA) method and the ID3 algorithm. ANOVA results show that statistically different position accuracies are obtained when different navaids are used, the number of navaids aiding the INS is varied, the aircraft's trajectory is varied, and the performance history is varied. The ID3 algorithm determines the NSM expert's classification rules in the form of decision trees. The performance of these decision trees was assessed on two arbitrary trajectories, and the results demonstrate that the NSM expert adapts to new situations and provides reasonable estimates of the expected hybrid performance.

  9. An Integrated Children Disease Prediction Tool within a Special Social Network.

    PubMed

    Apostolova Trpkovska, Marika; Yildirim Yayilgan, Sule; Besimi, Adrian

    2016-01-01

    This paper proposes a social network with an integrated children disease prediction system developed by the use of the specially designed Children General Disease Ontology (CGDO). This ontology consists of children diseases and their relationship with symptoms and Semantic Web Rule Language (SWRL rules) that are specially designed for predicting diseases. The prediction process starts by filling data about the appeared signs and symptoms by the user which are after that mapped with the CGDO ontology. Once the data are mapped, the prediction results are presented. The phase of prediction executes the rules which extract the predicted disease details based on the SWRL rule specified. The motivation behind the development of this system is to spread knowledge about the children diseases and their symptoms in a very simple way using the specialized social networking website www.emama.mk.

  10. Conformance Testing: Measurement Decision Rules

    NASA Technical Reports Server (NTRS)

    Mimbs, Scott M.

    2010-01-01

    The goal of a Quality Management System (QMS) as specified in ISO 9001 and AS9100 is to provide assurance to the customer that end products meet specifications. Measuring devices, often called measuring and test equipment (MTE), are used to provide the evidence of product conformity to specified requirements. Unfortunately, processes that employ MTE can become a weak link to the overall QMS if proper attention is not given to the measurement process design, capability, and implementation. Documented "decision rules" establish the requirements to ensure measurement processes provide the measurement data that supports the needs of the QMS. Measurement data are used to make the decisions that impact all areas of technology. Whether measurements support research, design, production, or maintenance, ensuring the data supports the decision is crucial. Measurement data quality can be critical to the resulting consequences of measurement-based decisions. Historically, most industries required simplistic, one-size-fits-all decision rules for measurements. One-size-fits-all rules in some cases are not rigorous enough to provide adequate measurement results, while in other cases are overly conservative and too costly to implement. Ideally, decision rules should be rigorous enough to match the criticality of the parameter being measured, while being flexible enough to be cost effective. The goal of a decision rule is to ensure that measurement processes provide data with a sufficient level of quality to support the decisions being made - no more, no less. This paper discusses the basic concepts of providing measurement-based evidence that end products meet specifications. Although relevant to all measurement-based conformance tests, the target audience is the MTE end-user, which is anyone using MTE other than calibration service providers. Topics include measurement fundamentals, the associated decision risks, verifying conformance to specifications, and basic measurement decisions rules.

  11. General Rule of Negative Effective Ueff System & Materials Design of High-Tc Superconductors by ab initio Calculations

    NASA Astrophysics Data System (ADS)

    Katayama-Yoshida, Hiroshi; Nakanishi, Akitaka; Uede, Hiroki; Takawashi, Yuki; Fukushima, Tetsuya; Sato, Kazunori

    2014-03-01

    Based upon ab initio electronic structure calculation, I will discuss the general rule of negative effective U system by (1) exchange-correlation-induced negative effective U caused by the stability of the exchange-correlation energy in Hund's rule with high-spin ground states of d5 configuration, and (2) charge-excitation-induced negative effective U caused by the stability of chemical bond in the closed-shell of s2, p6, and d10 configurations. I will show the calculated results of negative effective U systems such as hole-doped CuAlO2 and CuFeS2. Based on the total energy calculations of antiferromagnetic and ferromagnetic states, I will discuss the magnetic phase diagram and superconductivity upon hole doping. I also discuss the computational materials design method of high-Tc superconductors by ab initio calculation to go beyond LDA and multi-scale simulations.

  12. Three CLIPS-based expert systems for solving engineering problems

    NASA Technical Reports Server (NTRS)

    Parkinson, W. J.; Luger, G. F.; Bretz, R. E.

    1990-01-01

    We have written three expert systems, using the CLIPS PC-based expert system shell. These three expert systems are rule based and are relatively small, with the largest containing slightly less than 200 rules. The first expert system is an expert assistant that was written to help users of the ASPEN computer code choose the proper thermodynamic package to use with their particular vapor-liquid equilibrium problem. The second expert system was designed to help petroleum engineers choose the proper enhanced oil recovery method to be used with a given reservoir. The effectiveness of each technique is highly dependent upon the reservoir conditions. The third expert system is a combination consultant and control system. This system was designed specifically for silicon carbide whisker growth. Silicon carbide whiskers are an extremely strong product used to make ceramic and metal composites. The manufacture of whiskers is a very complicated process. which to date. has defied a good mathematical model. The process was run by experts who had gained their expertise by trial and error. A system of rules was devised by these experts both for procedure setup and for the process control. In this paper we discuss the three problem areas of the design, development and evaluation of the CLIPS-based programs.

  13. 75 FR 67282 - Provisions Common to Registered Entities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-02

    ...The Commodity Futures Trading Commission (``Commission'' or ``CFTC'') is proposing rules to implement new statutory provisions enacted under Title VII of the Dodd-Frank Wall Street Reform and Consumer Protection Act (``Dodd-Frank Act'') and amend existing rules affected by the passage of the Dodd-Frank Act. These proposed rules apply to designated contract markets (``DCMs''), derivatives clearing organizations (``DCOs''), swap execution facilities (``SEFs'') and swap data repositories (``SDRs''). The proposed rules implement the new statutory framework for certification and approval for new products, new rules and rule amendments submitted to the Commission by registered entities. Furthermore, the proposed rules prohibit event contracts based on certain excluded commodities, establish special procedures for certain rule changes proposed by systemically important derivatives clearing organizations (``SIDCOs''), and provide for the tolling of review periods for certain novel derivative products pending the resolution of jurisdictional determinations.

  14. Central Heat and Power Plant Coal Dust and Silica Risk Management, Eielson Air Force Base, Alaska

    DTIC Science & Technology

    2014-12-11

    the dump truck driver, lowers the telescopic chute into a hole at the top of the ash box on the dump truck and then activates the screw conveyor ...addition to the main ammonia health risk assessment letter and designed to inform EAFB of the status of the pending silica rule, exposure assessment...main ammonia health risk assessment letter, AFRL- SA-WP-CL-2014-0014, and designed to inform EAFB of the status of the pending silica rule, exposure

  15. 75 FR 26098 - Safety Zone; Under Water Clean Up of Copper Canyon, Lake Havasu, AZ

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-11

    ... rulemaking (NPRM) with respect to this rule because the logistical details of the event were not finalized or... Port, or his designated representative. Regulatory Analyses We developed this rule after considering numerous statutes and executive orders related to rulemaking. Below we summarize our analyses based on 13...

  16. 76 FR 9227 - Safety Zone; Havasu Landing Regatta, Colorado River, Lake Havasu Landing, CA

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-17

    ... rulemaking (NPRM) with respect to this rule because doing so would be impracticable. The logistical details... designated representative. Regulatory Analyses We developed this rule after considering numerous statutes and executive orders related to rulemaking. Below we summarize our analyses based on 13 of these statutes or...

  17. Invented Rule with English Language Learners

    ERIC Educational Resources Information Center

    Boyer, Valerie E.; Martin, Kathryn Y.

    2012-01-01

    The purpose of this study was to utilize an invented rule with English language learners (ELLs) in a clinical setting to determine differences based on language and age of the children. The performance was correlated with teacher reports of strong and weak language learning. Using a within-participants design, ELLs of age three to five were taught…

  18. Accelerating clinical development of HIV vaccine strategies: methodological challenges and considerations in constructing an optimised multi-arm phase I/II trial design.

    PubMed

    Richert, Laura; Doussau, Adélaïde; Lelièvre, Jean-Daniel; Arnold, Vincent; Rieux, Véronique; Bouakane, Amel; Lévy, Yves; Chêne, Geneviève; Thiébaut, Rodolphe

    2014-02-26

    Many candidate vaccine strategies against human immunodeficiency virus (HIV) infection are under study, but their clinical development is lengthy and iterative. To accelerate HIV vaccine development optimised trial designs are needed. We propose a randomised multi-arm phase I/II design for early stage development of several vaccine strategies, aiming at rapidly discarding those that are unsafe or non-immunogenic. We explored early stage designs to evaluate both the safety and the immunogenicity of four heterologous prime-boost HIV vaccine strategies in parallel. One of the vaccines used as a prime and boost in the different strategies (vaccine 1) has yet to be tested in humans, thus requiring a phase I safety evaluation. However, its toxicity risk is considered minimal based on data from similar vaccines. We newly adapted a randomised phase II trial by integrating an early safety decision rule, emulating that of a phase I study. We evaluated the operating characteristics of the proposed design in simulation studies with either a fixed-sample frequentist or a continuous Bayesian safety decision rule and projected timelines for the trial. We propose a randomised four-arm phase I/II design with two independent binary endpoints for safety and immunogenicity. Immunogenicity evaluation at trial end is based on a single-stage Fleming design per arm, comparing the observed proportion of responders in an immunogenicity screening assay to an unacceptably low proportion, without direct comparisons between arms. Randomisation limits heterogeneity in volunteer characteristics between arms. To avoid exposure of additional participants to an unsafe vaccine during the vaccine boost phase, an early safety decision rule is imposed on the arm starting with vaccine 1 injections. In simulations of the design with either decision rule, the risks of erroneous conclusions were controlled <15%. Flexibility in trial conduct is greater with the continuous Bayesian rule. A 12-month gain in timelines is expected by this optimised design. Other existing designs such as bivariate or seamless phase I/II designs did not offer a clear-cut alternative. By combining phase I and phase II evaluations in a multi-arm trial, the proposed optimised design allows for accelerating early stage clinical development of HIV vaccine strategies.

  19. Fuzzy based attitude controller for flexible spacecraft with on/off thrusters. M.S. Thesis - M.I.T., 1993

    NASA Technical Reports Server (NTRS)

    Knapp, Roger Glenn

    1993-01-01

    A fuzzy-based attitude controller is designed for attitude control of a generic spacecraft with on/off thrusters. The controller is comprised of packages of rules dedicated to addressing different objectives (e.g., disturbance rejection, low fuel consumption, avoiding the excitation of flexible appendages, etc.). These rule packages can be inserted or removed depending on the requirements of the particular spacecraft and are parameterized based on vehicle parameters such as inertia or operational parameters such as the maneuvering rate. Individual rule packages can be 'weighted' relative to each other to emphasize the importance of one objective relative to another. Finally, the fuzzy controller and rule packages are demonstrated using the high-fidelity Space Shuttle Interactive On-Orbit Simulator (IOS) while performing typical on-orbit operations and are subsequently compared with the existing shuttle flight control system performance.

  20. Recommendations for Benchmarking Preclinical Studies of Nanomedicines.

    PubMed

    Dawidczyk, Charlene M; Russell, Luisa M; Searson, Peter C

    2015-10-01

    Nanoparticle-based delivery systems provide new opportunities to overcome the limitations associated with traditional small-molecule drug therapy for cancer and to achieve both therapeutic and diagnostic functions in the same platform. Preclinical trials are generally designed to assess therapeutic potential and not to optimize the design of the delivery platform. Consequently, progress in developing design rules for cancer nanomedicines has been slow, hindering progress in the field. Despite the large number of preclinical trials, several factors restrict comparison and benchmarking of different platforms, including variability in experimental design, reporting of results, and the lack of quantitative data. To solve this problem, we review the variables involved in the design of preclinical trials and propose a protocol for benchmarking that we recommend be included in in vivo preclinical studies of drug-delivery platforms for cancer therapy. This strategy will contribute to building the scientific knowledge base that enables development of design rules and accelerates the translation of new technologies. ©2015 American Association for Cancer Research.

  1. Perspective: Recommendations for benchmarking pre-clinical studies of nanomedicines

    PubMed Central

    Dawidczyk, Charlene M.; Russell, Luisa M.; Searson, Peter C.

    2015-01-01

    Nanoparticle-based delivery systems provide new opportunities to overcome the limitations associated with traditional small molecule drug therapy for cancer, and to achieve both therapeutic and diagnostic functions in the same platform. Pre-clinical trials are generally designed to assess therapeutic potential and not to optimize the design of the delivery platform. Consequently, progress in developing design rules for cancer nanomedicines has been slow, hindering progress in the field. Despite the large number of pre-clinical trials, several factors restrict comparison and benchmarking of different platforms, including variability in experimental design, reporting of results, and the lack of quantitative data. To solve this problem, we review the variables involved in the design of pre-clinical trials and propose a protocol for benchmarking that we recommend be included in in vivo pre-clinical studies of drug delivery platforms for cancer therapy. This strategy will contribute to building the scientific knowledge base that enables development of design rules and accelerates the translation of new technologies. PMID:26249177

  2. A guided search genetic algorithm using mined rules for optimal affective product design

    NASA Astrophysics Data System (ADS)

    Fung, Chris K. Y.; Kwong, C. K.; Chan, Kit Yan; Jiang, H.

    2014-08-01

    Affective design is an important aspect of new product development, especially for consumer products, to achieve a competitive edge in the marketplace. It can help companies to develop new products that can better satisfy the emotional needs of customers. However, product designers usually encounter difficulties in determining the optimal settings of the design attributes for affective design. In this article, a novel guided search genetic algorithm (GA) approach is proposed to determine the optimal design attribute settings for affective design. The optimization model formulated based on the proposed approach applied constraints and guided search operators, which were formulated based on mined rules, to guide the GA search and to achieve desirable solutions. A case study on the affective design of mobile phones was conducted to illustrate the proposed approach and validate its effectiveness. Validation tests were conducted, and the results show that the guided search GA approach outperforms the GA approach without the guided search strategy in terms of GA convergence and computational time. In addition, the guided search optimization model is capable of improving GA to generate good solutions for affective design.

  3. Identified research directions for using manufacturing knowledge earlier in the product lifecycle

    PubMed Central

    Hedberg, Thomas D.; Hartman, Nathan W.; Rosche, Phil; Fischer, Kevin

    2016-01-01

    Design for Manufacturing (DFM), especially the use of manufacturing knowledge to support design decisions, has received attention in the academic domain. However, industry practice has not been studied enough to provide solutions that are mature for industry. The current state of the art for DFM is often rule-based functionality within Computer-Aided Design (CAD) systems that enforce specific design requirements. That rule-based functionality may or may not dynamically affect geometry definition. And, if rule-based functionality exists in the CAD system, it is typically a customization on a case-by-case basis. Manufacturing knowledge is a phrase with vast meanings, which may include knowledge on the effects of material properties decisions, machine and process capabilities, or understanding the unintended consequences of design decisions on manufacturing. One of the DFM questions to answer is how can manufacturing knowledge, depending on its definition, be used earlier in the product lifecycle to enable a more collaborative development environment? This paper will discuss the results of a workshop on manufacturing knowledge that highlights several research questions needing more study. This paper proposes recommendations for investigating the relationship of manufacturing knowledge with shape, behavior, and context characteristics of product to produce a better understanding of what knowledge is most important. In addition, the proposal includes recommendations for investigating the system-level barriers to reusing manufacturing knowledge and how model-based manufacturing may ease the burden of knowledge sharing. Lastly, the proposal addresses the direction of future research for holistic solutions of using manufacturing knowledge earlier in the product lifecycle. PMID:27990027

  4. Identified research directions for using manufacturing knowledge earlier in the product lifecycle.

    PubMed

    Hedberg, Thomas D; Hartman, Nathan W; Rosche, Phil; Fischer, Kevin

    2017-01-01

    Design for Manufacturing (DFM), especially the use of manufacturing knowledge to support design decisions, has received attention in the academic domain. However, industry practice has not been studied enough to provide solutions that are mature for industry. The current state of the art for DFM is often rule-based functionality within Computer-Aided Design (CAD) systems that enforce specific design requirements. That rule-based functionality may or may not dynamically affect geometry definition. And, if rule-based functionality exists in the CAD system, it is typically a customization on a case-by-case basis. Manufacturing knowledge is a phrase with vast meanings, which may include knowledge on the effects of material properties decisions, machine and process capabilities, or understanding the unintended consequences of design decisions on manufacturing. One of the DFM questions to answer is how can manufacturing knowledge, depending on its definition, be used earlier in the product lifecycle to enable a more collaborative development environment? This paper will discuss the results of a workshop on manufacturing knowledge that highlights several research questions needing more study. This paper proposes recommendations for investigating the relationship of manufacturing knowledge with shape, behavior, and context characteristics of product to produce a better understanding of what knowledge is most important. In addition, the proposal includes recommendations for investigating the system-level barriers to reusing manufacturing knowledge and how model-based manufacturing may ease the burden of knowledge sharing. Lastly, the proposal addresses the direction of future research for holistic solutions of using manufacturing knowledge earlier in the product lifecycle.

  5. Post-decomposition optimizations using pattern matching and rule-based clustering for multi-patterning technology

    NASA Astrophysics Data System (ADS)

    Wang, Lynn T.-N.; Madhavan, Sriram

    2018-03-01

    A pattern matching and rule-based polygon clustering methodology with DFM scoring is proposed to detect decomposition-induced manufacturability detractors and fix the layout designs prior to manufacturing. A pattern matcher scans the layout for pre-characterized patterns from a library. If a pattern were detected, rule-based clustering identifies the neighboring polygons that interact with those captured by the pattern. Then, DFM scores are computed for the possible layout fixes: the fix with the best score is applied. The proposed methodology was applied to two 20nm products with a chip area of 11 mm2 on the metal 2 layer. All the hotspots were resolved. The number of DFM spacing violations decreased by 7-15%.

  6. Detailed statistical assessment of the characteristics of the ESMO Magnitude of Clinical Benefit Scale (ESMO-MCBS) threshold rules.

    PubMed

    Dafni, Urania; Karlis, Dimitris; Pedeli, Xanthi; Bogaerts, Jan; Pentheroudakis, George; Tabernero, Josep; Zielinski, Christoph C; Piccart, Martine J; de Vries, Elisabeth G E; Latino, Nicola Jane; Douillard, Jean-Yves; Cherny, Nathan I

    2017-01-01

    The European Society for Medical Oncology (ESMO) has developed the ESMO Magnitude of Clinical Benefit Scale (ESMO-MCBS), a tool to assess the magnitude of clinical benefit from new cancer therapies. Grading is guided by a dual rule comparing the relative benefit (RB) and the absolute benefit (AB) achieved by the therapy to prespecified threshold values. The ESMO-MCBS v1.0 dual rule evaluates the RB of an experimental treatment based on the lower limit of the 95%CI (LL95%CI) for the hazard ratio (HR) along with an AB threshold. This dual rule addresses two goals: inclusiveness: not unfairly penalising experimental treatments from trials designed with adequate power targeting clinically meaningful relative benefit; and discernment: penalising trials designed to detect a small inconsequential benefit. Based on 50 000 simulations of plausible trial scenarios, the sensitivity and specificity of the LL95%CI rule and the ESMO-MCBS dual rule, the robustness of their characteristics for reasonable power and range of targeted and true HRs, are examined. The per cent acceptance of maximal preliminary grade is compared with other dual rules based on point estimate (PE) thresholds for RB. For particularly small or particularly large studies, the observed benefit needs to be relatively big for the ESMO-MCBS dual rule to be satisfied and the maximal grade awarded. Compared with approaches that evaluate RB using the PE thresholds, simulations demonstrate that the MCBS approach better exhibits the desired behaviour achieving the goals of both inclusiveness and discernment. RB assessment using the LL95%CI for HR rather than a PE threshold has two advantages: it diminishes the probability of excluding big benefit positive studies from achieving due credit and, when combined with the AB assessment, it increases the probability of downgrading a trial with a statistically significant but clinically insignificant observed benefit.

  7. Detailed statistical assessment of the characteristics of the ESMO Magnitude of Clinical Benefit Scale (ESMO-MCBS) threshold rules

    PubMed Central

    Dafni, Urania; Karlis, Dimitris; Pedeli, Xanthi; Bogaerts, Jan; Pentheroudakis, George; Tabernero, Josep; Zielinski, Christoph C; Piccart, Martine J; de Vries, Elisabeth G E; Latino, Nicola Jane; Douillard, Jean-Yves; Cherny, Nathan I

    2017-01-01

    Background The European Society for Medical Oncology (ESMO) has developed the ESMO Magnitude of Clinical Benefit Scale (ESMO-MCBS), a tool to assess the magnitude of clinical benefit from new cancer therapies. Grading is guided by a dual rule comparing the relative benefit (RB) and the absolute benefit (AB) achieved by the therapy to prespecified threshold values. The ESMO-MCBS v1.0 dual rule evaluates the RB of an experimental treatment based on the lower limit of the 95%CI (LL95%CI) for the hazard ratio (HR) along with an AB threshold. This dual rule addresses two goals: inclusiveness: not unfairly penalising experimental treatments from trials designed with adequate power targeting clinically meaningful relative benefit; and discernment: penalising trials designed to detect a small inconsequential benefit. Methods Based on 50 000 simulations of plausible trial scenarios, the sensitivity and specificity of the LL95%CI rule and the ESMO-MCBS dual rule, the robustness of their characteristics for reasonable power and range of targeted and true HRs, are examined. The per cent acceptance of maximal preliminary grade is compared with other dual rules based on point estimate (PE) thresholds for RB. Results For particularly small or particularly large studies, the observed benefit needs to be relatively big for the ESMO-MCBS dual rule to be satisfied and the maximal grade awarded. Compared with approaches that evaluate RB using the PE thresholds, simulations demonstrate that the MCBS approach better exhibits the desired behaviour achieving the goals of both inclusiveness and discernment. Conclusions RB assessment using the LL95%CI for HR rather than a PE threshold has two advantages: it diminishes the probability of excluding big benefit positive studies from achieving due credit and, when combined with the AB assessment, it increases the probability of downgrading a trial with a statistically significant but clinically insignificant observed benefit. PMID:29067214

  8. 76 FR 69230 - Fisheries of the Caribbean, Gulf of Mexico, and South Atlantic; Comprehensive Ecosystem-Based...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-08

    ... economic zone (EEZ), establish an annual catch limit (ACL) for octocorals, modify management in special... specifications in the South Atlantic region. Through CE-BA 2, NMFS also proposes to designate new Essential Fish... Flexibility Act, for this rule. The IRFA describes the economic impact that this rule, if adopted, would have...

  9. 76 FR 82183 - Fisheries of the Caribbean, Gulf of Mexico, and South Atlantic; Comprehensive Ecosystem-Based...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-30

    ... modifies the fishery management unit (FMU) for octocorals in the South Atlantic exclusive economic zone... specifications in the South Atlantic region. CE-BA 2 also designates new Essential Fish Habitat (EFH) for... that described the economic impact of the rule. As described in the IRFA, the only action in this rule...

  10. 75 FR 55975 - Safety Zone; San Diego Harbor Shark Fest Swim; San Diego Bay, San Diego, CA

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-15

    ... Guard did not receive notification of the logistical details of the San Diego Bay swim in sufficient... the Captain of the Port, or designated representative. Regulatory Analyses We developed this rule... analyses based on 13 of these statutes or executive orders. Regulatory Planning and Review This rule is not...

  11. The Practical Art of Redesigning Teacher Education: Teacher Education Reform at Washington University, 1970-1975.

    ERIC Educational Resources Information Center

    Tom, Alan R.

    1988-01-01

    This article proposes rules of thumb about the teacher education design process. The rules are grounded in the attempts at reforming teacher education at Washington University in the early 1970s, at a time during which a year-long, field-based alternative to the traditional elementary program was operated. (IAH)

  12. 77 FR 58889 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing of Proposed Rule Change To List...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-24

    ... rising or falling markets that are not directly correlated to broad equity or fixed income market returns... quantitative, rules-based strategy designed to provide returns that correspond to the performance of the S&P..., ``VIX Index Related Instruments''), money market instruments, cash, cash equivalents and futures...

  13. A rule based computer aided design system

    NASA Technical Reports Server (NTRS)

    Premack, T.

    1986-01-01

    A Computer Aided Design (CAD) system is presented which supports the iterative process of design, the dimensional continuity between mating parts, and the hierarchical structure of the parts in their assembled configuration. Prolog, an interactive logic programming language, is used to represent and interpret the data base. The solid geometry representing the parts is defined in parameterized form using the swept volume method. The system is demonstrated with a design of a spring piston.

  14. EXADS - EXPERT SYSTEM FOR AUTOMATED DESIGN SYNTHESIS

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1994-01-01

    The expert system called EXADS was developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. Because of the general purpose nature of ADS, it is difficult for a nonexpert to select the best choice of strategy, optimizer, and one-dimensional search options from the one hundred or so combinations that are available. EXADS aids engineers in determining the best combination based on their knowledge of the problem and the expert knowledge previously stored by experts who developed ADS. EXADS is a customized application of the AESOP artificial intelligence program (the general version of AESOP is available separately from COSMIC. The ADS program is also available from COSMIC.) The expert system consists of two main components. The knowledge base contains about 200 rules and is divided into three categories: constrained, unconstrained, and constrained treated as unconstrained. The EXADS inference engine is rule-based and makes decisions about a particular situation using hypotheses (potential solutions), rules, and answers to questions drawn from the rule base. EXADS is backward-chaining, that is, it works from hypothesis to facts. The rule base was compiled from sources such as literature searches, ADS documentation, and engineer surveys. EXADS will accept answers such as yes, no, maybe, likely, and don't know, or a certainty factor ranging from 0 to 10. When any hypothesis reaches a confidence level of 90% or more, it is deemed as the best choice and displayed to the user. If no hypothesis is confirmed, the user can examine explanations of why the hypotheses failed to reach the 90% level. The IBM PC version of EXADS is written in IQ-LISP for execution under DOS 2.0 or higher with a central memory requirement of approximately 512K of 8 bit bytes. This program was developed in 1986.

  15. A method and knowledge base for automated inference of patient problems from structured data in an electronic medical record.

    PubMed

    Wright, Adam; Pang, Justine; Feblowitz, Joshua C; Maloney, Francine L; Wilcox, Allison R; Ramelson, Harley Z; Schneider, Louise I; Bates, David W

    2011-01-01

    Accurate knowledge of a patient's medical problems is critical for clinical decision making, quality measurement, research, billing and clinical decision support. Common structured sources of problem information include the patient problem list and billing data; however, these sources are often inaccurate or incomplete. To develop and validate methods of automatically inferring patient problems from clinical and billing data, and to provide a knowledge base for inferring problems. We identified 17 target conditions and designed and validated a set of rules for identifying patient problems based on medications, laboratory results, billing codes, and vital signs. A panel of physicians provided input on a preliminary set of rules. Based on this input, we tested candidate rules on a sample of 100,000 patient records to assess their performance compared to gold standard manual chart review. The physician panel selected a final rule for each condition, which was validated on an independent sample of 100,000 records to assess its accuracy. Seventeen rules were developed for inferring patient problems. Analysis using a validation set of 100,000 randomly selected patients showed high sensitivity (range: 62.8-100.0%) and positive predictive value (range: 79.8-99.6%) for most rules. Overall, the inference rules performed better than using either the problem list or billing data alone. We developed and validated a set of rules for inferring patient problems. These rules have a variety of applications, including clinical decision support, care improvement, augmentation of the problem list, and identification of patients for research cohorts.

  16. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems

    PubMed Central

    Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.

    2013-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887

  17. Knowledge-based reasoning in the Paladin tactical decision generation system

    NASA Technical Reports Server (NTRS)

    Chappell, Alan R.

    1993-01-01

    A real-time tactical decision generation system for air combat engagements, Paladin, has been developed. A pilot's job in air combat includes tasks that are largely symbolic. These symbolic tasks are generally performed through the application of experience and training (i.e. knowledge) gathered over years of flying a fighter aircraft. Two such tasks, situation assessment and throttle control, are identified and broken out in Paladin to be handled by specialized knowledge based systems. Knowledge pertaining to these tasks is encoded into rule-bases to provide the foundation for decisions. Paladin uses a custom built inference engine and a partitioned rule-base structure to give these symbolic results in real-time. This paper provides an overview of knowledge-based reasoning systems as a subset of rule-based systems. The knowledge used by Paladin in generating results as well as the system design for real-time execution is discussed.

  18. Rule based design of conceptual models for formative evaluation

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.; Chang, Kai; Hale, Joseph P.; Bester, Terri; Rix, Thomas; Wang, Yaowen

    1994-01-01

    A Human-Computer Interface (HCI) Prototyping Environment with embedded evaluation capability has been investigated. This environment will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. This environment, which allows for rapid prototyping and evaluation of graphical interfaces, includes the following four components: (1) a HCI development tool; (2) a low fidelity simulator development tool; (3) a dynamic, interactive interface between the HCI and the simulator; and (4) an embedded evaluator that evaluates the adequacy of a HCI based on a user's performance. The embedded evaluation tool collects data while the user is interacting with the system and evaluates the adequacy of an interface based on a user's performance. This paper describes the design of conceptual models for the embedded evaluation system using a rule-based approach.

  19. Rule based design of conceptual models for formative evaluation

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.; Chang, Kai; Hale, Joseph P.; Bester, Terri; Rix, Thomas; Wang, Yaowen

    1994-01-01

    A Human-Computer Interface (HCI) Prototyping Environment with embedded evaluation capability has been investigated. This environment will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. This environment, which allows for rapid prototyping and evaluation of graphical interfaces, includes the following four components: (1) a HCI development tool, (2) a low fidelity simulator development tool, (3) a dynamic, interactive interface between the HCI and the simulator, and (4) an embedded evaluator that evaluates the adequacy of a HCI based on a user's performance. The embedded evaluation tool collects data while the user is interacting with the system and evaluates the adequacy of an interface based on a user's performance. This paper describes the design of conceptual models for the embedded evaluation system using a rule-based approach.

  20. Efficient and Accurate Optimal Linear Phase FIR Filter Design Using Opposition-Based Harmony Search Algorithm

    PubMed Central

    Saha, S. K.; Dutta, R.; Choudhury, R.; Kar, R.; Mandal, D.; Ghoshal, S. P.

    2013-01-01

    In this paper, opposition-based harmony search has been applied for the optimal design of linear phase FIR filters. RGA, PSO, and DE have also been adopted for the sake of comparison. The original harmony search algorithm is chosen as the parent one, and opposition-based approach is applied. During the initialization, randomly generated population of solutions is chosen, opposite solutions are also considered, and the fitter one is selected as a priori guess. In harmony memory, each such solution passes through memory consideration rule, pitch adjustment rule, and then opposition-based reinitialization generation jumping, which gives the optimum result corresponding to the least error fitness in multidimensional search space of FIR filter design. Incorporation of different control parameters in the basic HS algorithm results in the balancing of exploration and exploitation of search space. Low pass, high pass, band pass, and band stop FIR filters are designed with the proposed OHS and other aforementioned algorithms individually for comparative optimization performance. A comparison of simulation results reveals the optimization efficacy of the OHS over the other optimization techniques for the solution of the multimodal, nondifferentiable, nonlinear, and constrained FIR filter design problems. PMID:23844390

  1. Efficient and accurate optimal linear phase FIR filter design using opposition-based harmony search algorithm.

    PubMed

    Saha, S K; Dutta, R; Choudhury, R; Kar, R; Mandal, D; Ghoshal, S P

    2013-01-01

    In this paper, opposition-based harmony search has been applied for the optimal design of linear phase FIR filters. RGA, PSO, and DE have also been adopted for the sake of comparison. The original harmony search algorithm is chosen as the parent one, and opposition-based approach is applied. During the initialization, randomly generated population of solutions is chosen, opposite solutions are also considered, and the fitter one is selected as a priori guess. In harmony memory, each such solution passes through memory consideration rule, pitch adjustment rule, and then opposition-based reinitialization generation jumping, which gives the optimum result corresponding to the least error fitness in multidimensional search space of FIR filter design. Incorporation of different control parameters in the basic HS algorithm results in the balancing of exploration and exploitation of search space. Low pass, high pass, band pass, and band stop FIR filters are designed with the proposed OHS and other aforementioned algorithms individually for comparative optimization performance. A comparison of simulation results reveals the optimization efficacy of the OHS over the other optimization techniques for the solution of the multimodal, nondifferentiable, nonlinear, and constrained FIR filter design problems.

  2. Building v/s Exploring Models: Comparing Learning of Evolutionary Processes through Agent-based Modeling

    NASA Astrophysics Data System (ADS)

    Wagh, Aditi

    Two strands of work motivate the three studies in this dissertation. Evolutionary change can be viewed as a computational complex system in which a small set of rules operating at the individual level result in different population level outcomes under different conditions. Extensive research has documented students' difficulties with learning about evolutionary change (Rosengren et al., 2012), particularly in terms of levels slippage (Wilensky & Resnick, 1999). Second, though building and using computational models is becoming increasingly common in K-12 science education, we know little about how these two modalities compare. This dissertation adopts agent-based modeling as a representational system to compare these modalities in the conceptual context of micro-evolutionary processes. Drawing on interviews, Study 1 examines middle-school students' productive ways of reasoning about micro-evolutionary processes to find that the specific framing of traits plays a key role in whether slippage explanations are cued. Study 2, which was conducted in 2 schools with about 150 students, forms the crux of the dissertation. It compares learning processes and outcomes when students build their own models or explore a pre-built model. Analysis of Camtasia videos of student pairs reveals that builders' and explorers' ways of accessing rules, and sense-making of observed trends are of a different character. Builders notice rules through available blocks-based primitives, often bypassing their enactment while explorers attend to rules primarily through the enactment. Moreover, builders' sense-making of observed trends is more rule-driven while explorers' is more enactment-driven. Pre and posttests reveal that builders manifest a greater facility with accessing rules, providing explanations manifesting targeted assembly. Explorers use rules to construct explanations manifesting non-targeted assembly. Interviews reveal varying degrees of shifts away from slippage in both modalities, with students who built models not incorporating slippage explanations in responses. Study 3 compares these modalities with a control using traditional activities. Pre and posttests reveal that the two modalities manifested greater facility with accessing and assembling rules than the control. The dissertation offers implications for the design of learning environments for evolutionary change, design of the two modalities based on their strengths and weaknesses, and teacher training for the same.

  3. Design issues for a reinforcement-based self-learning fuzzy controller

    NASA Technical Reports Server (NTRS)

    Yen, John; Wang, Haojin; Dauherity, Walter

    1993-01-01

    Fuzzy logic controllers have some often cited advantages over conventional techniques such as PID control: easy implementation, its accommodation to natural language, the ability to cover wider range of operating conditions and others. One major obstacle that hinders its broader application is the lack of a systematic way to develop and modify its rules and as result the creation and modification of fuzzy rules often depends on try-error or pure experimentation. One of the proposed approaches to address this issue is self-learning fuzzy logic controllers (SFLC) that use reinforcement learning techniques to learn the desirability of states and to adjust the consequent part of fuzzy control rules accordingly. Due to the different dynamics of the controlled processes, the performance of self-learning fuzzy controller is highly contingent on the design. The design issue has not received sufficient attention. The issues related to the design of a SFLC for the application to chemical process are discussed and its performance is compared with that of PID and self-tuning fuzzy logic controller.

  4. 77 FR 52887 - Regulatory Capital Rules: Standardized Approach for Risk-Weighted Assets; Market Discipline and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ... securitization framework designed to address the credit risk of exposures that involve the tranching of the... creditworthiness standards and risk-based capital requirements have been designed to be consistent with safety and...

  5. Reinforcement interval type-2 fuzzy controller design by online rule generation and q-value-aided ant colony optimization.

    PubMed

    Juang, Chia-Feng; Hsu, Chia-Hung

    2009-12-01

    This paper proposes a new reinforcement-learning method using online rule generation and Q-value-aided ant colony optimization (ORGQACO) for fuzzy controller design. The fuzzy controller is based on an interval type-2 fuzzy system (IT2FS). The antecedent part in the designed IT2FS uses interval type-2 fuzzy sets to improve controller robustness to noise. There are initially no fuzzy rules in the IT2FS. The ORGQACO concurrently designs both the structure and parameters of an IT2FS. We propose an online interval type-2 rule generation method for the evolution of system structure and flexible partitioning of the input space. Consequent part parameters in an IT2FS are designed using Q -values and the reinforcement local-global ant colony optimization algorithm. This algorithm selects the consequent part from a set of candidate actions according to ant pheromone trails and Q-values, both of which are updated using reinforcement signals. The ORGQACO design method is applied to the following three control problems: 1) truck-backing control; 2) magnetic-levitation control; and 3) chaotic-system control. The ORGQACO is compared with other reinforcement-learning methods to verify its efficiency and effectiveness. Comparisons with type-1 fuzzy systems verify the noise robustness property of using an IT2FS.

  6. An application of object-oriented knowledge representation to engineering expert systems

    NASA Technical Reports Server (NTRS)

    Logie, D. S.; Kamil, H.; Umaretiya, J. R.

    1990-01-01

    The paper describes an object-oriented knowledge representation and its application to engineering expert systems. The object-oriented approach promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects and organized by defining relationships between the objects. An Object Representation Language (ORL) was implemented as a tool for building and manipulating the object base. Rule-based knowledge representation is then used to simulate engineering design reasoning. Using a common object base, very large expert systems can be developed, comprised of small, individually processed, rule sets. The integration of these two schemes makes it easier to develop practical engineering expert systems. The general approach to applying this technology to the domain of the finite element analysis, design, and optimization of aerospace structures is discussed.

  7. Adaptive fractional order sliding mode control for Boost converter in the Battery/Supercapacitor HESS.

    PubMed

    Wang, Jianlin; Xu, Dan; Zhou, Huan; Zhou, Tao

    2018-01-01

    In this paper, an adaptive fractional order sliding mode control (AFSMC) scheme is designed for the current tracking control of the Boost-type converter in a Battery/Supercapacitor hybrid energy storage system (HESS). In order to stabilize the current, the adaptation rules based on state-observer and Lyapunov function are being designed. A fractional order sliding surface function is defined based on the tracking current error and adaptive rules. Furthermore, through fractional order analysis, the stability of the fractional order control system is proven, and the value of the fractional order (λ) is being investigated. In addition, the effectiveness of the proposed AFSMC strategy is being verified by numerical simulations. The advantages of good transient response and robustness to uncertainty are being indicated by this design, when compared with a conventional integer order sliding mode control system.

  8. Skills, rules and knowledge in aircraft maintenance: errors in context

    NASA Technical Reports Server (NTRS)

    Hobbs, Alan; Williamson, Ann

    2002-01-01

    Automatic or skill-based behaviour is generally considered to be less prone to error than behaviour directed by conscious control. However, researchers who have applied Rasmussen's skill-rule-knowledge human error framework to accidents and incidents have sometimes found that skill-based errors appear in significant numbers. It is proposed that this is largely a reflection of the opportunities for error which workplaces present and does not indicate that skill-based behaviour is intrinsically unreliable. In the current study, 99 errors reported by 72 aircraft mechanics were examined in the light of a task analysis based on observations of the work of 25 aircraft mechanics. The task analysis identified the opportunities for error presented at various stages of maintenance work packages and by the job as a whole. Once the frequency of each error type was normalized in terms of the opportunities for error, it became apparent that skill-based performance is more reliable than rule-based performance, which is in turn more reliable than knowledge-based performance. The results reinforce the belief that industrial safety interventions designed to reduce errors would best be directed at those aspects of jobs that involve rule- and knowledge-based performance.

  9. Style-independent document labeling: design and performance evaluation

    NASA Astrophysics Data System (ADS)

    Mao, Song; Kim, Jong Woo; Thoma, George R.

    2003-12-01

    The Medical Article Records System or MARS has been developed at the U.S. National Library of Medicine (NLM) for automated data entry of bibliographical information from medical journals into MEDLINE, the premier bibliographic citation database at NLM. Currently, a rule-based algorithm (called ZoneCzar) is used for labeling important bibliographical fields (title, author, affiliation, and abstract) on medical journal article page images. While rules have been created for medical journals with regular layout types, new rules have to be manually created for any input journals with arbitrary or new layout types. Therefore, it is of interest to label any journal articles independent of their layout styles. In this paper, we first describe a system (called ZoneMatch) for automated generation of crucial geometric and non-geometric features of important bibliographical fields based on string-matching and clustering techniques. The rule based algorithm is then modified to use these features to perform style-independent labeling. We then describe a performance evaluation method for quantitatively evaluating our algorithm and characterizing its error distributions. Experimental results show that the labeling performance of the rule-based algorithm is significantly improved when the generated features are used.

  10. Improving drivers' knowledge of road rules using digital games.

    PubMed

    Li, Qing; Tay, Richard

    2014-04-01

    Although a proficient knowledge of the road rules is important to safe driving, many drivers do not retain the knowledge acquired after they have obtained their licenses. Hence, more innovative and appealing methods are needed to improve drivers' knowledge of the road rules. This study examines the effect of game based learning on drivers' knowledge acquisition and retention. We find that playing an entertaining game that is designed to impart knowledge of the road rules not only improves players' knowledge but also helps them retain such knowledge. Hence, learning by gaming appears to be a promising learning approach for driver education. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. 78 FR 66642 - Updating OSHA Standards Based on National Consensus Standards; Signage

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-06

    ... single standard is best. The ANSI Z535 designs, the traditional safety sign and tag designs, as well as... [Docket No. OSHA-2013-0005] RIN 1218-AC77 Updating OSHA Standards Based on National Consensus Standards... rule; confirmation of effective date. SUMMARY: On June 13, 2013, OSHA published in the Federal Register...

  12. Evaluation of a rule-based method for epidemiological document classification towards the automation of systematic reviews.

    PubMed

    Karystianis, George; Thayer, Kristina; Wolfe, Mary; Tsafnat, Guy

    2017-06-01

    Most data extraction efforts in epidemiology are focused on obtaining targeted information from clinical trials. In contrast, limited research has been conducted on the identification of information from observational studies, a major source for human evidence in many fields, including environmental health. The recognition of key epidemiological information (e.g., exposures) through text mining techniques can assist in the automation of systematic reviews and other evidence summaries. We designed and applied a knowledge-driven, rule-based approach to identify targeted information (study design, participant population, exposure, outcome, confounding factors, and the country where the study was conducted) from abstracts of epidemiological studies included in several systematic reviews of environmental health exposures. The rules were based on common syntactical patterns observed in text and are thus not specific to any systematic review. To validate the general applicability of our approach, we compared the data extracted using our approach versus hand curation for 35 epidemiological study abstracts manually selected for inclusion in two systematic reviews. The returned F-score, precision, and recall ranged from 70% to 98%, 81% to 100%, and 54% to 97%, respectively. The highest precision was observed for exposure, outcome and population (100%) while recall was best for exposure and study design with 97% and 89%, respectively. The lowest recall was observed for the population (54%), which also had the lowest F-score (70%). The generated performance of our text-mining approach demonstrated encouraging results for the identification of targeted information from observational epidemiological study abstracts related to environmental exposures. We have demonstrated that rules based on generic syntactic patterns in one corpus can be applied to other observational study design by simple interchanging the dictionaries aiming to identify certain characteristics (i.e., outcomes, exposures). At the document level, the recognised information can assist in the selection and categorization of studies included in a systematic review. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. 76 FR 76775 - Self-Regulatory Organizations; NYSE Amex LLC; Notice of Filing and Immediate Effectiveness of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-08

    ... Change To Amend NYSE Rule 104(a)(1)(A) To Reflect That Designated Market Maker Unit Quoting Requirements Are Based on Consolidated Average Daily Volume December 2, 2011. Pursuant to Section 19(b)(1) of the... Commission (``Commission'') the proposed rule change as described in Items I and II below, which Items have...

  14. Dynamic Magnification Factor in a Box-Shape Steel Girder

    NASA Astrophysics Data System (ADS)

    Rahbar-Ranji, A.

    2014-01-01

    The dynamic effect of moving loads on structures is treated as a dynamic magnification factor when resonant is not imminent. Studies have shown that the calculated magnification factors from field measurements could be higher than the values specified in design codes. It is the main aim of present paper to investigate the applicability and accuracy of a rule-based expression for calculation of dynamic magnification factor for lifting appliances used in marine industry. A steel box shape girder of a crane is considered and transient dynamic analysis using computer code ANSYS is implemented. Dynamic magnification factor is calculated for different loading conditions and compared with rule-based equation. The effects of lifting speeds, acceleration, damping ratio and position of cargo are examined. It is found that rule-based expression underestimate dynamic magnification factor.

  15. Model-assisted template extraction SRAF application to contact holes patterns in high-end flash memory device fabrication

    NASA Astrophysics Data System (ADS)

    Seoud, Ahmed; Kim, Juhwan; Ma, Yuansheng; Jayaram, Srividya; Hong, Le; Chae, Gyu-Yeol; Lee, Jeong-Woo; Park, Dae-Jin; Yune, Hyoung-Soon; Oh, Se-Young; Park, Chan-Ha

    2018-03-01

    Sub-resolution assist feature (SRAF) insertion techniques have been effectively used for a long time now to increase process latitude in the lithography patterning process. Rule-based SRAF and model-based SRAF are complementary solutions, and each has its own benefits, depending on the objectives of applications and the criticality of the impact on manufacturing yield, efficiency, and productivity. Rule-based SRAF provides superior geometric output consistency and faster runtime performance, but the associated recipe development time can be of concern. Model-based SRAF provides better coverage for more complicated pattern structures in terms of shapes and sizes, with considerably less time required for recipe development, although consistency and performance may be impacted. In this paper, we introduce a new model-assisted template extraction (MATE) SRAF solution, which employs decision tree learning in a model-based solution to provide the benefits of both rule-based and model-based SRAF insertion approaches. The MATE solution is designed to automate the creation of rules/templates for SRAF insertion, and is based on the SRAF placement predicted by model-based solutions. The MATE SRAF recipe provides optimum lithographic quality in relation to various manufacturing aspects in a very short time, compared to traditional methods of rule optimization. Experiments were done using memory device pattern layouts to compare the MATE solution to existing model-based SRAF and pixelated SRAF approaches, based on lithographic process window quality, runtime performance, and geometric output consistency.

  16. An intelligent knowledge-based and customizable home care system framework with ubiquitous patient monitoring and alerting techniques.

    PubMed

    Chen, Yen-Lin; Chiang, Hsin-Han; Yu, Chao-Wei; Chiang, Chuan-Yen; Liu, Chuan-Ming; Wang, Jenq-Haur

    2012-01-01

    This study develops and integrates an efficient knowledge-based system and a component-based framework to design an intelligent and flexible home health care system. The proposed knowledge-based system integrates an efficient rule-based reasoning model and flexible knowledge rules for determining efficiently and rapidly the necessary physiological and medication treatment procedures based on software modules, video camera sensors, communication devices, and physiological sensor information. This knowledge-based system offers high flexibility for improving and extending the system further to meet the monitoring demands of new patient and caregiver health care by updating the knowledge rules in the inference mechanism. All of the proposed functional components in this study are reusable, configurable, and extensible for system developers. Based on the experimental results, the proposed intelligent homecare system demonstrates that it can accomplish the extensible, customizable, and configurable demands of the ubiquitous healthcare systems to meet the different demands of patients and caregivers under various rehabilitation and nursing conditions.

  17. An Intelligent Knowledge-Based and Customizable Home Care System Framework with Ubiquitous Patient Monitoring and Alerting Techniques

    PubMed Central

    Chen, Yen-Lin; Chiang, Hsin-Han; Yu, Chao-Wei; Chiang, Chuan-Yen; Liu, Chuan-Ming; Wang, Jenq-Haur

    2012-01-01

    This study develops and integrates an efficient knowledge-based system and a component-based framework to design an intelligent and flexible home health care system. The proposed knowledge-based system integrates an efficient rule-based reasoning model and flexible knowledge rules for determining efficiently and rapidly the necessary physiological and medication treatment procedures based on software modules, video camera sensors, communication devices, and physiological sensor information. This knowledge-based system offers high flexibility for improving and extending the system further to meet the monitoring demands of new patient and caregiver health care by updating the knowledge rules in the inference mechanism. All of the proposed functional components in this study are reusable, configurable, and extensible for system developers. Based on the experimental results, the proposed intelligent homecare system demonstrates that it can accomplish the extensible, customizable, and configurable demands of the ubiquitous healthcare systems to meet the different demands of patients and caregivers under various rehabilitation and nursing conditions. PMID:23112650

  18. Mechanical Response Analysis of Long-life Asphalt Pavement Structure of Yunluo High-speed on the Semi-rigid Base

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Wu, Chuanhai; Xu, Xinquan; Li, Hao; Wang, Zhixiang

    2018-01-01

    In order to grasp the rule of the strain change of the semi-rigid asphalt pavement structure under the FWD load and provide a reliable theoretical and practical basis for the design of the pavement structure, based on the test section of Guangdong Yunluo expressway, taking FWD as the loading tool, by using the finite element analysis software ANSYS, the internal variation rules of each pavement structural layer were obtained. Based on the results of the theoretical analysis, the measured strain sensor was set up in the corresponding layer of the pavement structure, and the strain test plan was determined. Based on the analysis of the strain data obtained from several structural layers and field monitoring, the rationality of the type pavement structure and the strain test scheme were verified, so as to provide useful help for the design and the maintenance of the pavement structure.

  19. Use of an explicit rule decreases procrastination in university students.

    PubMed

    Johnson, Paul E; Perrin, Christopher J; Salo, Allen; Deschaine, Elyssa; Johnson, Beth

    2016-06-01

    The procrastination behavior of students from a small rural university was decreased by presenting them with a rule indicating that a sooner final due date for a writing assignment would be contingent on procrastination during earlier phases of the paper. A counterbalanced AB BA design was used to measure the effects of the rule-based treatment across 2 introductory psychology classes (N = 33). Overall, participants engaged in less procrastination, missed fewer deadlines, and produced higher quality writing in the treatment condition. © 2016 Society for the Experimental Analysis of Behavior.

  20. Fuzzy self-learning control for magnetic servo system

    NASA Technical Reports Server (NTRS)

    Tarn, J. H.; Kuo, L. T.; Juang, K. Y.; Lin, C. E.

    1994-01-01

    It is known that an effective control system is the key condition for successful implementation of high-performance magnetic servo systems. Major issues to design such control systems are nonlinearity; unmodeled dynamics, such as secondary effects for copper resistance, stray fields, and saturation; and that disturbance rejection for the load effect reacts directly on the servo system without transmission elements. One typical approach to design control systems under these conditions is a special type of nonlinear feedback called gain scheduling. It accommodates linear regulators whose parameters are changed as a function of operating conditions in a preprogrammed way. In this paper, an on-line learning fuzzy control strategy is proposed. To inherit the wealth of linear control design, the relations between linear feedback and fuzzy logic controllers have been established. The exercise of engineering axioms of linear control design is thus transformed into tuning of appropriate fuzzy parameters. Furthermore, fuzzy logic control brings the domain of candidate control laws from linear into nonlinear, and brings new prospects into design of the local controllers. On the other hand, a self-learning scheme is utilized to automatically tune the fuzzy rule base. It is based on network learning infrastructure; statistical approximation to assign credit; animal learning method to update the reinforcement map with a fast learning rate; and temporal difference predictive scheme to optimize the control laws. Different from supervised and statistical unsupervised learning schemes, the proposed method learns on-line from past experience and information from the process and forms a rule base of an FLC system from randomly assigned initial control rules.

  1. Solar Imaging UV/EUV Spectrometers Using TVLS Gratings

    NASA Astrophysics Data System (ADS)

    Thomas, R. J.

    2003-05-01

    It is a particular challenge to develop a stigmatic spectrograph for UV/EUV wavelengths since the very low normal-incidence reflectance of standard materials most often requires that the design be restricted to a single optical element which must simultaneously provide both re-imaging and spectral dispersion. This problem has been solved in the past by the use of toroidal gratings with uniform line-spaced rulings (TULS). A number of solar EUV spectrometers have been based on such designs, including SOHO/CDS, Solar-B/EIS, and the sounding rockets SERTS and EUNIS. More recently, Kita, Harada, and collaborators have developed the theory of spherical gratings with varied line-space rulings (SVLS) operated at unity magnification, which have been flown on several astronomical satellite missions. We now combine these ideas into a spectrometer concept that puts varied-line space rulings onto toroidal gratings. Such TVLS designs are found to provide excellent imaging even at very large spectrograph magnifications and beam-speeds, permitting extremely high-quality performance in remarkably compact instrument packages. Optical characteristics of three new solar spectrometers based on this concept are described: SUMI and RAISE, two sounding rocket payloads, and NEXUS, currently being proposed as a Small-Explorer (SMEX) mission.

  2. Toroidal varied-line space (TVLS) gratings

    NASA Astrophysics Data System (ADS)

    Thomas, Roger J.

    2003-02-01

    It is a particular challenge to develop a stigmatic spectrograph for EUV wavelengths since the very low normal-incidence reflectance of standard materials most often requires that the design be restricted to a single optical element which must simultaneously provide both re-imaging and spectral dispersion. This problem has been solved in the past by the use of toroidal gratings with uniform line-space rulings (TULS). A number of solar EUV spectrographs have been based on such designs, including SOHO/CDS, Solar-B/EIS, and the sounding rockets SERTS and EUNIS. More recently, Kita, Harada, and collaborators have developed the theory of spherical gratings with varied line-space rulings (SVLS) operated at unity magnification, which have been flown on several astronomical satellite missions. These ideas are now combined into a spectrograph concept that considers varied-line space grooves ruled onto toroidal gratings. Such TVLS designs are found to provide excellent imaging even at very large spectrograph magnifications and beam-speeds, permitting extremely high-quality performance in remarkably compact instrument packages. Optical characteristics of two solar spectrographs based on this concept are described: SUMI, proposed as a sounding rocket experiment, and NEXUS, proposed for the Solar Dynamics Observatory mission.

  3. Toroidal Varied-Line Space (TVLS) Gratings

    NASA Technical Reports Server (NTRS)

    Thomas, Roger J.; Oegerle, William (Technical Monitor)

    2002-01-01

    It is a particular challenge to develop a stigmatic spectrograph for XUV wavelengths since the very low normal-incidence reflectance of standard materials most often requires that the design be restricted to a single optical element which must simultaneously provide both re-imaging and spectral dispersion. This problem has been solved in the past by the use of toroidal gratings with uniform line-spaced rulings (TULS). A number of solar EUV (Extreme Ultraviolet) spectrometers have been based on such designs, including SOHO/CDS, Solar-B/EIS, and the sounding rockets SERTS and EUNIS. More recently, Kita, Harada, and collaborators have developed the theory of spherical gratings with varied line-space rulings (SVLS) operated at unity magnification, which have been flown on several astronomical satellite missions. We now combine these ideas into a spectrometer concept that puts varied-line space rulings onto toroidal gratings. Such TVLS designs are found to provide excellent imaging even at very large spectrograph magnifications and beam-speeds, permitting extremely high-quality performance in remarkably compact instrument packages. Optical characteristics of two solar spectrometers based on this concept are described: SUMI, proposed as a sounding rocket experiment, and NEXUS, proposed for the Solar Dynamics Observatory mission.

  4. An expert system for choosing the best combination of options in a general purpose program for automated design synthesis

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.; Barthelemy, J.-F. M.

    1986-01-01

    An expert system called EXADS has been developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. ADS has approximately 100 combinations of strategy, optimizer, and one-dimensional search options from which to choose. It is difficult for a nonexpert to make this choice. This expert system aids the user in choosing the best combination of options based on the users knowledge of the problem and the expert knowledge stored in the knowledge base. The knowledge base is divided into three categories; constrained problems, unconstrained problems, and constrained problems being treated as unconstrained problems. The inference engine and rules are written in LISP, contains about 200 rules, and executes on DEC-VAX (with Franz-LISP) and IBM PC (with IQ-LISP) computers.

  5. Design of integration-ready metasurface-based infrared absorbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ogando, Karim, E-mail: karim@cab.cnea.gov.ar; Pastoriza, Hernán

    2015-07-28

    We introduce an integration ready design of metamaterial infrared absorber, highly compatible with many kinds of fabrication processes. We present the results of an exhaustive experimental characterization, including an analysis of the effects of single meta-atom geometrical parameters and collective arrangement. We confront the results with the theoretical interpretations proposed in the literature. Based on the results, we develop a set of practical design rules for metamaterial absorbers in the infrared region.

  6. Computer-based creativity enhanced conceptual design model for non-routine design of mechanical systems

    NASA Astrophysics Data System (ADS)

    Li, Yutong; Wang, Yuxin; Duffy, Alex H. B.

    2014-11-01

    Computer-based conceptual design for routine design has made great strides, yet non-routine design has not been given due attention, and it is still poorly automated. Considering that the function-behavior-structure(FBS) model is widely used for modeling the conceptual design process, a computer-based creativity enhanced conceptual design model(CECD) for non-routine design of mechanical systems is presented. In the model, the leaf functions in the FBS model are decomposed into and represented with fine-grain basic operation actions(BOA), and the corresponding BOA set in the function domain is then constructed. Choosing building blocks from the database, and expressing their multiple functions with BOAs, the BOA set in the structure domain is formed. Through rule-based dynamic partition of the BOA set in the function domain, many variants of regenerated functional schemes are generated. For enhancing the capability to introduce new design variables into the conceptual design process, and dig out more innovative physical structure schemes, the indirect function-structure matching strategy based on reconstructing the combined structure schemes is adopted. By adjusting the tightness of the partition rules and the granularity of the divided BOA subsets, and making full use of the main function and secondary functions of each basic structure in the process of reconstructing of the physical structures, new design variables and variants are introduced into the physical structure scheme reconstructing process, and a great number of simpler physical structure schemes to accomplish the overall function organically are figured out. The creativity enhanced conceptual design model presented has a dominant capability in introducing new deign variables in function domain and digging out simpler physical structures to accomplish the overall function, therefore it can be utilized to solve non-routine conceptual design problem.

  7. Adaptive fractional order sliding mode control for Boost converter in the Battery/Supercapacitor HESS

    PubMed Central

    Xu, Dan; Zhou, Huan; Zhou, Tao

    2018-01-01

    In this paper, an adaptive fractional order sliding mode control (AFSMC) scheme is designed for the current tracking control of the Boost-type converter in a Battery/Supercapacitor hybrid energy storage system (HESS). In order to stabilize the current, the adaptation rules based on state-observer and Lyapunov function are being designed. A fractional order sliding surface function is defined based on the tracking current error and adaptive rules. Furthermore, through fractional order analysis, the stability of the fractional order control system is proven, and the value of the fractional order (λ) is being investigated. In addition, the effectiveness of the proposed AFSMC strategy is being verified by numerical simulations. The advantages of good transient response and robustness to uncertainty are being indicated by this design, when compared with a conventional integer order sliding mode control system. PMID:29702696

  8. Modular design of synthetic gene circuits with biological parts and pools.

    PubMed

    Marchisio, Mario Andrea

    2015-01-01

    Synthetic gene circuits can be designed in an electronic fashion by displaying their basic components-Standard Biological Parts and Pools of molecules-on the computer screen and connecting them with hypothetical wires. This procedure, achieved by our add-on for the software ProMoT, was successfully applied to bacterial circuits. Recently, we have extended this design-methodology to eukaryotic cells. Here, highly complex components such as promoters and Pools of mRNA contain hundreds of species and reactions whose calculation demands a rule-based modeling approach. We showed how to build such complex modules via the joint employment of the software BioNetGen (rule-based modeling) and ProMoT (modularization). In this chapter, we illustrate how to utilize our computational tool for synthetic biology with the in silico implementation of a simple eukaryotic gene circuit that performs the logic AND operation.

  9. Design of Critical Components

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C.; Zaretsky, Erwin V.

    2001-01-01

    Critical component design is based on minimizing product failures that results in loss of life. Potential catastrophic failures are reduced to secondary failures where components removed for cause or operating time in the system. Issues of liability and cost of component removal become of paramount importance. Deterministic design with factors of safety and probabilistic design address but lack the essential characteristics for the design of critical components. In deterministic design and fabrication there are heuristic rules and safety factors developed over time for large sets of structural/material components. These factors did not come without cost. Many designs failed and many rules (codes) have standing committees to oversee their proper usage and enforcement. In probabilistic design, not only are failures a given, the failures are calculated; an element of risk is assumed based on empirical failure data for large classes of component operations. Failure of a class of components can be predicted, yet one can not predict when a specific component will fail. The analogy is to the life insurance industry where very careful statistics are book-kept on classes of individuals. For a specific class, life span can be predicted within statistical limits, yet life-span of a specific element of that class can not be predicted.

  10. Design technology co-optimization for 14/10nm metal1 double patterning layer

    NASA Astrophysics Data System (ADS)

    Duan, Yingli; Su, Xiaojing; Chen, Ying; Su, Yajuan; Shao, Feng; Zhang, Recco; Lei, Junjiang; Wei, Yayi

    2016-03-01

    Design and technology co-optimization (DTCO) can satisfy the needs of the design, generate robust design rule, and avoid unfriendly patterns at the early stage of design to ensure a high level of manufacturability of the product by the technical capability of the present process. The DTCO methodology in this paper includes design rule translation, layout analysis, model validation, hotspots classification and design rule optimization mainly. The correlation of the DTCO and double patterning (DPT) can optimize the related design rule and generate friendlier layout which meets the requirement of the 14/10nm technology node. The experiment demonstrates the methodology of DPT-compliant DTCO which is applied to a metal1 layer from the 14/10nm node. The DTCO workflow proposed in our job is an efficient solution for optimizing the design rules for 14/10 nm tech node Metal1 layer. And the paper also discussed and did the verification about how to tune the design rule of the U-shape and L-shape structures in a DPT-aware metal layer.

  11. Scalability of a Methodology for Generating Technical Trading Rules with GAPs Based on Risk-Return Adjustment and Incremental Training

    NASA Astrophysics Data System (ADS)

    de La Cal, E. A.; Fernández, E. M.; Quiroga, R.; Villar, J. R.; Sedano, J.

    In previous works a methodology was defined, based on the design of a genetic algorithm GAP and an incremental training technique adapted to the learning of series of stock market values. The GAP technique consists in a fusion of GP and GA. The GAP algorithm implements the automatic search for crisp trading rules taking as objectives of the training both the optimization of the return obtained and the minimization of the assumed risk. Applying the proposed methodology, rules have been obtained for a period of eight years of the S&P500 index. The achieved adjustment of the relation return-risk has generated rules with returns very superior in the testing period to those obtained applying habitual methodologies and even clearly superior to Buy&Hold. This work probes that the proposed methodology is valid for different assets in a different market than previous work.

  12. ARC Collaborative Research Seminar Series

    Science.gov Websites

    been used to formulate design rules for hydration-based TES systems. Don Siegel is an Associate structural-acoustics, design of complex systems, and blast event simulations. Technology that he developed interests includes advanced fatigue and fracture assessment methodologies, computational methods for

  13. A rule-based expert system applied to moisture durability of building envelopes

    DOE PAGES

    Boudreaux, Philip R.; Pallin, Simon B.; Accawi, Gina K.; ...

    2018-01-09

    The moisture durability of an envelope component such as a wall or roof is difficult to predict. Moisture durability depends on all the construction materials used, as well as the climate, orientation, air tightness, and indoor conditions. Modern building codes require more insulation and tighter construction but provide little guidance about how to ensure these energy-efficient assemblies remain moisture durable. Furthermore, as new products and materials are introduced, builders are increasingly uncertain about the long-term durability of their building envelope designs. Oak Ridge National Laboratory and the US Department of Energy’s Building America Program are applying a rule-based expert systemmore » methodology in a web tool to help designers determine whether a given wall design is likely to be moisture durable and provide expert guidance on moisture risk management specific to a wall design and climate. Finally, the expert system is populated with knowledge from both expert judgment and probabilistic hygrothermal simulation results.« less

  14. A rule-based expert system applied to moisture durability of building envelopes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boudreaux, Philip R.; Pallin, Simon B.; Accawi, Gina K.

    The moisture durability of an envelope component such as a wall or roof is difficult to predict. Moisture durability depends on all the construction materials used, as well as the climate, orientation, air tightness, and indoor conditions. Modern building codes require more insulation and tighter construction but provide little guidance about how to ensure these energy-efficient assemblies remain moisture durable. Furthermore, as new products and materials are introduced, builders are increasingly uncertain about the long-term durability of their building envelope designs. Oak Ridge National Laboratory and the US Department of Energy’s Building America Program are applying a rule-based expert systemmore » methodology in a web tool to help designers determine whether a given wall design is likely to be moisture durable and provide expert guidance on moisture risk management specific to a wall design and climate. Finally, the expert system is populated with knowledge from both expert judgment and probabilistic hygrothermal simulation results.« less

  15. Application of Adaptive Design Methodology in Development of a Long-Acting Glucagon-Like Peptide-1 Analog (Dulaglutide): Statistical Design and Simulations

    PubMed Central

    Skrivanek, Zachary; Berry, Scott; Berry, Don; Chien, Jenny; Geiger, Mary Jane; Anderson, James H.; Gaydos, Brenda

    2012-01-01

    Background Dulaglutide (dula, LY2189265), a long-acting glucagon-like peptide-1 analog, is being developed to treat type 2 diabetes mellitus. Methods To foster the development of dula, we designed a two-stage adaptive, dose-finding, inferentially seamless phase 2/3 study. The Bayesian theoretical framework is used to adaptively randomize patients in stage 1 to 7 dula doses and, at the decision point, to either stop for futility or to select up to 2 dula doses for stage 2. After dose selection, patients continue to be randomized to the selected dula doses or comparator arms. Data from patients assigned the selected doses will be pooled across both stages and analyzed with an analysis of covariance model, using baseline hemoglobin A1c and country as covariates. The operating characteristics of the trial were assessed by extensive simulation studies. Results Simulations demonstrated that the adaptive design would identify the correct doses 88% of the time, compared to as low as 6% for a fixed-dose design (the latter value based on frequentist decision rules analogous to the Bayesian decision rules for adaptive design). Conclusions This article discusses the decision rules used to select the dula dose(s); the mathematical details of the adaptive algorithm—including a description of the clinical utility index used to mathematically quantify the desirability of a dose based on safety and efficacy measurements; and a description of the simulation process and results that quantify the operating characteristics of the design. PMID:23294775

  16. Rule-violations sensitise towards negative and authority-related stimuli.

    PubMed

    Wirth, Robert; Foerster, Anna; Rendel, Hannah; Kunde, Wilfried; Pfister, Roland

    2018-05-01

    Rule violations have usually been studied from a third-person perspective, identifying situational factors that render violations more or less likely. A first-person perspective of the agent that actively violates the rules, on the other hand, is only just beginning to emerge. Here we show that committing a rule violation sensitises towards subsequent negative stimuli as well as subsequent authority-related stimuli. In a Prime-Probe design, we used an instructed rule-violation task as the Prime and a word categorisation task as the Probe. Also, we employed a control condition that used a rule inversion task as the Prime (instead of rule violations). Probe targets were categorised faster after a violation relative to after a rule-based response if they related to either, negative valence or authority. Inversions, however, primed only negative stimuli and did not accelerate the categorisation of authority-related stimuli. A heightened sensitivity towards authority-related targets thus seems to be specific to rule violations. A control experiment showed that these effects cannot be explained in terms of semantic priming. Therefore, we propose that rule violations necessarily activate authority-related representations that make rule violations qualitatively different from simple rule inversions.

  17. CLIPS: A tool for the development and delivery of expert systems

    NASA Technical Reports Server (NTRS)

    Riley, Gary

    1991-01-01

    The C Language Integrated Production System (CLIPS) is a forward chaining rule-based language developed by the Software Technology Branch at the Johnson Space Center. CLIPS provides a complete environment for the construction of rule-based expert systems. CLIPS was designed specifically to provide high probability, low cost, and easy integration with external systems. Other key features of CLIPS include a powerful rule syntax, an interactive development environment, high performance, extensibility, a verification/validation tool, extensive documentation, and source code availability. The current release of CLIPS, version 4.3, is being used by over 2,500 users throughout the public and private community including: all NASA sites and branches of the military, numerous Federal bureaus, government contractors, 140 universities, and many companies.

  18. Automatic programming via iterated local search for dynamic job shop scheduling.

    PubMed

    Nguyen, Su; Zhang, Mengjie; Johnston, Mark; Tan, Kay Chen

    2015-01-01

    Dispatching rules have been commonly used in practice for making sequencing and scheduling decisions. Due to specific characteristics of each manufacturing system, there is no universal dispatching rule that can dominate in all situations. Therefore, it is important to design specialized dispatching rules to enhance the scheduling performance for each manufacturing environment. Evolutionary computation approaches such as tree-based genetic programming (TGP) and gene expression programming (GEP) have been proposed to facilitate the design task through automatic design of dispatching rules. However, these methods are still limited by their high computational cost and low exploitation ability. To overcome this problem, we develop a new approach to automatic programming via iterated local search (APRILS) for dynamic job shop scheduling. The key idea of APRILS is to perform multiple local searches started with programs modified from the best obtained programs so far. The experiments show that APRILS outperforms TGP and GEP in most simulation scenarios in terms of effectiveness and efficiency. The analysis also shows that programs generated by APRILS are more compact than those obtained by genetic programming. An investigation of the behavior of APRILS suggests that the good performance of APRILS comes from the balance between exploration and exploitation in its search mechanism.

  19. Design issues of a reinforcement-based self-learning fuzzy controller for petrochemical process control

    NASA Technical Reports Server (NTRS)

    Yen, John; Wang, Haojin; Daugherity, Walter C.

    1992-01-01

    Fuzzy logic controllers have some often-cited advantages over conventional techniques such as PID control, including easier implementation, accommodation to natural language, and the ability to cover a wider range of operating conditions. One major obstacle that hinders the broader application of fuzzy logic controllers is the lack of a systematic way to develop and modify their rules; as a result the creation and modification of fuzzy rules often depends on trial and error or pure experimentation. One of the proposed approaches to address this issue is a self-learning fuzzy logic controller (SFLC) that uses reinforcement learning techniques to learn the desirability of states and to adjust the consequent part of its fuzzy control rules accordingly. Due to the different dynamics of the controlled processes, the performance of a self-learning fuzzy controller is highly contingent on its design. The design issue has not received sufficient attention. The issues related to the design of a SFLC for application to a petrochemical process are discussed, and its performance is compared with that of a PID and a self-tuning fuzzy logic controller.

  20. Design of fuzzy systems using neurofuzzy networks.

    PubMed

    Figueiredo, M; Gomide, F

    1999-01-01

    This paper introduces a systematic approach for fuzzy system design based on a class of neural fuzzy networks built upon a general neuron model. The network structure is such that it encodes the knowledge learned in the form of if-then fuzzy rules and processes data following fuzzy reasoning principles. The technique provides a mechanism to obtain rules covering the whole input/output space as well as the membership functions (including their shapes) for each input variable. Such characteristics are of utmost importance in fuzzy systems design and application. In addition, after learning, it is very simple to extract fuzzy rules in the linguistic form. The network has universal approximation capability, a property very useful in, e.g., modeling and control applications. Here we focus on function approximation problems as a vehicle to illustrate its usefulness and to evaluate its performance. Comparisons with alternative approaches are also included. Both, nonnoisy and noisy data have been studied and considered in the computational experiments. The neural fuzzy network developed here and, consequently, the underlying approach, has shown to provide good results from the accuracy, complexity, and system design points of view.

  1. 10 CFR Appendix C to Part 52 - Design Certification Rule for the AP600 Design

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Design Certification Rule for the AP600 Design C Appendix C to Part 52 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES, CERTIFICATIONS, AND APPROVALS FOR NUCLEAR POWER PLANTS Pt. 52, App. C Appendix C to Part 52—Design Certification Rule for the...

  2. Fuzzylot: a novel self-organising fuzzy-neural rule-based pilot system for automated vehicles.

    PubMed

    Pasquier, M; Quek, C; Toh, M

    2001-10-01

    This paper presents part of our research work concerned with the realisation of an Intelligent Vehicle and the technologies required for its routing, navigation, and control. An automated driver prototype has been developed using a self-organising fuzzy rule-based system (POPFNN-CRI(S)) to model and subsequently emulate human driving expertise. The ability of fuzzy logic to represent vague information using linguistic variables makes it a powerful tool to develop rule-based control systems when an exact working model is not available, as is the case of any vehicle-driving task. Designing a fuzzy system, however, is a complex endeavour, due to the need to define the variables and their associated fuzzy sets, and determine a suitable rule base. Many efforts have thus been devoted to automating this process, yielding the development of learning and optimisation techniques. One of them is the family of POP-FNNs, or Pseudo-Outer Product Fuzzy Neural Networks (TVR, AARS(S), AARS(NS), CRI, Yager). These generic self-organising neural networks developed at the Intelligent Systems Laboratory (ISL/NTU) are based on formal fuzzy mathematical theory and are able to objectively extract a fuzzy rule base from training data. In this application, a driving simulator has been developed, that integrates a detailed model of the car dynamics, complete with engine characteristics and environmental parameters, and an OpenGL-based 3D-simulation interface coupled with driving wheel and accelerator/ brake pedals. The simulator has been used on various road scenarios to record from a human pilot driving data consisting of steering and speed control actions associated to road features. Specifically, the POPFNN-CRI(S) system is used to cluster the data and extract a fuzzy rule base modelling the human driving behaviour. Finally, the effectiveness of the generated rule base has been validated using the simulator in autopilot mode.

  3. Using new aggregation operators in rule-based intelligent control

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.; Chen, Yung-Yaw; Yager, Ronald R.

    1990-01-01

    A new aggregation operator is applied in the design of an approximate reasoning-based controller. The ordered weighted averaging (OWA) operator has the property of lying between the And function and the Or function used in previous fuzzy set reasoning systems. It is shown here that, by applying OWA operators, more generalized types of control rules, which may include linguistic quantifiers such as Many and Most, can be developed. The new aggregation operators, as tested in a cart-pole balancing control problem, illustrate improved performance when compared with existing fuzzy control aggregation schemes.

  4. Description of sampling designs using a comprehensive data structure

    Treesearch

    John C. Byrne; Albert R. Stage

    1988-01-01

    Maintaining permanent plot data with different sampling designs over long periods within an organization, as well as sharing such information between organizations, requires that common standards be used. A data structure for the description of the sampling design within a stand is proposed. It is based on the definition of subpopulations of trees sampled, the rules...

  5. A clocking discipline for two-phase digital integrated circuits

    NASA Astrophysics Data System (ADS)

    Noice, D. C.

    1983-09-01

    Sooner or later a designer of digital circuits must face the problem of timing verification so he can avoid errors caused by clock skew, critical races, and hazards. Unlike previous verification methods, such as timing simulation and timing analysis, the approach presented here guarantees correct operation despite uncertainty about delays in the circuit. The result is a clocking discipline that deals with timing abstractions only. It is not based on delay calculations; it is only concerned with the correct, synchronous operation at some clock rate. Accordingly, it may be used earlier in the design cycle, which is particularly important to integrated circuit designs. The clocking discipline consists of a notation of clocking types, and composition rules for using the types. Together, the notation and rules define a formal theory of two phase clocking. The notation defines the names and exact characteristics for different signals that are used in a two phase digital system. The notation makes it possible to develop rules for propagating the clocking types through particular circuits.

  6. Elasticity-dependent fast underwater adhesion demonstrated by macroscopic supramolecular assembly.

    PubMed

    Ju, Guannan; Cheng, Mengjiao; Guo, Fengli; Zhang, Qian; Shi, Feng

    2018-05-30

    Macroscopic supramolecular assembly (MSA) is a recent progress in supramolecular chemistry to associate visible building blocks through non-covalent interactions in a multivalent manner. Although various substrates (e. g. hydrogels, rigid materials) have been used, a general design rule of building blocks in MSA systems and interpretation of the assembly mechanism are still lacking and urgently in demand. Here we design three model systems with varied modulus and correlated the MSA probability with the elasticity. Based on the effects of substrate deformability on multivalency, we have proposed an elastic-modulus-dependent rule that building blocks below a critical modulus of 2.5 MPa can achieve MSA for the used host/guest system. Moreover, this MSA rule applies well to the design of materials applicable for fast underwater adhesion: Soft substrates (0.5 MPa) can achieve underwater adhesion within 10 s with one magnitude higher strength than that of rigid substrates (2.5 MPa). © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Design of retinal-projection-based near-eye display with contact lens.

    PubMed

    Wu, Yuhang; Chen, Chao Ping; Mi, Lantian; Zhang, Wenbo; Zhao, Jingxin; Lu, Yifan; Guo, Weiqian; Yu, Bing; Li, Yang; Maitlo, Nizamuddin

    2018-04-30

    We propose a design of a retinal-projection-based near-eye display for achieving ultra-large field of view, vision correction, and occlusion. Our solution is highlighted by a contact lens combo, a transparent organic light-emitting diode panel, and a twisted nematic liquid crystal panel. Its design rules are set forth in detail, followed by the results and discussion regarding the field of view, angular resolution, modulation transfer function, contrast ratio, distortion, and simulated imaging.

  8. Checking Flight Rules with TraceContract: Application of a Scala DSL for Trace Analysis

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Havelund, Klaus; Morris, Robert A.

    2011-01-01

    Typically during the design and development of a NASA space mission, rules and constraints are identified to help reduce reasons for failure during operations. These flight rules are usually captured in a set of indexed tables, containing rule descriptions, rationales for the rules, and other information. Flight rules can be part of manual operations procedures carried out by humans. However, they can also be automated, and either implemented as on-board monitors, or as ground based monitors that are part of a ground data system. In the case of automated flight rules, one considerable expense to be addressed for any mission is the extensive process by which system engineers express flight rules in prose, software developers translate these requirements into code, and then both experts verify that the resulting application is correct. This paper explores the potential benefits of using an internal Scala DSL for general trace analysis, named TRACECONTRACT, to write executable specifications of flight rules. TRACECONTRACT can generally be applied to analysis of for example log files or for monitoring executing systems online.

  9. Dose Transition Pathways: The Missing Link Between Complex Dose-Finding Designs and Simple Decision-Making.

    PubMed

    Yap, Christina; Billingham, Lucinda J; Cheung, Ying Kuen; Craddock, Charlie; O'Quigley, John

    2017-12-15

    The ever-increasing pace of development of novel therapies mandates efficient methodologies for assessment of their tolerability and activity. Evidence increasingly support the merits of model-based dose-finding designs in identifying the recommended phase II dose compared with conventional rule-based designs such as the 3 + 3 but despite this, their use remains limited. Here, we propose a useful tool, dose transition pathways (DTP), which helps overcome several commonly faced practical and methodologic challenges in the implementation of model-based designs. DTP projects in advance the doses recommended by a model-based design for subsequent patients (stay, escalate, de-escalate, or stop early), using all the accumulated information. After specifying a model with favorable statistical properties, we utilize the DTP to fine-tune the model to tailor it to the trial's specific requirements that reflect important clinical judgments. In particular, it can help to determine how stringent the stopping rules should be if the investigated therapy is too toxic. Its use to design and implement a modified continual reassessment method is illustrated in an acute myeloid leukemia trial. DTP removes the fears of model-based designs as unknown, complex systems and can serve as a handbook, guiding decision-making for each dose update. In the illustrated trial, the seamless, clear transition for each dose recommendation aided the investigators' understanding of the design and facilitated decision-making to enable finer calibration of a tailored model. We advocate the use of the DTP as an integral procedure in the co-development and successful implementation of practical model-based designs by statisticians and investigators. Clin Cancer Res; 23(24); 7440-7. ©2017 AACR . ©2017 American Association for Cancer Research.

  10. UPM: unified policy-based network management

    NASA Astrophysics Data System (ADS)

    Law, Eddie; Saxena, Achint

    2001-07-01

    Besides providing network management to the Internet, it has become essential to offer different Quality of Service (QoS) to users. Policy-based management provides control on network routers to achieve this goal. The Internet Engineering Task Force (IETF) has proposed a two-tier architecture whose implementation is based on the Common Open Policy Service (COPS) protocol and Lightweight Directory Access Protocol (LDAP). However, there are several limitations to this design such as scalability and cross-vendor hardware compatibility. To address these issues, we present a functionally enhanced multi-tier policy management architecture design in this paper. Several extensions are introduced thereby adding flexibility and scalability. In particular, an intermediate entity between the policy server and policy rule database called the Policy Enforcement Agent (PEA) is introduced. By keeping internal data in a common format, using a standard protocol, and by interpreting and translating request and decision messages from multi-vendor hardware, this agent allows a dynamic Unified Information Model throughout the architecture. We have tailor-made this unique information system to save policy rules in the directory server and allow executions of policy rules with dynamic addition of new equipment during run-time.

  11. Development of a New Departure Aversion Standard for Light Aircraft

    NASA Technical Reports Server (NTRS)

    Borer, Nicholas K.

    2017-01-01

    The Federal Aviation Administration (FAA) and European Aviation Safety Agency (EASA) have recently established new light aircraft certification rules that introduce significant changes to the current regulations. The changes include moving from prescriptive design requirements to performance-based standards, transferring many of the acceptable means of compliance out of the rules and into consensus standards. In addition, the FAA/EASA rules change the performance requirements associated with some of the more salient safety issues regarding light aircraft. One significant change is the elimination of spin recovery demonstration. The new rules now call for enhanced stall warning and aircraft handling characteristics that demonstrate resistance to inadvertent departure from controlled flight. The means of compliance with these changes in a safe, cost-effective manner is a challenging problem. This paper discusses existing approaches to reducing the likelihood of departure from controlled flight and introduces a new approach, dubbed Departure Aversion, which allows applicants to tailor the amount of departure resistance, stall warning, and enhanced safety equipment to meet the new proposed rules. The Departure Aversion approach gives applicants the freedom to select the most cost-effective portfolio for their design, while meeting the safety intent of the new rules, by ensuring that any combination of the selected approaches will be at a higher equivalent level of safety than today's status quo.

  12. Rocket Design for the Future

    NASA Technical Reports Server (NTRS)

    Follett, William W.; Rajagopal, Raj

    2001-01-01

    The focus of the AA MDO team is to reduce product development cost through the capture and automation of best design and analysis practices and through increasing the availability of low-cost, high-fidelity analysis. Implementation of robust designs reduces costs associated with the Test-Fall-Fix cycle. RD is currently focusing on several technologies to improve the design process, including optimization and robust design, expert and rule-based systems, and collaborative technologies.

  13. Design of impact-resistant boron/aluminum large fan blade

    NASA Technical Reports Server (NTRS)

    Salemme, C. T.; Yokel, S. A.

    1978-01-01

    The technical program was comprised of two technical tasks. Task 1 encompassed the preliminary boron/aluminum fan blade design effort. Two preliminary designs were evolved. An initial design consisted of 32 blades per stage and was based on material properties extracted from manufactured blades. A final design of 36 blades per stage was based on rule-of-mixture material properties. In Task 2, the selected preliminary blade design was refined via more sophisticated analytical tools. Detailed finite element stress analysis and aero performance analysis were carried out to determine blade material frequencies and directional stresses.

  14. Det Norske Veritas rule philosophy with regard to gas turbines for marine propulsion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, P.

    1999-04-01

    This paper is mainly based on Det Norske Veritas (DNV) Rules of January 1996, Part 4, Chapter 2, Section 4 -- Gas Turbines, and is intended to at least open the dialogue between the gas turbine industry and DNV. There is a need for design approval and manufacturing inspection process systematic and testing procedures to match the standards of the industry. The role and expectations imposed by owners, the authorities, insurance agencies, etc. needs to be understood. These expectations often have technical implications that may go against the normal procedures and practices of the gas turbine industry, and could havemore » cost impacts. The question of DNV acceptance criteria has been asked many times, with respect to gas turbines. DNV relies a great deal on the manufacturer to provide the basis for the design criteria, manufacturing, and testing criteria of the gas turbine. However, DNV adds its knowledge and experience to this, and checks that the documentation presented by the manufacturer is technically acceptable. Generally, a high level of the state-of-the-art theoretical documentation is required to support the design of modern gas turbines. A proper understanding of the rule philosophy of DNV could prove to be useful in developing better gas turbines systems, which fulfill the rule requirements, and at the same time save resources such as money and time. It is important for gas turbine manufacturers to understand the intent of the rules since it is the intent that needs to be fulfilled. Further, the rules do have the principle of equivalence, which means that there is full freedom in how one fulfills the intent of the rules, as long as DNV accepts the solution.« less

  15. Rebuilding the NAVSEA Early Stage Ship Design Environment

    DTIC Science & Technology

    2010-04-01

    rules -of- thumb to base these crucial decisions upon. With High Performance Computing (HPC) as an enabler, the vision is to explore all downstream...the results of the analysis back into LEAPS. Another software development worthy of discussion here is Intelligent Ship Arrangements ( ISA ), which...constraints and rules set by the users ahead of time. When used in a systematic and stochastic way, and when integrated using LEAPS, having this

  16. Health-Terrain: Visualizing Large Scale Health Data

    DTIC Science & Technology

    2014-12-01

    systems can only be realized if the quality of emerging large medical databases can be characterized and the meaning of the data understood. For this...Designed and tested an evaluation procedure for health data visualization system. This visualization framework offers a real time and web-based solution...rule is shown in the table, with the quality measures of each rule including the support, confidence, Laplace, Gain, p-s, lift and Conviction. We

  17. Lung Cancer Assistant: a hybrid clinical decision support application for lung cancer care.

    PubMed

    Sesen, M Berkan; Peake, Michael D; Banares-Alcantara, Rene; Tse, Donald; Kadir, Timor; Stanley, Roz; Gleeson, Fergus; Brady, Michael

    2014-09-06

    Multidisciplinary team (MDT) meetings are becoming the model of care for cancer patients worldwide. While MDTs have improved the quality of cancer care, the meetings impose substantial time pressure on the members, who generally attend several such MDTs. We describe Lung Cancer Assistant (LCA), a clinical decision support (CDS) prototype designed to assist the experts in the treatment selection decisions in the lung cancer MDTs. A novel feature of LCA is its ability to provide rule-based and probabilistic decision support within a single platform. The guideline-based CDS is based on clinical guideline rules, while the probabilistic CDS is based on a Bayesian network trained on the English Lung Cancer Audit Database (LUCADA). We assess rule-based and probabilistic recommendations based on their concordances with the treatments recorded in LUCADA. Our results reveal that the guideline rule-based recommendations perform well in simulating the recorded treatments with exact and partial concordance rates of 0.57 and 0.79, respectively. On the other hand, the exact and partial concordance rates achieved with probabilistic results are relatively poorer with 0.27 and 0.76. However, probabilistic decision support fulfils a complementary role in providing accurate survival estimations. Compared to recorded treatments, both CDS approaches promote higher resection rates and multimodality treatments.

  18. A knowledge-based patient assessment system: conceptual and technical design.

    PubMed Central

    Reilly, C. A.; Zielstorff, R. D.; Fox, R. L.; O'Connell, E. M.; Carroll, D. L.; Conley, K. A.; Fitzgerald, P.; Eng, T. K.; Martin, A.; Zidik, C. M.; Segal, M.

    2000-01-01

    This paper describes the design of an inpatient patient assessment application that captures nursing assessment data using a wireless laptop computer. The primary aim of this system is to capture structured information for facilitating decision support and quality monitoring. The system also aims to improve efficiency of recording patient assessments, reduce costs, and improve discharge planning and early identification of patient learning needs. Object-oriented methods were used to elicit functional requirements and to model the proposed system. A tools-based development approach is being used to facilitate rapid development and easy modification of assessment items and rules for decision support. Criteria for evaluation include perceived utility by clinician users, validity of decision support rules, time spent recording assessments, and perceived utility of aggregate reports for quality monitoring. PMID:11079970

  19. Implementing a Rule-Based Contract Compliance Checker

    NASA Astrophysics Data System (ADS)

    Strano, Massimo; Molina-Jimenez, Carlos; Shrivastava, Santosh

    The paper describes the design and implementation of an independent, third party contract monitoring service called Contract Compliance Checker (CCC). The CCC is provided with the specification of the contract in force, and is capable of observing and logging the relevant business-to-business (B2B) interaction events, in order to determine whether the actions of the business partners are consistent with the contract. A contract specification language called EROP (for Events, Rights, Obligations and Prohibitions) for the CCC has been developed based on business rules, that provides constructs to specify what rights, obligation and prohibitions become active and inactive after the occurrence of events related to the execution of business operations. The system has been designed to work with B2B industry standards such as ebXML and RosettaNet.

  20. Multi-objective design of fuzzy logic controller in supply chain

    NASA Astrophysics Data System (ADS)

    Ghane, Mahdi; Tarokh, Mohammad Jafar

    2012-08-01

    Unlike commonly used methods, in this paper, we have introduced a new approach for designing fuzzy controllers. In this approach, we have simultaneously optimized both objective functions of a supply chain over a two-dimensional space. Then, we have obtained a spectrum of optimized points, each of which represents a set of optimal parameters which can be chosen by the manager according to the importance of objective functions. Our used supply chain model is a member of inventory and order-based production control system family, a generalization of the periodic review which is termed `Order-Up-To policy.' An auto rule maker, based on non-dominated sorting genetic algorithm-II, has been applied to the experimental initial fuzzy rules. According to performance measurement, our results indicate the efficiency of the proposed approach.

  1. A knowledge-based patient assessment system: conceptual and technical design.

    PubMed

    Reilly, C A; Zielstorff, R D; Fox, R L; O'Connell, E M; Carroll, D L; Conley, K A; Fitzgerald, P; Eng, T K; Martin, A; Zidik, C M; Segal, M

    2000-01-01

    This paper describes the design of an inpatient patient assessment application that captures nursing assessment data using a wireless laptop computer. The primary aim of this system is to capture structured information for facilitating decision support and quality monitoring. The system also aims to improve efficiency of recording patient assessments, reduce costs, and improve discharge planning and early identification of patient learning needs. Object-oriented methods were used to elicit functional requirements and to model the proposed system. A tools-based development approach is being used to facilitate rapid development and easy modification of assessment items and rules for decision support. Criteria for evaluation include perceived utility by clinician users, validity of decision support rules, time spent recording assessments, and perceived utility of aggregate reports for quality monitoring.

  2. Toward rules relating zinc finger protein sequences and DNA binding site preferences.

    PubMed

    Desjarlais, J R; Berg, J M

    1992-08-15

    Zinc finger proteins of the Cys2-His2 type consist of tandem arrays of domains, where each domain appears to contact three adjacent base pairs of DNA through three key residues. We have designed and prepared a series of variants of the central zinc finger within the DNA binding domain of Sp1 by using information from an analysis of a large data base of zinc finger protein sequences. Through systematic variations at two of the three contact positions (underlined), relatively specific recognition of sequences of the form 5'-GGGGN(G or T)GGG-3' has been achieved. These results provide the basis for rules that may develop into a code that will allow the design of zinc finger proteins with preselected DNA site specificity.

  3. Retrosynthetic Analysis-Guided Breaking Tile Symmetry for the Assembly of Complex DNA Nanostructures.

    PubMed

    Wang, Pengfei; Wu, Siyu; Tian, Cheng; Yu, Guimei; Jiang, Wen; Wang, Guansong; Mao, Chengde

    2016-10-11

    Current tile-based DNA self-assembly produces simple repetitive or highly symmetric structures. In the case of 2D lattices, the unit cell often contains only one basic tile because the tiles often are symmetric (in terms of either the backbone or the sequence). In this work, we have applied retrosynthetic analysis to determine the minimal asymmetric units for complex DNA nanostructures. Such analysis guides us to break the intrinsic structural symmetries of the tiles to achieve high structural complexities. This strategy has led to the construction of several DNA nanostructures that are not accessible from conventional symmetric tile designs. Along with previous studies, herein we have established a set of four fundamental rules regarding tile-based assembly. Such rules could serve as guidelines for the design of DNA nanostructures.

  4. Design rules for RCA self-aligned silicon-gate CMOS/SOS process

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The CMOS/SOS design rules prepared by the RCA Solid State Technology Center (SSTC) are described. These rules specify the spacing and width requirements for each of the six design levels, the seventh level being used to define openings in the passivation level. An associated report, entitled Silicon-Gate CMOS/SOS Processing, provides further insight into the usage of these rules.

  5. DeviceEditor visual biological CAD canvas

    PubMed Central

    2012-01-01

    Background Biological Computer Aided Design (bioCAD) assists the de novo design and selection of existing genetic components to achieve a desired biological activity, as part of an integrated design-build-test cycle. To meet the emerging needs of Synthetic Biology, bioCAD tools must address the increasing prevalence of combinatorial library design, design rule specification, and scar-less multi-part DNA assembly. Results We report the development and deployment of web-based bioCAD software, DeviceEditor, which provides a graphical design environment that mimics the intuitive visual whiteboard design process practiced in biological laboratories. The key innovations of DeviceEditor include visual combinatorial library design, direct integration with scar-less multi-part DNA assembly design automation, and a graphical user interface for the creation and modification of design specification rules. We demonstrate how biological designs are rendered on the DeviceEditor canvas, and we present effective visualizations of genetic component ordering and combinatorial variations within complex designs. Conclusions DeviceEditor liberates researchers from DNA base-pair manipulation, and enables users to create successful prototypes using standardized, functional, and visual abstractions. Open and documented software interfaces support further integration of DeviceEditor with other bioCAD tools and software platforms. DeviceEditor saves researcher time and institutional resources through correct-by-construction design, the automation of tedious tasks, design reuse, and the minimization of DNA assembly costs. PMID:22373390

  6. Design Rules for Life Support Systems

    NASA Technical Reports Server (NTRS)

    Jones, Harry

    2002-01-01

    This paper considers some of the common assumptions and engineering rules of thumb used in life support system design. One general design rule is that the longer the mission, the more the life support system should use recycling and regenerable technologies. A more specific rule is that, if the system grows more than half the food, the food plants will supply all the oxygen needed for the crew life support. There are many such design rules that help in planning the analysis of life support systems and in checking results. These rules are typically if-then statements describing the results of steady-state, "back of the envelope," mass flow calculations. They are useful in identifying plausible candidate life support system designs and in rough allocations between resupply and resource recovery. Life support system designers should always review the design rules and make quick steady state calculations before doing detailed design and dynamic simulation. This paper develops the basis for the different assumptions and design rules and discusses how they should be used. We start top-down, with the highest level requirement to sustain human beings in a closed environment off Earth. We consider the crew needs for air, water, and food. We then discuss atmosphere leakage and recycling losses. The needs to support the crew and to make up losses define the fundamental life support system requirements. We consider the trade-offs between resupplying and recycling oxygen, water, and food. The specific choices between resupply and recycling are determined by mission duration, presence of in-situ resources, etc., and are defining parameters of life support system design.

  7. Fuzzy model-based fault detection and diagnosis for a pilot heat exchanger

    NASA Astrophysics Data System (ADS)

    Habbi, Hacene; Kidouche, Madjid; Kinnaert, Michel; Zelmat, Mimoun

    2011-04-01

    This article addresses the design and real-time implementation of a fuzzy model-based fault detection and diagnosis (FDD) system for a pilot co-current heat exchanger. The design method is based on a three-step procedure which involves the identification of data-driven fuzzy rule-based models, the design of a fuzzy residual generator and the evaluation of the residuals for fault diagnosis using statistical tests. The fuzzy FDD mechanism has been implemented and validated on the real co-current heat exchanger, and has been proven to be efficient in detecting and isolating process, sensor and actuator faults.

  8. Targeted Business Incentives and Local Labor Markets

    ERIC Educational Resources Information Center

    Freedman, Matthew

    2013-01-01

    This paper uses a regression discontinuity design to examine the effects of geographically targeted business incentives on local labor markets. Unlike elsewhere in the United States, enterprise zone (EZ) designations in Texas are determined in part by a cutoff rule based on census block group poverty rates. Exploiting this discontinuity as a…

  9. Quantitative knowledge acquisition for expert systems

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    A common problem in the design of expert systems is the definition of rules from data obtained in system operation or simulation. While it is relatively easy to collect data and to log the comments of human operators engaged in experiments, generalizing such information to a set of rules has not previously been a direct task. A statistical method is presented for generating rule bases from numerical data, motivated by an example based on aircraft navigation with multiple sensors. The specific objective is to design an expert system that selects a satisfactory suite of measurements from a dissimilar, redundant set, given an arbitrary navigation geometry and possible sensor failures. The systematic development is described of a Navigation Sensor Management (NSM) Expert System from Kalman Filter convariance data. The method invokes two statistical techniques: Analysis of Variance (ANOVA) and the ID3 Algorithm. The ANOVA technique indicates whether variations of problem parameters give statistically different covariance results, and the ID3 algorithms identifies the relationships between the problem parameters using probabilistic knowledge extracted from a simulation example set. Both are detailed.

  10. The load shedding advisor: An example of a crisis-response expert system

    NASA Technical Reports Server (NTRS)

    Bollinger, Terry B.; Lightner, Eric; Laverty, John; Ambrose, Edward

    1987-01-01

    A Prolog-based prototype expert system is described that was implemented by the Network Operations Branch of the NASA Goddard Space Flight Center. The purpose of the prototype was to test whether a small, inexpensive computer system could be used to host a load shedding advisor, a system which would monitor major physical environment parameters in a computer facility, then recommend appropriate operator reponses whenever a serious condition was detected. The resulting prototype performed significantly to efficiency gains achieved by replacing a purely rule-based design methodology with a hybrid approach that combined procedural, entity-relationship, and rule-based methods.

  11. Automated software development workstation

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Engineering software development was automated using an expert system (rule-based) approach. The use of this technology offers benefits not available from current software development and maintenance methodologies. A workstation was built with a library or program data base with methods for browsing the designs stored; a system for graphical specification of designs including a capability for hierarchical refinement and definition in a graphical design system; and an automated code generation capability in FORTRAN. The workstation was then used in a demonstration with examples from an attitude control subsystem design for the space station. Documentation and recommendations are presented.

  12. Empirical OPC rule inference for rapid RET application

    NASA Astrophysics Data System (ADS)

    Kulkarni, Anand P.

    2006-10-01

    A given technological node (45 nm, 65 nm) can be expected to process thousands of individual designs. Iterative methods applied at the node consume valuable days in determining proper placement of OPC features, and manufacturing and testing mask correspondence to wafer patterns in a trial-and-error fashion for each design. Repeating this fabrication process for each individual design is a time-consuming and expensive process. We present a novel technique which sidesteps the requirement to iterate through the model-based OPC analysis and pattern verification cycle on subsequent designs at the same node. Our approach relies on the inference of rules from a correct pattern at the wafer surface it relates to the OPC and pre-OPC pattern layout files. We begin with an offline phase where we obtain a "gold standard" design file that has been fab-tested at the node with a prepared, post-OPC layout file that corresponds to the intended on-wafer pattern. We then run an offline analysis to infer rules to be used in this method. During the analysis, our method implicitly identifies contextual OPC strategies for optimal placement of RET features on any design at that node. Using these strategies, we can apply OPC to subsequent designs at the same node with accuracy comparable to the original design file but significantly smaller expected runtimes. The technique promises to offer a rapid and accurate complement to existing RET application strategies.

  13. Criteria for evidence-based practice in Iranian traditional medicine.

    PubMed

    Soltani Arabshahi, SeyyedKamran; Mohammadi Kenari, Hoorieh; Kordafshari, Gholamreza; Shams-Ardakani, MohammadReza; Bigdeli, Shoaleh

    2015-07-01

    The major difference between Iranian traditional medicine and allopathic medicine is in the application  of  evidence  and  documents.  In  this  study,  criteria  for  evidence-based  practice  in  Iranian traditional medicine and its rules of practice were studied. The experts' views were investigated through in- depth, semi-structured interviews and the results were categorized into four main categories including Designing clinical questions/clinical question-based search, critical appraisal, resource search criteria and clinical prescription appraisal. Although the application of evidence in Iranian traditional medicine follows Evidence Based Medicine (EBM) principles but it benefits from its own rules, regulations, and criteria that are compatible with EBM.

  14. Solar Imaging UV/EUV Spectrometers Using TVLS Gratings

    NASA Technical Reports Server (NTRS)

    Thomas, Roger J.

    2003-01-01

    It is a particular challenge to develop a stigmatic spectrograph for UV, EUV wavelengths since the very low normal-incidence reflectance of standard materials most often requires that the design be restricted to a single optical element which must simultaneously provide both reimaging and spectral dispersion. This problem has been solved in the past by the use of toroidal gratings with uniform line-spaced rulings (TULS). A number of solar extreme ultraviolet (EUV) spectrometers have been based on such designs, including SOHO/CDS, Solar-B/EIS, and the sounding rockets Solar Extreme ultraviolet Research Telescope and Spectrograph (SERTS) and Extreme Ultraviolet Normal Incidence Spectrograph (EUNIS). More recently, Kita, Harada, and collaborators have developed the theory of spherical gratings with varied line-space rulings (SVLS) operated at unity magnification, which have been flown on several astronomical satellite missions. We now combine these ideas into a spectrometer concept that puts varied-line space rulings onto toroidal gratings. Such TVLS designs are found to provide excellent imaging even at very large spectrograph magnifications and beam-speeds, permitting extremely high-quality performance in remarkably compact instrument packages. Optical characteristics of three new solar spectrometers based on this concept are described: SUMI and RAISE, two sounding rocket payloads, and NEXUS, currently being proposed as a Small-Explorer (SMEX) mission.

  15. An Architecture for Performance Optimization in a Collaborative Knowledge-Based Approach for Wireless Sensor Networks

    PubMed Central

    Gadeo-Martos, Manuel Angel; Fernandez-Prieto, Jose Angel; Canada-Bago, Joaquin; Velasco, Juan Ramon

    2011-01-01

    Over the past few years, Intelligent Spaces (ISs) have received the attention of many Wireless Sensor Network researchers. Recently, several studies have been devoted to identify their common capacities and to set up ISs over these networks. However, little attention has been paid to integrating Fuzzy Rule-Based Systems into collaborative Wireless Sensor Networks for the purpose of implementing ISs. This work presents a distributed architecture proposal for collaborative Fuzzy Rule-Based Systems embedded in Wireless Sensor Networks, which has been designed to optimize the implementation of ISs. This architecture includes the following: (a) an optimized design for the inference engine; (b) a visual interface; (c) a module to reduce the redundancy and complexity of the knowledge bases; (d) a module to evaluate the accuracy of the new knowledge base; (e) a module to adapt the format of the rules to the structure used by the inference engine; and (f) a communications protocol. As a real-world application of this architecture and the proposed methodologies, we show an application to the problem of modeling two plagues of the olive tree: prays (olive moth, Prays oleae Bern.) and repilo (caused by the fungus Spilocaea oleagina). The results show that the architecture presented in this paper significantly decreases the consumption of resources (memory, CPU and battery) without a substantial decrease in the accuracy of the inferred values. PMID:22163687

  16. An architecture for performance optimization in a collaborative knowledge-based approach for wireless sensor networks.

    PubMed

    Gadeo-Martos, Manuel Angel; Fernandez-Prieto, Jose Angel; Canada-Bago, Joaquin; Velasco, Juan Ramon

    2011-01-01

    Over the past few years, Intelligent Spaces (ISs) have received the attention of many Wireless Sensor Network researchers. Recently, several studies have been devoted to identify their common capacities and to set up ISs over these networks. However, little attention has been paid to integrating Fuzzy Rule-Based Systems into collaborative Wireless Sensor Networks for the purpose of implementing ISs. This work presents a distributed architecture proposal for collaborative Fuzzy Rule-Based Systems embedded in Wireless Sensor Networks, which has been designed to optimize the implementation of ISs. This architecture includes the following: (a) an optimized design for the inference engine; (b) a visual interface; (c) a module to reduce the redundancy and complexity of the knowledge bases; (d) a module to evaluate the accuracy of the new knowledge base; (e) a module to adapt the format of the rules to the structure used by the inference engine; and (f) a communications protocol. As a real-world application of this architecture and the proposed methodologies, we show an application to the problem of modeling two plagues of the olive tree: prays (olive moth, Prays oleae Bern.) and repilo (caused by the fungus Spilocaea oleagina). The results show that the architecture presented in this paper significantly decreases the consumption of resources (memory, CPU and battery) without a substantial decrease in the accuracy of the inferred values.

  17. Portable design rules for bulk CMOS

    NASA Technical Reports Server (NTRS)

    Griswold, T. W.

    1982-01-01

    It is pointed out that for the past several years, one school of IC designers has used a simplified set of nMOS geometric design rules (GDR) which is 'portable', in that it can be used by many different nMOS manufacturers. The present investigation is concerned with a preliminary set of design rules for bulk CMOS which has been verified for simple test structures. The GDR are defined in terms of Caltech Intermediate Form (CIF), which is a geometry-description language that defines simple geometrical objects in layers. The layers are abstractions of physical mask layers. The design rules do not presume the existence of any particular design methodology. Attention is given to p-well and n-well CMOS processes, bulk CMOS and CMOS-SOS, CMOS geometric rules, and a description of the advantages of CMOS technology.

  18. Application of a swarm-based approach for phase unwrapping

    NASA Astrophysics Data System (ADS)

    da S. Maciel, Lucas; Albertazzi G., Armando, Jr.

    2014-07-01

    An algorithm for phase unwrapping based on swarm intelligence is proposed. The novel approach is based on the emergent behavior of swarms. This behavior is the result of the interactions between independent agents following a simple set of rules and is regarded as fast, flexible and robust. The rules here were designed with two purposes. Firstly, the collective behavior must result in a reliable map of the unwrapped phase. The unwrapping reliability was evaluated by each agent during run-time, based on the quality of the neighboring pixels. In addition, the rule set must result in a behavior that focuses on wrapped regions. Stigmergy and communication rules were implemented in order to enable each agent to seek less worked areas of the image. The agents were modeled as Finite-State Machines. Based on the availability of unwrappable pixels, each agent assumed a different state in order to better adapt itself to the surroundings. The implemented rule set was able to fulfill the requirements on reliability and focused unwrapping. The unwrapped phase map was comparable to those from established methods as the agents were able to reliably evaluate each pixel quality. Also, the unwrapping behavior, being observed in real time, was able to focus on workable areas as the agents communicated in order to find less traveled regions. The results were very positive for such a new approach to the phase unwrapping problem. Finally, the authors see great potential for future developments concerning the flexibility, robustness and processing times of the swarm-based algorithm.

  19. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  20. Context-Awareness Based Personalized Recommendation of Anti-Hypertension Drugs.

    PubMed

    Chen, Dexin; Jin, Dawei; Goh, Tiong-Thye; Li, Na; Wei, Leiru

    2016-09-01

    The World Health Organization estimates that almost one-third of the world's adult population are suffering from hypertension which has gradually become a "silent killer". Due to the varieties of anti-hypertensive drugs, patients are interested in how these drugs can be selected to match their respective conditions. This study provides a personalized recommendation service system of anti-hypertensive drugs based on context-awareness and designs a context ontology framework of the service. In addition, this paper introduces a Semantic Web Rule Language (SWRL)-based rule to provide high-level context reasoning and information recommendation and to overcome the limitation of ontology reasoning. To make the information recommendation of the drugs more personalized, this study also devises three categories of information recommendation rules that match different priority levels and uses a ranking algorithm to optimize the recommendation. The experiment conducted shows that combining the anti-hypertensive drugs personalized recommendation service context ontology (HyRCO) with the optimized rule reasoning can achieve a higher-quality personalized drug recommendation service. Accordingly this exploratory study of the personalized recommendation service for hypertensive drugs and its method can be easily adopted for other diseases.

  1. Simulation of data safety components for corporative systems

    NASA Astrophysics Data System (ADS)

    Yaremko, Svetlana A.; Kuzmina, Elena M.; Savchuk, Tamara O.; Krivonosov, Valeriy E.; Smolarz, Andrzej; Arman, Abenov; Smailova, Saule; Kalizhanova, Aliya

    2017-08-01

    The article deals with research of designing data safety components for corporations by means of mathematical simulations and modern information technologies. Simulation of threats ranks has been done which is based on definite values of data components. The rules of safety policy for corporative information systems have been presented. The ways of realization of safety policy rules have been proposed on the basis of taken conditions and appropriate class of valuable data protection.

  2. Spatial Rule-Based Modeling: A Method and Its Application to the Human Mitotic Kinetochore

    PubMed Central

    Ibrahim, Bashar; Henze, Richard; Gruenert, Gerd; Egbert, Matthew; Huwald, Jan; Dittrich, Peter

    2013-01-01

    A common problem in the analysis of biological systems is the combinatorial explosion that emerges from the complexity of multi-protein assemblies. Conventional formalisms, like differential equations, Boolean networks and Bayesian networks, are unsuitable for dealing with the combinatorial explosion, because they are designed for a restricted state space with fixed dimensionality. To overcome this problem, the rule-based modeling language, BioNetGen, and the spatial extension, SRSim, have been developed. Here, we describe how to apply rule-based modeling to integrate experimental data from different sources into a single spatial simulation model and how to analyze the output of that model. The starting point for this approach can be a combination of molecular interaction data, reaction network data, proximities, binding and diffusion kinetics and molecular geometries at different levels of detail. We describe the technique and then use it to construct a model of the human mitotic inner and outer kinetochore, including the spindle assembly checkpoint signaling pathway. This allows us to demonstrate the utility of the procedure, show how a novel perspective for understanding such complex systems becomes accessible and elaborate on challenges that arise in the formulation, simulation and analysis of spatial rule-based models. PMID:24709796

  3. Restoring Consistency In Subjective Information For Groundwater Driven Health Risk Assessment

    NASA Astrophysics Data System (ADS)

    Ozbek, M. M.; Pinder, G. F.

    2004-12-01

    In an earlier work (Ozbek and Pinder, 2003), we constructed a fuzzy rule-based knowledge base that uses subjective expert opinion to calculate risk-based design constraints (i.e., dose and pattern of exposure) to sustain the groundwater-driven individual health risk at a desired level. Ideally, our system must be capable to produce for any individual a meaningful risk result or for any given risk a meaningful design constraint, in the sense that the result is neither the empty set nor the whole domain of the variable of interest. Otherwise we consider our system as inconsistent. We present a method based on fuzzy similarity relations to restore consistency in our implicative fuzzy rule based system used for the risk-based groundwater remediation design problem. Both a global and a local approach are considered. Even though straightforward and computationally less demanding, the global approach can affect pieces of knowledge negatively by inducing unwarranted imprecision into the knowledge base. On the other hand, the local approach, given a family of parameterized similarity relations, determines a parameter for each inference such that consistent results are computed which may not be feasible in real time applications of our knowledge base. Several scenarios are considered for comparing the two approaches that suggest that for specific applications one or several approaches ranging from a completely global to a completely local one will be more suitable than others while calculating the design constraints.

  4. The Evolvement of Automobile Steering System Based on TRIZ

    NASA Astrophysics Data System (ADS)

    Zhao, Xinjun; Zhang, Shuang

    Products and techniques pass through a process of birth, growth, maturity, death and quit the stage like biological evolution process. The developments of products and techniques conform to some evolvement rules. If people know and hold these rules, they can design new kind of products and forecast the develop trends of the products. Thereby, enterprises can grasp the future technique directions of products, and make product and technique innovation. Below, based on TRIZ theory, the mechanism evolvement, the function evolvement and the appearance evolvement of automobile steering system had been analyzed and put forward some new ideas about future automobile steering system.

  5. The Making of SPINdle

    NASA Astrophysics Data System (ADS)

    Lam, Ho-Pun; Governatori, Guido

    We present the design and implementation of SPINdle - an open source Java based defeasible logic reasoner capable to perform efficient and scalable reasoning on defeasible logic theories (including theories with over 1 million rules). The implementation covers both the standard and modal extensions to defeasible logics. It can be used as a standalone theory prover and can be embedded into any applications as a defeasible logic rule engine. It allows users or agents to issues queries, on a given knowledge base or a theory generated on the fly by other applications, and automatically produces the conclusions of its consequences. The theory can also be represented using XML.

  6. PID tuning rules for SOPDT systems: review and some new results.

    PubMed

    Panda, Rames C; Yu, Cheng-Ching; Huang, Hsiao-Ping

    2004-04-01

    PID controllers are widely used in industries and so many tuning rules have been proposed over the past 50 years that users are often lost in the jungle of tuning formulas. Moreover, unlike PI control, different control laws and structures of implementation further complicate the use of the PID controller. In this work, five different tuning rules are taken for study to control second-order plus dead time systems with wide ranges of damping coefficients and dead time to time constant ratios (D/tau). Four of them are based on IMC design with different types of approximations on dead time and the other on desired closed-loop specifications (i.e., specified forward transfer function). The method of handling dead time in the IMC type of design is important especially for systems with large D/tau ratios. A systematic approach was followed to evaluate the performance of controllers. The regions of applicability of suitable tuning rules are highlighted and recommendations are also given. It turns out that IMC designed with the Maclaurin series expansion type PID is a better choice for both set point and load changes for systems with D/tau greater than 1. For systems with D/tau less than 1, the desired closed-loop specification approach is favored.

  7. 77 FR 74351 - Fees for Reviews of the Rule Enforcement Programs of Designated Contract Markets and Registered...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-14

    ... Budget Program Activity Codes (BPAC) system, formerly the Management Accounting Structure Codes (MASC... charges fees to designated contract markets and registered futures associations to recover the costs... notice is based upon an average of actual program costs incurred during FY 2009, 2010, and 2011. DATES...

  8. A films based approach to intensity imbalance correction for 65nm node c:PSM

    NASA Astrophysics Data System (ADS)

    Cottle, Rand; Sixt, Pierre; Lassiter, Matt; Cangemi, Marc; Martin, Patrick; Progler, Chris

    2005-11-01

    Intensity imbalance between the 0 and π phase features of c:PSM cause gate CD control and edge placement problems. Strategies such as undercut, selective biasing, and combinations of undercut and bias are currently used in production to mitigate these problems. However, there are drawbacks to these strategies such as space CD delta through pitch, gate CD control through defocus, design rule restrictions, and reticle manufacturability. This paper investigates the application of an innovative films-based approach to intensity balancing known as the Transparent Etch Stop Layer (TESL). TESL, in addition to providing a host of reticle quality and manufacturability benefits, also can be tuned to significantly reduce imbalance. Rigorous 3D vector simulations and experimental data compare through pitch and defocus performance of TESL and conventional c:PSM for 65nm design rules.

  9. Intelligent control for modeling of real-time reservoir operation, part II: artificial neural network with operating rule curves

    NASA Astrophysics Data System (ADS)

    Chang, Ya-Ting; Chang, Li-Chiu; Chang, Fi-John

    2005-04-01

    To bridge the gap between academic research and actual operation, we propose an intelligent control system for reservoir operation. The methodology includes two major processes, the knowledge acquired and implemented, and the inference system. In this study, a genetic algorithm (GA) and a fuzzy rule base (FRB) are used to extract knowledge based on the historical inflow data with a design objective function and on the operating rule curves respectively. The adaptive network-based fuzzy inference system (ANFIS) is then used to implement the knowledge, to create the fuzzy inference system, and then to estimate the optimal reservoir operation. To investigate its applicability and practicability, the Shihmen reservoir, Taiwan, is used as a case study. For the purpose of comparison, a simulation of the currently used M-5 operating rule curve is also performed. The results demonstrate that (1) the GA is an efficient way to search the optimal input-output patterns, (2) the FRB can extract the knowledge from the operating rule curves, and (3) the ANFIS models built on different types of knowledge can produce much better performance than the traditional M-5 curves in real-time reservoir operation. Moreover, we show that the model can be more intelligent for reservoir operation if more information (or knowledge) is involved.

  10. 77 FR 68873 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-16

    ...-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer Period for Commission Action on Proceedings to Determine Whether to Approve or Disapprove Proposed Rule Change To Establish... proposed rule change to establish various ``Benchmark Orders'' under NASDAQ Rule 4751(f). The proposed rule...

  11. 78 FR 6154 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ...-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer Period for Commission Action on Proceedings To Determine Whether To Approve or Disapprove Proposed Rule Change To Amend Rule...,\\2\\ a proposed rule change to amend Exchange Rule 4626--Limitation of Liability (``accommodation...

  12. What Communication Theories Can Teach the Designer of Computer-Based Training.

    ERIC Educational Resources Information Center

    Larsen, Ronald E.

    1985-01-01

    Reviews characteristics of computer-based training (CBT) that make application of communication theories appropriate and presents principles from communication theory (e.g., general systems theory, symbolic interactionism, rule theories, and interpersonal communication theories) to illustrate how CBT developers can profitably apply them to…

  13. Automation for pattern library creation and in-design optimization

    NASA Astrophysics Data System (ADS)

    Deng, Rock; Zou, Elain; Hong, Sid; Wang, Jinyan; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua; Huang, Jason

    2015-03-01

    Semiconductor manufacturing technologies are becoming increasingly complex with every passing node. Newer technology nodes are pushing the limits of optical lithography and requiring multiple exposures with exotic material stacks for each critical layer. All of this added complexity usually amounts to further restrictions in what can be designed. Furthermore, the designs must be checked against all these restrictions in verification and sign-off stages. Design rules are intended to capture all the manufacturing limitations such that yield can be maximized for any given design adhering to all the rules. Most manufacturing steps employ some sort of model based simulation which characterizes the behavior of each step. The lithography models play a very big part of the overall yield and design restrictions in patterning. However, lithography models are not practical to run during design creation due to their slow and prohibitive run times. Furthermore, the models are not usually given to foundry customers because of the confidential and sensitive nature of every foundry's processes. The design layout locations where a model flags unacceptable simulated results can be used to define pattern rules which can be shared with customers. With advanced technology nodes we see a large growth of pattern based rules. This is due to the fact that pattern matching is very fast and the rules themselves can be very complex to describe in a standard DRC language. Therefore, the patterns are left as either pattern layout clips or abstracted into pattern-like syntax which a pattern matcher can use directly. The patterns themselves can be multi-layered with "fuzzy" designations such that groups of similar patterns can be found using one description. The pattern matcher is often integrated with a DRC tool such that verification and signoff can be done in one step. The patterns can be layout constructs that are "forbidden", "waived", or simply low-yielding in nature. The patterns can also contain remedies built in so that fixing happens either automatically or in a guided manner. Building a comprehensive library of patterns is a very difficult task especially when a new technology node is being developed or the process keeps changing. The main dilemma is not having enough representative layouts to use for model simulation where pattern locations can be marked and extracted. This paper will present an automatic pattern library creation flow by using a few known yield detractor patterns to systematically expand the pattern library and generate optimized patterns. We will also look at the specific fixing hints in terms of edge movements, additive, or subtractive changes needed during optimization. Optimization will be shown for both the digital physical implementation and custom design methods.

  14. An expert system shell for inferring vegetation characteristics: Interface for the addition of techniques (Task H)

    NASA Technical Reports Server (NTRS)

    Harrison, P. Ann

    1993-01-01

    All the NASA VEGetation Workbench (VEG) goals except the Learning System provide the scientist with several different techniques. When VEG is run, rules assist the scientist in selecting the best of the available techniques to apply to the sample of cover type data being studied. The techniques are stored in the VEG knowledge base. The design and implementation of an interface that allows the scientist to add new techniques to VEG without assistance from the developer were completed. A new interface that enables the scientist to add techniques to VEG without assistance from the developer was designed and implemented. This interface does not require the scientist to have a thorough knowledge of Knowledge Engineering Environment (KEE) by Intellicorp or a detailed knowledge of the structure of VEG. The interface prompts the scientist to enter the required information about the new technique. It prompts the scientist to enter the required Common Lisp functions for executing the technique and the left hand side of the rule that causes the technique to be selected. A template for each function and rule and detailed instructions about the arguments of the functions, the values they should return, and the format of the rule are displayed. Checks are made to ensure that the required data were entered, the functions compiled correctly, and the rule parsed correctly before the new technique is stored. The additional techniques are stored separately from the VEG knowledge base. When the VEG knowledge base is loaded, the additional techniques are not normally loaded. The interface allows the scientist the option of adding all the previously defined new techniques before running VEG. When the techniques are added, the required units to store the additional techniques are created automatically in the correct places in the VEG knowledge base. The methods file containing the functions required by the additional techniques is loaded. New rule units are created to store the new rules. The interface that allow the scientist to select which techniques to use is updated automatically to include the new techniques. Task H was completed. The interface that allows the scientist to add techniques to VEG was implemented and comprehensively tested. The Common Lisp code for the Add Techniques system is listed in Appendix A.

  15. Hidden electronic rule in the “cluster-plus-glue-atom” model

    PubMed Central

    Du, Jinglian; Dong, Chuang; Melnik, Roderick; Kawazoe, Yoshiyuki; Wen, Bin

    2016-01-01

    Electrons and their interactions are intrinsic factors to affect the structure and properties of materials. Based on the “cluster-cluster-plus-glue-atom” model, an electron counting rule for complex metallic alloys (CMAs) has been revealed in this work (i. e. the CPGAMEC rule). Our results on the cluster structure and electron concentration of CMAs with apparent cluster features, indicate that the valence electrons’ number per unit cluster formula for these CMAs are specific constants of eight-multiples and twelve-multiples. It is thus termed as specific electrons cluster formula. This CPGAMEC rule has been demonstrated as a useful guidance to direct the design of CMAs with desired properties, while its practical applications and underlying mechanism have been illustrated on the basis of CMAs’ cluster structural features. Our investigation provides an aggregate picture with intriguing electronic rule and atomic structural features of CMAs. PMID:27642002

  16. Design Characteristics Influence Performance of Clinical Prediction Rules in Validation: A Meta-Epidemiological Study.

    PubMed

    Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda

    2016-01-01

    Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules' performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2-4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved.

  17. An Interval Type-2 Neural Fuzzy System for Online System Identification and Feature Elimination.

    PubMed

    Lin, Chin-Teng; Pal, Nikhil R; Wu, Shang-Lin; Liu, Yu-Ting; Lin, Yang-Yin

    2015-07-01

    We propose an integrated mechanism for discarding derogatory features and extraction of fuzzy rules based on an interval type-2 neural fuzzy system (NFS)-in fact, it is a more general scheme that can discard bad features, irrelevant antecedent clauses, and even irrelevant rules. High-dimensional input variable and a large number of rules not only enhance the computational complexity of NFSs but also reduce their interpretability. Therefore, a mechanism for simultaneous extraction of fuzzy rules and reducing the impact of (or eliminating) the inferior features is necessary. The proposed approach, namely an interval type-2 Neural Fuzzy System for online System Identification and Feature Elimination (IT2NFS-SIFE), uses type-2 fuzzy sets to model uncertainties associated with information and data in designing the knowledge base. The consequent part of the IT2NFS-SIFE is of Takagi-Sugeno-Kang type with interval weights. The IT2NFS-SIFE possesses a self-evolving property that can automatically generate fuzzy rules. The poor features can be discarded through the concept of a membership modulator. The antecedent and modulator weights are learned using a gradient descent algorithm. The consequent part weights are tuned via the rule-ordered Kalman filter algorithm to enhance learning effectiveness. Simulation results show that IT2NFS-SIFE not only simplifies the system architecture by eliminating derogatory/irrelevant antecedent clauses, rules, and features but also maintains excellent performance.

  18. Developmental engineering: a new paradigm for the design and manufacturing of cell-based products. Part II: from genes to networks: tissue engineering from the viewpoint of systems biology and network science.

    PubMed

    Lenas, Petros; Moos, Malcolm; Luyten, Frank P

    2009-12-01

    The field of tissue engineering is moving toward a new concept of "in vitro biomimetics of in vivo tissue development." In Part I of this series, we proposed a theoretical framework integrating the concepts of developmental biology with those of process design to provide the rules for the design of biomimetic processes. We named this methodology "developmental engineering" to emphasize that it is not the tissue but the process of in vitro tissue development that has to be engineered. To formulate the process design rules in a rigorous way that will allow a computational design, we should refer to mathematical methods to model the biological process taking place in vitro. Tissue functions cannot be attributed to individual molecules but rather to complex interactions between the numerous components of a cell and interactions between cells in a tissue that form a network. For tissue engineering to advance to the level of a technologically driven discipline amenable to well-established principles of process engineering, a scientifically rigorous formulation is needed of the general design rules so that the behavior of networks of genes, proteins, or cells that govern the unfolding of developmental processes could be related to the design parameters. Now that sufficient experimental data exist to construct plausible mathematical models of many biological control circuits, explicit hypotheses can be evaluated using computational approaches to facilitate process design. Recent progress in systems biology has shown that the empirical concepts of developmental biology that we used in Part I to extract the rules of biomimetic process design can be expressed in rigorous mathematical terms. This allows the accurate characterization of manufacturing processes in tissue engineering as well as the properties of the artificial tissues themselves. In addition, network science has recently shown that the behavior of biological networks strongly depends on their topology and has developed the necessary concepts and methods to describe it, allowing therefore a deeper understanding of the behavior of networks during biomimetic processes. These advances thus open the door to a transition for tissue engineering from a substantially empirical endeavor to a technology-based discipline comparable to other branches of engineering.

  19. An Incremental High-Utility Mining Algorithm with Transaction Insertion

    PubMed Central

    Gan, Wensheng; Zhang, Binbin

    2015-01-01

    Association-rule mining is commonly used to discover useful and meaningful patterns from a very large database. It only considers the occurrence frequencies of items to reveal the relationships among itemsets. Traditional association-rule mining is, however, not suitable in real-world applications since the purchased items from a customer may have various factors, such as profit or quantity. High-utility mining was designed to solve the limitations of association-rule mining by considering both the quantity and profit measures. Most algorithms of high-utility mining are designed to handle the static database. Fewer researches handle the dynamic high-utility mining with transaction insertion, thus requiring the computations of database rescan and combination explosion of pattern-growth mechanism. In this paper, an efficient incremental algorithm with transaction insertion is designed to reduce computations without candidate generation based on the utility-list structures. The enumeration tree and the relationships between 2-itemsets are also adopted in the proposed algorithm to speed up the computations. Several experiments are conducted to show the performance of the proposed algorithm in terms of runtime, memory consumption, and number of generated patterns. PMID:25811038

  20. Correcting groove error in gratings ruled on a 500-mm ruling engine using interferometric control.

    PubMed

    Mi, Xiaotao; Yu, Haili; Yu, Hongzhu; Zhang, Shanwen; Li, Xiaotian; Yao, Xuefeng; Qi, Xiangdong; Bayinhedhig; Wan, Qiuhua

    2017-07-20

    Groove error is one of the most important factors affecting grating quality and spectral performance. To reduce groove error, we propose a new ruling-tool carriage system based on aerostatic guideways. We design a new blank carriage system with double piezoelectric actuators. We also propose a completely closed-loop servo-control system with a new optical measurement system that can control the position of the diamond relative to the blank. To evaluate our proposed methods, we produced several gratings, including an echelle grating with 79  grooves/mm, a grating with 768  grooves/mm, and a high-density grating with 6000  grooves/mm. The results show that our methods effectively reduce groove error in ruled gratings.

  1. Coreference analysis in clinical notes: a multi-pass sieve with alternate anaphora resolution modules.

    PubMed

    Jonnalagadda, Siddhartha Reddy; Li, Dingcheng; Sohn, Sunghwan; Wu, Stephen Tze-Inn; Wagholikar, Kavishwar; Torii, Manabu; Liu, Hongfang

    2012-01-01

    This paper describes the coreference resolution system submitted by Mayo Clinic for the 2011 i2b2/VA/Cincinnati shared task Track 1C. The goal of the task was to construct a system that links the markables corresponding to the same entity. The task organizers provided progress notes and discharge summaries that were annotated with the markables of treatment, problem, test, person, and pronoun. We used a multi-pass sieve algorithm that applies deterministic rules in the order of preciseness and simultaneously gathers information about the entities in the documents. Our system, MedCoref, also uses a state-of-the-art machine learning framework as an alternative to the final, rule-based pronoun resolution sieve. The best system that uses a multi-pass sieve has an overall score of 0.836 (average of B(3), MUC, Blanc, and CEAF F score) for the training set and 0.843 for the test set. A supervised machine learning system that typically uses a single function to find coreferents cannot accommodate irregularities encountered in data especially given the insufficient number of examples. On the other hand, a completely deterministic system could lead to a decrease in recall (sensitivity) when the rules are not exhaustive. The sieve-based framework allows one to combine reliable machine learning components with rules designed by experts. Using relatively simple rules, part-of-speech information, and semantic type properties, an effective coreference resolution system could be designed. The source code of the system described is available at https://sourceforge.net/projects/ohnlp/files/MedCoref.

  2. Interpretable Decision Sets: A Joint Framework for Description and Prediction

    PubMed Central

    Lakkaraju, Himabindu; Bach, Stephen H.; Jure, Leskovec

    2016-01-01

    One of the most important obstacles to deploying predictive models is the fact that humans do not understand and trust them. Knowing which variables are important in a model’s prediction and how they are combined can be very powerful in helping people understand and trust automatic decision making systems. Here we propose interpretable decision sets, a framework for building predictive models that are highly accurate, yet also highly interpretable. Decision sets are sets of independent if-then rules. Because each rule can be applied independently, decision sets are simple, concise, and easily interpretable. We formalize decision set learning through an objective function that simultaneously optimizes accuracy and interpretability of the rules. In particular, our approach learns short, accurate, and non-overlapping rules that cover the whole feature space and pay attention to small but important classes. Moreover, we prove that our objective is a non-monotone submodular function, which we efficiently optimize to find a near-optimal set of rules. Experiments show that interpretable decision sets are as accurate at classification as state-of-the-art machine learning techniques. They are also three times smaller on average than rule-based models learned by other methods. Finally, results of a user study show that people are able to answer multiple-choice questions about the decision boundaries of interpretable decision sets and write descriptions of classes based on them faster and more accurately than with other rule-based models that were designed for interpretability. Overall, our framework provides a new approach to interpretable machine learning that balances accuracy, interpretability, and computational efficiency. PMID:27853627

  3. Poisson-Based Inference for Perturbation Models in Adaptive Spelling Training

    ERIC Educational Resources Information Center

    Baschera, Gian-Marco; Gross, Markus

    2010-01-01

    We present an inference algorithm for perturbation models based on Poisson regression. The algorithm is designed to handle unclassified input with multiple errors described by independent mal-rules. This knowledge representation provides an intelligent tutoring system with local and global information about a student, such as error classification…

  4. Covering Intensive Community-Based Child Mental Health Services under Medicaid. A Series of Issue Briefs.

    ERIC Educational Resources Information Center

    Koyanagi, Chris; Semansky, Rafael

    This set of seven issue briefs considers six important community-based services for children with serious mental or emotional disorders that some states provide as mandated rehabilitation services under the federal Medicaid law. The materials are designed to help state policymakers develop appropriate rules for covering community-based services…

  5. A Bayesian Hybrid Adaptive Randomisation Design for Clinical Trials with Survival Outcomes.

    PubMed

    Moatti, M; Chevret, S; Zohar, S; Rosenberger, W F

    2016-01-01

    Response-adaptive randomisation designs have been proposed to improve the efficiency of phase III randomised clinical trials and improve the outcomes of the clinical trial population. In the setting of failure time outcomes, Zhang and Rosenberger (2007) developed a response-adaptive randomisation approach that targets an optimal allocation, based on a fixed sample size. The aim of this research is to propose a response-adaptive randomisation procedure for survival trials with an interim monitoring plan, based on the following optimal criterion: for fixed variance of the estimated log hazard ratio, what allocation minimizes the expected hazard of failure? We demonstrate the utility of the design by redesigning a clinical trial on multiple myeloma. To handle continuous monitoring of data, we propose a Bayesian response-adaptive randomisation procedure, where the log hazard ratio is the effect measure of interest. Combining the prior with the normal likelihood, the mean posterior estimate of the log hazard ratio allows derivation of the optimal target allocation. We perform a simulation study to assess and compare the performance of this proposed Bayesian hybrid adaptive design to those of fixed, sequential or adaptive - either frequentist or fully Bayesian - designs. Non informative normal priors of the log hazard ratio were used, as well as mixture of enthusiastic and skeptical priors. Stopping rules based on the posterior distribution of the log hazard ratio were computed. The method is then illustrated by redesigning a phase III randomised clinical trial of chemotherapy in patients with multiple myeloma, with mixture of normal priors elicited from experts. As expected, there was a reduction in the proportion of observed deaths in the adaptive vs. non-adaptive designs; this reduction was maximized using a Bayes mixture prior, with no clear-cut improvement by using a fully Bayesian procedure. The use of stopping rules allows a slight decrease in the observed proportion of deaths under the alternate hypothesis compared with the adaptive designs with no stopping rules. Such Bayesian hybrid adaptive survival trials may be promising alternatives to traditional designs, reducing the duration of survival trials, as well as optimizing the ethical concerns for patients enrolled in the trial.

  6. 77 FR 2254 - Endangered and Threatened Wildlife and Plants; Designation of Critical Habitat for Mississippi...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-17

    ...-0024; 4500030114] RIN 1018-AW89 Endangered and Threatened Wildlife and Plants; Designation of Critical... breeding ponds and upland habitats. However, in that revised proposed rule, we stated we used the mean... provide specific data on breeding pond or upland habitat use. Based on the peer review comments we...

  7. Design of a Golf Swing Injury Detection and Evaluation open service platform with Ontology-oriented clustering case-based reasoning mechanism.

    PubMed

    Ku, Hao-Hsiang

    2015-01-01

    Nowadays, people can easily use a smartphone to get wanted information and requested services. Hence, this study designs and proposes a Golf Swing Injury Detection and Evaluation open service platform with Ontology-oritened clustering case-based reasoning mechanism, which is called GoSIDE, based on Arduino and Open Service Gateway initative (OSGi). GoSIDE is a three-tier architecture, which is composed of Mobile Users, Application Servers and a Cloud-based Digital Convergence Server. A mobile user is with a smartphone and Kinect sensors to detect the user's Golf swing actions and to interact with iDTV. An application server is with Intelligent Golf Swing Posture Analysis Model (iGoSPAM) to check a user's Golf swing actions and to alter this user when he is with error actions. Cloud-based Digital Convergence Server is with Ontology-oriented Clustering Case-based Reasoning (CBR) for Quality of Experiences (OCC4QoE), which is designed to provide QoE services by QoE-based Ontology strategies, rules and events for this user. Furthermore, GoSIDE will automatically trigger OCC4QoE and deliver popular rules for a new user. Experiment results illustrate that GoSIDE can provide appropriate detections for Golfers. Finally, GoSIDE can be a reference model for researchers and engineers.

  8. On nonstationarity-related errors in modal combination rules of the response spectrum method

    NASA Astrophysics Data System (ADS)

    Pathak, Shashank; Gupta, Vinay K.

    2017-10-01

    Characterization of seismic hazard via (elastic) design spectra and the estimation of linear peak response of a given structure from this characterization continue to form the basis of earthquake-resistant design philosophy in various codes of practice all over the world. Since the direct use of design spectrum ordinates is a preferred option for the practicing engineers, modal combination rules play central role in the peak response estimation. Most of the available modal combination rules are however based on the assumption that nonstationarity affects the structural response alike at the modal and overall response levels. This study considers those situations where this assumption may cause significant errors in the peak response estimation, and preliminary models are proposed for the estimation of the extents to which nonstationarity affects the modal and total system responses, when the ground acceleration process is assumed to be a stationary process. It is shown through numerical examples in the context of complete-quadratic-combination (CQC) method that the nonstationarity-related errors in the estimation of peak base shear may be significant, when strong-motion duration of the excitation is too small compared to the period of the system and/or the response is distributed comparably in several modes. It is also shown that these errors are reduced marginally with the use of the proposed nonstationarity factor models.

  9. 75 FR 17816 - Self-Regulatory Organizations; Notice of Filing and Immediate Effectiveness of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-07

    ... Stock Exchange LLC Amending NYSE Rule 1 To Provide for the Designation of Qualified Employees and NYSE... qualified employees to act in place of any person named in a rule as having authority to act under such rule... 1 to provide that the Exchange may formally designate one or more qualified employees to act in...

  10. Examining the Rule of Thumb of Not Using Multilevel Modeling: The "Design Effect Smaller than Two" Rule

    ERIC Educational Resources Information Center

    Lai, Mark H. C.; Kwok, Oi-man

    2015-01-01

    Educational researchers commonly use the rule of thumb of "design effect smaller than 2" as the justification of not accounting for the multilevel or clustered structure in their data. The rule, however, has not yet been systematically studied in previous research. In the present study, we generated data from three different models…

  11. Moral empiricism and the bias for act-based rules.

    PubMed

    Ayars, Alisabeth; Nichols, Shaun

    2017-10-01

    Previous studies on rule learning show a bias in favor of act-based rules, which prohibit intentionally producing an outcome but not merely allowing the outcome. Nichols, Kumar, Lopez, Ayars, and Chan (2016) found that exposure to a single sample violation in which an agent intentionally causes the outcome was sufficient for participants to infer that the rule was act-based. One explanation is that people have an innate bias to think rules are act-based. We suggest an alternative empiricist account: since most rules that people learn are act-based, people form an overhypothesis (Goodman, 1955) that rules are typically act-based. We report three studies that indicate that people can use information about violations to form overhypotheses about rules. In study 1, participants learned either three "consequence-based" rules that prohibited allowing an outcome or three "act-based" rules that prohibiting producing the outcome; in a subsequent learning task, we found that participants who had learned three consequence-based rules were more likely to think that the new rule prohibited allowing an outcome. In study 2, we presented participants with either 1 consequence-based rule or 3 consequence-based rules, and we found that those exposed to 3 such rules were more likely to think that a new rule was also consequence based. Thus, in both studies, it seems that learning 3 consequence-based rules generates an overhypothesis to expect new rules to be consequence-based. In a final study, we used a more subtle manipulation. We exposed participants to examples act-based or accident-based (strict liability) laws and then had them learn a novel rule. We found that participants who were exposed to the accident-based laws were more likely to think a new rule was accident-based. The fact that participants' bias for act-based rules can be shaped by evidence from other rules supports the idea that the bias for act-based rules might be acquired as an overhypothesis from the preponderance of act-based rules. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Automating Rule Strengths in Expert Systems.

    DTIC Science & Technology

    1987-05-01

    systems were designed in an incremental, iterative way. One of the most easily identifiable phases in this process, sometimes called tuning, consists...attenuators. The designer of the knowledge-based system must determine (synthesize) or adjust (xfine, if estimates of the values are given) these...values. We consider two ways in which the designer can learn the values. We call the first model of learning the complete case and the second model the

  13. Data Mining for Financial Applications

    NASA Astrophysics Data System (ADS)

    Kovalerchuk, Boris; Vityaev, Evgenii

    This chapter describes Data Mining in finance by discussing financial tasks, specifics of methodologies and techniques in this Data Mining area. It includes time dependence, data selection, forecast horizon, measures of success, quality of patterns, hypothesis evaluation, problem ID, method profile, attribute-based and relational methodologies. The second part of the chapter discusses Data Mining models and practice in finance. It covers use of neural networks in portfolio management, design of interpretable trading rules and discovering money laundering schemes using decision rules and relational Data Mining methodology.

  14. A Program Manager’s Methodology for Developing Structured Design in Embedded Weapons Systems.

    DTIC Science & Technology

    1983-12-01

    the hardware selec- tion. This premise has been reiterated and substantiated by numerous case studies performed in recent years among them Barry ...measures, rules of thumb, and analysis techniques, this method with early development by re Marco is the basis for the Pressman design methcdology...desired traits of a design based on the specificaticns generated, but does not include a procedure for realizat or" of the design. Pressman , (Ref. 5

  15. Integration of object-oriented knowledge representation with the CLIPS rule based system

    NASA Technical Reports Server (NTRS)

    Logie, David S.; Kamil, Hasan

    1990-01-01

    The paper describes a portion of the work aimed at developing an integrated, knowledge based environment for the development of engineering-oriented applications. An Object Representation Language (ORL) was implemented in C++ which is used to build and modify an object-oriented knowledge base. The ORL was designed in such a way so as to be easily integrated with other representation schemes that could effectively reason with the object base. Specifically, the integration of the ORL with the rule based system C Language Production Systems (CLIPS), developed at the NASA Johnson Space Center, will be discussed. The object-oriented knowledge representation provides a natural means of representing problem data as a collection of related objects. Objects are comprised of descriptive properties and interrelationships. The object-oriented model promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects. Data is inherited through an object network via the relationship links. Together, the two schemes complement each other in that the object-oriented approach efficiently handles problem data while the rule based knowledge is used to simulate the reasoning process. Alone, the object based knowledge is little more than an object-oriented data storage scheme; however, the CLIPS inference engine adds the mechanism to directly and automatically reason with that knowledge. In this hybrid scheme, the expert system dynamically queries for data and can modify the object base with complete access to all the functionality of the ORL from rules.

  16. Cost of enlarged operating zone for an existing Francis runner

    NASA Astrophysics Data System (ADS)

    Monette, Christine; Marmont, Hugues; Chamberland-Lauzon, Joël; Skagerstrand, Anders; Coutu, André; Carlevi, Jens

    2016-11-01

    Traditionally, hydro power plants have been operated close to best efficiency point, the more stable operating condition for which they have been designed. However, because of changes in the electricity market, many hydro power plants operators wish to operate their machines differently to fulfil those new market needs. New operating conditions can include whole range operation, many start/stops, extensive low load operation, synchronous condenser mode and power/frequency regulation. Many of these new operating conditions may impose more severe fatigue damage than the traditional base load operation close to best efficiency point. Under these conditions, the fatigue life of the runner may be significantly reduced and reparation or replacement cost might occur sooner than expected. In order to design reliable Francis runners for those new challenging operating scenarios, Andritz Hydro has developed various proprietary tools and design rules. These are used within Andritz Hydro to design mechanically robust Francis runners for the operating scenarios fulfilling customer's specifications. To estimate residual life under different operating scenarios of an existing runner designed years ago for best efficiency base load operation, Andritz Hydro's design rules and tools would necessarily lead to conservative results. While the geometry of a new runner can be modified to fulfil all conservative mechanical design rules, the predicted fatigue life of an existing runner under off-design operating conditions may appear rather short because of the conservative safety factor included in the calculations. The most precise and reliable way to calculate residual life of an existing runner under different operating scenarios is to perform a strain gauge measurement campaign on the runner. This paper presents the runner strain gage measurement campaign of a mid-head Francis turbine over all the operating conditions available during the test, the analysis of the measurement signals and the runner residual life assessment under different operating scenarios. With these results, the maintenance cost of the change in operating mode can then be calculated and foreseen by the power plant owner.

  17. Designing and Implementation of a Heart Failure Telemonitoring System

    PubMed Central

    Safdari, Reza; Jafarpour, Maryam; Mokhtaran, Mehrshad; Naderi, Nasim

    2017-01-01

    Introduction: The aim of this study was to identify patients at-risk, enhancing self-care management of HF patients at home and reduce the disease exacerbations and readmissions. Method: In this research according to standard heart failure guidelines and Semi-structured interviews with 10 heart failure Specialists, a draft heart failure rule set for alerts and patient instructions was developed. Eventually, the clinical champion of the project vetted the rule set. Also we designed a transactional system to enhance monitoring and follow up of CHF patients. With this system, CHF patients are required to measure their physiological measurements (vital signs and body weight) every day and to submit their symptoms using the app. additionally, based on their data, they will receive customized notifications and motivation messages to classify risk of disease exacerbation. The architecture of system comprised of six major components: 1) a patient data collection suite including a mobile app and website; 2) Data Receiver; 3) Database; 4) a Specialists expert Panel; 5) Rule engine classifier; 6) Notifier engine. Results: This system has implemented in Iran for the first time and we are currently in the testing phase with 10 patients to evaluate the technical performance of our system. The developed expert system generates alerts and instructions based on the patient’s data and the notify engine notifies responsible nurses and physicians and sometimes patients. Detailed analysis of those results will be reported in a future report. Conclusion: This study is based on the design of a telemonitoring system for heart failure self-care that intents to overcome the gap that occurs when patients discharge from the hospital and tries to accurate requirement of readmission. A rule set for classifying and resulting automated alerts and patient instructions for heart failure telemonitoring was developed. It also facilitates daily communication among patients and heart failure clinicians so any deterioration in health could be identified immediately. PMID:29114106

  18. Designing and Implementation of a Heart Failure Telemonitoring System.

    PubMed

    Safdari, Reza; Jafarpour, Maryam; Mokhtaran, Mehrshad; Naderi, Nasim

    2017-09-01

    The aim of this study was to identify patients at-risk, enhancing self-care management of HF patients at home and reduce the disease exacerbations and readmissions. In this research according to standard heart failure guidelines and Semi-structured interviews with 10 heart failure Specialists, a draft heart failure rule set for alerts and patient instructions was developed. Eventually, the clinical champion of the project vetted the rule set. Also we designed a transactional system to enhance monitoring and follow up of CHF patients. With this system, CHF patients are required to measure their physiological measurements (vital signs and body weight) every day and to submit their symptoms using the app. additionally, based on their data, they will receive customized notifications and motivation messages to classify risk of disease exacerbation. The architecture of system comprised of six major components: 1) a patient data collection suite including a mobile app and website; 2) Data Receiver; 3) Database; 4) a Specialists expert Panel; 5) Rule engine classifier; 6) Notifier engine. This system has implemented in Iran for the first time and we are currently in the testing phase with 10 patients to evaluate the technical performance of our system. The developed expert system generates alerts and instructions based on the patient's data and the notify engine notifies responsible nurses and physicians and sometimes patients. Detailed analysis of those results will be reported in a future report. This study is based on the design of a telemonitoring system for heart failure self-care that intents to overcome the gap that occurs when patients discharge from the hospital and tries to accurate requirement of readmission. A rule set for classifying and resulting automated alerts and patient instructions for heart failure telemonitoring was developed. It also facilitates daily communication among patients and heart failure clinicians so any deterioration in health could be identified immediately.

  19. A semi-automatic computer-aided method for surgical template design

    NASA Astrophysics Data System (ADS)

    Chen, Xiaojun; Xu, Lu; Yang, Yue; Egger, Jan

    2016-02-01

    This paper presents a generalized integrated framework of semi-automatic surgical template design. Several algorithms were implemented including the mesh segmentation, offset surface generation, collision detection, ruled surface generation, etc., and a special software named TemDesigner was developed. With a simple user interface, a customized template can be semi- automatically designed according to the preoperative plan. Firstly, mesh segmentation with signed scalar of vertex is utilized to partition the inner surface from the input surface mesh based on the indicated point loop. Then, the offset surface of the inner surface is obtained through contouring the distance field of the inner surface, and segmented to generate the outer surface. Ruled surface is employed to connect inner and outer surfaces. Finally, drilling tubes are generated according to the preoperative plan through collision detection and merging. It has been applied to the template design for various kinds of surgeries, including oral implantology, cervical pedicle screw insertion, iliosacral screw insertion and osteotomy, demonstrating the efficiency, functionality and generality of our method.

  20. Inclusive Competitive Game Play Through Balanced Sensory Feedback.

    PubMed

    Westin, Thomas; Söderström, David; Karlsson, Olov; Peiris, Ranil

    2017-01-01

    While game accessibility has improved significantly the last few years, there are still barriers for equal participation and multiplayer issues have been less researched. Game balance is here about making the game fair in a player versus player competitive game. One difficult design task is to balance the game to be fair regardless of visual or hearing capabilities, with clearly different requirements. This paper explores a tentative design method for enabling inclusive competitive game-play without individual adaptations of game rules that could spoil the game. The method involved applying a unified design method to design an unbalanced game, then modifying visual feedback as a hypothetical balanced design, and testing the game with totally 52 people with and without visual or hearing disabilities in three workshops. Game balance was evaluated based on score differences and less structured qualitative data, and a redesign of the game was made. Conclusions are a tentative method for balancing a multiplayer, competitive game without changing game rules and how the method can be applied.

  1. Design and static structural analysis of a race car chassis for Formula Society of Automotive Engineers (FSAE) event

    NASA Astrophysics Data System (ADS)

    Mohamad, M. L.; Rahman, M. T. A.; Khan, S. F.; Basha, M. H.; Adom, A. H.; Hashim, M. S. M.

    2017-10-01

    The main purpose of this study is to make improvement for the UniMAP Automotive Racing Team car chassis which has several problems associated with the chassis must be fixed and some changes are needed to be made in order to perform well. This study involves the process of designing three chassis that are created based on the rules stated by FSAE rules book (2017/2018). The three chassis will undergo analysis test that consists of five tests which are main roll hoop test, front roll hoop test, static shear, side impact, static torsional loading and finally one of them will be selected as the best design in term of Von Mises Stress and torsional displacement. From the results obtained, the new selected chassis design which also declared as the new improved design poses the weight of 27.66 kg which was decreased by 16.7% from the existing chassis (32.77 kg). The torsional rigidity of the improved chassis increased by 37.74%.

  2. A semi-automatic computer-aided method for surgical template design

    PubMed Central

    Chen, Xiaojun; Xu, Lu; Yang, Yue; Egger, Jan

    2016-01-01

    This paper presents a generalized integrated framework of semi-automatic surgical template design. Several algorithms were implemented including the mesh segmentation, offset surface generation, collision detection, ruled surface generation, etc., and a special software named TemDesigner was developed. With a simple user interface, a customized template can be semi- automatically designed according to the preoperative plan. Firstly, mesh segmentation with signed scalar of vertex is utilized to partition the inner surface from the input surface mesh based on the indicated point loop. Then, the offset surface of the inner surface is obtained through contouring the distance field of the inner surface, and segmented to generate the outer surface. Ruled surface is employed to connect inner and outer surfaces. Finally, drilling tubes are generated according to the preoperative plan through collision detection and merging. It has been applied to the template design for various kinds of surgeries, including oral implantology, cervical pedicle screw insertion, iliosacral screw insertion and osteotomy, demonstrating the efficiency, functionality and generality of our method. PMID:26843434

  3. A semi-automatic computer-aided method for surgical template design.

    PubMed

    Chen, Xiaojun; Xu, Lu; Yang, Yue; Egger, Jan

    2016-02-04

    This paper presents a generalized integrated framework of semi-automatic surgical template design. Several algorithms were implemented including the mesh segmentation, offset surface generation, collision detection, ruled surface generation, etc., and a special software named TemDesigner was developed. With a simple user interface, a customized template can be semi- automatically designed according to the preoperative plan. Firstly, mesh segmentation with signed scalar of vertex is utilized to partition the inner surface from the input surface mesh based on the indicated point loop. Then, the offset surface of the inner surface is obtained through contouring the distance field of the inner surface, and segmented to generate the outer surface. Ruled surface is employed to connect inner and outer surfaces. Finally, drilling tubes are generated according to the preoperative plan through collision detection and merging. It has been applied to the template design for various kinds of surgeries, including oral implantology, cervical pedicle screw insertion, iliosacral screw insertion and osteotomy, demonstrating the efficiency, functionality and generality of our method.

  4. A rule-based system for real-time analysis of control systems

    NASA Astrophysics Data System (ADS)

    Larson, Richard R.; Millard, D. Edward

    1992-10-01

    An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.

  5. A rule-based system for real-time analysis of control systems

    NASA Technical Reports Server (NTRS)

    Larson, Richard R.; Millard, D. Edward

    1992-01-01

    An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.

  6. Reliability based design including future tests and multiagent approaches

    NASA Astrophysics Data System (ADS)

    Villanueva, Diane

    The initial stages of reliability-based design optimization involve the formulation of objective functions and constraints, and building a model to estimate the reliability of the design with quantified uncertainties. However, even experienced hands often overlook important objective functions and constraints that affect the design. In addition, uncertainty reduction measures, such as tests and redesign, are often not considered in reliability calculations during the initial stages. This research considers two areas that concern the design of engineering systems: 1) the trade-off of the effect of a test and post-test redesign on reliability and cost and 2) the search for multiple candidate designs as insurance against unforeseen faults in some designs. In this research, a methodology was developed to estimate the effect of a single future test and post-test redesign on reliability and cost. The methodology uses assumed distributions of computational and experimental errors with re-design rules to simulate alternative future test and redesign outcomes to form a probabilistic estimate of the reliability and cost for a given design. Further, it was explored how modeling a future test and redesign provides a company an opportunity to balance development costs versus performance by simultaneously designing the design and the post-test redesign rules during the initial design stage. The second area of this research considers the use of dynamic local surrogates, or surrogate-based agents, to locate multiple candidate designs. Surrogate-based global optimization algorithms often require search in multiple candidate regions of design space, expending most of the computation needed to define multiple alternate designs. Thus, focusing on solely locating the best design may be wasteful. We extended adaptive sampling surrogate techniques to locate multiple optima by building local surrogates in sub-regions of the design space to identify optima. The efficiency of this method was studied, and the method was compared to other surrogate-based optimization methods that aim to locate the global optimum using two two-dimensional test functions, a six-dimensional test function, and a five-dimensional engineering example.

  7. 78 FR 72968 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-04

    ... Capital Commitment Schedule (``CCS'') interest; (3) NYSE Rule 70.25 to permit d-Quotes to be designated... that MPL Orders may interact with CCS interest; (3) NYSE Rule 70.25 to permit d- Quotes to be... the CCS pursuant to Rule 1000 would not be permitted to be designated as MPL Orders. The CCS is a...

  8. Habituation: a non-associative learning rule design for spiking neurons and an autonomous mobile robots implementation.

    PubMed

    Cyr, André; Boukadoum, Mounir

    2013-03-01

    This paper presents a novel bio-inspired habituation function for robots under control by an artificial spiking neural network. This non-associative learning rule is modelled at the synaptic level and validated through robotic behaviours in reaction to different stimuli patterns in a dynamical virtual 3D world. Habituation is minimally represented to show an attenuated response after exposure to and perception of persistent external stimuli. Based on current neurosciences research, the originality of this rule includes modulated response to variable frequencies of the captured stimuli. Filtering out repetitive data from the natural habituation mechanism has been demonstrated to be a key factor in the attention phenomenon, and inserting such a rule operating at multiple temporal dimensions of stimuli increases a robot's adaptive behaviours by ignoring broader contextual irrelevant information.

  9. A flexible telerobotic system for space operations

    NASA Technical Reports Server (NTRS)

    Sliwa, N. O.; Will, R. W.

    1987-01-01

    The objective and design of a proposed goal-oriented knowledge-based telerobotic system for space operations is described. This design effort encompasses the elements of the system executive and user interface and the distribution and general structure of the knowledge base, the displays, and the task sequencing. The objective of the design effort is to provide an expandable structure for a telerobotic system that provides cooperative interaction between the human operator and computer control. The initial phase of the implementation provides a rule-based, goal-oriented script generator to interface to the existing control modes of a telerobotic research system, in the Intelligent Systems Research Lab at NASA Research Center.

  10. The study on the effect of pattern density distribution on the STI CMP process

    NASA Astrophysics Data System (ADS)

    Sub, Yoon Myung; Hian, Bernard Yap Tzen; Fong, Lee It; Anak, Philip Menit; Minhar, Ariffin Bin; Wui, Tan Kim; Kim, Melvin Phua Twang; Jin, Looi Hui; Min, Foo Thai

    2017-08-01

    The effects of pattern density on CMP characteristics were investigated using specially designed wafer for the characterization of pattern-dependencies in STI CMP [1]. The purpose of this study is to investigate the planarization behavior based on a direct STI CMP used in cerium (CeO2) based slurry system in terms of pattern density variation. The minimal design rule (DR) of 180nm generation technology node was adopted for the mask layout. The mask was successfully applied for evaluation of a cerium (CeO2) abrasive based direct STI CMP process. In this study, we described a planarization behavior of the loading-effects of pattern density variation which were characterized with layout pattern density and pitch variations using masks mentioned above. Furthermore, the characterizing pattern dependent on the variations of the dimensions and spacing features, in thickness remaining after CMP, were analyzed and evaluated. The goal was to establish a concept of library method which will be used to generate design rules reducing the probability of CMP-related failures. Details of the characterization were measured in various layouts showing different pattern density ranges and the effects of pattern density on STI CMP has been discussed in this paper.

  11. Design Characteristics Influence Performance of Clinical Prediction Rules in Validation: A Meta-Epidemiological Study

    PubMed Central

    Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda

    2016-01-01

    Background Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules’ performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Methods Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. Results A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2–4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Conclusion Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved. PMID:26730980

  12. Transfer of Mixed Word Identification Training to a Reading Context.

    ERIC Educational Resources Information Center

    Koehler, John; And Others

    The research reported here was designed to examine a number of factors that findings from verbal learning studies indicate should affect the recall and transfer of word identification materials. Sight word and phonics-based or rule-based learning were investigated in 112 kindergarteners who were identified as nonreaders. Groups were trained on…

  13. NASIS data base management system: IBM 360 TSS implementation. Volume 1: Installation standards

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The installation standards for the (NASIS) data base management system are presented. The standard approach to preparing systems documentation and the program design and coding rules and conventions are outlines. Included are instructions for preparing all major specifications and suggestions for improving the quality and efficency of the programming task.

  14. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  15. C-Language Integrated Production System, Version 6.0

    NASA Technical Reports Server (NTRS)

    Riley, Gary; Donnell, Brian; Ly, Huyen-Anh Bebe; Ortiz, Chris

    1995-01-01

    C Language Integrated Production System (CLIPS) computer programs are specifically intended to model human expertise or other knowledge. CLIPS is designed to enable research on, and development and delivery of, artificial intelligence on conventional computers. CLIPS 6.0 provides cohesive software tool for handling wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming: representation of knowledge as heuristics - essentially, rules of thumb that specify set of actions performed in given situation. Object-oriented programming: modeling of complex systems comprised of modular components easily reused to model other systems or create new components. Procedural-programming: representation of knowledge in ways similar to those of such languages as C, Pascal, Ada, and LISP. Version of CLIPS 6.0 for IBM PC-compatible computers requires DOS v3.3 or later and/or Windows 3.1 or later.

  16. A study of applications scribe frame data verifications using design rule check

    NASA Astrophysics Data System (ADS)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  17. 78 FR 3486 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-16

    ..., bonds, and combinations thereof. The economics of the transaction are based on the relationship between... rules of an exchange be designed to promote just and equitable principles of trade, to prevent...

  18. Targeted training of the decision rule benefits rule-guided behavior in Parkinson's disease.

    PubMed

    Ell, Shawn W

    2013-12-01

    The impact of Parkinson's disease (PD) on rule-guided behavior has received considerable attention in cognitive neuroscience. The majority of research has used PD as a model of dysfunction in frontostriatal networks, but very few attempts have been made to investigate the possibility of adapting common experimental techniques in an effort to identify the conditions that are most likely to facilitate successful performance. The present study investigated a targeted training paradigm designed to facilitate rule learning and application using rule-based categorization as a model task. Participants received targeted training in which there was no selective-attention demand (i.e., stimuli varied along a single, relevant dimension) or nontargeted training in which there was selective-attention demand (i.e., stimuli varied along a relevant dimension as well as an irrelevant dimension). Following training, all participants were tested on a rule-based task with selective-attention demand. During the test phase, PD patients who received targeted training performed similarly to control participants and outperformed patients who did not receive targeted training. As a preliminary test of the generalizability of the benefit of targeted training, a subset of the PD patients were tested on the Wisconsin card sorting task (WCST). PD patients who received targeted training outperformed PD patients who did not receive targeted training on several WCST performance measures. These data further characterize the contribution of frontostriatal circuitry to rule-guided behavior. Importantly, these data also suggest that PD patient impairment, on selective-attention-demanding tasks of rule-guided behavior, is not inevitable and highlight the potential benefit of targeted training.

  19. Coreference analysis in clinical notes: a multi-pass sieve with alternate anaphora resolution modules

    PubMed Central

    Li, Dingcheng; Sohn, Sunghwan; Wu, Stephen Tze-Inn; Wagholikar, Kavishwar; Torii, Manabu; Liu, Hongfang

    2012-01-01

    Objective This paper describes the coreference resolution system submitted by Mayo Clinic for the 2011 i2b2/VA/Cincinnati shared task Track 1C. The goal of the task was to construct a system that links the markables corresponding to the same entity. Materials and methods The task organizers provided progress notes and discharge summaries that were annotated with the markables of treatment, problem, test, person, and pronoun. We used a multi-pass sieve algorithm that applies deterministic rules in the order of preciseness and simultaneously gathers information about the entities in the documents. Our system, MedCoref, also uses a state-of-the-art machine learning framework as an alternative to the final, rule-based pronoun resolution sieve. Results The best system that uses a multi-pass sieve has an overall score of 0.836 (average of B3, MUC, Blanc, and CEAF F score) for the training set and 0.843 for the test set. Discussion A supervised machine learning system that typically uses a single function to find coreferents cannot accommodate irregularities encountered in data especially given the insufficient number of examples. On the other hand, a completely deterministic system could lead to a decrease in recall (sensitivity) when the rules are not exhaustive. The sieve-based framework allows one to combine reliable machine learning components with rules designed by experts. Conclusion Using relatively simple rules, part-of-speech information, and semantic type properties, an effective coreference resolution system could be designed. The source code of the system described is available at https://sourceforge.net/projects/ohnlp/files/MedCoref. PMID:22707745

  20. Mathematical programming models for the economic design and assessment of wind energy conversion systems

    NASA Astrophysics Data System (ADS)

    Reinert, K. A.

    The use of linear decision rules (LDR) and chance constrained programming (CCP) to optimize the performance of wind energy conversion clusters coupled to storage systems is described. Storage is modelled by LDR and output by CCP. The linear allocation rule and linear release rule prescribe the size and optimize a storage facility with a bypass. Chance constraints are introduced to explicitly treat reliability in terms of an appropriate value from an inverse cumulative distribution function. Details of deterministic programming structure and a sample problem involving a 500 kW and a 1.5 MW WECS are provided, considering an installed cost of $1/kW. Four demand patterns and three levels of reliability are analyzed for optimizing the generator choice and the storage configuration for base load and peak operating conditions. Deficiencies in ability to predict reliability and to account for serial correlations are noted in the model, which is concluded useful for narrowing WECS design options.

  1. Medicare Program; Inpatient Psychiatric Facilities Prospective Payment System--Update for Fiscal Year Beginning October 1, 2015 (FY 2016). Final rule.

    PubMed

    2015-08-05

    This final rule updates the prospective payment rates for Medicare inpatient hospital services provided by inpatient psychiatric facilities (IPFs) (which are freestanding IPFs and psychiatric units of an acute care hospital or critical access hospital). These changes are applicable to IPF discharges occurring during fiscal year (FY) 2016 (October 1, 2015 through September 30, 2016). This final rule also implements: a new 2012-based IPF market basket; an updated IPF labor-related share; a transition to new Core Based Statistical Area (CBSA) designations in the FY 2016 IPF Prospective Payment System (PPS) wage index; a phase-out of the rural adjustment for IPF providers whose status changes from rural to urban as a result of the wage index CBSA changes; and new quality measures and reporting requirements under the IPF quality reporting program. This final rule also reminds IPFs of the October 1, 2015 implementation of the International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM), and updates providers on the status of IPF PPS refinements.

  2. Evaluation of the performance of statistical tests used in making cleanup decisions at Superfund sites. Part 1: Choosing an appropriate statistical test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berman, D.W.; Allen, B.C.; Van Landingham, C.B.

    1998-12-31

    The decision rules commonly employed to determine the need for cleanup are evaluated both to identify conditions under which they lead to erroneous conclusions and to quantify the rate that such errors occur. Their performance is also compared with that of other applicable decision rules. The authors based the evaluation of decision rules on simulations. Results are presented as power curves. These curves demonstrate that the degree of statistical control achieved is independent of the form of the null hypothesis. The loss of statistical control that occurs when a decision rule is applied to a data set that does notmore » satisfy the rule`s validity criteria is also clearly demonstrated. Some of the rules evaluated do not offer the formal statistical control that is an inherent design feature of other rules. Nevertheless, results indicate that such informal decision rules may provide superior overall control of error rates, when their application is restricted to data exhibiting particular characteristics. The results reported here are limited to decision rules applied to uncensored and lognormally distributed data. To optimize decision rules, it is necessary to evaluate their behavior when applied to data exhibiting a range of characteristics that bracket those common to field data. The performance of decision rules applied to data sets exhibiting a broader range of characteristics is reported in the second paper of this study.« less

  3. 77 FR 30988 - Endangered and Threatened Wildlife and Plants; Designation of Critical Habitat for the Cumberland...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-24

    ... revised totals. The data in this table replaces the data provided in table 3 of the proposed rule at 76 FR... that we designate or revise critical habitat based upon the best scientific data available, after... educational benefits of mapping areas containing essential features that aid in the recovery of the listed...

  4. Integration of a knowledge-based system and a clinical documentation system via a data dictionary.

    PubMed

    Eich, H P; Ohmann, C; Keim, E; Lang, K

    1997-01-01

    This paper describes the design and realisation of a knowledge-based system and a clinical documentation system linked via a data dictionary. The software was developed as a shell with object oriented methods and C++ for IBM-compatible PC's and WINDOWS 3.1/95. The data dictionary covers terminology and document objects with relations to external classifications. It controls the terminology in the documentation program with form-based entry of clinical documents and in the knowledge-based system with scores and rules. The software was applied to the clinical field of acute abdominal pain by implementing a data dictionary with 580 terminology objects, 501 document objects, and 2136 links; a documentation module with 8 clinical documents and a knowledge-based system with 10 scores and 7 sets of rules.

  5. Centralized PI control for high dimensional multivariable systems based on equivalent transfer function.

    PubMed

    Luan, Xiaoli; Chen, Qiang; Liu, Fei

    2014-09-01

    This article presents a new scheme to design full matrix controller for high dimensional multivariable processes based on equivalent transfer function (ETF). Differing from existing ETF method, the proposed ETF is derived directly by exploiting the relationship between the equivalent closed-loop transfer function and the inverse of open-loop transfer function. Based on the obtained ETF, the full matrix controller is designed utilizing the existing PI tuning rules. The new proposed ETF model can more accurately represent the original processes. Furthermore, the full matrix centralized controller design method proposed in this paper is applicable to high dimensional multivariable systems with satisfactory performance. Comparison with other multivariable controllers shows that the designed ETF based controller is superior with respect to design-complexity and obtained performance. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Peptide array-based interaction assay of solid-bound peptides and anchorage-dependant cells and its effectiveness in cell-adhesive peptide design.

    PubMed

    Kato, Ryuji; Kaga, Chiaki; Kunimatsu, Mitoshi; Kobayashi, Takeshi; Honda, Hiroyuki

    2006-06-01

    Peptide array, the designable peptide library covalently synthesized on cellulose support, was applied to assay peptide-cell interaction, between solid-bound peptides and anchorage-dependant cells, to study objective peptide design. As a model case, cell-adhesive peptides that could enhance cell growth as tissue engineering scaffold material, was studied. On the peptide array, the relative cell-adhesion ratio of NIH/3T3 cells was 2.5-fold higher on the RGDS (Arg-Gly-Asp-Ser) peptide spot as compared to the spot with no peptide, thus indicating integrin-mediated peptide-cell interaction. Such strong cell adhesion mediated by the RGDS peptide was easily disrupted by single residue substitution on the peptide array, thus indicating that the sequence recognition accuracy of cells was strictly conserved in our optimized scheme. The observed cellular morphological extension with active actin stress-fiber on the RGD motif-containing peptide supported our strategy that peptide array-based interaction assay of solid-bound peptide and anchorage-dependant cells (PIASPAC) could provide quantitative data on biological peptide-cell interaction. The analysis of 180 peptides obtained from fibronectin type III domain (no. 1447-1629) yielded 18 novel cell-adhesive peptides without the RGD motif. Taken together with the novel candidates, representative rules of ineffective amino acid usage were obtained from non-effective candidate sequences for the effective designing of cell-adhesive peptides. On comparing the amino acid usage of the top 20 and last 20 peptides from the 180 peptides, the following four brief design rules were indicated: (i) Arg or Lys of positively charged amino acids (except His) could enhance cell adhesion, (ii) small hydrophilic amino acids are favored in cell-adhesion peptides, (iii) negatively charged amino acids and small amino acids (except Gly) could reduce cell adhesion, and (iv) Cys and Met could be excluded from the sequence combination since they have less influence on the peptide design. Such rules that are indicative of the nature of the functional peptide sequence can be obtained only by the mass comparison analysis of PIASPAC using peptide array. By following such indicative rules, numerous amino acid combinations can be effectively screened for further examination of novel peptide design.

  7. A General Design Rule to Manipulate Photocarrier Transport Path in Solar Cells and Its Realization by the Plasmonic-Electrical Effect

    NASA Astrophysics Data System (ADS)

    Sha, Wei E. I.; Zhu, Hugh L.; Chen, Luzhou; Chew, Weng Cho; Choy, Wallace C. H.

    2015-02-01

    It is well known that transport paths of photocarriers (electrons and holes) before collected by electrodes strongly affect bulk recombination and thus electrical properties of solar cells, including open-circuit voltage and fill factor. For boosting device performance, a general design rule, tailored to arbitrary electron to hole mobility ratio, is proposed to decide the transport paths of photocarriers. Due to a unique ability to localize and concentrate light, plasmonics is explored to manipulate photocarrier transport through spatially redistributing light absorption at the active layer of devices. Without changing the active materials, we conceive a plasmonic-electrical concept, which tunes electrical properties of solar cells via the plasmon-modified optical field distribution, to realize the design rule. Incorporating spectrally and spatially configurable metallic nanostructures, thin-film solar cells are theoretically modelled and experimentally fabricated to validate the design rule and verify the plasmonic-tunable electrical properties. The general design rule, together with the plasmonic-electrical effect, contributes to the evolution of emerging photovoltaics.

  8. Affect-Aware Adaptive Tutoring Based on Human-Automation Etiquette Strategies.

    PubMed

    Yang, Euijung; Dorneich, Michael C

    2018-06-01

    We investigated adapting the interaction style of intelligent tutoring system (ITS) feedback based on human-automation etiquette strategies. Most ITSs adapt the content difficulty level, adapt the feedback timing, or provide extra content when they detect cognitive or affective decrements. Our previous work demonstrated that changing the interaction style via different feedback etiquette strategies has differential effects on students' motivation, confidence, satisfaction, and performance. The best etiquette strategy was also determined by user frustration. Based on these findings, a rule set was developed that systemically selected the proper etiquette strategy to address one of four learning factors (motivation, confidence, satisfaction, and performance) under two different levels of user frustration. We explored whether etiquette strategy selection based on this rule set (systematic) or random changes in etiquette strategy for a given level of frustration affected the four learning factors. Participants solved mathematics problems under different frustration conditions with feedback that adapted dynamic changes in etiquette strategies either systematically or randomly. The results demonstrated that feedback with etiquette strategies chosen systematically via the rule set could selectively target and improve motivation, confidence, satisfaction, and performance more than changing etiquette strategies randomly. The systematic adaptation was effective no matter the level of frustration for the participant. If computer tutors can vary the interaction style to effectively mitigate negative emotions, then ITS designers would have one more mechanism in which to design affect-aware adaptations that provide the proper responses in situations where human emotions affect the ability to learn.

  9. A business rules design framework for a pharmaceutical validation and alert system.

    PubMed

    Boussadi, A; Bousquet, C; Sabatier, B; Caruba, T; Durieux, P; Degoulet, P

    2011-01-01

    Several alert systems have been developed to improve the patient safety aspects of clinical information systems (CIS). Most studies have focused on the evaluation of these systems, with little information provided about the methodology leading to system implementation. We propose here an 'agile' business rule design framework (BRDF) supporting both the design of alerts for the validation of drug prescriptions and the incorporation of the end user into the design process. We analyzed the unified process (UP) design life cycle and defined the activities, subactivities, actors and UML artifacts that could be used to enhance the agility of the proposed framework. We then applied the proposed framework to two different sets of data in the context of the Georges Pompidou University Hospital (HEGP) CIS. We introduced two new subactivities into UP: business rule specification and business rule instantiation activity. The pharmacist made an effective contribution to five of the eight BRDF design activities. Validation of the two new subactivities was effected in the context of drug dosage adaption to the patients' clinical and biological contexts. Pilot experiment shows that business rules modeled with BRDF and implemented as an alert system triggered an alert for 5824 of the 71,413 prescriptions considered (8.16%). A business rule design framework approach meets one of the strategic objectives for decision support design by taking into account three important criteria posing a particular challenge to system designers: 1) business processes, 2) knowledge modeling of the context of application, and 3) the agility of the various design steps.

  10. 40 CFR 62.1100 - Identification of plan.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...

  11. 40 CFR 62.1100 - Identification of plan.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...

  12. 40 CFR 62.1100 - Identification of plan.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...

  13. 40 CFR 62.1100 - Identification of plan.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...

  14. 40 CFR 62.1100 - Identification of plan.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...

  15. 78 FR 24331 - Apricots Grown in Designated Counties in Washington; Temporary Suspension of Handling Regulations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-25

    ... DEPARTMENT OF AGRICULTURE Agricultural Marketing Service 7 CFR Part 922 [Docket No. AMS-FV-12-0028... Regulations AGENCY: Agricultural Marketing Service, USDA. ACTION: Affirmation of interim rule as final rule... the marketing order for apricots grown in designated Counties in Washington. The interim rule...

  16. DEVELOPMENT OF ASME SECTION X CODE RULES FOR HIGH PRESSURE COMPOSITE HYDROGEN PRESSURE VESSELS WITH NON-LOAD SHARING LINERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rawls, G.; Newhouse, N.; Rana, M.

    2010-04-13

    The Boiler and Pressure Vessel Project Team on Hydrogen Tanks was formed in 2004 to develop Code rules to address the various needs that had been identified for the design and construction of up to 15000 psi hydrogen storage vessel. One of these needs was the development of Code rules for high pressure composite vessels with non-load sharing liners for stationary applications. In 2009, ASME approved new Appendix 8, for Section X Code which contains the rules for these vessels. These vessels are designated as Class III vessels with design pressure ranging from 20.7 MPa (3,000 ps)i to 103.4 MPamore » (15,000 psi) and maximum allowable outside liner diameter of 2.54 m (100 inches). The maximum design life of these vessels is limited to 20 years. Design, fabrication, and examination requirements have been specified, included Acoustic Emission testing at time of manufacture. The Code rules include the design qualification testing of prototype vessels. Qualification includes proof, expansion, burst, cyclic fatigue, creep, flaw, permeability, torque, penetration, and environmental testing.« less

  17. Proposal to designate Methylothermus subterraneus Hirayama et al. 2011 as the type species of the genus Methylothermus. Request for an Opinion.

    PubMed

    Boden, Rich; Oren, Aharon

    2017-09-01

    Methylothermus thermalis, the designated type species of the genus Methylothermus, is not available from culture collections and its nomenclatural type is a patent strain. According to Rule 20a of the International Code of Nomenclature of Prokaryotes, only species whose names are legitimate may serve as types of genera. Therefore, the name Methylothermus and the names of the species Methylothermus thermalis and Methylothermus subterraneus are not validly published and are illegitimate. We therefore submit a Request for an Opinion to the Judicial Commission of the ICSP to consider the later-named Methylothermus subterraneus as the new type species of the genus Methylothermus based on Rule 20e(2).

  18. Machine-Learning Approach for Design of Nanomagnetic-Based Antennas

    NASA Astrophysics Data System (ADS)

    Gianfagna, Carmine; Yu, Huan; Swaminathan, Madhavan; Pulugurtha, Raj; Tummala, Rao; Antonini, Giulio

    2017-08-01

    We propose a machine-learning approach for design of planar inverted-F antennas with a magneto-dielectric nanocomposite substrate. It is shown that machine-learning techniques can be efficiently used to characterize nanomagnetic-based antennas by accurately mapping the particle radius and volume fraction of the nanomagnetic material to antenna parameters such as gain, bandwidth, radiation efficiency, and resonant frequency. A modified mixing rule model is also presented. In addition, the inverse problem is addressed through machine learning as well, where given the antenna parameters, the corresponding design space of possible material parameters is identified.

  19. Evaluation of wholesale electric power market rules and financial risk management by agent-based simulations

    NASA Astrophysics Data System (ADS)

    Yu, Nanpeng

    As U.S. regional electricity markets continue to refine their market structures, designs and rules of operation in various ways, two critical issues are emerging. First, although much experience has been gained and costly and valuable lessons have been learned, there is still a lack of a systematic platform for evaluation of the impact of a new market design from both engineering and economic points of view. Second, the transition from a monopoly paradigm characterized by a guaranteed rate of return to a competitive market created various unfamiliar financial risks for various market participants, especially for the Investor Owned Utilities (IOUs) and Independent Power Producers (IPPs). This dissertation uses agent-based simulation methods to tackle the market rules evaluation and financial risk management problems. The California energy crisis in 2000-01 showed what could happen to an electricity market if it did not go through a comprehensive and rigorous testing before its implementation. Due to the complexity of the market structure, strategic interaction between the participants, and the underlying physics, it is difficult to fully evaluate the implications of potential changes to market rules. This dissertation presents a flexible and integrative method to assess market designs through agent-based simulations. Realistic simulation scenarios on a 225-bus system are constructed for evaluation of the proposed PJM-like market power mitigation rules of the California electricity market. Simulation results show that in the absence of market power mitigation, generation company (GenCo) agents facilitated by Q-learning are able to exploit the market flaws and make significantly higher profits relative to the competitive benchmark. The incorporation of PJM-like local market power mitigation rules is shown to be effective in suppressing the exercise of market power. The importance of financial risk management is exemplified by the recent financial crisis. In this dissertation, basic financial risk management concepts relevant for wholesale electric power markets are carefully explained and illustrated. In addition, the financial risk management problem in wholesale electric power markets is generalized as a four-stage process. Within the proposed financial risk management framework, the critical problem of financial bilateral contract negotiation is addressed. This dissertation analyzes a financial bilateral contract negotiation process between a generating company and a load-serving entity in a wholesale electric power market with congestion managed by locational marginal pricing. Nash bargaining theory is used to model a Pareto-efficient settlement point. The model predicts negotiation results under varied conditions and identifies circumstances in which the two parties might fail to reach an agreement. Both analysis and agent-based simulation are used to gain insight regarding how relative risk aversion and biased price estimates influence negotiated outcomes. These results should provide useful guidance to market participants in their bilateral contract negotiation processes.

  20. Developing a semantic web model for medical differential diagnosis recommendation.

    PubMed

    Mohammed, Osama; Benlamri, Rachid

    2014-10-01

    In this paper we describe a novel model for differential diagnosis designed to make recommendations by utilizing semantic web technologies. The model is a response to a number of requirements, ranging from incorporating essential clinical diagnostic semantics to the integration of data mining for the process of identifying candidate diseases that best explain a set of clinical features. We introduce two major components, which we find essential to the construction of an integral differential diagnosis recommendation model: the evidence-based recommender component and the proximity-based recommender component. Both approaches are driven by disease diagnosis ontologies designed specifically to enable the process of generating diagnostic recommendations. These ontologies are the disease symptom ontology and the patient ontology. The evidence-based diagnosis process develops dynamic rules based on standardized clinical pathways. The proximity-based component employs data mining to provide clinicians with diagnosis predictions, as well as generates new diagnosis rules from provided training datasets. This article describes the integration between these two components along with the developed diagnosis ontologies to form a novel medical differential diagnosis recommendation model. This article also provides test cases from the implementation of the overall model, which shows quite promising diagnostic recommendation results.

  1. Clinical Trials Targeting Aging and Age-Related Multimorbidity

    PubMed Central

    Crimmins, Eileen M; Grossardt, Brandon R; Crandall, Jill P; Gelfond, Jonathan A L; Harris, Tamara B; Kritchevsky, Stephen B; Manson, JoAnn E; Robinson, Jennifer G; Rocca, Walter A; Temprosa, Marinella; Thomas, Fridtjof; Wallace, Robert; Barzilai, Nir

    2017-01-01

    Abstract Background There is growing interest in identifying interventions that may increase health span by targeting biological processes underlying aging. The design of efficient and rigorous clinical trials to assess these interventions requires careful consideration of eligibility criteria, outcomes, sample size, and monitoring plans. Methods Experienced geriatrics researchers and clinical trialists collaborated to provide advice on clinical trial design. Results Outcomes based on the accumulation and incidence of age-related chronic diseases are attractive for clinical trials targeting aging. Accumulation and incidence rates of multimorbidity outcomes were developed by selecting at-risk subsets of individuals from three large cohort studies of older individuals. These provide representative benchmark data for decisions on eligibility, duration, and assessment protocols. Monitoring rules should be sensitive to targeting aging-related, rather than disease-specific, outcomes. Conclusions Clinical trials targeting aging are feasible, but require careful design consideration and monitoring rules. PMID:28364543

  2. The Good, the Bad, and the Ugly: A Theoretical Framework for the Assessment of Continuous Colormaps.

    PubMed

    Bujack, Roxana; Turton, Terece L; Samsel, Francesca; Ware, Colin; Rogers, David H; Ahrens, James

    2018-01-01

    A myriad of design rules for what constitutes a "good" colormap can be found in the literature. Some common rules include order, uniformity, and high discriminative power. However, the meaning of many of these terms is often ambiguous or open to interpretation. At times, different authors may use the same term to describe different concepts or the same rule is described by varying nomenclature. These ambiguities stand in the way of collaborative work, the design of experiments to assess the characteristics of colormaps, and automated colormap generation. In this paper, we review current and historical guidelines for colormap design. We propose a specified taxonomy and provide unambiguous mathematical definitions for the most common design rules.

  3. Tool Mediation in Focus on Form Activities: Case Studies in a Grammar-Exploring Environment

    ERIC Educational Resources Information Center

    Karlstrom, Petter; Cerratto-Pargman, Teresa; Lindstrom, Henrik; Knutsson, Ola

    2007-01-01

    We present two case studies of two different pedagogical tasks in a Computer Assisted Language Learning environment called Grim. The main design principle in Grim is to support "Focus on Form" in second language pedagogy. Grim contains several language technology-based features for exploring linguistic forms (static, rule-based and statistical),…

  4. Phrase Frequency, Proficiency and Grammaticality Interact in Non-Native Processing: Implications for Theories of SLA

    ERIC Educational Resources Information Center

    Shantz, Kailen

    2017-01-01

    This study reports on a self-paced reading experiment in which native and non-native speakers of English read sentences designed to evaluate the predictions of usage-based and rule-based approaches to second language acquisition (SLA). Critical stimuli were four-word sequences embedded into sentences in which phrase frequency and grammaticality…

  5. Intelligent virtual teacher

    NASA Astrophysics Data System (ADS)

    Takács, Ondřej; Kostolányová, Kateřina

    2016-06-01

    This paper describes the Virtual Teacher that uses a set of rules to automatically adapt the way of teaching. These rules compose of two parts: conditions on various students' properties or learning situation; conclusions that specify different adaptation parameters. The rules can be used for general adaptation of each subject or they can be specific to some subject. The rule based system of Virtual Teacher is dedicated to be used in pedagogical experiments in adaptive e-learning and is therefore designed for users without education in computer science. The Virtual Teacher was used in dissertation theses of two students, who executed two pedagogical experiments. This paper also describes the phase of simulating and modeling of the theoretically prepared adaptive process in the modeling tool, which has all the required parameters and has been created especially for the occasion. The experiments are being conducted on groups of virtual students and by using a virtual study material.

  6. 17 CFR 151.11 - Designated contract market and swap execution facility position limits and accountability rules.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Designated contract market and swap execution facility position limits and accountability rules. (a) Spot... rules and procedures for monitoring and enforcing spot-month position limits set at levels no greater... monitoring and enforcing spot-month position limits set at levels no greater than 25 percent of estimated...

  7. 17 CFR 151.11 - Designated contract market and swap execution facility position limits and accountability rules.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Designated contract market and swap execution facility position limits and accountability rules. (a) Spot... rules and procedures for monitoring and enforcing spot-month position limits set at levels no greater... monitoring and enforcing spot-month position limits set at levels no greater than 25 percent of estimated...

  8. Engineering monitoring expert system's developer

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.

    1991-01-01

    This research project is designed to apply artificial intelligence technology including expert systems, dynamic interface of neural networks, and hypertext to construct an expert system developer. The developer environment is specifically suited to building expert systems which monitor the performance of ground support equipment for propulsion systems and testing facilities. The expert system developer, through the use of a graphics interface and a rule network, will be transparent to the user during rule constructing and data scanning of the knowledge base. The project will result in a software system that allows its user to build specific monitoring type expert systems which monitor various equipments used for propulsion systems or ground testing facilities and accrues system performance information in a dynamic knowledge base.

  9. 77 FR 52977 - Regulatory Capital Rules: Advanced Approaches Risk-Based Capital Rule; Market Risk Capital Rule

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ... Corporation 12 CFR Parts 324, 325 Regulatory Capital Rules: Advanced Approaches Risk-Based Capital Rule... 325 RIN 3064-AD97 Regulatory Capital Rules: Advanced Approaches Risk-Based Capital Rule; Market Risk... the agencies' current capital rules. In this NPR (Advanced Approaches and Market Risk NPR) the...

  10. Design, Development and Analysis of Centrifugal Blower

    NASA Astrophysics Data System (ADS)

    Baloni, Beena Devendra; Channiwala, Salim Abbasbhai; Harsha, Sugnanam Naga Ramannath

    2018-06-01

    Centrifugal blowers are widely used turbomachines equipment in all kinds of modern and domestic life. Manufacturing of blowers seldom follow an optimum design solution for individual blower. Although centrifugal blowers are developed as highly efficient machines, design is still based on various empirical and semi empirical rules proposed by fan designers. There are different methodologies used to design the impeller and other components of blowers. The objective of present study is to study explicit design methodologies and tracing unified design to get better design point performance. This unified design methodology is based more on fundamental concepts and minimum assumptions. Parametric study is also carried out for the effect of design parameters on pressure ratio and their interdependency in the design. The code is developed based on a unified design using C programming. Numerical analysis is carried out to check the flow parameters inside the blower. Two blowers, one based on the present design and other on industrial design, are developed with a standard OEM blower manufacturing unit. A comparison of both designs is done based on experimental performance analysis as per IS standard. The results suggest better efficiency and more flow rate for the same pressure head in case of the present design compared with industrial one.

  11. 10 CFR Appendix A to Part 52 - Design Certification Rule for the U.S. Advanced Boiling Water Reactor

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ....34—Separate Plant Safety Parameter Display Console; 2. Paragraph (f)(2)(viii) of 10 CFR 50.34—Post... design bases or in the safety analyses. c. A proposed departure from Tier 2 affecting resolution of an ex... if: (1) There is a substantial increase in the probability of an ex-vessel severe accident such that...

  12. 10 CFR Appendix A to Part 52 - Design Certification Rule for the U.S. Advanced Boiling Water Reactor

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ....34—Separate Plant Safety Parameter Display Console; 2. Paragraph (f)(2)(viii) of 10 CFR 50.34—Post... design bases or in the safety analyses. c. A proposed departure from Tier 2 affecting resolution of an ex... if: (1) There is a substantial increase in the probability of an ex-vessel severe accident such that...

  13. Elements of decisional dynamics: An agent-based approach applied to artificial financial market

    NASA Astrophysics Data System (ADS)

    Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille

    2018-02-01

    This paper introduces an original mathematical description for describing agents' decision-making process in the case of problems affected by both individual and collective behaviors in systems characterized by nonlinear, path dependent, and self-organizing interactions. An application to artificial financial markets is proposed by designing a multi-agent system based on the proposed formalization. In this application, agents' decision-making process is based on fuzzy logic rules and the price dynamics is purely deterministic according to the basic matching rules of a central order book. Finally, while putting most parameters under evolutionary control, the computational agent-based system is able to replicate several stylized facts of financial time series (distributions of stock returns showing a heavy tail with positive excess kurtosis, absence of autocorrelations in stock returns, and volatility clustering phenomenon).

  14. Elements of decisional dynamics: An agent-based approach applied to artificial financial market.

    PubMed

    Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille

    2018-02-01

    This paper introduces an original mathematical description for describing agents' decision-making process in the case of problems affected by both individual and collective behaviors in systems characterized by nonlinear, path dependent, and self-organizing interactions. An application to artificial financial markets is proposed by designing a multi-agent system based on the proposed formalization. In this application, agents' decision-making process is based on fuzzy logic rules and the price dynamics is purely deterministic according to the basic matching rules of a central order book. Finally, while putting most parameters under evolutionary control, the computational agent-based system is able to replicate several stylized facts of financial time series (distributions of stock returns showing a heavy tail with positive excess kurtosis, absence of autocorrelations in stock returns, and volatility clustering phenomenon).

  15. Exploration of SWRL Rule Bases through Visualization, Paraphrasing, and Categorization of Rules

    NASA Astrophysics Data System (ADS)

    Hassanpour, Saeed; O'Connor, Martin J.; Das, Amar K.

    Rule bases are increasingly being used as repositories of knowledge content on the Semantic Web. As the size and complexity of these rule bases increases, developers and end users need methods of rule abstraction to facilitate rule management. In this paper, we describe a rule abstraction method for Semantic Web Rule Language (SWRL) rules that is based on lexical analysis and a set of heuristics. Our method results in a tree data structure that we exploit in creating techniques to visualize, paraphrase, and categorize SWRL rules. We evaluate our approach by applying it to several biomedical ontologies that contain SWRL rules, and show how the results reveal rule patterns within the rule base. We have implemented our method as a plug-in tool for Protégé-OWL, the most widely used ontology modeling software for the Semantic Web. Our tool can allow users to rapidly explore content and patterns in SWRL rule bases, enabling their acquisition and management.

  16. Tunneling magnetoresistance sensor with pT level 1/f magnetic noise

    NASA Astrophysics Data System (ADS)

    Deak, James G.; Zhou, Zhimin; Shen, Weifeng

    2017-05-01

    Magnetoresistive devices are important components in a large number of commercial electronic products in a wide range of applications including industrial position sensors, automotive sensors, hard disk read heads, cell phone compasses, and solid state memories. These devices are commonly based on anisotropic magnetoresistance (AMR) and giant magnetoresistance (GMR), but over the past few years tunneling magnetoresistance (TMR) has been emerging in more applications. Here we focus on recent work that has enabled the development of TMR magnetic field sensors with 1/f noise of less than 100 pT/rtHz at 1 Hz. Of the commercially available sensors, the lowest noise devices have typically been AMR, but they generally have the largest die size. Based on this observation and modeling of experimental data size and geometry dependence, we find that there is an optimal design rule that produces minimum 1/f noise. This design rule requires maximizing the areal coverage of an on-chip flux concentrator, providing it with a minimum possible total gap width, and tightly packing the gaps with MTJ elements, which increases the effective volume and decreases the saturation field of the MTJ freelayers. When properly optimized using this rule, these sensors have noise below 60 pT/rtHz, and could possibly replace fluxgate magnetometers in some applications.

  17. An evaluation of inferential procedures for adaptive clinical trial designs with pre-specified rules for modifying the sample size.

    PubMed

    Levin, Gregory P; Emerson, Sarah C; Emerson, Scott S

    2014-09-01

    Many papers have introduced adaptive clinical trial methods that allow modifications to the sample size based on interim estimates of treatment effect. There has been extensive commentary on type I error control and efficiency considerations, but little research on estimation after an adaptive hypothesis test. We evaluate the reliability and precision of different inferential procedures in the presence of an adaptive design with pre-specified rules for modifying the sampling plan. We extend group sequential orderings of the outcome space based on the stage at stopping, likelihood ratio statistic, and sample mean to the adaptive setting in order to compute median-unbiased point estimates, exact confidence intervals, and P-values uniformly distributed under the null hypothesis. The likelihood ratio ordering is found to average shorter confidence intervals and produce higher probabilities of P-values below important thresholds than alternative approaches. The bias adjusted mean demonstrates the lowest mean squared error among candidate point estimates. A conditional error-based approach in the literature has the benefit of being the only method that accommodates unplanned adaptations. We compare the performance of this and other methods in order to quantify the cost of failing to plan ahead in settings where adaptations could realistically be pre-specified at the design stage. We find the cost to be meaningful for all designs and treatment effects considered, and to be substantial for designs frequently proposed in the literature. © 2014, The International Biometric Society.

  18. Big data mining analysis method based on cloud computing

    NASA Astrophysics Data System (ADS)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  19. Model based high NA anamorphic EUV RET

    NASA Astrophysics Data System (ADS)

    Jiang, Fan; Wiaux, Vincent; Fenger, Germain; Clifford, Chris; Liubich, Vlad; Hendrickx, Eric

    2018-03-01

    With the announcement of the extension of the Extreme Ultraviolet (EUV) roadmap to a high NA lithography tool that utilizes anamorphic optics design, an investigation of design tradeoffs unique to the imaging of anamorphic lithography tool is shown. An anamorphic optical proximity correction (OPC) solution has been developed that models fully the EUV near field electromagnetic effects and the anamorphic imaging using the Domain Decomposition Method (DDM). Clips of imec representative for the N3 logic node were used to demonstrate the OPC solutions on critical layers that will benefit from the increased contrast at high NA using anamorphic imaging. However, unlike isomorphic case, from wafer perspective, OPC needs to treat x and y differently. In the paper, we show a design trade-off seen unique to Anamorphic EUV, namely that using a mask rule of 48nm (mask scale), approaching current state of the art, limitations are observed in the available correction that can be applied to the mask. The metal pattern has a pitch of 24nm and CD of 12nm. During OPC, the correction of the metal lines oriented vertically are being limited by the mask rule of 12nm 1X. The horizontally oriented lines do not suffer from this mask rule limitation as the correction is allowed to go to 6nm 1X. For this example, the masks rules will need to be more aggressive to allow complete correction, or design rules and wafer processes (wafer rotation) would need to be created that utilize the orientation that can image more aggressive features. When considering VIA or block level correction, aggressive polygon corner to corner designs can be handled with various solutions, including applying a 45 degree chop. Multiple solutions are discussed with the metrics of edge placement error (EPE) and Process Variation Bands (PVBands), together with all the mask constrains. Noted in anamorphic OPC, the 45 degree chop is maintained at the mask level to meet mask manufacturing constraints, but results in skewed angle edge in wafer level correction. In this paper, we used both contact (Via/block) patterns and metal patterns for OPC practice. By comparing the EPE of horizontal and vertical patterns with a fixed mask rule check (MRC), and the PVBand, we focus on the challenges and the solutions of OPC with anamorphic High-NA lens.

  20. Online intelligent controllers for an enzyme recovery plant: design methodology and performance.

    PubMed

    Leite, M S; Fujiki, T L; Silva, F V; Fileti, A M F

    2010-12-27

    This paper focuses on the development of intelligent controllers for use in a process of enzyme recovery from pineapple rind. The proteolytic enzyme bromelain (EC 3.4.22.4) is precipitated with alcohol at low temperature in a fed-batch jacketed tank. Temperature control is crucial to avoid irreversible protein denaturation. Fuzzy or neural controllers offer a way of implementing solutions that cover dynamic and nonlinear processes. The design methodology and a comparative study on the performance of fuzzy-PI, neurofuzzy, and neural network intelligent controllers are presented. To tune the fuzzy PI Mamdani controller, various universes of discourse, rule bases, and membership function support sets were tested. A neurofuzzy inference system (ANFIS), based on Takagi-Sugeno rules, and a model predictive controller, based on neural modeling, were developed and tested as well. Using a Fieldbus network architecture, a coolant variable speed pump was driven by the controllers. The experimental results show the effectiveness of fuzzy controllers in comparison to the neural predictive control. The fuzzy PI controller exhibited a reduced error parameter (ITAE), lower power consumption, and better recovery of enzyme activity.

  1. Online Intelligent Controllers for an Enzyme Recovery Plant: Design Methodology and Performance

    PubMed Central

    Leite, M. S.; Fujiki, T. L.; Silva, F. V.; Fileti, A. M. F.

    2010-01-01

    This paper focuses on the development of intelligent controllers for use in a process of enzyme recovery from pineapple rind. The proteolytic enzyme bromelain (EC 3.4.22.4) is precipitated with alcohol at low temperature in a fed-batch jacketed tank. Temperature control is crucial to avoid irreversible protein denaturation. Fuzzy or neural controllers offer a way of implementing solutions that cover dynamic and nonlinear processes. The design methodology and a comparative study on the performance of fuzzy-PI, neurofuzzy, and neural network intelligent controllers are presented. To tune the fuzzy PI Mamdani controller, various universes of discourse, rule bases, and membership function support sets were tested. A neurofuzzy inference system (ANFIS), based on Takagi-Sugeno rules, and a model predictive controller, based on neural modeling, were developed and tested as well. Using a Fieldbus network architecture, a coolant variable speed pump was driven by the controllers. The experimental results show the effectiveness of fuzzy controllers in comparison to the neural predictive control. The fuzzy PI controller exhibited a reduced error parameter (ITAE), lower power consumption, and better recovery of enzyme activity. PMID:21234106

  2. Multi-hazard Assessment and Scenario Toolbox (MhAST): A Framework for Analyzing Compounding Effects of Multiple Hazards

    NASA Astrophysics Data System (ADS)

    Sadegh, M.; Moftakhari, H.; AghaKouchak, A.

    2017-12-01

    Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.

  3. A Data Stream Model For Runoff Simulation In A Changing Environment

    NASA Astrophysics Data System (ADS)

    Yang, Q.; Shao, J.; Zhang, H.; Wang, G.

    2017-12-01

    Runoff simulation is of great significance for water engineering design, water disaster control, water resources planning and management in a catchment or region. A large number of methods including concept-based process-driven models and statistic-based data-driven models, have been proposed and widely used in worldwide during past decades. Most existing models assume that the relationship among runoff and its impacting factors is stationary. However, in the changing environment (e.g., climate change, human disturbance), their relationship usually evolves over time. In this study, we propose a data stream model for runoff simulation in a changing environment. Specifically, the proposed model works in three steps: learning a rule set, expansion of a rule, and simulation. The first step is to initialize a rule set. When a new observation arrives, the model will check which rule covers it and then use the rule for simulation. Meanwhile, Page-Hinckley (PH) change detection test is used to monitor the online simulation error of each rule. If a change is detected, the corresponding rule is removed from the rule set. In the second step, for each rule, if it covers more than a given number of instance, the rule is expected to expand. In the third step, a simulation model of each leaf node is learnt with a perceptron without activation function, and is updated with adding a newly incoming observation. Taking Fuxi River catchment as a case study, we applied the model to simulate the monthly runoff in the catchment. Results show that abrupt change is detected in the year of 1997 by using the Page-Hinckley change detection test method, which is consistent with the historic record of flooding. In addition, the model achieves good simulation results with the RMSE of 13.326, and outperforms many established methods. The findings demonstrated that the proposed data stream model provides a promising way to simulate runoff in a changing environment.

  4. The research on construction and application of machining process knowledge base

    NASA Astrophysics Data System (ADS)

    Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai

    2018-03-01

    In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.

  5. 75 FR 58453 - Self-Regulatory Organizations; NYSE Amex LLC; Notice of Filing of a Proposed Rule Change Amending...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-24

    ... Equities Rule 104 To Adopt Pricing Obligations for Designated Market Makers September 20, 2010. Pursuant to... pricing obligations for Designated Market Makers (``DMMs''). The text of the proposed rule change is... adopt pricing obligations for DMMs. Under the proposal, the Exchange will require DMMs to continuously...

  6. 77 FR 33794 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Designation of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-07

    ... Organizations; International Securities Exchange, LLC; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change To List and Trade Option Contracts Overlying 10 Shares of a Security June... Act of 1934 (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to list and trade...

  7. 76 FR 44388 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-25

    ...-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer Period for Commission Action on Proceedings To Determine Whether To Approve or Disapprove Proposed Rule Change To Link Market... Rule 19b-4 thereunder,\\2\\ a proposed rule change to discount certain market data fees and increase...

  8. 77 FR 28653 - Self-Regulatory Organizations; Municipal Securities Rulemaking Board; Notice of a Designation of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-15

    ...; Proposed Amendments to Rule G-8, on Books and Records, Rule G- 9, on Record Retention, and Rule G-18, on... of proposed MSRB Rule G-43, on broker's brokers; amendments to MSRB Rule G-8, on books and records...

  9. Surrogate-Assisted Genetic Programming With Simplified Models for Automated Design of Dispatching Rules.

    PubMed

    Nguyen, Su; Zhang, Mengjie; Tan, Kay Chen

    2017-09-01

    Automated design of dispatching rules for production systems has been an interesting research topic over the last several years. Machine learning, especially genetic programming (GP), has been a powerful approach to dealing with this design problem. However, intensive computational requirements, accuracy and interpretability are still its limitations. This paper aims at developing a new surrogate assisted GP to help improving the quality of the evolved rules without significant computational costs. The experiments have verified the effectiveness and efficiency of the proposed algorithms as compared to those in the literature. Furthermore, new simplification and visualisation approaches have also been developed to improve the interpretability of the evolved rules. These approaches have shown great potentials and proved to be a critical part of the automated design system.

  10. 75 FR 47063 - Mutual Fund Distribution Fees; Confirmations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-04

    ... competition for distribution services. The proposed rule and rule amendments are designed to protect... designed to enhance investor understanding of those charges, limit the cumulative sales charges each...(b) was designed to protect funds from being charged excessive sales and promotional expenses.\\26...

  11. Parameterized CAD techniques implementation for the fatigue behaviour optimization of a service chamber

    NASA Astrophysics Data System (ADS)

    Sánchez, H. T.; Estrems, M.; Franco, P.; Faura, F.

    2009-11-01

    In recent years, the market of heat exchangers is increasingly demanding new products in short cycle time, which means that both the design and manufacturing stages must be extremely reduced. The design stage can be reduced by means of CAD-based parametric design techniques. The methodology presented in this proceeding is based on the optimized control of geometric parameters of a service chamber of a heat exchanger by means of the Application Programming Interface (API) provided by the Solidworks CAD package. Using this implementation, a set of different design configurations of the service chamber made of stainless steel AISI 316 are studied by means of the FE method. As a result of this study, a set of knowledge rules based on the fatigue behaviour are constructed and integrated into the design optimization process.

  12. Fast vaccine design and development based on correlates of protection (COPs)

    PubMed Central

    van Els, Cécile; Mjaaland, Siri; Næss, Lisbeth; Sarkadi, Julia; Gonczol, Eva; Smith Korsholm, Karen; Hansen, Jon; de Jonge, Jørgen; Kersten, Gideon; Warner, Jennifer; Semper, Amanda; Kruiswijk, Corine; Oftung, Fredrik

    2014-01-01

    New and reemerging infectious diseases call for innovative and efficient control strategies of which fast vaccine design and development represent an important element. In emergency situations, when time is limited, identification and use of correlates of protection (COPs) may play a key role as a strategic tool for accelerated vaccine design, testing, and licensure. We propose that general rules for COP-based vaccine design can be extracted from the existing knowledge of protective immune responses against a large spectrum of relevant viral and bacterial pathogens. Herein, we focus on the applicability of this approach by reviewing the established and up-coming COPs for influenza in the context of traditional and a wide array of new vaccine concepts. The lessons learnt from this field may be applied more generally to COP-based accelerated vaccine design for emerging infections. PMID:25424803

  13. Benefits of Australian Design Rule 69 (full frontal crash protection) and airbags in frontal crashes in Australia.

    PubMed

    Fitzharris, Michael; Fildes, Brian; Newstead, Stuart; Logan, David

    2004-01-01

    In-depth data at MUARC was used to evaluate the Australian Design Rule 69 (ADR69) - Full frontal dynamic crash requirement, as well as the effectiveness of frontal airbag deployment on injury risk and associated cost of injury. ADR69 was introduced in Australia in mid-1995 and was based largely on the US equivalent FMVSS-208. The results indicate reductions in excess of 90% in the likelihood of sustaining AIS 2+ injuries in body regions where frontal airbags would be expected to benefit. The average injury cost savings for drivers of post-ADR69 manufactured vehicles was found to be up to AUD19,000 dollars depending on body region considered. Limitations and implications of these findings are discussed.

  14. Benefits of Australian Design Rule 69 (full frontal crash protection) and airbags in frontal crashes in Australia

    PubMed Central

    Fitzharris, Michael; Fildes, Brian; Newstead, Stuart; Logan, David

    2004-01-01

    In-depth data at MUARC was used to evaluate the Australian Design Rule 69 (ADR69) - Full frontal dynamic crash requirement, as well as the effectiveness of frontal airbag deployment on injury risk and associated cost of injury. ADR69 was introduced in Australia in mid-1995 and was based largely on the US equivalent FMVSS-208. The results indicate reductions in excess of 90% in the likelihood of sustaining AIS 2+ injuries in body regions where frontal airbags would be expected to benefit. The average injury cost savings for drivers of post-ADR69 manufactured vehicles was found to be up to AUD$19,000 depending on body region considered. Limitations and implications of these findings are discussed.

  15. Layout pattern analysis using the Voronoi diagram of line segments

    NASA Astrophysics Data System (ADS)

    Dey, Sandeep Kumar; Cheilaris, Panagiotis; Gabrani, Maria; Papadopoulou, Evanthia

    2016-01-01

    Early identification of problematic patterns in very large scale integration (VLSI) designs is of great value as the lithographic simulation tools face significant timing challenges. To reduce the processing time, such a tool selects only a fraction of possible patterns which have a probable area of failure, with the risk of missing some problematic patterns. We introduce a fast method to automatically extract patterns based on their structure and context, using the Voronoi diagram of line-segments as derived from the edges of VLSI design shapes. Designers put line segments around the problematic locations in patterns called "gauges," along which the critical distance is measured. The gauge center is the midpoint of a gauge. We first use the Voronoi diagram of VLSI shapes to identify possible problematic locations, represented as gauge centers. Then we use the derived locations to extract windows containing the problematic patterns from the design layout. The problematic locations are prioritized by the shape and proximity information of the design polygons. We perform experiments for pattern selection in a portion of a 22-nm random logic design layout. The design layout had 38,584 design polygons (consisting of 199,946 line segments) on layer Mx, and 7079 markers generated by an optical rule checker (ORC) tool. The optical rules specify requirements for printing circuits with minimum dimension. Markers are the locations of some optical rule violations in the layout. We verify our approach by comparing the coverage of our extracted patterns to the ORC-generated markers. We further derive a similarity measure between patterns and between layouts. The similarity measure helps to identify a set of representative gauges that reduces the number of patterns for analysis.

  16. ICE System: Interruptible control expert system. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Vezina, James M.

    1990-01-01

    The Interruptible Control Expert (ICE) System is based on an architecture designed to provide a strong foundation for real-time production rule expert systems. Three principles are adopted to guide the development of ICE. A practical delivery platform must be provided, no specialized hardware can be used to solve deficiencies in the software design. Knowledge of the environment and the rule-base is exploited to improve the performance of a delivered system. The third principle of ICE is to respond to the most critical event, at the expense of the more trivial tasks. Minimal time is spent on classifying the potential importance of environmental events with the majority of the time used for finding the responses. A feature of the system, derived from all three principles, is the lack of working memory. By using a priori information, a fixed amount of memory can be specified for the hardware platform. The absence of working memory removes the dangers of garbage collection during the continuous operation of the controller.

  17. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.

    PubMed

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil

    2014-08-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.

  18. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods

    PubMed Central

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called “Patient Recursive Survival Peeling” is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called “combined” cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication. PMID:26997922

  19. Towards the design of novel cuprate-based superconductors

    NASA Astrophysics Data System (ADS)

    Yee, Chuck-Hou

    The rapid maturation of materials databases combined with recent development of theories seeking to quantitatively link chemical properties to superconductivity in the cuprates provide the context to design novel superconductors. In this talk, we describe a framework designed to search for new superconductors, which combines chemical rules-of-thumb, insights of transition temperatures from dynamical mean-field theory, first-principles electronic structure tools, materials databases and structure prediction via evolutionary algorithms. We apply the framework to design a family of copper oxysulfides and evaluate the prospects of superconductivity.

  20. Tips for Teachers of Evidence-based Medicine: Clinical Prediction Rules (CPRs) and Estimating Pretest Probability

    PubMed Central

    McGinn, Thomas; Jervis, Ramiro; Wisnivesky, Juan; Keitz, Sheri

    2008-01-01

    Background Clinical prediction rules (CPR) are tools that clinicians can use to predict the most likely diagnosis, prognosis, or response to treatment in a patient based on individual characteristics. CPRs attempt to standardize, simplify, and increase the accuracy of clinicians’ diagnostic and prognostic assessments. The teaching tips series is designed to give teachers advice and materials they can use to attain specific educational objectives. Educational Objectives In this article, we present 3 teaching tips aimed at helping clinical learners use clinical prediction rules and to more accurately assess pretest probability in every day practice. The first tip is designed to demonstrate variability in physician estimation of pretest probability. The second tip demonstrates how the estimate of pretest probability influences the interpretation of diagnostic tests and patient management. The third tip exposes learners to various examples and different types of Clinical Prediction Rules (CPR) and how to apply them in practice. Pilot Testing We field tested all 3 tips with 16 learners, a mix of interns and senior residents. Teacher preparatory time was approximately 2 hours. The field test utilized a board and a data projector; 3 handouts were prepared. The tips were felt to be clear and the educational objectives reached. Potential teaching pitfalls were identified. Conclusion Teaching with these tips will help physicians appreciate the importance of applying evidence to their every day decisions. In 2 or 3 short teaching sessions, clinicians can also become familiar with the use of CPRs in applying evidence consistently in everyday practice. PMID:18491194

  1. Environmental Compliance and Pollution Prevention Training Manual for Campus-Based Organizations--Operational and Facility Maintenance Personnel.

    ERIC Educational Resources Information Center

    New York State Dept. of Environmental Conservation, Albany.

    This manual was designed to be used as part of the Workshop on Environmental Compliance and Pollution Prevention for campus-based facilities. It contains basic information on New York state and federal laws, rules, and regulations for protecting the environment. The information presented is a summary with emphasis on those items believed to be…

  2. NASIS data base management system - IBM 360/370 OS MVT implementation. 1: Installation standards

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The installation standards for the NASA Aerospace Safety Information System (NASIS) data base management system are presented. The standard approach to preparing systems documentation and the program design and coding rules and conventions are outlined. Included are instructions for preparing all major specifications and suggestions for improving the quality and efficiency of the programming task.

  3. Reinforcement learning design-based adaptive tracking control with less learning parameters for nonlinear discrete-time MIMO systems.

    PubMed

    Liu, Yan-Jun; Tang, Li; Tong, Shaocheng; Chen, C L Philip; Li, Dong-Juan

    2015-01-01

    Based on the neural network (NN) approximator, an online reinforcement learning algorithm is proposed for a class of affine multiple input and multiple output (MIMO) nonlinear discrete-time systems with unknown functions and disturbances. In the design procedure, two networks are provided where one is an action network to generate an optimal control signal and the other is a critic network to approximate the cost function. An optimal control signal and adaptation laws can be generated based on two NNs. In the previous approaches, the weights of critic and action networks are updated based on the gradient descent rule and the estimations of optimal weight vectors are directly adjusted in the design. Consequently, compared with the existing results, the main contributions of this paper are: 1) only two parameters are needed to be adjusted, and thus the number of the adaptation laws is smaller than the previous results and 2) the updating parameters do not depend on the number of the subsystems for MIMO systems and the tuning rules are replaced by adjusting the norms on optimal weight vectors in both action and critic networks. It is proven that the tracking errors, the adaptation laws, and the control inputs are uniformly bounded using Lyapunov analysis method. The simulation examples are employed to illustrate the effectiveness of the proposed algorithm.

  4. Liquid Rocket Lines, Bellows, Flexible Hoses, and Filters

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Fluid-flow components in a liquid propellant rocket engine and the rocket vehicle which it propels are interconnected by lines, bellows, and flexible hoses. Elements involved in the successful design of these components are identified and current technologies pertaining to these elements are reviewed, assessed, and summarized to provide a technology base for a checklist of rules to be followed by project managers in guiding a design or assessing its adequacy. Recommended procedures for satisfying each of the design criteria are included.

  5. On Decision-Making Among Multiple Rule-Bases in Fuzzy Control Systems

    NASA Technical Reports Server (NTRS)

    Tunstel, Edward; Jamshidi, Mo

    1997-01-01

    Intelligent control of complex multi-variable systems can be a challenge for single fuzzy rule-based controllers. This class of problems cam often be managed with less difficulty by distributing intelligent decision-making amongst a collection of rule-bases. Such an approach requires that a mechanism be chosen to ensure goal-oriented interaction between the multiple rule-bases. In this paper, a hierarchical rule-based approach is described. Decision-making mechanisms based on generalized concepts from single-rule-based fuzzy control are described. Finally, the effects of different aggregation operators on multi-rule-base decision-making are examined in a navigation control problem for mobile robots.

  6. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    NASA Astrophysics Data System (ADS)

    Saha, Debasish; Kemanian, Armen R.; Rau, Benjamin M.; Adler, Paul R.; Montes, Felipe

    2017-04-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (corn-soybean rotation), College Station, TX (corn-vetch rotation), Fort Collins, CO (irrigated corn), and Pullman, WA (winter wheat), representing diverse agro-ecoregions of the United States. Fertilization source, rate, and timing were site-specific. These simulated fluxes surrogated daily measurements in the analysis. We ;sampled; the fluxes using a fixed interval (1-32 days) or a rule-based (decision tree-based) sampling method. Two types of decision trees were built: a high-input tree (HI) that included soil inorganic nitrogen (SIN) as a predictor variable, and a low-input tree (LI) that excluded SIN. Other predictor variables were identified with Random Forest. The decision trees were inverted to be used as rules for sampling a representative number of members from each terminal node. The uncertainty of the annual N2O flux estimation increased along with the fixed interval length. A 4- and 8-day fixed sampling interval was required at College Station and Ames, respectively, to yield ±20% accuracy in the flux estimate; a 12-day interval rendered the same accuracy at Fort Collins and Pullman. Both the HI and the LI rule-based methods provided the same accuracy as that of fixed interval method with up to a 60% reduction in sampling events, particularly at locations with greater temporal flux variability. For instance, at Ames, the HI rule-based and the fixed interval methods required 16 and 91 sampling events, respectively, to achieve the same absolute bias of 0.2 kg N ha-1 yr-1 in estimating cumulative N2O flux. These results suggest that using simulation models along with decision trees can reduce the cost and improve the accuracy of the estimations of cumulative N2O fluxes using the discrete chamber-based method.

  7. The use of twin-screen-based WIMPS in spacecraft control

    NASA Astrophysics Data System (ADS)

    Klim, R. D.

    1990-10-01

    The ergonomic problems of designing a sophisticated Windows Icons Mouse Pop-up (WIMP) based twin screen workstation are outlined. These same problems will be encountered by future spacecraft controllers. The design of a modern, advanced workstation for use on a distributed multicontrol center in a multisatellite control system is outlined. The system uses access control mechanisms to ensure that only authorized personnel can undertake certain operations on the workstation. Rules governing the use of windowing features, screen attributes, icons, keyboard and mouse in spacecraft control are discussed.

  8. Acquisition, representation and rule generation for procedural knowledge

    NASA Technical Reports Server (NTRS)

    Ortiz, Chris; Saito, Tim; Mithal, Sachin; Loftin, R. Bowen

    1991-01-01

    Current research into the design and continuing development of a system for the acquisition of procedural knowledge, its representation in useful forms, and proposed methods for automated C Language Integrated Production System (CLIPS) rule generation is discussed. The Task Analysis and Rule Generation Tool (TARGET) is intended to permit experts, individually or collectively, to visually describe and refine procedural tasks. The system is designed to represent the acquired knowledge in the form of graphical objects with the capacity for generating production rules in CLIPS. The generated rules can then be integrated into applications such as NASA's Intelligent Computer Aided Training (ICAT) architecture. Also described are proposed methods for use in translating the graphical and intermediate knowledge representations into CLIPS rules.

  9. Verifying Architectural Design Rules of the Flight Software Product Line

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  10. 77 FR 28919 - Self-Regulatory Organizations; NYSE Arca, Inc.; Order Approving a Proposed Rule Change To Amend...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-16

    ... be made in a nondiscriminatory fashion.\\14\\ \\14\\ See NYSE Arca Equities Rule 7.45(d)(3). NYSE Arca... Securities will be required to establish and enforce policies and procedures that are reasonably designed to... other things, that the rules of a national securities exchange be designed to prevent fraudulent and...

  11. 77 FR 38116 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Designation of a Longer Period for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-26

    ...-Regulatory Organizations; NYSE Arca, Inc.; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change Proposing a Pilot Program To Create a Lead Market Maker Issuer Incentive Program for...'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to create and implement, on a pilot basis, a...

  12. Learning CAD at University through Summaries of the Rules of Design Intent

    ERIC Educational Resources Information Center

    Barbero, Basilio Ramos; Pedrosa, Carlos Melgosa; Samperio, Raúl Zamora

    2017-01-01

    The ease with which 3D CAD models may be modified and reused are two key aspects that improve the design-intent variable and that can significantly shorten the development timelines of a product. A set of rules are gathered from various authors that take different 3D modelling strategies into account. These rules are then applied to CAD…

  13. Developmental changes in automatic rule-learning mechanisms across early childhood.

    PubMed

    Mueller, Jutta L; Friederici, Angela D; Männel, Claudia

    2018-06-27

    Infants' ability to learn complex linguistic regularities from early on has been revealed by electrophysiological studies indicating that 3-month-olds, but not adults, can automatically detect non-adjacent dependencies between syllables. While different ERP responses in adults and infants suggest that both linguistic rule learning and its link to basic auditory processing undergo developmental changes, systematic investigations of the developmental trajectories are scarce. In the present study, we assessed 2- and 4-year-olds' ERP indicators of pitch discrimination and linguistic rule learning in a syllable-based oddball design. To test for the relation between auditory discrimination and rule learning, ERP responses to pitch changes were used as predictor for potential linguistic rule-learning effects. Results revealed that 2-year-olds, but not 4-year-olds, showed ERP markers of rule learning. Although, 2-year-olds' rule learning was not dependent on differences in pitch perception, 4-year-old children demonstrated a dependency, such that those children who showed more pronounced responses to pitch changes still showed an effect of rule learning. These results narrow down the developmental decline of the ability for automatic linguistic rule learning to the age between 2 and 4 years, and, moreover, point towards a strong modification of this change by auditory processes. At an age when the ability of automatic linguistic rule learning phases out, rule learning can still be observed in children with enhanced auditory responses. The observed interrelations are plausible causes for age-of-acquisition effects and inter-individual differences in language learning. © 2018 John Wiley & Sons Ltd.

  14. Basis of the tubesheet heat exchanger design rules used in the French pressure vessel code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osweiller, F.

    1992-02-01

    For about 40 years most tubessheet exchangers have been designed according to the standards of TEMA. Partly due to their simplicity, these rules do not assure a safe heat-exchanger design in all cases. This is the main reason why new tubesheet design rules were developed in 1981 in France for the French pressure vessel code CODAP. For fixed tubesheet heat exchangers, the new rules account for the elastic rotational restraint of the shell and channel at the outer edge of the tubesheet, as proposed in 1959 by Galletly. For floating-head and U-tube heat exchangers, the approach developed by Gardner inmore » 1969 was selected with some modifications. In both cases, the tubesheet is replaced by an equivalent solid plate with adequate effective elastic constants, and the tube bundle is simulated by an elastic foundation. The elastic restraint at the edge of the tubesheet due the shell and channel is accounted for in different ways in the two types of heat exchangers. The purpose of the paper is to present the main basis of these rules and to compare them to TEMA rules.« less

  15. FAIL-SAFE: Fault Aware IntelLigent Software for Exascale

    DTIC Science & Technology

    2016-06-13

    and that these programs can continue to correct solutions. To broaden the impact of this research, we also needed to be able to ameliorate errors...designing an interface between the application and an introspection framework for resilience ( IFR ) based on the inference engine SHINE; (4) using...the ROSE compiler to translate annotations into reasoning rules for the IFR ; and (5) designing a Knowledge/Experience Database, which will store

  16. Development and prospects of standardization in the German municipal wastewater sector.

    PubMed

    Freimuth, Claudia; Oelmann, Mark; Amann, Erwin

    2018-04-17

    Given the significance of wastewater treatment and disposal for society and the economy together with the omnipresence of standards in the sector, we studied the development and prospects of the rules governing standardization in the German municipal wastewater sector. We thereby provide a detailed description of sector-specific committee-based standardization and significantly contribute to the understanding of this complex arena. We find that the German Association for Water Wastewater and Waste (DWA) has significantly improved its rules on standardization over time by aligning them closer to the generally accepted superordinate standardization principles. However, by focusing on theoretical findings of committee decision-making and committee composition, we argue that there is still scope for improvement with respect to rule reading and rule compliance. We show that the incentives at work in standardization committees are manifold, whereas the representation of the different stakeholder groups needs' remains unbalanced. Due to vested interests and potential strategic behavior of the various agents involved in standardization rule compliance does not necessarily happen naturally. To this end, we claim that the implementation of monitoring mechanisms can be a significant contribution to the institutional design of standardization and briefly discuss the advantages and disadvantages of different schemes. Finally, we show that there is ample need for future research on the optimal design of such a scheme. Even though the analysis relates specifically to the DWA our claims apply to a wide range of standards development organizations. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. CMedTEX: A Rule-based Temporal Expression Extraction and Normalization System for Chinese Clinical Notes.

    PubMed

    Liu, Zengjian; Tang, Buzhou; Wang, Xiaolong; Chen, Qingcai; Li, Haodi; Bu, Junzhao; Jiang, Jingzhi; Deng, Qiwen; Zhu, Suisong

    2016-01-01

    Time is an important aspect of information and is very useful for information utilization. The goal of this study was to analyze the challenges of temporal expression (TE) extraction and normalization in Chinese clinical notes by assessing the performance of a rule-based system developed by us on a manually annotated corpus (including 1,778 clinical notes of 281 hospitalized patients). In order to develop system conveniently, we divided TEs into three categories: direct, indirect and uncertain TEs, and designed different rules for each category of them. Evaluation on the independent test set shows that our system achieves an F-score of93.40% on TE extraction, and an accuracy of 92.58% on TE normalization under "exact-match" criterion. Compared with HeidelTime for Chinese newswire text, our system is much better, indicating that it is necessary to develop a specific TE extraction and normalization system for Chinese clinical notes because of domain difference.

  18. 77 FR 71658 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-03

    ... Proposed Rule Change Moving the Rule Text That Provides for Pegging on the Exchange From Supplementary Material .26 of NYSE Rule 70 to NYSE Rule 13 and Amending Such Text to (i) Permit Designated Market Maker... of the Terms of Substance of the Proposed Rule Change The Exchange proposes to move the rule text...

  19. Statistically based sustainable re-design of stormwater overflow control systems in urban catchments

    NASA Astrophysics Data System (ADS)

    Ganora, Daniele; Isacco, Silvia; Claps, Pierluigi

    2017-04-01

    Control and reduction of pollution from stormwater overflow is a major concern for municipalities to manage the quality of the receiving water bodies according to the Framework Water Directive 2000/60/CE. In this regard, assessment studies of the potential pollution load from sewer networks recognize the need for adaptation and upgrade of existing drainage systems, which can be achieved with either traditional water works (detention tanks, increase of wastewater treatment plant capacity, etc.) or even Nature-based solutions (constructed wetlands, restored floodplains, etc.) sometimes used in combination. Nature-based solutions are recently receiving consistent attentions as they are able to enhance urban and degraded environments being, in the same time, more resilient and adaptable to climatic and anthropic changes than most traditional engineering works. On the other hand, restoration of the urban environment using natural absorbing surfaces requires diffuse interventions, high costs and a considerable amount of time. In this work we investigate how simple, economically-sustainable and quick solutions to the problem at hand can be addressed by changes in the management rules when pumping stations play a role in sewer systems. In particular, we provide a statistically-based framework to be used in the calibration of the management rules, facing improved quality of overflows from sewer systems. Typical pumping rules favor a massive delivery of stormwater volumes to the wastewater treatment plans, requiring large storage tanks in the sewer network, heavy pumping power and reducing the efficiency of the treatment plant due to pollutant dilution. In this study we show that it is possible to optimize the pumping rule in order to reduce pumped volumes to the plant (thus saving energy), while simultaneously keeping high pollutant concentration. On the other hand, larger low-concentration overflow volumes are released outside the sewer network with respect to the standard pumping rules. Such released volumes could be efficiently processed by nature-based solutions, like for instance constructed wetlands, to reduce the final pollutant impact on the environment. The proposed procedure is based on the previous knowledge of the precipitation forcing and of a quantity/quality model of the sewer network. The method provides marginal and joint probability distributions of water volumes and pollutant concentration (or mass) delivered toward the wastewater treatment plant and the Nature-based system, with the aim of supporting a more efficient design of the whole sewer system. A practical example of application is provided for illustrative purposes.

  20. 75 FR 63238 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-14

    ... Proposed Rule Change To Modify the Requirements To Qualify for Credits as a Designated Liquidity Provider... requirements to qualify for credits as a designated liquidity provider under Rule 7018(i) and to make a minor... Designated Liquidity Providers: Charge to Designated Liquidity Provider $0.003 per share executed entering...

  1. 75 FR 52860 - Final Airworthiness Design Standards for Acceptance Under the Primary Category Rule; Orlando...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-30

    ...., wish to apply these airworthiness design standards to other airplane models, OHA, Inc. must submit a... affects only certain airworthiness design standards on Cessna model C172I, C172K, C172L, C172M airplanes... Design Standards for Acceptance Under the Primary Category Rule; Orlando Helicopter Airways (OHA), Inc...

  2. Improving the anesthetic process by a fuzzy rule based medical decision system.

    PubMed

    Mendez, Juan Albino; Leon, Ana; Marrero, Ayoze; Gonzalez-Cava, Jose M; Reboso, Jose Antonio; Estevez, Jose Ignacio; Gomez-Gonzalez, José F

    2018-01-01

    The main objective of this research is the design and implementation of a new fuzzy logic tool for automatic drug delivery in patients undergoing general anesthesia. The aim is to adjust the drug dose to the real patient needs using heuristic knowledge provided by clinicians. A two-level computer decision system is proposed. The idea is to release the clinician from routine tasks so that he can focus on other variables of the patient. The controller uses the Bispectral Index (BIS) to assess the hypnotic state of the patient. Fuzzy controller was included in a closed-loop system to reach the BIS target and reject disturbances. BIS was measured using a BIS VISTA monitor, a device capable of calculating the hypnosis level of the patient through EEG information. An infusion pump with propofol 1% is used to supply the drug to the patient. The inputs to the fuzzy inference system are BIS error and BIS rate. The output is infusion rate increment. The mapping of the input information and the appropriate output is given by a rule-base based on knowledge of clinicians. To evaluate the performance of the fuzzy closed-loop system proposed, an observational study was carried out. Eighty one patients scheduled for ambulatory surgery were randomly distributed in 2 groups: one group using a fuzzy logic based closed-loop system (FCL) to automate the administration of propofol (42 cases); the second group using manual delivering of the drug (39 cases). In both groups, the BIS target was 50. The FCL, designed with intuitive logic rules based on the clinician experience, performed satisfactorily and outperformed the manual administration in patients in terms of accuracy through the maintenance stage. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Bendability optimization of flexible optical nanoelectronics via neutral axis engineering

    PubMed Central

    2012-01-01

    The enhancement of bendability of flexible nanoelectronics is critically important to realize future portable and wearable nanoelectronics for personal and military purposes. Because there is an enormous variety of materials and structures that are used for flexible nanoelectronic devices, a governing design rule for optimizing the bendability of these nanodevices is required. In this article, we suggest a design rule to optimize the bendability of flexible nanoelectronics through neutral axis (NA) engineering. In flexible optical nanoelectronics, transparent electrodes such as indium tin oxide (ITO) are usually the most fragile under an external load because of their brittleness. Therefore, we representatively focus on the bendability of ITO which has been widely used as transparent electrodes, and the NA is controlled by employing a buffer layer on the ITO layer. First, we independently investigate the effect of the thickness and elastic modulus of a buffer layer on the bendability of an ITO film. Then, we develop a design rule for the bendability optimization of flexible optical nanoelectronics. Because NA is determined by considering both the thickness and elastic modulus of a buffer layer, the design rule is conceived to be applicable regardless of the material and thickness that are used for the buffer layer. Finally, our design rule is applied to optimize the bendability of an organic solar cell, which allows the bending radius to reach about 1 mm. Our design rule is thus expected to provide a great strategy to enhance the bending performance of a variety of flexible nanoelectronics. PMID:22587757

  4. Bendability optimization of flexible optical nanoelectronics via neutral axis engineering.

    PubMed

    Lee, Sangmin; Kwon, Jang-Yeon; Yoon, Daesung; Cho, Handong; You, Jinho; Kang, Yong Tae; Choi, Dukhyun; Hwang, Woonbong

    2012-05-15

    The enhancement of bendability of flexible nanoelectronics is critically important to realize future portable and wearable nanoelectronics for personal and military purposes. Because there is an enormous variety of materials and structures that are used for flexible nanoelectronic devices, a governing design rule for optimizing the bendability of these nanodevices is required. In this article, we suggest a design rule to optimize the bendability of flexible nanoelectronics through neutral axis (NA) engineering. In flexible optical nanoelectronics, transparent electrodes such as indium tin oxide (ITO) are usually the most fragile under an external load because of their brittleness. Therefore, we representatively focus on the bendability of ITO which has been widely used as transparent electrodes, and the NA is controlled by employing a buffer layer on the ITO layer. First, we independently investigate the effect of the thickness and elastic modulus of a buffer layer on the bendability of an ITO film. Then, we develop a design rule for the bendability optimization of flexible optical nanoelectronics. Because NA is determined by considering both the thickness and elastic modulus of a buffer layer, the design rule is conceived to be applicable regardless of the material and thickness that are used for the buffer layer. Finally, our design rule is applied to optimize the bendability of an organic solar cell, which allows the bending radius to reach about 1 mm. Our design rule is thus expected to provide a great strategy to enhance the bending performance of a variety of flexible nanoelectronics.

  5. 75 FR 40805 - Combined Notice of Filings #2

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-14

    ... 3- Section III- Market Rule 1- Standard Market Design etc. Filed Date: 06/30/2010. Accession Number.... Applicants: MATEP, Inc. Description: MATEP Limited Partnership submits Notice of Succession to New MATEP, Inc's market- based rate tariff. Filed Date: 06/30/2010. Accession Number: 20100630-0220. Comment Date...

  6. Center for Neural Engineering at Tennessee State University, ASSERT Annual Progress Report.

    DTIC Science & Technology

    1995-07-01

    neural networks . Their research topics are: (1) developing frequency dependent oscillatory neural networks ; (2) long term pontentiation learning rules...as applied to spatial navigation; (3) design and build a servo joint robotic arm and (4) neural network based prothesis control. One graduate student

  7. Risk Control Through the Use of Procedures - A Method for Evaluating the Change in Risk

    NASA Technical Reports Server (NTRS)

    Praino, Gregory; Sharit, Joseph

    2010-01-01

    Organizations use procedures to influence or control the behavior of their workers, but often have no basis for determining whether an additional rule, or procedural control will be beneficial. This paper outlines a proposed method for determining if the addition or removal of procedural controls will impact the occurrences of critical consequences. The proposed method focuses on two aspects: how valuable the procedural control is, based on the inevitability of the consequence and the opportunity to intervene; and how likely the control is to fail, based on five procedural design elements that address how well the rule or control has been Defined, Assigned, Trained, Organized and Monitored-referred to as the DATOM elements

  8. Raft cultivation area extraction from high resolution remote sensing imagery by fusing multi-scale region-line primitive association features

    NASA Astrophysics Data System (ADS)

    Wang, Min; Cui, Qi; Wang, Jie; Ming, Dongping; Lv, Guonian

    2017-01-01

    In this paper, we first propose several novel concepts for object-based image analysis, which include line-based shape regularity, line density, and scale-based best feature value (SBV), based on the region-line primitive association framework (RLPAF). We then propose a raft cultivation area (RCA) extraction method for high spatial resolution (HSR) remote sensing imagery based on multi-scale feature fusion and spatial rule induction. The proposed method includes the following steps: (1) Multi-scale region primitives (segments) are obtained by image segmentation method HBC-SEG, and line primitives (straight lines) are obtained by phase-based line detection method. (2) Association relationships between regions and lines are built based on RLPAF, and then multi-scale RLPAF features are extracted and SBVs are selected. (3) Several spatial rules are designed to extract RCAs within sea waters after land and water separation. Experiments show that the proposed method can successfully extract different-shaped RCAs from HR images with good performance.

  9. 78 FR 62784 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-22

    ...-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Designation of a Longer Period for Commission Action on a Proposed Rule Change Relating to Wash Sale Transactions and FINRA Rule...-4 thereunder,\\2\\ a proposed rule change to amend FINRA Rule 5210. The proposed rule change was...

  10. Sensor-based activity recognition using extended belief rule-based inference methodology.

    PubMed

    Calzada, A; Liu, J; Nugent, C D; Wang, H; Martinez, L

    2014-01-01

    The recently developed extended belief rule-based inference methodology (RIMER+) recognizes the need of modeling different types of information and uncertainty that usually coexist in real environments. A home setting with sensors located in different rooms and on different appliances can be considered as a particularly relevant example of such an environment, which brings a range of challenges for sensor-based activity recognition. Although RIMER+ has been designed as a generic decision model that could be applied in a wide range of situations, this paper discusses how this methodology can be adapted to recognize human activities using binary sensors within smart environments. The evaluation of RIMER+ against other state-of-the-art classifiers in terms of accuracy, efficiency and applicability was found to be significantly relevant, specially in situations of input data incompleteness, and it demonstrates the potential of this methodology and underpins the basis to develop further research on the topic.

  11. 14 CFR 93.121 - Applicability.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...

  12. 14 CFR 93.121 - Applicability.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...

  13. 14 CFR 93.121 - Applicability.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...

  14. 14 CFR 93.121 - Applicability.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...

  15. 14 CFR 93.121 - Applicability.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...

  16. Elderly’s Family Life Supplies - Innovative Chinese Checkers Game Board

    NASA Astrophysics Data System (ADS)

    CHAO, Fanglin

    2017-09-01

    The product design course for industrial design students was implemented in our university which spans 9 weeks. Throughout the creativity rules and field study, students achieve high standard on problem identification and concept generation. The prototype test with elderly in design projects is helpful to make students with deeper understand user demand, which in turn enhance the concept further. Traditional Chinese checkers are redesigned using special checkers with different height or shape and specific rules to increase user interest and game diversity. Game is more challenging due to location weighting on score calculation to planning its strategies. Redesign Chinese checkers game board include reconfigurable board and several shape checkers. Checkers has 3 parts: standing ring body, the base body, both support the side holding structure. The body shows slightly concave to facilitate the fingers hold. The upper portion of the body is provided with different shapes extension section which can be engaged with base body. Player move the checker to the opposite target area. When one of player moved all the checkers to the opposite target area; they shift to the scoring calculation stage. The participant may develop specific strategy to gain higher score by maximized weighted checkers into its target block regions.

  17. A plausible neural circuit for decision making and its formation based on reinforcement learning.

    PubMed

    Wei, Hui; Dai, Dawei; Bu, Yijie

    2017-06-01

    A human's, or lower insects', behavior is dominated by its nervous system. Each stable behavior has its own inner steps and control rules, and is regulated by a neural circuit. Understanding how the brain influences perception, thought, and behavior is a central mandate of neuroscience. The phototactic flight of insects is a widely observed deterministic behavior. Since its movement is not stochastic, the behavior should be dominated by a neural circuit. Based on the basic firing characteristics of biological neurons and the neural circuit's constitution, we designed a plausible neural circuit for this phototactic behavior from logic perspective. The circuit's output layer, which generates a stable spike firing rate to encode flight commands, controls the insect's angular velocity when flying. The firing pattern and connection type of excitatory and inhibitory neurons are considered in this computational model. We simulated the circuit's information processing using a distributed PC array, and used the real-time average firing rate of output neuron clusters to drive a flying behavior simulation. In this paper, we also explored how a correct neural decision circuit is generated from network flow view through a bee's behavior experiment based on the reward and punishment feedback mechanism. The significance of this study: firstly, we designed a neural circuit to achieve the behavioral logic rules by strictly following the electrophysiological characteristics of biological neurons and anatomical facts. Secondly, our circuit's generality permits the design and implementation of behavioral logic rules based on the most general information processing and activity mode of biological neurons. Thirdly, through computer simulation, we achieved new understanding about the cooperative condition upon which multi-neurons achieve some behavioral control. Fourthly, this study aims in understanding the information encoding mechanism and how neural circuits achieve behavior control. Finally, this study also helps establish a transitional bridge between the microscopic activity of the nervous system and macroscopic animal behavior.

  18. Feasibility of automatic evaluation of clinical rules in general practice.

    PubMed

    Opondo, Dedan; Visscher, Stefan; Eslami, Saied; Medlock, Stephanie; Verheij, Robert; Korevaar, Joke C; Abu-Hanna, Ameen

    2017-04-01

    To assess the extent to which clinical rules (CRs) can be implemented for automatic evaluation of quality of care in general practice. We assessed 81 clinical rules (CRs) adapted from a subset of Assessing Care of Vulnerable Elders (ACOVE) clinical rules, against Dutch College of General Practitioners (NHG) data model. Each CR was analyzed using the Logical Elements Rule METHOD: (LERM). LERM is a stepwise method of assessing and formalizing clinical rules for decision support. Clinical rules that satisfied the criteria outlined in the LERM method were judged to be implementable in automatic evaluation in general practice. Thirty-three out of 81 (40.7%) Dutch-translated ACOVE clinical rules can be automatically evaluated in electronic medical record systems. Seven out of 7 CRs (100%) in the domain of diabetes can be automatically evaluated, 9/17 (52.9%) in medication use, 5/10 (50%) in depression care, 3/6 (50%) in nutrition care, 6/13 (46.1%) in dementia care, 1/6 (16.6%) in end of life care, 2/13 (15.3%) in continuity of care, and 0/9 (0%) in the fall-related care. Lack of documentation of care activities between primary and secondary health facilities and ambiguous formulation of clinical rules were the main reasons for the inability to automate the clinical rules. Approximately two-fifths of the primary care Dutch ACOVE-based clinical rules can be automatically evaluated. Clear definition of clinical rules, improved GP database design and electronic linkage of primary and secondary healthcare facilities can improve prospects of automatic assessment of quality of care. These findings are relevant especially because the Netherlands has very high automation of primary care. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. 78 FR 67467 - Registration of Municipal Advisors

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-12

    ... the Exchange Act. These rules and forms are designed to give effect to provisions of Title IX of the... ``investment strategies'' in the final rule is designed to address the main concerns raised by these commenters... state, and provide tax advantages designed to encourage saving for future college costs.\\54\\ 529 Savings...

  20. Ultra-Structure database design methodology for managing systems biology data and analyses

    PubMed Central

    Maier, Christopher W; Long, Jeffrey G; Hemminger, Bradley M; Giddings, Morgan C

    2009-01-01

    Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping). Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find Ultra-Structure offers substantial benefits for biological information systems, the largest being the integration of diverse information sources into a common framework. This facilitates systems biology research by integrating data from disparate high-throughput techniques. It also enables us to readily incorporate new data types, sources, and domain knowledge with no change to the database structure or associated computer code. Ultra-Structure may be a significant step towards solving the hard problem of data management and integration in the systems biology era. PMID:19691849

  1. A robotically constructed production and supply base on Phobos

    NASA Astrophysics Data System (ADS)

    1989-05-01

    PHOBIA Corporation is involved with the design of a man-tenable robotically constructed, bootstrap base on Mars' moon, Phobos. This base will be a pit-stop for future manned missions to Mars and beyond and will be a control facility during the robotic construction of a Martian base. An introduction is given to the concepts and the ground rules followed during the design process. Details of a base design and its location are given along with information about some of the subsystems. Since a major purpose of the base is to supply fuel to spacecraft so they can limit their fuel mass, mining and production systems are discussed. Surface support activities such as docks, anchors, and surface transportation systems are detailed. Several power supplies for the base are investigated and include fuel cells and a nuclear reactor. Tasks for the robots are defined along with descriptions of the robots capable of completing the tasks. Finally, failure modes for the entire PHOBIA Corporation design are presented along with an effects analysis and preventative recommendations.

  2. A robotically constructed production and supply base on Phobos

    NASA Technical Reports Server (NTRS)

    1989-01-01

    PHOBIA Corporation is involved with the design of a man-tenable robotically constructed, bootstrap base on Mars' moon, Phobos. This base will be a pit-stop for future manned missions to Mars and beyond and will be a control facility during the robotic construction of a Martian base. An introduction is given to the concepts and the ground rules followed during the design process. Details of a base design and its location are given along with information about some of the subsystems. Since a major purpose of the base is to supply fuel to spacecraft so they can limit their fuel mass, mining and production systems are discussed. Surface support activities such as docks, anchors, and surface transportation systems are detailed. Several power supplies for the base are investigated and include fuel cells and a nuclear reactor. Tasks for the robots are defined along with descriptions of the robots capable of completing the tasks. Finally, failure modes for the entire PHOBIA Corporation design are presented along with an effects analysis and preventative recommendations.

  3. 78 FR 18377 - Self-Regulatory Organizations; National Stock Exchange, Inc.; Notice of Designation of a Longer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... Organizations; National Stock Exchange, Inc.; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change To Adopt a New Order Type Called the ``Auto-Ex Only'' Order March 19, 2013. On January... (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to adopt a new order type called the...

  4. RuleMonkey: software for stochastic simulation of rule-based models

    PubMed Central

    2010-01-01

    Background The system-level dynamics of many molecular interactions, particularly protein-protein interactions, can be conveniently represented using reaction rules, which can be specified using model-specification languages, such as the BioNetGen language (BNGL). A set of rules implicitly defines a (bio)chemical reaction network. The reaction network implied by a set of rules is often very large, and as a result, generation of the network implied by rules tends to be computationally expensive. Moreover, the cost of many commonly used methods for simulating network dynamics is a function of network size. Together these factors have limited application of the rule-based modeling approach. Recently, several methods for simulating rule-based models have been developed that avoid the expensive step of network generation. The cost of these "network-free" simulation methods is independent of the number of reactions implied by rules. Software implementing such methods is now needed for the simulation and analysis of rule-based models of biochemical systems. Results Here, we present a software tool called RuleMonkey, which implements a network-free method for simulation of rule-based models that is similar to Gillespie's method. The method is suitable for rule-based models that can be encoded in BNGL, including models with rules that have global application conditions, such as rules for intramolecular association reactions. In addition, the method is rejection free, unlike other network-free methods that introduce null events, i.e., steps in the simulation procedure that do not change the state of the reaction system being simulated. We verify that RuleMonkey produces correct simulation results, and we compare its performance against DYNSTOC, another BNGL-compliant tool for network-free simulation of rule-based models. We also compare RuleMonkey against problem-specific codes implementing network-free simulation methods. Conclusions RuleMonkey enables the simulation of rule-based models for which the underlying reaction networks are large. It is typically faster than DYNSTOC for benchmark problems that we have examined. RuleMonkey is freely available as a stand-alone application http://public.tgen.org/rulemonkey. It is also available as a simulation engine within GetBonNie, a web-based environment for building, analyzing and sharing rule-based models. PMID:20673321

  5. Computing exponentially faster: implementing a non-deterministic universal Turing machine using DNA

    PubMed Central

    Currin, Andrew; Korovin, Konstantin; Ababi, Maria; Roper, Katherine; Kell, Douglas B.; Day, Philip J.

    2017-01-01

    The theory of computer science is based around universal Turing machines (UTMs): abstract machines able to execute all possible algorithms. Modern digital computers are physical embodiments of classical UTMs. For the most important class of problem in computer science, non-deterministic polynomial complete problems, non-deterministic UTMs (NUTMs) are theoretically exponentially faster than both classical UTMs and quantum mechanical UTMs (QUTMs). However, no attempt has previously been made to build an NUTM, and their construction has been regarded as impossible. Here, we demonstrate the first physical design of an NUTM. This design is based on Thue string rewriting systems, and thereby avoids the limitations of most previous DNA computing schemes: all the computation is local (simple edits to strings) so there is no need for communication, and there is no need to order operations. The design exploits DNA's ability to replicate to execute an exponential number of computational paths in P time. Each Thue rewriting step is embodied in a DNA edit implemented using a novel combination of polymerase chain reactions and site-directed mutagenesis. We demonstrate that the design works using both computational modelling and in vitro molecular biology experimentation: the design is thermodynamically favourable, microprogramming can be used to encode arbitrary Thue rules, all classes of Thue rule can be implemented, and non-deterministic rule implementation. In an NUTM, the resource limitation is space, which contrasts with classical UTMs and QUTMs where it is time. This fundamental difference enables an NUTM to trade space for time, which is significant for both theoretical computer science and physics. It is also of practical importance, for to quote Richard Feynman ‘there's plenty of room at the bottom’. This means that a desktop DNA NUTM could potentially utilize more processors than all the electronic computers in the world combined, and thereby outperform the world's current fastest supercomputer, while consuming a tiny fraction of its energy. PMID:28250099

  6. Amino Acid Distribution Rules Predict Protein Fold: Protein Grammar for Beta-Strand Sandwich-Like Structures

    PubMed Central

    Kister, Alexander

    2015-01-01

    We present an alternative approach to protein 3D folding prediction based on determination of rules that specify distribution of “favorable” residues, that are mainly responsible for a given fold formation, and “unfavorable” residues, that are incompatible with that fold, in polypeptide sequences. The process of determining favorable and unfavorable residues is iterative. The starting assumptions are based on the general principles of protein structure formation as well as structural features peculiar to a protein fold under investigation. The initial assumptions are tested one-by-one for a set of all known proteins with a given structure. The assumption is accepted as a “rule of amino acid distribution” for the protein fold if it holds true for all, or near all, structures. If the assumption is not accepted as a rule, it can be modified to better fit the data and then tested again in the next step of the iterative search algorithm, or rejected. We determined the set of amino acid distribution rules for a large group of beta sandwich-like proteins characterized by a specific arrangement of strands in two beta sheets. It was shown that this set of rules is highly sensitive (~90%) and very specific (~99%) for identifying sequences of proteins with specified beta sandwich fold structure. The advantage of the proposed approach is that it does not require that query proteins have a high degree of homology to proteins with known structure. So long as the query protein satisfies residue distribution rules, it can be confidently assigned to its respective protein fold. Another advantage of our approach is that it allows for a better understanding of which residues play an essential role in protein fold formation. It may, therefore, facilitate rational protein engineering design. PMID:25625198

  7. 75 FR 58455 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing of a Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-24

    ... Rule 104 To Adopt Pricing Obligations for Designated Market Makers September 20, 2010. Pursuant to... of the Proposed Rule Change The Exchange proposes to amend Rule 104 to adopt pricing obligations for.... Purpose The Exchange proposes to amend Rule 104 to adopt pricing obligations for DMMs. Under the proposal...

  8. Guidelines for Processing and Cataloging Computer Software for Schools and Area Education Agencies. Suggestions to Aid Schools and AEAs.

    ERIC Educational Resources Information Center

    Martin, Elizabeth; And Others

    Based on definitions of a machine-readable data file (MRDF) taken from the Anglo-American Cataloging Rules, second edition (AACR2) and Standards for Cataloging Nonprint Materials, the following recommendations for processing items of computer software are provided: (1) base main and added entry determination on AACR2; (2) place designation of form…

  9. Accident/Mishap Investigation System

    NASA Technical Reports Server (NTRS)

    Keller, Richard; Wolfe, Shawn; Gawdiak, Yuri; Carvalho, Robert; Panontin, Tina; Williams, James; Sturken, Ian

    2007-01-01

    InvestigationOrganizer (IO) is a Web-based collaborative information system that integrates the generic functionality of a database, a document repository, a semantic hypermedia browser, and a rule-based inference system with specialized modeling and visualization functionality to support accident/mishap investigation teams. This accessible, online structure is designed to support investigators by allowing them to make explicit, shared, and meaningful links among evidence, causal models, findings, and recommendations.

  10. Image-based change estimation (ICE): monitoring land use, land cover and agent of change information for all lands

    Treesearch

    Kevin Megown; Andy Lister; Paul Patterson; Tracey Frescino; Dennis Jacobs; Jeremy Webb; Nicholas Daniels; Mark Finco

    2015-01-01

    The Image-based Change Estimation (ICE) protocols have been designed to respond to several Agency and Department information requirements. These include provisions set forth by the 2014 Farm Bill, the Forest Service Action Plan and Strategic Plan, the 2012 Planning Rule, and the 2015 Planning Directives. ICE outputs support the information needs by providing estimates...

  11. Take-off and Landing

    DTIC Science & Technology

    1975-01-01

    Studies Program. The results of AGARD work are reported to the member nations and the NATO Authorities through the AGARD series of publications of...calculated based on a low altitude mission profile. 2. GROUND RULES AND BASIC ASSUMPTIONS Base Design All aircraft synthesized for this study are...In this study manoeuverability is defined in terms of specific excess power (as shown in Fig. 5) at specified Mach number, altitude,and load

  12. Personalised Information Services Using a Hybrid Recommendation Method Based on Usage Frequency

    ERIC Educational Resources Information Center

    Kim, Yong; Chung, Min Gyo

    2008-01-01

    Purpose: This paper seeks to describe a personal recommendation service (PRS) involving an innovative hybrid recommendation method suitable for deployment in a large-scale multimedia user environment. Design/methodology/approach: The proposed hybrid method partitions content and user into segments and executes association rule mining,…

  13. Chemical Safety Information, Site Security and Fuels Regulatory Relief Act: Public Distribution of Off-Site Consequence Analysis Information Fact Sheet

    EPA Pesticide Factsheets

    Based on assessments of increased risk of terrorist/criminal activity, EPA and DOJ have issued a rule that allows public access to OCA information in ways that are designed to minimize likelihood of chemical accidents and public harm.

  14. A MOOC on Approaches to Machine Translation

    ERIC Educational Resources Information Center

    Costa-jussà, Mart R.; Formiga, Lluís; Torrillas, Oriol; Petit, Jordi; Fonollosa, José A. R.

    2015-01-01

    This paper describes the design, development, and analysis of a MOOC entitled "Approaches to Machine Translation: Rule-based, statistical and hybrid", and provides lessons learned and conclusions to be taken into account in the future. The course was developed within the Canvas platform, used by recognized European universities. It…

  15. 76 FR 3859 - Trade Acknowledgment and Verification of Security-Based Swap Transactions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-21

    ... establish, maintain, and enforce policies and procedures that are reasonably designed to obtain prompt...; have the capacity to enforce their rules and discipline their participants; and have chief compliance...). Moreover, as discussed in Part II.E below, an SBS Entity must establish, maintain, and enforce policies and...

  16. Reviewing the College Disciplinary Procedure. Mendip Papers.

    ERIC Educational Resources Information Center

    Kedney, R. J.; Saunders, R.

    This paper provides practical advice on reviewing and designing disciplinary procedures and is set in the context of incorporation of further education and sixth form colleges in England. Reasons are provided for having disciplinary rules, based on the Advisory Conciliation and Arbitration Service's (ACAS) Code of Practice. Relevant English…

  17. The Box-and-Dot Method: A Simple Strategy for Counting Significant Figures

    NASA Astrophysics Data System (ADS)

    Stephenson, W. Kirk

    2009-08-01

    A visual method for counting significant digits is presented. This easy-to-learn (and easy-to-teach) method, designated the box-and-dot method, uses the device of "boxing" significant figures based on two simple rules, then counting the number of digits in the boxes.

  18. An optimal sample data usage strategy to minimize overfitting and underfitting effects in regression tree models based on remotely-sensed data

    USGS Publications Warehouse

    Gu, Yingxin; Wylie, Bruce K.; Boyte, Stephen; Picotte, Joshua J.; Howard, Danny; Smith, Kelcy; Nelson, Kurtis

    2016-01-01

    Regression tree models have been widely used for remote sensing-based ecosystem mapping. Improper use of the sample data (model training and testing data) may cause overfitting and underfitting effects in the model. The goal of this study is to develop an optimal sampling data usage strategy for any dataset and identify an appropriate number of rules in the regression tree model that will improve its accuracy and robustness. Landsat 8 data and Moderate-Resolution Imaging Spectroradiometer-scaled Normalized Difference Vegetation Index (NDVI) were used to develop regression tree models. A Python procedure was designed to generate random replications of model parameter options across a range of model development data sizes and rule number constraints. The mean absolute difference (MAD) between the predicted and actual NDVI (scaled NDVI, value from 0–200) and its variability across the different randomized replications were calculated to assess the accuracy and stability of the models. In our case study, a six-rule regression tree model developed from 80% of the sample data had the lowest MAD (MADtraining = 2.5 and MADtesting = 2.4), which was suggested as the optimal model. This study demonstrates how the training data and rule number selections impact model accuracy and provides important guidance for future remote-sensing-based ecosystem modeling.

  19. An object oriented generic controller using CLIPS

    NASA Technical Reports Server (NTRS)

    Nivens, Cody R.

    1990-01-01

    In today's applications, the need for the division of code and data has focused on the growth of object oriented programming. This philosophy gives software engineers greater control over the environment of an application. Yet the use of object oriented design does not exclude the need for greater understanding by the application of what the controller is doing. Such understanding is only possible by using expert systems. Providing a controller that is capable of controlling an object by using rule-based expertise would expedite the use of both object oriented design and expert knowledge of the dynamic of an environment in modern controllers. This project presents a model of a controller that uses the CLIPS expert system and objects in C++ to create a generic controller. The polymorphic abilities of C++ allow for the design of a generic component stored in individual data files. Accompanying the component is a set of rules written in CLIPS which provide the following: the control of individual components, the input of sensory data from components and the ability to find the status of a given component. Along with the data describing the application, a set of inference rules written in CLIPS allows the application to make use of sensory facts and status and control abilities. As a demonstration of this ability, the control of the environment of a house is provided. This demonstration includes the data files describing the rooms and their contents as far as devices, windows and doors. The rules used for the home consist of the flow of people in the house and the control of devices by the home owner.

  20. Run-length encoding graphic rules, biochemically editable designs and steganographical numeric data embedment for DNA-based cryptographical coding system.

    PubMed

    Kawano, Tomonori

    2013-03-01

    There have been a wide variety of approaches for handling the pieces of DNA as the "unplugged" tools for digital information storage and processing, including a series of studies applied to the security-related area, such as DNA-based digital barcodes, water marks and cryptography. In the present article, novel designs of artificial genes as the media for storing the digitally compressed data for images are proposed for bio-computing purpose while natural genes principally encode for proteins. Furthermore, the proposed system allows cryptographical application of DNA through biochemically editable designs with capacity for steganographical numeric data embedment. As a model case of image-coding DNA technique application, numerically and biochemically combined protocols are employed for ciphering the given "passwords" and/or secret numbers using DNA sequences. The "passwords" of interest were decomposed into single letters and translated into the font image coded on the separate DNA chains with both the coding regions in which the images are encoded based on the novel run-length encoding rule, and the non-coding regions designed for biochemical editing and the remodeling processes revealing the hidden orientation of letters composing the original "passwords." The latter processes require the molecular biological tools for digestion and ligation of the fragmented DNA molecules targeting at the polymerase chain reaction-engineered termini of the chains. Lastly, additional protocols for steganographical overwriting of the numeric data of interests over the image-coding DNA are also discussed.

  1. Model for the design of distributed data bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ram, S.

    This research focuses on developing a model to solve the File Allocation Problem (FAP). The model integrates two major design issues, namely Concurrently Control and Data Distribution. The central node locking mechanism is incorporated in developing a nonlinear integer programming model. Two solution algorithms are proposed, one of which was implemented in FORTRAN.V. The allocation of data bases and programs are examined using this heuristic. Several decision rules were also formulated based on the results of the heuristic. A second more comprehensive heuristic was proposed, based on the knapsack problem. The development and implementation of this algorithm has been leftmore » as a topic for future research.« less

  2. A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base

    NASA Technical Reports Server (NTRS)

    Kautzmann, Frank N., III

    1988-01-01

    Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.

  3. Complete denture tooth arrangement technology driven by a reconfigurable rule.

    PubMed

    Dai, Ning; Yu, Xiaoling; Fan, Qilei; Yuan, Fulai; Liu, Lele; Sun, Yuchun

    2018-01-01

    The conventional technique for the fabrication of complete dentures is complex, with a long fabrication process and difficult-to-control restoration quality. In recent years, digital complete denture design has become a research focus. Digital complete denture tooth arrangement is a challenging issue that is difficult to efficiently implement under the constraints of complex tooth arrangement rules and the patient's individualized functional aesthetics. The present study proposes a complete denture automatic tooth arrangement method driven by a reconfigurable rule; it uses four typical operators, including a position operator, a scaling operator, a posture operator, and a contact operator, to establish the constraint mapping association between the teeth and the constraint set of the individual patient. By using the process reorganization of different constraint operators, this method can flexibly implement different clinical tooth arrangement rules. When combined with a virtual occlusion algorithm based on progressive iterative Laplacian deformation, the proposed method can achieve automatic and individual tooth arrangement. Finally, the experimental results verify that the proposed method is flexible and efficient.

  4. An automated approach to the design of decision tree classifiers

    NASA Technical Reports Server (NTRS)

    Argentiero, P.; Chin, P.; Beaudet, P.

    1980-01-01

    The classification of large dimensional data sets arising from the merging of remote sensing data with more traditional forms of ancillary data is considered. Decision tree classification, a popular approach to the problem, is characterized by the property that samples are subjected to a sequence of decision rules before they are assigned to a unique class. An automated technique for effective decision tree design which relies only on apriori statistics is presented. This procedure utilizes a set of two dimensional canonical transforms and Bayes table look-up decision rules. An optimal design at each node is derived based on the associated decision table. A procedure for computing the global probability of correct classfication is also provided. An example is given in which class statistics obtained from an actual LANDSAT scene are used as input to the program. The resulting decision tree design has an associated probability of correct classification of .76 compared to the theoretically optimum .79 probability of correct classification associated with a full dimensional Bayes classifier. Recommendations for future research are included.

  5. 78 FR 79044 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-27

    ... Proposed Rule Change to Offer Risk Management Tools Designed to Allow Member Organizations to Monitor and... of the Proposed Rule Change The Exchange proposes to offer risk management tools designed to allow... risk management tools designed to allow member organizations to monitor and address exposure to risk...

  6. 78 FR 51705 - Endangered and Threatened Wildlife and Plants; Designation of Critical Habitat for Ivesia webberi

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-21

    ... Habitat for Ivesia webberi (Webber's ivesia) AGENCY: Fish and Wildlife Service, Interior. ACTION: Proposed... dates published in the August 2, 2013, proposed rule to designate critical habitat for Ivesia webberi... rule to designate critical habitat for Ivesia webberi, we included the wrong date for the public...

  7. Cognitive Tutoring based on Intelligent Decision Support in the PENTHA Instructional Design Model

    NASA Astrophysics Data System (ADS)

    dall'Acqua, Luisa

    2010-06-01

    The research finality of this paper is how to support Authors to develop rule driven—subject oriented, adaptable course content, meta-rules—representing the disciplinary epistemology, model of teaching, Learning Path structure, and assessment parameters—for intelligent Tutoring actions in a personalized, adaptive e-Learning environment. The focus is to instruct the student to be a decision manager for himself, able to recognize the elements of a problem, select the necessary information with the perspective of factual choices. In particular, our research intends to provide some fundamental guidelines for the definition of didactical rules and logical relations, that Authors should provide to a cognitive Tutoring system through the use of an Instructional Design method (PENTHA Model) which proposes an educational environment, able to: increase productivity and operability, create conditions for a cooperative dialogue, developing participatory research activities of knowledge, observations and discoveries, customizing the learning design in a complex and holistic vision of the learning / teaching processes.

  8. Design space exploration for early identification of yield limiting patterns

    NASA Astrophysics Data System (ADS)

    Li, Helen; Zou, Elain; Lee, Robben; Hong, Sid; Liu, Square; Wang, JinYan; Du, Chunshan; Zhang, Recco; Madkour, Kareem; Ali, Hussein; Hsu, Danny; Kabeel, Aliaa; ElManhawy, Wael; Kwan, Joe

    2016-03-01

    In order to resolve the causality dilemma of which comes first, accurate design rules or real designs, this paper presents a flow for exploration of the layout design space to early identify problematic patterns that will negatively affect the yield. A new random layout generating method called Layout Schema Generator (LSG) is reported in this paper, this method generates realistic design-like layouts without any design rule violation. Lithography simulation is then used on the generated layout to discover the potentially problematic patterns (hotspots). These hotspot patterns are further explored by randomly inducing feature and context variations to these identified hotspots through a flow called Hotspot variation Flow (HSV). Simulation is then performed on these expanded set of layout clips to further identify more problematic patterns. These patterns are then classified into design forbidden patterns that should be included in the design rule checker and legal patterns that need better handling in the RET recipes and processes.

  9. A rule-based software test data generator

    NASA Technical Reports Server (NTRS)

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  10. Rule groupings: An approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Knowledge-based expert systems are playing an increasingly important role in NASA space and aircraft systems. However, many of NASA's software applications are life- or mission-critical and knowledge-based systems do not lend themselves to the traditional verification and validation techniques for highly reliable software. Rule-based systems lack the control abstractions found in procedural languages. Hence, it is difficult to verify or maintain such systems. Our goal is to automatically structure a rule-based system into a set of rule-groups having a well-defined interface to other rule-groups. Once a rule base is decomposed into such 'firewalled' units, studying the interactions between rules would become more tractable. Verification-aid tools can then be developed to test the behavior of each such rule-group. Furthermore, the interactions between rule-groups can be studied in a manner similar to integration testing. Such efforts will go a long way towards increasing our confidence in the expert-system software. Our research efforts address the feasibility of automating the identification of rule groups, in order to decompose the rule base into a number of meaningful units.

  11. Towards computerizing intensive care sedation guidelines: design of a rule-based architecture for automated execution of clinical guidelines

    PubMed Central

    2010-01-01

    Background Computerized ICUs rely on software services to convey the medical condition of their patients as well as assisting the staff in taking treatment decisions. Such services are useful for following clinical guidelines quickly and accurately. However, the development of services is often time-consuming and error-prone. Consequently, many care-related activities are still conducted based on manually constructed guidelines. These are often ambiguous, which leads to unnecessary variations in treatments and costs. The goal of this paper is to present a semi-automatic verification and translation framework capable of turning manually constructed diagrams into ready-to-use programs. This framework combines the strengths of the manual and service-oriented approaches while decreasing their disadvantages. The aim is to close the gap in communication between the IT and the medical domain. This leads to a less time-consuming and error-prone development phase and a shorter clinical evaluation phase. Methods A framework is proposed that semi-automatically translates a clinical guideline, expressed as an XML-based flow chart, into a Drools Rule Flow by employing semantic technologies such as ontologies and SWRL. An overview of the architecture is given and all the technology choices are thoroughly motivated. Finally, it is shown how this framework can be integrated into a service-oriented architecture (SOA). Results The applicability of the Drools Rule language to express clinical guidelines is evaluated by translating an example guideline, namely the sedation protocol used for the anaesthetization of patients, to a Drools Rule Flow and executing and deploying this Rule-based application as a part of a SOA. The results show that the performance of Drools is comparable to other technologies such as Web Services and increases with the number of decision nodes present in the Rule Flow. Most delays are introduced by loading the Rule Flows. Conclusions The framework is an effective solution for computerizing clinical guidelines as it allows for quick development, evaluation and human-readable visualization of the Rules and has a good performance. By monitoring the parameters of the patient to automatically detect exceptional situations and problems and by notifying the medical staff of tasks that need to be performed, the computerized sedation guideline improves the execution of the guideline. PMID:20082700

  12. Towards computerizing intensive care sedation guidelines: design of a rule-based architecture for automated execution of clinical guidelines.

    PubMed

    Ongenae, Femke; De Backere, Femke; Steurbaut, Kristof; Colpaert, Kirsten; Kerckhove, Wannes; Decruyenaere, Johan; De Turck, Filip

    2010-01-18

    Computerized ICUs rely on software services to convey the medical condition of their patients as well as assisting the staff in taking treatment decisions. Such services are useful for following clinical guidelines quickly and accurately. However, the development of services is often time-consuming and error-prone. Consequently, many care-related activities are still conducted based on manually constructed guidelines. These are often ambiguous, which leads to unnecessary variations in treatments and costs.The goal of this paper is to present a semi-automatic verification and translation framework capable of turning manually constructed diagrams into ready-to-use programs. This framework combines the strengths of the manual and service-oriented approaches while decreasing their disadvantages. The aim is to close the gap in communication between the IT and the medical domain. This leads to a less time-consuming and error-prone development phase and a shorter clinical evaluation phase. A framework is proposed that semi-automatically translates a clinical guideline, expressed as an XML-based flow chart, into a Drools Rule Flow by employing semantic technologies such as ontologies and SWRL. An overview of the architecture is given and all the technology choices are thoroughly motivated. Finally, it is shown how this framework can be integrated into a service-oriented architecture (SOA). The applicability of the Drools Rule language to express clinical guidelines is evaluated by translating an example guideline, namely the sedation protocol used for the anaesthetization of patients, to a Drools Rule Flow and executing and deploying this Rule-based application as a part of a SOA. The results show that the performance of Drools is comparable to other technologies such as Web Services and increases with the number of decision nodes present in the Rule Flow. Most delays are introduced by loading the Rule Flows. The framework is an effective solution for computerizing clinical guidelines as it allows for quick development, evaluation and human-readable visualization of the Rules and has a good performance. By monitoring the parameters of the patient to automatically detect exceptional situations and problems and by notifying the medical staff of tasks that need to be performed, the computerized sedation guideline improves the execution of the guideline.

  13. 77 FR 71853 - Self-Regulatory Organizations; NYSE MKT LLC; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-04

    ... Moving the Rule Text That Provides for Pegging on the Exchange From Supplementary Material .26 of Rule 70--Equities to Rule 13--Equities and Amending Such Text to (i) Permit Designated Market Maker Interest To Be... Proposed Rule Change The Exchange proposes to move the rule text that provides for pegging on the Exchange...

  14. 77 FR 39547 - Self-Regulatory Organizations; NYSE Amex LLC; Notice of Designation of a Longer Period for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-03

    ... Proposed Rule Change Amending Commentary .07 to NYSE Amex Options Rule 904 To Eliminate Position Limits for... Act of 1934 (the ``Act'') \\2\\ and Rule 19b-4 thereunder,\\3\\ a proposed rule change to eliminate... side of the market. The proposal would amend Commentary .07 to NYSE Amex Options Rule 904 to eliminate...

  15. 18 CFR 385.1403 - Petitions seeking institution of rulemaking proceedings (Rule 1404).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... PROCEDURE Oil Pipeline Proceedings § 385.1403 Petitions seeking institution of rulemaking proceedings (Rule... purpose of issuing statements, rules, or regulations of general applicability and significance designed to...

  16. Direct Final Rule for Technical Amendments for Marine Spark-Ignition Engines and Vessels

    EPA Pesticide Factsheets

    Rule published September 16, 2010 to make technical amendments to the design standard for portable marine fuel tanks. This rule incorporates safe recommended practices, developed through industry consensus.

  17. Litho hotspots fixing using model based algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Meili; Yu, Shirui; Mao, Zhibiao; Shafee, Marwa; Madkour, Kareem; ElManhawy, Wael; Kwan, Joe; Hu, Xinyi; Wan, Qijian; Du, Chunshan

    2017-04-01

    As technology advances, IC designs are getting more sophisticated, thus it becomes more critical and challenging to fix printability issues in the design flow. Running lithography checks before tapeout is now mandatory for designers, which creates a need for more advanced and easy-to-use techniques for fixing hotspots found after lithographic simulation without creating a new design rule checking (DRC) violation or generating a new hotspot. This paper presents a new methodology for fixing hotspots on layouts while using the same engine currently used to detect the hotspots. The fix is achieved by applying minimum movement of edges causing the hotspot, with consideration of DRC constraints. The fix is internally simulated by the lithographic simulation engine to verify that the hotspot is eliminated and that no new hotspot is generated by the new edge locations. Hotspot fix checking is enhanced by adding DRC checks to the litho-friendly design (LFD) rule file to guarantee that any fix options that violate DRC checks are removed from the output hint file. This extra checking eliminates the need to re-run both DRC and LFD checks to ensure the change successfully fixed the hotspot, which saves time and simplifies the designer's workflow. This methodology is demonstrated on industrial designs, where the fixing rate of single and dual layer hotspots is reported.

  18. Michigan`s air emission trading program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russette, T.M.; VanKolken, A.M.

    1997-12-31

    Michigan`s Emission Trading Program took effect on March 16, 1996 after two years of rule development by the Michigan Department of Environmental Quality, Air Quality Division and affected stakeholders. This program is based on the open market trading model and has been designed to (1) be consistent with existing federal and state rules and regulations, (2) integrate with existing air programs such as the permit program, and (3) address the needs of Michigan`s regulated community. Michigan`s Air Quality Division, along with other interested parties, initiated this program as part of market-based approaches to improve air quality through the reduction ofmore » criteria pollutants (except ozone) and volatile organic compounds. The Emission Trading rules offer potential benefits for Michigan companies that include increased operational flexibility, lower compliance costs, and/or money generated from the sale of the emission reduction credits. The environment also benefits from this program because the rules require that 10 percent of all registered emission reductions must be permanently retired as an air quality benefit. The emission trading program provides new opportunities for consulting firms to assist companies by identifying acceptable ways to generate and use emission reduction credits. Air pollution control companies may also see new opportunities by designing and installing control equipment in order to reduce air emissions. The role of consultants and equipment companies may expand to that of a broker selling and/or buying emission reduction credits on the Emission Trading Registry. Much has been learned since the conception of the air emission trading program. This paper will discuss how the program works in practice compared to what was envisioned in theory and the potential benefits from Michigan`s Emission Trading Program.« less

  19. Railway Online Booking System Design and Implementation

    NASA Astrophysics Data System (ADS)

    Zongjiang, Wang

    In this paper, we define rule usefulness and introduce one approach to evaluate the rule usefulness in rough sets. And we raise one method to get most useful rules. This method is easy and effective in applications of prisoners' reform. Comparing with the method to get most interesting rules, ours is direct and objective. Rule interestingness must consider the predefined knowledge on what kind of information is interesting. Our method greatly reduces the rule numbers generated and provides a measure of rule usefulness at the same time.

  20. Study of a Secondary Power System Based on an Intermediate Bus Converter and POLs

    NASA Astrophysics Data System (ADS)

    Santoja, Almudena; Fernandez, Arturo; Tonicello, Ferdinando

    2014-08-01

    Secondary power systems in satellites are everything but standard nowadays. All sorts of options can be found and, in the end, a new custom design is used in most of the cases. Even though this might be interesting in some specific cases, for most of them it would be more convenient to have a straightforward system based on standard components. One of the options to achieve this is to design the secondary power system with an Intermediate Bus Converter (IBC) and Point of Load converters (POLs). This paper presents a study of this architecture and some experimental verifications to establish some basic rules devoted to achieve an optimum design of this system.

  1. Integrated model-based retargeting and optical proximity correction

    NASA Astrophysics Data System (ADS)

    Agarwal, Kanak B.; Banerjee, Shayak

    2011-04-01

    Conventional resolution enhancement techniques (RET) are becoming increasingly inadequate at addressing the challenges of subwavelength lithography. In particular, features show high sensitivity to process variation in low-k1 lithography. Process variation aware RETs such as process-window OPC are becoming increasingly important to guarantee high lithographic yield, but such techniques suffer from high runtime impact. An alternative to PWOPC is to perform retargeting, which is a rule-assisted modification of target layout shapes to improve their process window. However, rule-based retargeting is not a scalable technique since rules cannot cover the entire search space of two-dimensional shape configurations, especially with technology scaling. In this paper, we propose to integrate the processes of retargeting and optical proximity correction (OPC). We utilize the normalized image log slope (NILS) metric, which is available at no extra computational cost during OPC. We use NILS to guide dynamic target modification between iterations of OPC. We utilize the NILS tagging capabilities of Calibre TCL scripting to identify fragments with low NILS. We then perform NILS binning to assign different magnitude of retargeting to different NILS bins. NILS is determined both for width, to identify regions of pinching, and space, to locate regions of potential bridging. We develop an integrated flow for 1x metal lines (M1) which exhibits lesser lithographic hotspots compared to a flow with just OPC and no retargeting. We also observe cases where hotspots that existed in the rule-based retargeting flow are fixed using our methodology. We finally also demonstrate that such a retargeting methodology does not significantly alter design properties by electrically simulating a latch layout before and after retargeting. We observe less than 1% impact on latch Clk-Q and D-Q delays post-retargeting, which makes this methodology an attractive one for use in improving shape process windows without perturbing designed values.

  2. FPGA chip performance improvement with gate shrink through alternating PSM 90nm process

    NASA Astrophysics Data System (ADS)

    Yu, Chun-Chi; Shieh, Ming-Feng; Liu, Erick; Lin, Benjamin; Ho, Jonathan; Wu, Xin; Panaite, Petrisor; Chacko, Manoj; Zhang, Yunqiang; Lei, Wen-Kang

    2005-11-01

    In the post-physical verification space called 'Mask Synthesis' a key component of design-for-manufacturing (DFM), double-exposure based, dark-field, alternating PSM (Alt-PSM) is being increasingly applied at the 90nm node in addition with other mature resolution enhancement techniques (RETs) such as optical proximity correction (OPC) and sub-resolution assist features (SRAF). Several high-performance IC manufacturers already use alt-PSM technology in 65nm production. At 90nm having strong control over the lithography process is a critical component in meeting targeted yield goals. However, implementing alt-PSM in production has been challenging due to several factors such as phase conflict errors, mask manufacturing, and the increased production cost due to the need for two masks in the process. Implementation of Alt-PSM generally requires phase compliance rules and proper phase topology in the layout and this has been successful for the technology node with these rules implemented. However, this may not be true for a mature, production process technology, in this case 90 nm. Especially, in the foundry-fabless business model where the foundry provides a standard set of design rules to its customers for a given process technology, and where not all the foundry customers require Alt-PSM in their tapeout flow. With minimum design changes, design houses usually are motivated by higher product performance for the existing designs. What follows is an in-depth review of the motivation to apply alt-PSM on a production FPGA, the DFM challenges to each partner faced, its effect on the tapeout flow, and how design, manufacturing, and EDA teams worked together to resolve phase conflicts, tapeout the chip, and finally verify the silicon results in production.

  3. Retrieving and Indexing Spatial Data in the Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Wang, Yonggang; Wang, Sheng; Zhou, Daliang

    In order to solve the drawbacks of spatial data storage in common Cloud Computing platform, we design and present a framework for retrieving, indexing, accessing and managing spatial data in the Cloud environment. An interoperable spatial data object model is provided based on the Simple Feature Coding Rules from the OGC such as Well Known Binary (WKB) and Well Known Text (WKT). And the classic spatial indexing algorithms like Quad-Tree and R-Tree are re-designed in the Cloud Computing environment. In the last we develop a prototype software based on Google App Engine to implement the proposed model.

  4. PVDaCS - A prototype knowledge-based expert system for certification of spacecraft data

    NASA Technical Reports Server (NTRS)

    Wharton, Cathleen; Shiroma, Patricia J.; Simmons, Karen E.

    1989-01-01

    On-line data management techniques to certify spacecraft information are mandated by increasing telemetry rates. Knowledge-based expert systems offer the ability to certify data electronically without the need for time-consuming human interaction. Issues of automatic certification are explored by designing a knowledge-based expert system to certify data from a scientific instrument, the Orbiter Ultraviolet Spectrometer, on an operating NASA planetary spacecraft, Pioneer Venus. The resulting rule-based system, called PVDaCS (Pioneer Venus Data Certification System), is a functional prototype demonstrating the concepts of a larger system design. A key element of the system design is the representation of an expert's knowledge through the usage of well ordered sequences. PVDaCS produces a certification value derived from expert knowledge and an analysis of the instrument's operation. Results of system performance are presented.

  5. Design a Fuzzy Rule-based Expert System to Aid Earlier Diagnosis of Gastric Cancer.

    PubMed

    Safdari, Reza; Arpanahi, Hadi Kazemi; Langarizadeh, Mostafa; Ghazisaiedi, Marjan; Dargahi, Hossein; Zendehdel, Kazem

    2018-01-01

    Screening and health check-up programs are most important sanitary priorities, that should be undertaken to control dangerous diseases such as gastric cancer that affected by different factors. More than 50% of gastric cancer diagnoses are made during the advanced stage. Currently, there is no systematic approach for early diagnosis of gastric cancer. to develop a fuzzy expert system that can identify gastric cancer risk levels in individuals. This system was implemented in MATLAB software, Mamdani inference technique applied to simulate reasoning of experts in the field, a total of 67 fuzzy rules extracted as a rule-base based on medical expert's opinion. 50 case scenarios were used to evaluate the system, the information of case reports is given to the system to find risk level of each case report then obtained results were compared with expert's diagnosis. Results revealed that sensitivity was 92.1% and the specificity was 83.1%. The results show that is possible to develop a system that can identify High risk individuals for gastric cancer. The system can lead to earlier diagnosis, this may facilitate early treatment and reduce gastric cancer mortality rate.

  6. Automated Database Mediation Using Ontological Metadata Mappings

    PubMed Central

    Marenco, Luis; Wang, Rixin; Nadkarni, Prakash

    2009-01-01

    Objective To devise an automated approach for integrating federated database information using database ontologies constructed from their extended metadata. Background One challenge of database federation is that the granularity of representation of equivalent data varies across systems. Dealing effectively with this problem is analogous to dealing with precoordinated vs. postcoordinated concepts in biomedical ontologies. Model Description The authors describe an approach based on ontological metadata mapping rules defined with elements of a global vocabulary, which allows a query specified at one granularity level to fetch data, where possible, from databases within the federation that use different granularities. This is implemented in OntoMediator, a newly developed production component of our previously described Query Integrator System. OntoMediator's operation is illustrated with a query that accesses three geographically separate, interoperating databases. An example based on SNOMED also illustrates the applicability of high-level rules to support the enforcement of constraints that can prevent inappropriate curator or power-user actions. Summary A rule-based framework simplifies the design and maintenance of systems where categories of data must be mapped to each other, for the purpose of either cross-database query or for curation of the contents of compositional controlled vocabularies. PMID:19567801

  7. 78 FR 36434 - Revisions to Rules of Practice

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-18

    ... federal holidays, make grammatical corrections, and remove the reference to part-day holidays. Rule 3001... section, the following categories of persons are designated ``decision-making personnel'': (i) The.... The following categories of person are designated ``non-decision-making personnel'': (i) All...

  8. 17 CFR Appendix A to Part 38 - Guidance on Compliance With Designation Criteria

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the criteria for designation. To the extent that compliance with, or satisfaction of, a criterion for designation is not self-explanatory from the face of the contract market's rules (as defined in § 40.1 of this... FACILITY—The board of trade shall—(A) establish and enforce rules defining, or specifications detailing...

  9. 10 CFR Appendix A to Part 52 - Design Certification Rule for the U.S. Advanced Boiling Water Reactor

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Water Reactor A Appendix A to Part 52 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES... Rule for the U.S. Advanced Boiling Water Reactor I. Introduction Appendix A constitutes the standard design certification for the U.S. Advanced Boiling Water Reactor (ABWR) design, in accordance with 10 CFR...

  10. 10 CFR Appendix A to Part 52 - Design Certification Rule for the U.S. Advanced Boiling Water Reactor

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Water Reactor A Appendix A to Part 52 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES... Rule for the U.S. Advanced Boiling Water Reactor I. Introduction Appendix A constitutes the standard design certification for the U.S. Advanced Boiling Water Reactor (ABWR) design, in accordance with 10 CFR...

  11. 76 FR 49303 - Approval and Promulgation of Air Quality Implementation Plans; Minnesota; Rules Update

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-10

    ... design requirements for the monitoring systems. The revised CMS rules also delineate the recordkeeping..., [Insert page number where the document begins]. 7017.1140 CEMS design 03/01/99 08/10/11, [Insert page...]. 7017.1190 COMS design 03/01/99 08/10/11, [Insert page requirements. number where the document begins...

  12. Double Linear Damage Rule for Fatigue Analysis

    NASA Technical Reports Server (NTRS)

    Halford, G.; Manson, S.

    1985-01-01

    Double Linear Damage Rule (DLDR) method for use by structural designers to determine fatigue-crack-initiation life when structure subjected to unsteady, variable-amplitude cyclic loadings. Method calculates in advance of service how many loading cycles imposed on structural component before macroscopic crack initiates. Approach eventually used in design of high performance systems and incorporated into design handbooks and codes.

  13. An expert system for prediction of aquatic toxicity of contaminants

    USGS Publications Warehouse

    Hickey, James P.; Aldridge, Andrew J.; Passino, Dora R. May; Frank, Anthony M.; Hushon, Judith M.

    1990-01-01

    The National Fisheries Research Center-Great Lakes has developed an interactive computer program in muLISP that runs on an IBM-compatible microcomputer and uses a linear solvation energy relationship (LSER) to predict acute toxicity to four representative aquatic species from the detailed structure of an organic molecule. Using the SMILES formalism for a chemical structure, the expert system identifies all structural components and uses a knowledge base of rules based on an LSER to generate four structure-related parameter values. A separate module then relates these values to toxicity. The system is designed for rapid screening of potential chemical hazards before laboratory or field investigations are conducted and can be operated by users with little toxicological background. This is the first expert system based on LSER, relying on the first comprehensive compilation of rules and values for the estimation of LSER parameters.

  14. Multi Groups Cooperation based Symbiotic Evolution for TSK-type Neuro-Fuzzy Systems Design

    PubMed Central

    Cheng, Yi-Chang; Hsu, Yung-Chi

    2010-01-01

    In this paper, a TSK-type neuro-fuzzy system with multi groups cooperation based symbiotic evolution method (TNFS-MGCSE) is proposed. The TNFS-MGCSE is developed from symbiotic evolution. The symbiotic evolution is different from traditional GAs (genetic algorithms) that each chromosome in symbiotic evolution represents a rule of fuzzy model. The MGCSE is different from the traditional symbiotic evolution; with a population in MGCSE is divided to several groups. Each group formed by a set of chromosomes represents a fuzzy rule and cooperate with other groups to generate the better chromosomes by using the proposed cooperation based crossover strategy (CCS). In this paper, the proposed TNFS-MGCSE is used to evaluate by numerical examples (Mackey-Glass chaotic time series and sunspot number forecasting). The performance of the TNFS-MGCSE achieves excellently with other existing models in the simulations. PMID:21709856

  15. A knowledge-based, concept-oriented view generation system for clinical data.

    PubMed

    Zeng, Q; Cimino, J J

    2001-04-01

    Information overload is a well-known problem for clinicians who must review large amounts of data in patient records. Concept-oriented views, which organize patient data around clinical concepts such as diagnostic strategies and therapeutic goals, may offer a solution to the problem of information overload. However, although concept-oriented views are desirable, they are difficult to create and maintain. We have developed a general-purpose, knowledge-based approach to the generation of concept-oriented views and have developed a system to test our approach. The system creates concept-oriented views through automated identification of relevant patient data. The knowledge in the system is represented by both a semantic network and rules. The key relevant data identification function is accomplished by a rule-based traversal of the semantic network. This paper focuses on the design and implementation of the system; an evaluation of the system is reported separately.

  16. Reconciled Rat and Human Metabolic Networks for Comparative Toxicogenomics and Biomarker Predictions

    DTIC Science & Technology

    2017-02-08

    compared with the original human GPR rules (Supplementary Fig. 3). The consensus-based approach for filtering orthology annotations was designed to...ARTICLE Received 29 Jan 2016 | Accepted 13 Dec 2016 | Published 8 Feb 2017 Reconciled rat and human metabolic networks for comparative toxicogenomics...predictions in response to 76 drugs. We validate comparative predictions for xanthine derivatives with new experimental data and literature- based evidence

  17. Building a Relationship between Robot Characteristics and Teleoperation User Interfaces.

    PubMed

    Mortimer, Michael; Horan, Ben; Seyedmahmoudian, Mehdi

    2017-03-14

    The Robot Operating System (ROS) provides roboticists with a standardized and distributed framework for real-time communication between robotic systems using a microkernel environment. This paper looks at how ROS metadata, Unified Robot Description Format (URDF), Semantic Robot Description Format (SRDF), and its message description language, can be used to identify key robot characteristics to inform User Interface (UI) design for the teleoperation of heterogeneous robot teams. Logical relationships between UI components and robot characteristics are defined by a set of relationship rules created using relevant and available information including developer expertise and ROS metadata. This provides a significant opportunity to move towards a rule-driven approach for generating the designs of teleoperation UIs; in particular the reduction of the number of different UI configurations required to teleoperate each individual robot within a heterogeneous robot team. This approach is based on using an underlying rule set identifying robots that can be teleoperated using the same UI configuration due to having the same or similar robot characteristics. Aside from reducing the number of different UI configurations an operator needs to be familiar with, this approach also supports consistency in UI configurations when a teleoperator is periodically switching between different robots. To achieve this aim, a Matlab toolbox is developed providing users with the ability to define rules specifying the relationship between robot characteristics and UI components. Once rules are defined, selections that best describe the characteristics of the robot type within a particular heterogeneous robot team can be made. A main advantage of this approach is that rather than specifying discrete robots comprising the team, the user can specify characteristics of the team more generally allowing the system to deal with slight variations that may occur in the future. In fact, by using the defined relationship rules and characteristic selections, the toolbox can automatically identify a reduced set of UI configurations required to control possible robot team configurations, as opposed to the traditional ad-hoc approach to teleoperation UI design. In the results section, three test cases are presented to demonstrate how the selection of different robot characteristics builds a number of robot characteristic combinations, and how the relationship rules are used to determine a reduced set of required UI configurations needed to control each individual robot in the robot team.

  18. Building a Relationship between Robot Characteristics and Teleoperation User Interfaces

    PubMed Central

    Mortimer, Michael; Horan, Ben; Seyedmahmoudian, Mehdi

    2017-01-01

    The Robot Operating System (ROS) provides roboticists with a standardized and distributed framework for real-time communication between robotic systems using a microkernel environment. This paper looks at how ROS metadata, Unified Robot Description Format (URDF), Semantic Robot Description Format (SRDF), and its message description language, can be used to identify key robot characteristics to inform User Interface (UI) design for the teleoperation of heterogeneous robot teams. Logical relationships between UI components and robot characteristics are defined by a set of relationship rules created using relevant and available information including developer expertise and ROS metadata. This provides a significant opportunity to move towards a rule-driven approach for generating the designs of teleoperation UIs; in particular the reduction of the number of different UI configurations required to teleoperate each individual robot within a heterogeneous robot team. This approach is based on using an underlying rule set identifying robots that can be teleoperated using the same UI configuration due to having the same or similar robot characteristics. Aside from reducing the number of different UI configurations an operator needs to be familiar with, this approach also supports consistency in UI configurations when a teleoperator is periodically switching between different robots. To achieve this aim, a Matlab toolbox is developed providing users with the ability to define rules specifying the relationship between robot characteristics and UI components. Once rules are defined, selections that best describe the characteristics of the robot type within a particular heterogeneous robot team can be made. A main advantage of this approach is that rather than specifying discrete robots comprising the team, the user can specify characteristics of the team more generally allowing the system to deal with slight variations that may occur in the future. In fact, by using the defined relationship rules and characteristic selections, the toolbox can automatically identify a reduced set of UI configurations required to control possible robot team configurations, as opposed to the traditional ad-hoc approach to teleoperation UI design. In the results section, three test cases are presented to demonstrate how the selection of different robot characteristics builds a number of robot characteristic combinations, and how the relationship rules are used to determine a reduced set of required UI configurations needed to control each individual robot in the robot team. PMID:28335431

  19. The Development of an Expert System for the Creative Design of Mechanisms

    DTIC Science & Technology

    1989-06-26

    adjacelnt Link-1 Figue 41 B.Semntc newr-bae knowlede representto cee 25 4.3 Planning Control in Mechanism Design In a "plain", rule-based expert system...the contracted level, ensures the non-crossing feature. 2. Geometrical: handled at the monochrome level, manages the approximate size of links. 3...Ornamental: handled at the colored level, manages proper orientations between binary links and other miscellaneous appearance of the sketch. Each stage

  20. HERB: A production system for programming with hierarchical expert rule bases: User's manual, HERB Version 1. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummel, K.E.

    1987-12-01

    Expert systems are artificial intelligence programs that solve problems requiring large amounts of heuristic knowledge, based on years of experience and tradition. Production systems are domain-independent tools that support the development of rule-based expert systems. This document describes a general purpose production system known as HERB. This system was developed to support the programming of expert systems using hierarchically structured rule bases. HERB encourages the partitioning of rules into multiple rule bases and supports the use of multiple conflict resolution strategies. Multiple rule bases can also be placed on a system stack and simultaneously searched during each interpreter cycle. Bothmore » backward and forward chaining rules are supported by HERB. The condition portion of each rule can contain both patterns, which are matched with facts in a data base, and LISP expressions, which are explicitly evaluated in the LISP environment. Properties of objects can also be stored in the HERB data base and referenced within the scope of each rule. This document serves both as an introduction to the principles of LISP-based production systems and as a user's manual for the HERB system. 6 refs., 17 figs.« less

  1. Nanoparticle Superlattice Engineering with DNA

    NASA Astrophysics Data System (ADS)

    Macfarlane, Robert John

    In this thesis, we describe a set of design rules for using programmable oligonucleotide interactions, elements of both thermodynamic and kinetic control, and an understanding of the dominant forces that are responsible for particle assembly to design and deliberately make a wide variety of nanoparticle-based superlattices. Like the rules for ionic solids developed by Linus Pauling, these rules are guidelines for determining relative nanoparticle superlattice stability, rather than rigorous mathematical descriptions. However, unlike Pauling's rules, the set of rules developed herein allow one to not just predict crystal stability, but also to deliberately and independently control the nanoparticle sizes, interparticle spacings, and crystallographic symmetries of a superlattice. In the first chapter of this thesis, a general background is given for using DNA as a tool in programmable materials synthesis. Chapter 2 demonstrates how altering oligonucleotide length and nanoparticle size can be used to control nanoparticle superlattice lattice parameters with nanometer-scale precision. In the third chapter, the kinetics of crystallization are examined, and a method to selectively stabilize kinetic products is presented. The data in chapter 4 prove that it is the overall hydrodynamic radius of a DNA-functionalized particle, rather than the sizes of the inorganic nanoparticles being assembled, that dictates particle packing behavior. Chapter 5 demonstrates how particles that exhibit non-equivalent packing behavior can be used to control superlattice symmetry, and chapter 6 utilizes these data to develop a phase diagram that predicts lattice stability a priori to synthesis. In chapter 7, the ability to functionalize a particle with multiple types of oligonucleotides is used to synthesize complex lattices, including ternary superlattices that are capable of dynamic symmetry conversion between a binary and a ternary state. The final chapter provides an outlook on other developments in DNA-programmed nanoparticle assembly not covered in this thesis, as well as future challenges for this field. Supplementary information to support the conclusions of the thesis, as well as provide technical details on how these materials are synthesized, are provided in appendices at the end of the thesis. As a whole, this methodology presents a major advance towards nanoparticle superlattice engineering, as it effectively separates the identity of a particle core (and thereby its physical properties) from the variables that control its assembly, enabling the synthesis of designer nanoparticle-based materials.

  2. An investigation of care-based vs. rule-based morality in frontotemporal dementia, Alzheimer's disease, and healthy controls.

    PubMed

    Carr, Andrew R; Paholpak, Pongsatorn; Daianu, Madelaine; Fong, Sylvia S; Mather, Michelle; Jimenez, Elvira E; Thompson, Paul; Mendez, Mario F

    2015-11-01

    Behavioral changes in dementia, especially behavioral variant frontotemporal dementia (bvFTD), may result in alterations in moral reasoning. Investigators have not clarified whether these alterations reflect differential impairment of care-based vs. rule-based moral behavior. This study investigated 18 bvFTD patients, 22 early onset Alzheimer's disease (eAD) patients, and 20 healthy age-matched controls on care-based and rule-based items from the Moral Behavioral Inventory and the Social Norms Questionnaire, neuropsychological measures, and magnetic resonance imaging (MRI) regions of interest. There were significant group differences with the bvFTD patients rating care-based morality transgressions less severely than the eAD group and rule-based moral behavioral transgressions more severely than controls. Across groups, higher care-based morality ratings correlated with phonemic fluency on neuropsychological tests, whereas higher rule-based morality ratings correlated with increased difficulty set-shifting and learning new rules to tasks. On neuroimaging, severe care-based reasoning correlated with cortical volume in right anterior temporal lobe, and rule-based reasoning correlated with decreased cortical volume in the right orbitofrontal cortex. Together, these findings suggest that frontotemporal disease decreases care-based morality and facilitates rule-based morality possibly from disturbed contextual abstraction and set-shifting. Future research can examine whether frontal lobe disorders and bvFTD result in a shift from empathic morality to the strong adherence to conventional rules. Published by Elsevier Ltd.

  3. An Investigation of Care-Based vs. Rule-Based Morality in Frontotemporal Dementia, Alzheimer’s Disease, and Healthy Controls

    PubMed Central

    Carr, Andrew R.; Paholpak, Pongsatorn; Daianu, Madelaine; Fong, Sylvia S.; Mather, Michelle; Jimenez, Elvira E.; Thompson, Paul; Mendez, Mario F.

    2015-01-01

    Behavioral changes in dementia, especially behavioral variant frontotemporal dementia (bvFTD), may result in alterations in moral reasoning. Investigators have not clarified whether these alterations reflect differential impairment of care-based vs. rule-based moral behavior. This study investigated 18 bvFTD patients, 22 early onset Alzheimer’s disease (eAD) patients, and 20 healthy age-matched controls on care-based and rule-based items from the Moral Behavioral Inventory and the Social Norms Questionnaire, neuropsychological measures, and magnetic resonance imaging (MRI) regions of interest. There were significant group differences with the bvFTD patients rating care-based morality transgressions less severely than the eAD group and rule-based moral behavioral transgressions more severely than controls. Across groups, higher care-based morality ratings correlated with phonemic fluency on neuropsychological tests, whereas higher rule-based morality ratings correlated with increased difficulty set-shifting and learning new rules to tasks. On neuroimaging, severe care-based reasoning correlated with cortical volume in right anterior temporal lobe, and rule-based reasoning correlated with decreased cortical volume in the right orbitofrontal cortex. Together, these findings suggest that frontotemporal disease decreases care-based morality and facilitates rule-based morality possibly from disturbed contextual abstraction and set-shifting. Future research can examine whether frontal lobe disorders and bvFTD result in a shift from empathic morality to the strong adherence to conventional rules. PMID:26432341

  4. 78 FR 11279 - Loan Originator Compensation Requirements Under the Truth in Lending Act (Regulation Z)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-15

    ...' understanding of and choices with respect to points and fees. This final rule is designed primarily to protect... non-deferred profits-based compensation if the individual loan originator originated ten or fewer mortgage transactions during the preceding 12 months; and (3) bonuses and other types of non-deferred...

  5. Designing P-Optimal Item Pools in Computerized Adaptive Tests with Polytomous Items

    ERIC Educational Resources Information Center

    Zhou, Xuechun

    2012-01-01

    Current CAT applications consist of predominantly dichotomous items, and CATs with polytomously scored items are limited. To ascertain the best approach to polytomous CAT, a significant amount of research has been conducted on item selection, ability estimation, and impact of termination rules based on polytomous IRT models. Few studies…

  6. Design as a Fusion Problem

    DTIC Science & Technology

    2008-07-01

    consider a proof as a composition relative to some system of music or as a painting. From the Bayesian perspective, any sufciently complex problem has...these types algorithms based on maximum entropy analysis. An example is the Bar-Shalom- Campo Fusion Rule: Xf (kjk) = X2(kjk) + (P22 P21)U1[X1(kjk

  7. Students' Individual Schematization Pathways--Empirical Reconstructions for the Case of Part-of-Part Determination for Fractions

    ERIC Educational Resources Information Center

    Glade, Matthias; Prediger, Susanne

    2017-01-01

    According to the design principle of progressive schematization, learning trajectories towards procedural rules can be organized as independent discoveries when the learning arrangement invites the students first to develop models for mathematical concepts and model-based informal strategies; then to explore the strategies and to discover pattern…

  8. Sauerbraten, Rotkappchen und Goethe: The Quiz Show as an Introduction to German Studies.

    ERIC Educational Resources Information Center

    White, Diane

    1980-01-01

    Proposes an adaptation of the quiz-show format for classroom use, discussing a set of rules and sample questions designed for beginning and intermediate German students. Presents questions based on German life and culture which are especially selected to encourage participation from students majoring in subjects other than German. (MES)

  9. Using ITS to Create an Insurance Industry Application: A Joint Case Study.

    ERIC Educational Resources Information Center

    Boies, Stephen J.; And Others

    1993-01-01

    Presents an empirical case study of the use of ITS, a software development environment designed by IBM, by Continental Insurance for underwriting applications. Use of a rule-based user interface style that made electronic forms look like standard insurance industry paper forms and worked according to Continental's guidelines is described.…

  10. A Comparison of Methods for Transforming Sentences into Test Questions for Instructional Materials. Technical Report #1.

    ERIC Educational Resources Information Center

    Roid, Gale; And Others

    Several measurement theorists have convincingly argued that methods of writing test questions, particularly for criterion-referenced tests, should be based on operationally defined rules. This study was designed to examine and further refine a method for objectively generating multiple-choice questions for prose instructional materials. Important…

  11. The Box-and-Dot Method: A Simple Strategy for Counting Significant Figures

    ERIC Educational Resources Information Center

    Stephenson, W. Kirk

    2009-01-01

    A visual method for counting significant digits is presented. This easy-to-learn (and easy-to-teach) method, designated the box-and-dot method, uses the device of "boxing" significant figures based on two simple rules, then counting the number of digits in the boxes. (Contains 4 notes.)

  12. Metareasoning and Social Evaluations in Cognitive Agents

    NASA Astrophysics Data System (ADS)

    Pinyol, Isaac; Sabater-Mir, Jordi

    Reputation mechanisms have been recognized one of the key technologies when designing multi-agent systems. They are specially relevant in complex open environments, becoming a non-centralized mechanism to control interactions among agents. Cognitive agents tackling such complex societies must use reputation information not only for selecting partners to interact with, but also in metareasoning processes to change reasoning rules. This is the focus of this paper. We argue about the necessity to allow, as a cognitive systems designers, certain degree of freedom in the reasoning rules of the agents. We also describes cognitive approaches of agency that support this idea. Furthermore, taking as a base the computational reputation model Repage, and its integration in a BDI architecture, we use the previous ideas to specify metarules and processes to modify at run-time the reasoning paths of the agent. In concrete we propose a metarule to update the link between Repage and the belief base, and a metarule and a process to update an axiom incorporated in the belief logic of the agent. Regarding this last issue we also provide empirical results that show the evolution of agents that use it.

  13. Rule of five in 2015 and beyond: Target and ligand structural limitations, ligand chemistry structure and drug discovery project decisions.

    PubMed

    Lipinski, Christopher A

    2016-06-01

    The rule of five (Ro5), based on physicochemical profiles of phase II drugs, is consistent with structural limitations in protein targets and the drug target ligands. Three of four parameters in Ro5 are fundamental to the structure of both target and drug binding sites. The chemical structure of the drug ligand depends on the ligand chemistry and design philosophy. Two extremes of chemical structure and design philosophy exist; ligands constructed in the medicinal chemistry synthesis laboratory without input from natural selection and natural product (NP) metabolites biosynthesized based on evolutionary selection. Exceptions to Ro5 are found mostly among NPs. Chemistry chameleon-like behavior of some NPs due to intra-molecular hydrogen bonding as exemplified by cyclosporine A is a strong contributor to NP Ro5 outliers. The fragment derived, drug Navitoclax is an example of the extensive expertise, resources, time and key decisions required for the rare discovery of a non-NP Ro5 outlier. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Fuzzy Logic Based Autonomous Parallel Parking System with Kalman Filtering

    NASA Astrophysics Data System (ADS)

    Panomruttanarug, Benjamas; Higuchi, Kohji

    This paper presents an emulation of fuzzy logic control schemes for an autonomous parallel parking system in a backward maneuver. There are four infrared sensors sending the distance data to a microcontroller for generating an obstacle-free parking path. Two of them mounted on the front and rear wheels on the parking side are used as the inputs to the fuzzy rules to calculate a proper steering angle while backing. The other two attached to the front and rear ends serve for avoiding collision with other cars along the parking space. At the end of parking processes, the vehicle will be in line with other parked cars and positioned in the middle of the free space. Fuzzy rules are designed based upon a wall following process. Performance of the infrared sensors is improved using Kalman filtering. The design method needs extra information from ultrasonic sensors. Starting from modeling the ultrasonic sensor in 1-D state space forms, one makes use of the infrared sensor as a measurement to update the predicted values. Experimental results demonstrate the effectiveness of sensor improvement.

  15. Overview of diffraction gratings technologies for spaceflight satellites and ground-based telescopes

    NASA Astrophysics Data System (ADS)

    Cotel, A.; Liard, A.; Desserouer, F.; Pichon, P.

    2017-11-01

    The diffraction gratings are widely used in Space-flight satellites for spectrograph instruments or in ground-based telescopes in astronomy. The diffraction gratings are one of the key optical components of such systems and have to exhibit very high optical performances. HORIBA Jobin Yvon S.A.S. (part of HORIBA Group) is in the forefront of such gratings development for more than 40 years. During the past decades, HORIBA Jobin Yvon (HJY) has developed a unique expertise in diffraction grating design and manufacturing processes for holographic, ruled or etched gratings. We will present in this paper an overview of diffraction grating technologies especially designed for space and astronomy applications. We will firstly review the heritage of the company in this field with the space qualification of different grating types. Then, we will describe several key grating technologies developed for specific space or astronomy projects: ruled blazed low groove density plane reflection grating, high-groove density holographic toroidal and spherical grating, and finally transmission Fused Silica Etched (FSE) grism-assembled grating. We will not present the Volume Phase Holographic (VPHG) grating type which is used in Astronomy.

  16. Model-Based Design of Tree WSNs for Decentralized Detection.

    PubMed

    Tantawy, Ashraf; Koutsoukos, Xenofon; Biswas, Gautam

    2015-08-20

    The classical decentralized detection problem of finding the optimal decision rules at the sensor and fusion center, as well as variants that introduce physical channel impairments have been studied extensively in the literature. The deployment of WSNs in decentralized detection applications brings new challenges to the field. Protocols for different communication layers have to be co-designed to optimize the detection performance. In this paper, we consider the communication network design problem for a tree WSN. We pursue a system-level approach where a complete model for the system is developed that captures the interactions between different layers, as well as different sensor quality measures. For network optimization, we propose a hierarchical optimization algorithm that lends itself to the tree structure, requiring only local network information. The proposed design approach shows superior performance over several contentionless and contention-based network design approaches.

  17. 75 FR 8759 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-25

    ... rule proposal methods. The FOCUS Report was designed to eliminate the overlapping regulatory reports... SECURITIES AND EXCHANGE COMMISSION [Rule 17a-5; SEC File No. 270-155; OMB Control No. 3235-0123... currently valid control number. Rule 17a-5 (17 CFR 240.17a-5) is the basic financial reporting rule for...

  18. 78 FR 17969 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-25

    ...-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change To Require Members To Report OTC Equity Transactions... (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to amend FINRA trade reporting rules...

  19. Compartmental and Spatial Rule-Based Modeling with Virtual Cell.

    PubMed

    Blinov, Michael L; Schaff, James C; Vasilescu, Dan; Moraru, Ion I; Bloom, Judy E; Loew, Leslie M

    2017-10-03

    In rule-based modeling, molecular interactions are systematically specified in the form of reaction rules that serve as generators of reactions. This provides a way to account for all the potential molecular complexes and interactions among multivalent or multistate molecules. Recently, we introduced rule-based modeling into the Virtual Cell (VCell) modeling framework, permitting graphical specification of rules and merger of networks generated automatically (using the BioNetGen modeling engine) with hand-specified reaction networks. VCell provides a number of ordinary differential equation and stochastic numerical solvers for single-compartment simulations of the kinetic systems derived from these networks, and agent-based network-free simulation of the rules. In this work, compartmental and spatial modeling of rule-based models has been implemented within VCell. To enable rule-based deterministic and stochastic spatial simulations and network-free agent-based compartmental simulations, the BioNetGen and NFSim engines were each modified to support compartments. In the new rule-based formalism, every reactant and product pattern and every reaction rule are assigned locations. We also introduce the rule-based concept of molecular anchors. This assures that any species that has a molecule anchored to a predefined compartment will remain in this compartment. Importantly, in addition to formulation of compartmental models, this now permits VCell users to seamlessly connect reaction networks derived from rules to explicit geometries to automatically generate a system of reaction-diffusion equations. These may then be simulated using either the VCell partial differential equations deterministic solvers or the Smoldyn stochastic simulator. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  20. Revised Interim Final Consolidated Enforcement Response and Penalty Policy for the Pre-Renovation Education Rule; Renovation, Repair and Painting Rule; and Lead-Based Paint Activities Rule

    EPA Pesticide Factsheets

    This is the revised version of the Interim Final Consolidated Enforcement Response and Penalty Policy for the Pre-Renovation Education Rule; Renovation, Repair and Painting Rule; and Lead-Based Paint Activities Rule.

  1. 17 CFR 240.3a40-1 - Designation of financial responsibility rules.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... relating to hypothecation or lending of customer securities; (c) Any rule adopted by any self-regulatory... other rule adopted by the Commission or any self-regulatory organization relating to the protection of...

  2. Implementation of clinical decision rules in the emergency department.

    PubMed

    Stiell, Ian G; Bennett, Carol

    2007-11-01

    Clinical decision rules (CDRs) are tools designed to help clinicians make bedside diagnostic and therapeutic decisions. The development of a CDR involves three stages: derivation, validation, and implementation. Several criteria need to be considered when designing and evaluating the results of an implementation trial. In this article, the authors review the results of implementation studies evaluating the effect of four CDRs: the Ottawa Ankle Rules, the Ottawa Knee Rule, the Canadian C-Spine Rule, and the Canadian CT Head Rule. Four implementation studies demonstrated that the implementation of CDRs in the emergency department (ED) safely reduced the use of radiography for ankle, knee, and cervical spine injuries. However, a recent trial failed to demonstrate an impact on computed tomography imaging rates. Well-developed and validated CDRs can be successfully implemented into practice, efficiently standardizing ED care. However, further research is needed to identify barriers to implementation in order to achieve improved uptake in the ED.

  3. The expert explorer: a tool for hospital data visualization and adverse drug event rules validation.

    PubMed

    Băceanu, Adrian; Atasiei, Ionuţ; Chazard, Emmanuel; Leroy, Nicolas

    2009-01-01

    An important part of adverse drug events (ADEs) detection is the validation of the clinical cases and the assessment of the decision rules to detect ADEs. For that purpose, a software called "Expert Explorer" has been designed by Ideea Advertising. Anonymized datasets have been extracted from hospitals into a common repository. The tool has 3 main features. (1) It can display hospital stays in a visual and comprehensive way (diagnoses, drugs, lab results, etc.) using tables and pretty charts. (2) It allows designing and executing dashboards in order to generate knowledge about ADEs. (3) It finally allows uploading decision rules obtained from data mining. Experts can then review the rules, the hospital stays that match the rules, and finally give their advice thanks to specialized forms. Then the rules can be validated, invalidated, or improved (knowledge elicitation phase).

  4. Blowout Monitor

    NASA Technical Reports Server (NTRS)

    1994-01-01

    C Language Integrated Production System (CLIPS), a NASA-developed software shell for developing expert systems, has been embedded in a PC-based expert system for training oil rig personnel in monitoring oil drilling. Oil drilling rigs if not properly maintained for possible blowouts pose hazards to human life, property and the environment may be destroyed. CLIPS is designed to permit the delivery of artificial intelligence on computer. A collection of rules is set up and, as facts become known, these rules are applied. In the Well Site Advisor, CLIPS provides the capability to accurately process, predict and interpret well data in a real time mode. CLIPS was provided to INTEQ by COSMIC.

  5. Designing an autoverification system in Zagazig University Hospitals Laboratories: preliminary evaluation on thyroid function profile.

    PubMed

    Sediq, Amany Mohy-Eldin; Abdel-Azeez, Ahmad GabAllahm Hala

    2014-01-01

    The current practice in Zagazig University Hospitals Laboratories (ZUHL) is manual verification of all results for the later release of reports. These processes are time consuming and tedious, with large inter-individual variation that slows the turnaround time (TAT). Autoverification is the process of comparing patient results, generated from interfaced instruments, against laboratory-defined acceptance parameters. This study describes an autoverification engine designed and implemented in ZUHL, Egypt. A descriptive study conducted at ZUHL, from January 2012-December 2013. A rule-based system was used in designing an autoverification engine. The engine was preliminarily evaluated on a thyroid function panel. A total of 563 rules were written and tested on 563 simulated cases and 1673 archived cases. The engine decisions were compared to that of 4 independent expert reviewers. The impact of engine implementation on TAT was evaluated. Agreement was achieved among the 4 reviewers in 55.5% of cases, and with the engine in 51.5% of cases. The autoverification rate for archived cases was 63.8%. Reported lab TAT was reduced by 34.9%, and TAT segment from the completion of analysis to verification was reduced by 61.8%. The developed rule-based autoverification system has a verification rate comparable to that of the commercially available software. However, the in-house development of this system had saved the hospital the cost of commercially available ones. The implementation of the system shortened the TAT and minimized the number of samples that needed staff revision, which enabled laboratory staff to devote more time and effort to handle problematic test results and to improve patient care quality.

  6. Design rules for phase-change materials in data storage applications.

    PubMed

    Lencer, Dominic; Salinga, Martin; Wuttig, Matthias

    2011-05-10

    Phase-change materials can rapidly and reversibly be switched between an amorphous and a crystalline phase. Since both phases are characterized by very different optical and electrical properties, these materials can be employed for rewritable optical and electrical data storage. Hence, there are considerable efforts to identify suitable materials, and to optimize them with respect to specific applications. Design rules that can explain why the materials identified so far enable phase-change based devices would hence be very beneficial. This article describes materials that have been successfully employed and dicusses common features regarding both typical structures and bonding mechanisms. It is shown that typical structural motifs and electronic properties can be found in the crystalline state that are indicative for resonant bonding, from which the employed contrast originates. The occurence of resonance is linked to the composition, thus providing a design rule for phase-change materials. This understanding helps to unravel characteristic properties such as electrical and thermal conductivity which are discussed in the subsequent section. Then, turning to the transition kinetics between the phases, the current understanding and modeling of the processes of amorphization and crystallization are discussed. Finally, present approaches for improved high-capacity optical discs and fast non-volatile electrical memories, that hold the potential to succeed present-day's Flash memory, are presented. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Stabilization and analytical tuning rule of double-loop control scheme for unstable dead-time process

    NASA Astrophysics Data System (ADS)

    Ugon, B.; Nandong, J.; Zang, Z.

    2017-06-01

    The presence of unstable dead-time systems in process plants often leads to a daunting challenge in the design of standard PID controllers, which are not only intended to provide close-loop stability but also to give good performance-robustness overall. In this paper, we conduct stability analysis on a double-loop control scheme based on the Routh-Hurwitz stability criteria. We propose to use this unstable double-loop control scheme which employs two P/PID controllers to control first-order or second-order unstable dead-time processes typically found in process industries. Based on the Routh-Hurwitz stability necessary and sufficient criteria, we establish several stability regions which enclose within them the P/PID parameter values that guarantee close-loop stability of the double-loop control scheme. A systematic tuning rule is developed for the purpose of obtaining the optimal P/PID parameter values within the established regions. The effectiveness of the proposed tuning rule is demonstrated using several numerical examples and the result are compared with some well-established tuning methods reported in the literature.

  8. Electron lithography STAR design guidelines. Part 2: The design of a STAR for space applications

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Newman, W.

    1982-01-01

    The STAR design system developed by NASA enables any user with a logic diagram to design a semicustom digital MOS integrated circuit. The system is comprised of a library of standard logic cells and computr programs to place, route, and display designs implemented with cells from the library. Also described is the development of a radiation-hard array designed for the STAR system. The design is based on the CMOS silicon gate technology developed by SANDIA National Laboratories. The design rules used are given as well as the model parameters developed for the basic array element. Library cells of the CMOS metal gate and CMOS silicon gate technologies were simulated using SPICE, and the results are shown and compared.

  9. Design and evaluation of a service oriented architecture for paperless ICU tarification.

    PubMed

    Steurbaut, Kristof; Colpaert, Kirsten; Van Hoecke, Sofie; Steurbaut, Sabrina; Danneels, Chris; Decruyenaere, Johan; De Turck, Filip

    2012-06-01

    The computerization of Intensive Care Units provides an overwhelming amount of electronic data for both medical and financial analysis. However, the current tarification, which is the process to tick and count patients' procedures, is still a repetitive, time-consuming process on paper. Nurses and secretaries keep track manually of the patients' medical procedures. This paper describes the design methodology and implementation of automated tarification services. In this study we investigate if the tarification can be modeled in service oriented architecture as a composition of interacting services. Services are responsible for data collection, automatic assignment of records to physicians and application of rules. Performance is evaluated in terms of execution time, cost evaluation and return on investment based on tracking of real procedures. The services provide high flexibility in terms of maintenance, integration and rules support. It is shown that services offer a more accurate, less time-consuming and cost-effective tarification.

  10. How to translate therapeutic recommendations in clinical practice guidelines into rules for critiquing physician prescriptions? Methods and application to five guidelines

    PubMed Central

    2010-01-01

    Background Clinical practice guidelines give recommendations about what to do in various medical situations, including therapeutical recommendations for drug prescription. An effective way to computerize these recommendations is to design critiquing decision support systems, i.e. systems that criticize the physician's prescription when it does not conform to the guidelines. These systems are commonly based on a list of "if conditions then criticism" rules. However, writing these rules from the guidelines is not a trivial task. The objective of this article is to propose methods that (1) simplify the implementation of guidelines' therapeutical recommendations in critiquing systems by automatically translating structured therapeutical recommendations into a list of "if conditions then criticize" rules, and (2) can generate an appropriate textual label to explain to the physician why his/her prescription is not recommended. Methods We worked on the therapeutic recommendations in five clinical practice guidelines concerning chronic diseases related to the management of cardiovascular risk. We evaluated the system using a test base of more than 2000 cases. Results Algorithms for automatically translating therapeutical recommendations into "if conditions then criticize" rules are presented. Eight generic recommendations are also proposed; they are guideline-independent, and can be used as default behaviour for handling various situations that are usually implicit in the guidelines, such as decreasing the dose of a poorly tolerated drug. Finally, we provide models and methods for generating a human-readable textual critique. The system was successfully evaluated on the test base. Conclusion We show that it is possible to criticize physicians' prescriptions starting from a structured clinical guideline, and to provide clear explanations. We are now planning a randomized clinical trial to evaluate the impact of the system on practices. PMID:20509903

  11. Rules based process window OPC

    NASA Astrophysics Data System (ADS)

    O'Brien, Sean; Soper, Robert; Best, Shane; Mason, Mark

    2008-03-01

    As a preliminary step towards Model-Based Process Window OPC we have analyzed the impact of correcting post-OPC layouts using rules based methods. Image processing on the Brion Tachyon was used to identify sites where the OPC model/recipe failed to generate an acceptable solution. A set of rules for 65nm active and poly were generated by classifying these failure sites. The rules were based upon segment runlengths, figure spaces, and adjacent figure widths. 2.1 million sites for active were corrected in a small chip (comparing the pre and post rules based operations), and 59 million were found at poly. Tachyon analysis of the final reticle layout found weak margin sites distinct from those sites repaired by rules-based corrections. For the active layer more than 75% of the sites corrected by rules would have printed without a defect indicating that most rulesbased cleanups degrade the lithographic pattern. Some sites were missed by the rules based cleanups due to either bugs in the DRC software or gaps in the rules table. In the end dramatic changes to the reticle prevented catastrophic lithography errors, but this method is far too blunt. A more subtle model-based procedure is needed changing only those sites which have unsatisfactory lithographic margin.

  12. Genetic learning in rule-based and neural systems

    NASA Technical Reports Server (NTRS)

    Smith, Robert E.

    1993-01-01

    The design of neural networks and fuzzy systems can involve complex, nonlinear, and ill-conditioned optimization problems. Often, traditional optimization schemes are inadequate or inapplicable for such tasks. Genetic Algorithms (GA's) are a class of optimization procedures whose mechanics are based on those of natural genetics. Mathematical arguments show how GAs bring substantial computational leverage to search problems, without requiring the mathematical characteristics often necessary for traditional optimization schemes (e.g., modality, continuity, availability of derivative information, etc.). GA's have proven effective in a variety of search tasks that arise in neural networks and fuzzy systems. This presentation begins by introducing the mechanism and theoretical underpinnings of GA's. GA's are then related to a class of rule-based machine learning systems called learning classifier systems (LCS's). An LCS implements a low-level production-system that uses a GA as its primary rule discovery mechanism. This presentation illustrates how, despite its rule-based framework, an LCS can be thought of as a competitive neural network. Neural network simulator code for an LCS is presented. In this context, the GA is doing more than optimizing and objective function. It is searching for an ecology of hidden nodes with limited connectivity. The GA attempts to evolve this ecology such that effective neural network performance results. The GA is particularly well adapted to this task, given its naturally-inspired basis. The LCS/neural network analogy extends itself to other, more traditional neural networks. Conclusions to the presentation discuss the implications of using GA's in ecological search problems that arise in neural and fuzzy systems.

  13. A simple signaling rule for variable life-adjusted display derived from an equivalent risk-adjusted CUSUM chart.

    PubMed

    Wittenberg, Philipp; Gan, Fah Fatt; Knoth, Sven

    2018-04-17

    The variable life-adjusted display (VLAD) is the first risk-adjusted graphical procedure proposed in the literature for monitoring the performance of a surgeon. It displays the cumulative sum of expected minus observed deaths. It has since become highly popular because the statistic plotted is easy to understand. But it is also easy to misinterpret a surgeon's performance by utilizing the VLAD, potentially leading to grave consequences. The problem of misinterpretation is essentially caused by the variance of the VLAD's statistic that increases with sample size. In order for the VLAD to be truly useful, a simple signaling rule is desperately needed. Various forms of signaling rules have been developed, but they are usually quite complicated. Without signaling rules, making inferences using the VLAD alone is difficult if not misleading. In this paper, we establish an equivalence between a VLAD with V-mask and a risk-adjusted cumulative sum (RA-CUSUM) chart based on the difference between the estimated probability of death and surgical outcome. Average run length analysis based on simulation shows that this particular RA-CUSUM chart has similar performance as compared to the established RA-CUSUM chart based on the log-likelihood ratio statistic obtained by testing the odds ratio of death. We provide a simple design procedure for determining the V-mask parameters based on a resampling approach. Resampling from a real data set ensures that these parameters can be estimated appropriately. Finally, we illustrate the monitoring of a real surgeon's performance using VLAD with V-mask. Copyright © 2018 John Wiley & Sons, Ltd.

  14. Self-Interest and the Design of Rules.

    PubMed

    Singh, Manvir; Wrangham, Richard; Glowacki, Luke

    2017-12-01

    Rules regulating social behavior raise challenging questions about cultural evolution in part because they frequently confer group-level benefits. Current multilevel selection theories contend that between-group processes interact with within-group processes to produce norms and institutions, but within-group processes have remained underspecified, leading to a recent emphasis on cultural group selection as the primary driver of cultural design. Here we present the self-interested enforcement (SIE) hypothesis, which proposes that the design of rules importantly reflects the relative enforcement capacities of competing parties. We show that, in addition to explaining patterns in cultural change and stability, SIE can account for the emergence of much group-functional culture. We outline how this process can stifle or accelerate cultural group selection, depending on various social conditions. Self-interested enforcement has important bearings on the emergence, stability, and change of rules.

  15. 78 FR 36290 - Self-Regulatory Organizations; NYSE MKT LLC; Notice of Designation of a Longer Period for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-17

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-69733; File No. SR-NYSEMKT-2013-25] Self-Regulatory Organizations; NYSE MKT LLC; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change Amending NYSE MKT Rule 104--Equities To Codify Certain Traditional Trading Floor Functions That May Be Performed by Designated...

  16. Multi-arm group sequential designs with a simultaneous stopping rule.

    PubMed

    Urach, S; Posch, M

    2016-12-30

    Multi-arm group sequential clinical trials are efficient designs to compare multiple treatments to a control. They allow one to test for treatment effects already in interim analyses and can have a lower average sample number than fixed sample designs. Their operating characteristics depend on the stopping rule: We consider simultaneous stopping, where the whole trial is stopped as soon as for any of the arms the null hypothesis of no treatment effect can be rejected, and separate stopping, where only recruitment to arms for which a significant treatment effect could be demonstrated is stopped, but the other arms are continued. For both stopping rules, the family-wise error rate can be controlled by the closed testing procedure applied to group sequential tests of intersection and elementary hypotheses. The group sequential boundaries for the separate stopping rule also control the family-wise error rate if the simultaneous stopping rule is applied. However, we show that for the simultaneous stopping rule, one can apply improved, less conservative stopping boundaries for local tests of elementary hypotheses. We derive corresponding improved Pocock and O'Brien type boundaries as well as optimized boundaries to maximize the power or average sample number and investigate the operating characteristics and small sample properties of the resulting designs. To control the power to reject at least one null hypothesis, the simultaneous stopping rule requires a lower average sample number than the separate stopping rule. This comes at the cost of a lower power to reject all null hypotheses. Some of this loss in power can be regained by applying the improved stopping boundaries for the simultaneous stopping rule. The procedures are illustrated with clinical trials in systemic sclerosis and narcolepsy. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  17. 50 CFR 424.18 - Final rules-general.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., DEPARTMENT OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.18 Final rules—general. (a... rule to list, delist, or reclassify a species or designate or revise critical habitat will also provide...

  18. 50 CFR 424.18 - Final rules-general.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., DEPARTMENT OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.18 Final rules—general. (a... rule to list, delist, or reclassify a species or designate or revise critical habitat will also provide...

  19. 78 FR 79720 - Self-Regulatory Organizations; NYSE MKT LLC; Notice of Designation of a Longer Period for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-31

    ... Proposed Rule Change, as Modified by Amendment No. 1 Thereto, To Adopt Commentary .03 to Rule 980NY To Limit the Volume of Complex Orders by a Single ATP Holder During the Trading Day December 24, 2013. On...\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to adopt Commentary .03 to NYSE MKT Rule 980NY to...

  20. 77 FR 55517 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Order Approving a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-10

    ...\\ FINRA Rule 6140(a) defines a ``designated security'' as any NMS stock as defined in Rule 600(b)(47) of Regulation NMS, 17 CFR 242.600(b)(47). \\8\\ See FINRA Rule 6140(h)(1)(A)-(B). \\9\\ See FINRA Rule 6140(h)(2... requirements of Section 15A(b) of the Act \\47\\ and the rules and regulations thereunder applicable to a...

  1. Transition Flight Control Room Automation

    NASA Technical Reports Server (NTRS)

    Welborn, Curtis Ray

    1990-01-01

    The Workstation Prototype Laboratory is currently working on a number of projects which we feel can have a direct impact on ground operations automation. These projects include: The Fuel Cell Monitoring System (FCMS), which will monitor and detect problems with the fuel cells on the Shuttle. FCMS will use a combination of rules (forward/backward) and multi-threaded procedures which run concurrently with the rules, to implement the malfunction algorithms of the EGIL flight controllers. The combination of rule based reasoning and procedural reasoning allows us to more easily map the malfunction algorithms into a real-time system implementation. A graphical computation language (AGCOMPL). AGCOMPL is an experimental prototype to determine the benefits and drawbacks of using a graphical language to design computations (algorithms) to work on Shuttle or Space Station telemetry and trajectory data. The design of a system which will allow a model of an electrical system, including telemetry sensors, to be configured on the screen graphically using previously defined electrical icons. This electrical model would then be used to generate rules and procedures for detecting malfunctions in the electrical components of the model. A generic message management (GMM) system. GMM is being designed as a message management system for real-time applications which send advisory messages to a user. The primary purpose of GMM is to reduce the risk of overloading a user with information when multiple failures occurs and in assisting the developer in devising an explanation facility. The emphasis of our work is to develop practical tools and techniques, while determining the feasibility of a given approach, including identification of appropriate software tools to support research, application and tool building activities.

  2. Transition flight control room automation

    NASA Technical Reports Server (NTRS)

    Welborn, Curtis Ray

    1990-01-01

    The Workstation Prototype Laboratory is currently working on a number of projects which can have a direct impact on ground operations automation. These projects include: (1) The fuel cell monitoring system (FCMS), which will monitor and detect problems with the fuel cells on the shuttle. FCMS will use a combination of rules (forward/backward) and multithreaded procedures, which run concurrently with the rules, to implement the malfunction algorithms of the EGIL flight controllers. The combination of rule-based reasoning and procedural reasoning allows us to more easily map the malfunction algorithms into a real-time system implementation. (2) A graphical computation language (AGCOMPL) is an experimental prototype to determine the benefits and drawbacks of using a graphical language to design computations (algorithms) to work on shuttle or space station telemetry and trajectory data. (3) The design of a system will allow a model of an electrical system, including telemetry sensors, to be configured on the screen graphically using previously defined electrical icons. This electrical model would then be used to generate rules and procedures for detecting malfunctions in the electrical components of the model. (4) A generic message management (GMM) system is being designed for real-time applications as a message management system which sends advisory messages to a user. The primary purpose of GMM is to reduce the risk of overloading a user with information when multiple failures occur and to assist the developer in the devising an explanation facility. The emphasis of our work is to develop practical tools and techniques, including identification of appropriate software tools to support research, application, and tool building activities, while determining the feasibility of a given approach.

  3. Impact of polymer film thickness and cavity size on polymer flow during embossing : towards process design rules for nanoimprint lithography.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schunk, Peter Randall; King, William P.; Sun, Amy Cha-Tien

    2006-08-01

    This paper presents continuum simulations of polymer flow during nanoimprint lithography (NIL). The simulations capture the underlying physics of polymer flow from the nanometer to millimeter length scale and examine geometry and thermophysical process quantities affecting cavity filling. Variations in embossing tool geometry and polymer film thickness during viscous flow distinguish different flow driving mechanisms. Three parameters can predict polymer deformation mode: cavity width to polymer thickness ratio, polymer supply ratio, and Capillary number. The ratio of cavity width to initial polymer film thickness determines vertically or laterally dominant deformation. The ratio of indenter width to residual film thickness measuresmore » polymer supply beneath the indenter which determines Stokes or squeeze flow. The local geometry ratios can predict a fill time based on laminar flow between plates, Stokes flow, or squeeze flow. Characteristic NIL capillary number based on geometry-dependent fill time distinguishes between capillary or viscous driven flows. The three parameters predict filling modes observed in published studies of NIL deformation over nanometer to millimeter length scales. The work seeks to establish process design rules for NIL and to provide tools for the rational design of NIL master templates, resist polymers, and process parameters.« less

  4. Characterising bias in regulatory risk and decision analysis: An analysis of heuristics applied in health technology appraisal, chemicals regulation, and climate change governance.

    PubMed

    MacGillivray, Brian H

    2017-08-01

    In many environmental and public health domains, heuristic methods of risk and decision analysis must be relied upon, either because problem structures are ambiguous, reliable data is lacking, or decisions are urgent. This introduces an additional source of uncertainty beyond model and measurement error - uncertainty stemming from relying on inexact inference rules. Here we identify and analyse heuristics used to prioritise risk objects, to discriminate between signal and noise, to weight evidence, to construct models, to extrapolate beyond datasets, and to make policy. Some of these heuristics are based on causal generalisations, yet can misfire when these relationships are presumed rather than tested (e.g. surrogates in clinical trials). Others are conventions designed to confer stability to decision analysis, yet which may introduce serious error when applied ritualistically (e.g. significance testing). Some heuristics can be traced back to formal justifications, but only subject to strong assumptions that are often violated in practical applications. Heuristic decision rules (e.g. feasibility rules) in principle act as surrogates for utility maximisation or distributional concerns, yet in practice may neglect costs and benefits, be based on arbitrary thresholds, and be prone to gaming. We highlight the problem of rule-entrenchment, where analytical choices that are in principle contestable are arbitrarily fixed in practice, masking uncertainty and potentially introducing bias. Strategies for making risk and decision analysis more rigorous include: formalising the assumptions and scope conditions under which heuristics should be applied; testing rather than presuming their underlying empirical or theoretical justifications; using sensitivity analysis, simulations, multiple bias analysis, and deductive systems of inference (e.g. directed acyclic graphs) to characterise rule uncertainty and refine heuristics; adopting "recovery schemes" to correct for known biases; and basing decision rules on clearly articulated values and evidence, rather than convention. Copyright © 2017. Published by Elsevier Ltd.

  5. Clinical Decision Support for a Multicenter Trial of Pediatric Head Trauma

    PubMed Central

    Swietlik, Marguerite; Deakyne, Sara; Hoffman, Jeffrey M.; Grundmeier, Robert W.; Paterno, Marilyn D.; Rocha, Beatriz H.; Schaeffer, Molly H; Pabbathi, Deepika; Alessandrini, Evaline; Ballard, Dustin; Goldberg, Howard S.; Kuppermann, Nathan; Dayan, Peter S.

    2016-01-01

    Summary Introduction For children who present to emergency departments (EDs) due to blunt head trauma, ED clinicians must decide who requires computed tomography (CT) scanning to evaluate for traumatic brain injury (TBI). The Pediatric Emergency Care Applied Research Network (PECARN) derived and validated two age-based prediction rules to identify children at very low risk of clinically-important traumatic brain injuries (ciTBIs) who do not typically require CT scans. In this case report, we describe the strategy used to implement the PECARN TBI prediction rules via electronic health record (EHR) clinical decision support (CDS) as the intervention in a multicenter clinical trial. Methods Thirteen EDs participated in this trial. The 10 sites receiving the CDS intervention used the Epic® EHR. All sites implementing EHR-based CDS built the rules by using the vendor’s CDS engine. Based on a sociotechnical analysis, we designed the CDS so that recommendations could be displayed immediately after any provider entered prediction rule data. One central site developed and tested the intervention package to be exported to other sites. The intervention package included a clinical trial alert, an electronic data collection form, the CDS rules and the format for recommendations. Results The original PECARN head trauma prediction rules were derived from physician documentation while this pragmatic trial led each site to customize their workflows and allow multiple different providers to complete the head trauma assessments. These differences in workflows led to varying completion rates across sites as well as differences in the types of providers completing the electronic data form. Site variation in internal change management processes made it challenging to maintain the same rigor across all sites. This led to downstream effects when data reports were developed. Conclusions The process of a centralized build and export of a CDS system in one commercial EHR system successfully supported a multicenter clinical trial. PMID:27437059

  6. Classification Based on Pruning and Double Covered Rule Sets for the Internet of Things Applications

    PubMed Central

    Zhou, Zhongmei; Wang, Weiping

    2014-01-01

    The Internet of things (IOT) is a hot issue in recent years. It accumulates large amounts of data by IOT users, which is a great challenge to mining useful knowledge from IOT. Classification is an effective strategy which can predict the need of users in IOT. However, many traditional rule-based classifiers cannot guarantee that all instances can be covered by at least two classification rules. Thus, these algorithms cannot achieve high accuracy in some datasets. In this paper, we propose a new rule-based classification, CDCR-P (Classification based on the Pruning and Double Covered Rule sets). CDCR-P can induce two different rule sets A and B. Every instance in training set can be covered by at least one rule not only in rule set A, but also in rule set B. In order to improve the quality of rule set B, we take measure to prune the length of rules in rule set B. Our experimental results indicate that, CDCR-P not only is feasible, but also it can achieve high accuracy. PMID:24511304

  7. Classification based on pruning and double covered rule sets for the internet of things applications.

    PubMed

    Li, Shasha; Zhou, Zhongmei; Wang, Weiping

    2014-01-01

    The Internet of things (IOT) is a hot issue in recent years. It accumulates large amounts of data by IOT users, which is a great challenge to mining useful knowledge from IOT. Classification is an effective strategy which can predict the need of users in IOT. However, many traditional rule-based classifiers cannot guarantee that all instances can be covered by at least two classification rules. Thus, these algorithms cannot achieve high accuracy in some datasets. In this paper, we propose a new rule-based classification, CDCR-P (Classification based on the Pruning and Double Covered Rule sets). CDCR-P can induce two different rule sets A and B. Every instance in training set can be covered by at least one rule not only in rule set A, but also in rule set B. In order to improve the quality of rule set B, we take measure to prune the length of rules in rule set B. Our experimental results indicate that, CDCR-P not only is feasible, but also it can achieve high accuracy.

  8. 77 FR 30087 - Air Quality Designations for the 2008 Ozone National Ambient Air Quality Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-21

    ...This rule establishes initial air quality designations for most areas in the United States, including areas of Indian country, for the 2008 primary and secondary national ambient air quality standards (NAAQS) for ozone. The designations for several counties in Illinois, Indiana, and Wisconsin that the EPA is considering for inclusion in the Chicago nonattainment area will be designated in a subsequent action, no later than May 31, 2012. Areas designated as nonattainment are also being classified by operation of law according to the severity of their air quality problems. The classification categories are Marginal, Moderate, Serious, Severe, and Extreme. The EPA is establishing the air quality thresholds that define the classifications in a separate rule that the EPA is signing and publishing in the Federal Register on the same schedule as these designations. In accordance with that separate rule, six nonattainment areas in California are being reclassified to a higher classification.

  9. Improving the Critic Learning for Event-Based Nonlinear $H_{\\infty }$ Control Design.

    PubMed

    Wang, Ding; He, Haibo; Liu, Derong

    2017-10-01

    In this paper, we aim at improving the critic learning criterion to cope with the event-based nonlinear H ∞ state feedback control design. First of all, the H ∞ control problem is regarded as a two-player zero-sum game and the adaptive critic mechanism is used to achieve the minimax optimization under event-based environment. Then, based on an improved updating rule, the event-based optimal control law and the time-based worst-case disturbance law are obtained approximately by training a single critic neural network. The initial stabilizing control is no longer required during the implementation process of the new algorithm. Next, the closed-loop system is formulated as an impulsive model and its stability issue is handled by incorporating the improved learning criterion. The infamous Zeno behavior of the present event-based design is also avoided through theoretical analysis on the lower bound of the minimal intersample time. Finally, the applications to an aircraft dynamics and a robot arm plant are carried out to verify the efficient performance of the present novel design method.

  10. Run-length encoding graphic rules, biochemically editable designs and steganographical numeric data embedment for DNA-based cryptographical coding system

    PubMed Central

    Kawano, Tomonori

    2013-01-01

    There have been a wide variety of approaches for handling the pieces of DNA as the “unplugged” tools for digital information storage and processing, including a series of studies applied to the security-related area, such as DNA-based digital barcodes, water marks and cryptography. In the present article, novel designs of artificial genes as the media for storing the digitally compressed data for images are proposed for bio-computing purpose while natural genes principally encode for proteins. Furthermore, the proposed system allows cryptographical application of DNA through biochemically editable designs with capacity for steganographical numeric data embedment. As a model case of image-coding DNA technique application, numerically and biochemically combined protocols are employed for ciphering the given “passwords” and/or secret numbers using DNA sequences. The “passwords” of interest were decomposed into single letters and translated into the font image coded on the separate DNA chains with both the coding regions in which the images are encoded based on the novel run-length encoding rule, and the non-coding regions designed for biochemical editing and the remodeling processes revealing the hidden orientation of letters composing the original “passwords.” The latter processes require the molecular biological tools for digestion and ligation of the fragmented DNA molecules targeting at the polymerase chain reaction-engineered termini of the chains. Lastly, additional protocols for steganographical overwriting of the numeric data of interests over the image-coding DNA are also discussed. PMID:23750303

  11. Ontology-based classification of remote sensing images using spectral rules

    NASA Astrophysics Data System (ADS)

    Andrés, Samuel; Arvor, Damien; Mougenot, Isabelle; Libourel, Thérèse; Durieux, Laurent

    2017-05-01

    Earth Observation data is of great interest for a wide spectrum of scientific domain applications. An enhanced access to remote sensing images for "domain" experts thus represents a great advance since it allows users to interpret remote sensing images based on their domain expert knowledge. However, such an advantage can also turn into a major limitation if this knowledge is not formalized, and thus is difficult for it to be shared with and understood by other users. In this context, knowledge representation techniques such as ontologies should play a major role in the future of remote sensing applications. We implemented an ontology-based prototype to automatically classify Landsat images based on explicit spectral rules. The ontology is designed in a very modular way in order to achieve a generic and versatile representation of concepts we think of utmost importance in remote sensing. The prototype was tested on four subsets of Landsat images and the results confirmed the potential of ontologies to formalize expert knowledge and classify remote sensing images.

  12. Smart Aerospace eCommerce: Using Intelligent Agents in a NASA Mission Services Ordering Application

    NASA Technical Reports Server (NTRS)

    Moleski, Walt; Luczak, Ed; Morris, Kim; Clayton, Bill; Scherf, Patricia; Obenschain, Arthur F. (Technical Monitor)

    2002-01-01

    This paper describes how intelligent agent technology was successfully prototyped and then deployed in a smart eCommerce application for NASA. An intelligent software agent called the Intelligent Service Validation Agent (ISVA) was added to an existing web-based ordering application to validate complex orders for spacecraft mission services. This integration of intelligent agent technology with conventional web technology satisfies an immediate NASA need to reduce manual order processing costs. The ISVA agent checks orders for completeness, consistency, and correctness, and notifies users of detected problems. ISVA uses NASA business rules and a knowledge base of NASA services, and is implemented using the Java Expert System Shell (Jess), a fast rule-based inference engine. The paper discusses the design of the agent and knowledge base, and the prototyping and deployment approach. It also discusses future directions and other applications, and discusses lessons-learned that may help other projects make their aerospace eCommerce applications smarter.

  13. Video Self-Modeling to Teach Classroom Rules to Two Students with Asperger's

    ERIC Educational Resources Information Center

    Lang, Russell; Shogren, Karrie A.; Machalicek, Wendy; Rispoli, Mandy; O'Reilly, Mark; Baker, Sonia; Regester, April

    2009-01-01

    Classroom rules are an integral part of classroom management. Children with Asperger's may require systematic instruction to learn classroom rules, but may be placed in classrooms in which the rules are not explicitly taught. A multiple baseline design across students with probes for maintenance after the intervention ceased was used to evaluate…

  14. 78 FR 36625 - Self-Regulatory Organizations; Chicago Mercantile Exchange Inc.; Notice of Designation of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-18

    ...\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to make adjustments to the liquidity risk factor... liquidity risk factor component. The proposed rule change was published for comment in the Federal Register... on Proposed Rule Change Related to the Liquidity Factor of CME's CDS Margin Methodology June 12, 2013...

  15. 'Ten Golden Rules' for Designing Software in Medical Education: Results from a Formative Evaluation of DIALOG.

    ERIC Educational Resources Information Center

    Jha, Vikram; Duffy, Sean

    2002-01-01

    Reports the results of an evaluation of Distance Interactive Learning in Obstetrics and Gynecology (DIALOG) which is an electronic program for continuing education. Presents 10 golden rules for designing software for medical practitioners. (Contains 26 references.) (Author/YDS)

  16. 77 FR 39277 - Self-Regulatory Organizations; NASDAQ OMX BX, Inc.; Order Granting Approval of a Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-02

    ... with a time-in- force designation of Good Til canceled (``GTC'') are treated as having a time-in-force... designation of Good Til Cancelled or Immediate or Cancel. See proposed BX Options Rules, Chapter VI, Section 1...

  17. 46 CFR 116.300 - Structural design.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Structure § 116.300 Structural design. Except as otherwise allowed by this subpart, a vessel must comply... the vessel. (a) Steel hull vessels: (1) Rules and Regulations for the Classification of Yachts and Small Craft, Lloyd's Register of Shipping (Lloyd's); or (2) Rules for Building and Classing Steel...

  18. 46 CFR 116.300 - Structural design.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Structure § 116.300 Structural design. Except as otherwise allowed by this subpart, a vessel must comply... the vessel. (a) Steel hull vessels: (1) Rules and Regulations for the Classification of Yachts and Small Craft, Lloyd's Register of Shipping (Lloyd's); or (2) Rules for Building and Classing Steel...

  19. 46 CFR 116.300 - Structural design.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Structure § 116.300 Structural design. Except as otherwise allowed by this subpart, a vessel must comply... the vessel. (a) Steel hull vessels: (1) Rules and Regulations for the Classification of Yachts and Small Craft, Lloyd's Register of Shipping (Lloyd's); or (2) Rules for Building and Classing Steel...

  20. 46 CFR 116.300 - Structural design.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Structure § 116.300 Structural design. Except as otherwise allowed by this subpart, a vessel must comply... the vessel. (a) Steel hull vessels: (1) Rules and Regulations for the Classification of Yachts and Small Craft, Lloyd's Register of Shipping (Lloyd's); or (2) Rules for Building and Classing Steel...

Top