Using pattern enumeration to accelerate process development and ramp yield
NASA Astrophysics Data System (ADS)
Zhuang, Linda; Pang, Jenny; Xu, Jessy; Tsai, Mengfeng; Wang, Amy; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua
2016-03-01
During a new technology node process setup phase, foundries do not initially have enough product chip designs to conduct exhaustive process development. Different operational teams use manually designed simple test keys to set up their process flows and recipes. When the very first version of the design rule manual (DRM) is ready, foundries enter the process development phase where new experiment design data is manually created based on these design rules. However, these IP/test keys contain very uniform or simple design structures. This kind of design normally does not contain critical design structures or process unfriendly design patterns that pass design rule checks but are found to be less manufacturable. It is desired to have a method to generate exhaustive test patterns allowed by design rules at development stage to verify the gap of design rule and process. This paper presents a novel method of how to generate test key patterns which contain known problematic patterns as well as any constructs which designers could possibly draw based on current design rules. The enumerated test key patterns will contain the most critical design structures which are allowed by any particular design rule. A layout profiling method is used to do design chip analysis in order to find potential weak points on new incoming products so fab can take preemptive action to avoid yield loss. It can be achieved by comparing different products and leveraging the knowledge learned from previous manufactured chips to find possible yield detractors.
Process Materialization Using Templates and Rules to Design Flexible Process Models
NASA Astrophysics Data System (ADS)
Kumar, Akhil; Yao, Wen
The main idea in this paper is to show how flexible processes can be designed by combining generic process templates and business rules. We instantiate a process by applying rules to specific case data, and running a materialization algorithm. The customized process instance is then executed in an existing workflow engine. We present an architecture and also give an algorithm for process materialization. The rules are written in a logic-based language like Prolog. Our focus is on capturing deeper process knowledge and achieving a holistic approach to robust process design that encompasses control flow, resources and data, as well as makes it easier to accommodate changes to business policy.
Design rules for RCA self-aligned silicon-gate CMOS/SOS process
NASA Technical Reports Server (NTRS)
1977-01-01
The CMOS/SOS design rules prepared by the RCA Solid State Technology Center (SSTC) are described. These rules specify the spacing and width requirements for each of the six design levels, the seventh level being used to define openings in the passivation level. An associated report, entitled Silicon-Gate CMOS/SOS Processing, provides further insight into the usage of these rules.
Investigation of model-based physical design restrictions (Invited Paper)
NASA Astrophysics Data System (ADS)
Lucas, Kevin; Baron, Stanislas; Belledent, Jerome; Boone, Robert; Borjon, Amandine; Couderc, Christophe; Patterson, Kyle; Riviere-Cazaux, Lionel; Rody, Yves; Sundermann, Frank; Toublan, Olivier; Trouiller, Yorick; Urbani, Jean-Christophe; Wimmer, Karl
2005-05-01
As lithography and other patterning processes become more complex and more non-linear with each generation, the task of physical design rules necessarily increases in complexity also. The goal of the physical design rules is to define the boundary between the physical layout structures which will yield well from those which will not. This is essentially a rule-based pre-silicon guarantee of layout correctness. However the rapid increase in design rule requirement complexity has created logistical problems for both the design and process functions. Therefore, similar to the semiconductor industry's transition from rule-based to model-based optical proximity correction (OPC) due to increased patterning complexity, opportunities for improving physical design restrictions by implementing model-based physical design methods are evident. In this paper we analyze the possible need and applications for model-based physical design restrictions (MBPDR). We first analyze the traditional design rule evolution, development and usage methodologies for semiconductor manufacturers. Next we discuss examples of specific design rule challenges requiring new solution methods in the patterning regime of low K1 lithography and highly complex RET. We then evaluate possible working strategies for MBPDR in the process development and product design flows, including examples of recent model-based pre-silicon verification techniques. Finally we summarize with a proposed flow and key considerations for MBPDR implementation.
Failure detection system design methodology. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chow, E. Y.
1980-01-01
The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.
Bayesian design of decision rules for failure detection
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.
A novel methodology for building robust design rules by using design based metrology (DBM)
NASA Astrophysics Data System (ADS)
Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan
2013-03-01
This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.
Integrated layout based Monte-Carlo simulation for design arc optimization
NASA Astrophysics Data System (ADS)
Shao, Dongbing; Clevenger, Larry; Zhuang, Lei; Liebmann, Lars; Wong, Robert; Culp, James
2016-03-01
Design rules are created considering a wafer fail mechanism with the relevant design levels under various design cases, and the values are set to cover the worst scenario. Because of the simplification and generalization, design rule hinders, rather than helps, dense device scaling. As an example, SRAM designs always need extensive ground rule waivers. Furthermore, dense design also often involves "design arc", a collection of design rules, the sum of which equals critical pitch defined by technology. In design arc, a single rule change can lead to chain reaction of other rule violations. In this talk we present a methodology using Layout Based Monte-Carlo Simulation (LBMCS) with integrated multiple ground rule checks. We apply this methodology on SRAM word line contact, and the result is a layout that has balanced wafer fail risks based on Process Assumptions (PAs). This work was performed at the IBM Microelectronics Div, Semiconductor Research and Development Center, Hopewell Junction, NY 12533
Self-Interest and the Design of Rules.
Singh, Manvir; Wrangham, Richard; Glowacki, Luke
2017-12-01
Rules regulating social behavior raise challenging questions about cultural evolution in part because they frequently confer group-level benefits. Current multilevel selection theories contend that between-group processes interact with within-group processes to produce norms and institutions, but within-group processes have remained underspecified, leading to a recent emphasis on cultural group selection as the primary driver of cultural design. Here we present the self-interested enforcement (SIE) hypothesis, which proposes that the design of rules importantly reflects the relative enforcement capacities of competing parties. We show that, in addition to explaining patterns in cultural change and stability, SIE can account for the emergence of much group-functional culture. We outline how this process can stifle or accelerate cultural group selection, depending on various social conditions. Self-interested enforcement has important bearings on the emergence, stability, and change of rules.
Monitoring Agents for Assisting NASA Engineers with Shuttle Ground Processing
NASA Technical Reports Server (NTRS)
Semmel, Glenn S.; Davis, Steven R.; Leucht, Kurt W.; Rowe, Danil A.; Smith, Kevin E.; Boeloeni, Ladislau
2005-01-01
The Spaceport Processing Systems Branch at NASA Kennedy Space Center has designed, developed, and deployed a rule-based agent to monitor the Space Shuttle's ground processing telemetry stream. The NASA Engineering Shuttle Telemetry Agent increases situational awareness for system and hardware engineers during ground processing of the Shuttle's subsystems. The agent provides autonomous monitoring of the telemetry stream and automatically alerts system engineers when user defined conditions are satisfied. Efficiency and safety are improved through increased automation. Sandia National Labs' Java Expert System Shell is employed as the agent's rule engine. The shell's predicate logic lends itself well to capturing the heuristics and specifying the engineering rules within this domain. The declarative paradigm of the rule-based agent yields a highly modular and scalable design spanning multiple subsystems of the Shuttle. Several hundred monitoring rules have been written thus far with corresponding notifications sent to Shuttle engineers. This chapter discusses the rule-based telemetry agent used for Space Shuttle ground processing. We present the problem domain along with design and development considerations such as information modeling, knowledge capture, and the deployment of the product. We also present ongoing work with other condition monitoring agents.
A business rules design framework for a pharmaceutical validation and alert system.
Boussadi, A; Bousquet, C; Sabatier, B; Caruba, T; Durieux, P; Degoulet, P
2011-01-01
Several alert systems have been developed to improve the patient safety aspects of clinical information systems (CIS). Most studies have focused on the evaluation of these systems, with little information provided about the methodology leading to system implementation. We propose here an 'agile' business rule design framework (BRDF) supporting both the design of alerts for the validation of drug prescriptions and the incorporation of the end user into the design process. We analyzed the unified process (UP) design life cycle and defined the activities, subactivities, actors and UML artifacts that could be used to enhance the agility of the proposed framework. We then applied the proposed framework to two different sets of data in the context of the Georges Pompidou University Hospital (HEGP) CIS. We introduced two new subactivities into UP: business rule specification and business rule instantiation activity. The pharmacist made an effective contribution to five of the eight BRDF design activities. Validation of the two new subactivities was effected in the context of drug dosage adaption to the patients' clinical and biological contexts. Pilot experiment shows that business rules modeled with BRDF and implemented as an alert system triggered an alert for 5824 of the 71,413 prescriptions considered (8.16%). A business rule design framework approach meets one of the strategic objectives for decision support design by taking into account three important criteria posing a particular challenge to system designers: 1) business processes, 2) knowledge modeling of the context of application, and 3) the agility of the various design steps.
Portable design rules for bulk CMOS
NASA Technical Reports Server (NTRS)
Griswold, T. W.
1982-01-01
It is pointed out that for the past several years, one school of IC designers has used a simplified set of nMOS geometric design rules (GDR) which is 'portable', in that it can be used by many different nMOS manufacturers. The present investigation is concerned with a preliminary set of design rules for bulk CMOS which has been verified for simple test structures. The GDR are defined in terms of Caltech Intermediate Form (CIF), which is a geometry-description language that defines simple geometrical objects in layers. The layers are abstractions of physical mask layers. The design rules do not presume the existence of any particular design methodology. Attention is given to p-well and n-well CMOS processes, bulk CMOS and CMOS-SOS, CMOS geometric rules, and a description of the advantages of CMOS technology.
Design technology co-optimization for 14/10nm metal1 double patterning layer
NASA Astrophysics Data System (ADS)
Duan, Yingli; Su, Xiaojing; Chen, Ying; Su, Yajuan; Shao, Feng; Zhang, Recco; Lei, Junjiang; Wei, Yayi
2016-03-01
Design and technology co-optimization (DTCO) can satisfy the needs of the design, generate robust design rule, and avoid unfriendly patterns at the early stage of design to ensure a high level of manufacturability of the product by the technical capability of the present process. The DTCO methodology in this paper includes design rule translation, layout analysis, model validation, hotspots classification and design rule optimization mainly. The correlation of the DTCO and double patterning (DPT) can optimize the related design rule and generate friendlier layout which meets the requirement of the 14/10nm technology node. The experiment demonstrates the methodology of DPT-compliant DTCO which is applied to a metal1 layer from the 14/10nm node. The DTCO workflow proposed in our job is an efficient solution for optimizing the design rules for 14/10 nm tech node Metal1 layer. And the paper also discussed and did the verification about how to tune the design rule of the U-shape and L-shape structures in a DPT-aware metal layer.
NASA Astrophysics Data System (ADS)
Serrano, Rafael; González, Luis Carlos; Martín, Francisco Jesús
2009-11-01
Under the project SENSOR-IA which has had financial funding from the Order of Incentives to the Regional Technology Centers of the Counsil of Innovation, Science and Enterprise of Andalusia, an architecture for the optimization of a machining process in real time through rule-based expert system has been developed. The architecture consists of an acquisition system and sensor data processing engine (SATD) from an expert system (SE) rule-based which communicates with the SATD. The SE has been designed as an inference engine with an algorithm for effective action, using a modus ponens rule model of goal-oriented rules.The pilot test demonstrated that it is possible to govern in real time the machining process based on rules contained in a SE. The tests have been done with approximated rules. Future work includes an exhaustive collection of data with different tool materials and geometries in a database to extract more precise rules.
19 CFR 191.7 - General manufacturing drawback ruling.
Code of Federal Regulations, 2010 CFR
2010-04-01
... followed without variation; and (iv) The described manufacturing or production process is a manufacture or... ruling. (a) Purpose; eligibility. General manufacturing drawback rulings are designed to simplify... parent corporation is engaged in manufacture or production for drawback, the subsidiary is the proper...
Hyper-heuristic Evolution of Dispatching Rules: A Comparison of Rule Representations.
Branke, Jürgen; Hildebrandt, Torsten; Scholz-Reiter, Bernd
2015-01-01
Dispatching rules are frequently used for real-time, online scheduling in complex manufacturing systems. Design of such rules is usually done by experts in a time consuming trial-and-error process. Recently, evolutionary algorithms have been proposed to automate the design process. There are several possibilities to represent rules for this hyper-heuristic search. Because the representation determines the search neighborhood and the complexity of the rules that can be evolved, a suitable choice of representation is key for a successful evolutionary algorithm. In this paper we empirically compare three different representations, both numeric and symbolic, for automated rule design: A linear combination of attributes, a representation based on artificial neural networks, and a tree representation. Using appropriate evolutionary algorithms (CMA-ES for the neural network and linear representations, genetic programming for the tree representation), we empirically investigate the suitability of each representation in a dynamic stochastic job shop scenario. We also examine the robustness of the evolved dispatching rules against variations in the underlying job shop scenario, and visualize what the rules do, in order to get an intuitive understanding of their inner workings. Results indicate that the tree representation using an improved version of genetic programming gives the best results if many candidate rules can be evaluated, closely followed by the neural network representation that already leads to good results for small to moderate computational budgets. The linear representation is found to be competitive only for extremely small computational budgets.
A Rule Based Approach to ISS Interior Volume Control and Layout
NASA Technical Reports Server (NTRS)
Peacock, Brian; Maida, Jim; Fitts, David; Dory, Jonathan
2001-01-01
Traditional human factors design involves the development of human factors requirements based on a desire to accommodate a certain percentage of the intended user population. As the product is developed human factors evaluation involves comparison between the resulting design and the specifications. Sometimes performance metrics are involved that allow leniency in the design requirements given that the human performance result is satisfactory. Clearly such approaches may work but they give rise to uncertainty and negotiation. An alternative approach is to adopt human factors design rules that articulate a range of each design continuum over which there are varying outcome expectations and interactions with other variables, including time. These rules are based on a consensus of human factors specialists, designers, managers and customers. The International Space Station faces exactly this challenge in interior volume control, which is based on anthropometric, performance and subjective preference criteria. This paper describes the traditional approach and then proposes a rule-based alternative. The proposed rules involve spatial, temporal and importance dimensions. If successful this rule-based concept could be applied to many traditional human factors design variables and could lead to a more effective and efficient contribution of human factors input to the design process.
19 CFR Appendix A to Part 191 - General Manufacturing Drawback Rulings
Code of Federal Regulations, 2010 CFR
2010-04-01
... manufacture or production. B. These general manufacturing drawback rulings supersede general “contracts... manufacturing drawback rulings which have been designed to simplify drawback procedures. Any person who can... drawback; and 9. Description of the manufacturing or production process, unless specifically described in...
Lenas, Petros; Moos, Malcolm; Luyten, Frank P
2009-12-01
The field of tissue engineering is moving toward a new concept of "in vitro biomimetics of in vivo tissue development." In Part I of this series, we proposed a theoretical framework integrating the concepts of developmental biology with those of process design to provide the rules for the design of biomimetic processes. We named this methodology "developmental engineering" to emphasize that it is not the tissue but the process of in vitro tissue development that has to be engineered. To formulate the process design rules in a rigorous way that will allow a computational design, we should refer to mathematical methods to model the biological process taking place in vitro. Tissue functions cannot be attributed to individual molecules but rather to complex interactions between the numerous components of a cell and interactions between cells in a tissue that form a network. For tissue engineering to advance to the level of a technologically driven discipline amenable to well-established principles of process engineering, a scientifically rigorous formulation is needed of the general design rules so that the behavior of networks of genes, proteins, or cells that govern the unfolding of developmental processes could be related to the design parameters. Now that sufficient experimental data exist to construct plausible mathematical models of many biological control circuits, explicit hypotheses can be evaluated using computational approaches to facilitate process design. Recent progress in systems biology has shown that the empirical concepts of developmental biology that we used in Part I to extract the rules of biomimetic process design can be expressed in rigorous mathematical terms. This allows the accurate characterization of manufacturing processes in tissue engineering as well as the properties of the artificial tissues themselves. In addition, network science has recently shown that the behavior of biological networks strongly depends on their topology and has developed the necessary concepts and methods to describe it, allowing therefore a deeper understanding of the behavior of networks during biomimetic processes. These advances thus open the door to a transition for tissue engineering from a substantially empirical endeavor to a technology-based discipline comparable to other branches of engineering.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-18
...-Adviser has designed the following quantitative stock selection rules to make allocation decisions and to..., the Sub-Adviser's investment process is quantitative. Based on extensive historical research, the Sub... open-end fund's portfolio composition must be subject to procedures designed to prevent the use and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-01
...). ACTION: Final rule. SUMMARY: EPA is finalizing a significant new use rule (SNUR) under the Toxic Substances Control Act (TSCA) for the chemical substance identified generically as ethoxylated, propoxylated... manufacture, import, or process this chemical substance for an activity that is designated as a significant...
Overview of the production of sintered SiC optics and optical sub-assemblies
NASA Astrophysics Data System (ADS)
Williams, S.; Deny, P.
2005-08-01
The following is an overview on sintered silicon carbide (SSiC) material properties and processing requirements for the manufacturing of components for advanced technology optical systems. The overview will compare SSiC material properties to typical materials used for optics and optical structures. In addition, it will review manufacturing processes required to produce optical components in detail by process step. The process overview will illustrate current manufacturing process and concepts to expand the process size capability. The overview will include information on the substantial capital equipment employed in the manufacturing of SSIC. This paper will also review common in-process inspection methodology and design rules. The design rules are used to improve production yield, minimize cost, and maximize the inherent benefits of SSiC for optical systems. Optimizing optical system designs for a SSiC manufacturing process will allow systems designers to utilize SSiC as a low risk, cost competitive, and fast cycle time technology for next generation optical systems.
77 FR 19861 - Certain Polybrominated Diphenylethers; Significant New Use Rule and Test Rule
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
...The Agency is proposing to amend the Toxic Substances Control Act (TSCA) section 5(a) Significant New Use Rule (SNUR), for certain polybrominated diphenylethers (PBDEs) by: Designating processing of six PBDEs, or any combination of these chemical substances resulting from a chemical reaction, as a significant new use; designating manufacturing, importing, and processing of a seventh PBDE, decabromodiphenyl ether (decaBDE) for any use which is not ongoing after December 31, 2013, as a significant new use; and making inapplicable the article exemption for SNURs for this action. A person who intends to import or process any of the seven PBDEs included in the proposed SNUR, as part of an article for a significant new use would be required to notify EPA at least 90 days in advance to ensure that the Agency has an opportunity to review and, if necessary, restrict or prohibit a new use before it begins. EPA is also proposing a test rule under TSCA that would require any person who manufactures or processes commercial pentabromodiphenyl ether (c-pentaBDE), commercial octabromodiphenyl ether (c-octaBDE), or commercial decaBDE (c-decaBDE), including in articles, for any use after December 31, 2013, to conduct testing on their effects on health and the environment. EPA is proposing to designate all discontinued uses of PBDEs as significant new uses. The test rule would be promulgated if EPA determines that there are persons who intend to manufacture, import, or process c-pentaBDE, c-octaBDE, or c-decaBDE, for any use, including in articles, after December 31, 2013.
NASA Technical Reports Server (NTRS)
Yen, John; Wang, Haojin; Daugherity, Walter C.
1992-01-01
Fuzzy logic controllers have some often-cited advantages over conventional techniques such as PID control, including easier implementation, accommodation to natural language, and the ability to cover a wider range of operating conditions. One major obstacle that hinders the broader application of fuzzy logic controllers is the lack of a systematic way to develop and modify their rules; as a result the creation and modification of fuzzy rules often depends on trial and error or pure experimentation. One of the proposed approaches to address this issue is a self-learning fuzzy logic controller (SFLC) that uses reinforcement learning techniques to learn the desirability of states and to adjust the consequent part of its fuzzy control rules accordingly. Due to the different dynamics of the controlled processes, the performance of a self-learning fuzzy controller is highly contingent on its design. The design issue has not received sufficient attention. The issues related to the design of a SFLC for application to a petrochemical process are discussed, and its performance is compared with that of a PID and a self-tuning fuzzy logic controller.
The Path to Advanced Practice Licensure for Clinical Nurse Specialists in Washington State.
Schoonover, Heather
The aim of this study was to provide a review of the history and process to obtaining advanced practice licensure for clinical nurse specialists in Washington State. Before 2016, Washington State licensed certified nurse practitioners, certified nurse midwives, and certified nurse anesthetists under the designation of an advanced registered nurse practitioner; however, the state did not recognize clinical nurse specialists as advanced practice nurses. The work to drive the rule change began in 2007. The Washington Affiliate of the National Association of Clinical Nurse Specialists used the Power Elite Theory to guide advocacy activities, building coalitions and support for the desired rule changes. On January 8, 2016, the Washington State Nursing Care Quality Assurance Commission voted to amend the state's advanced practice rules, including clinical nurse specialists in the designation of an advanced practice nurse. Since the rule revision, clinical nurse specialists in Washington State have been granted advanced registered nurse practitioner licenses. Driving changes in state regulatory rules requires diligent advocacy, partnership, and a deep understanding of the state's rule-making processes. To be successful in changing rules, clinical nurse specialists must build strong partnerships with key influencers and understand the steps in practice required to make the desired changes.
Empirical OPC rule inference for rapid RET application
NASA Astrophysics Data System (ADS)
Kulkarni, Anand P.
2006-10-01
A given technological node (45 nm, 65 nm) can be expected to process thousands of individual designs. Iterative methods applied at the node consume valuable days in determining proper placement of OPC features, and manufacturing and testing mask correspondence to wafer patterns in a trial-and-error fashion for each design. Repeating this fabrication process for each individual design is a time-consuming and expensive process. We present a novel technique which sidesteps the requirement to iterate through the model-based OPC analysis and pattern verification cycle on subsequent designs at the same node. Our approach relies on the inference of rules from a correct pattern at the wafer surface it relates to the OPC and pre-OPC pattern layout files. We begin with an offline phase where we obtain a "gold standard" design file that has been fab-tested at the node with a prepared, post-OPC layout file that corresponds to the intended on-wafer pattern. We then run an offline analysis to infer rules to be used in this method. During the analysis, our method implicitly identifies contextual OPC strategies for optimal placement of RET features on any design at that node. Using these strategies, we can apply OPC to subsequent designs at the same node with accuracy comparable to the original design file but significantly smaller expected runtimes. The technique promises to offer a rapid and accurate complement to existing RET application strategies.
77 FR 50907 - Airspace Designations; Incorporation by Reference
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-23
... FAA processed all proposed changes of the airspace listings in FAA Order 7400.9V in full text as... in full text as final rules in the Federal Register. This rule reflects the periodic integration of... changes of the airspace listings in FAA Order 7400.9W in full text as proposed rule documents in the...
78 FR 52847 - Airspace Designations; Incorporation by Reference
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-27
... FAA processed all proposed changes of the airspace listings in FAA Order 7400.9W in full text as... in full text as final rules in the Federal Register. This rule reflects the periodic integration of... changes of the airspace listings in FAA Order 7400.9X in full text as proposed rule documents in the...
Service without a Smile: Comparing the Consequences of Neutral and Positive Display Rules
ERIC Educational Resources Information Center
Trougakos, John P.; Jackson, Christine L.; Beal, Daniel J.
2011-01-01
We used an experimental design to examine the intrapersonal and interpersonal processes through which neutral display rules, compared to positive display rules, influence objective task performance of poll workers and ratings provided by survey respondents of the poll workers. Student participants (N = 140) were trained to adhere to 1 of the 2…
Rigorous ILT optimization for advanced patterning and design-process co-optimization
NASA Astrophysics Data System (ADS)
Selinidis, Kosta; Kuechler, Bernd; Cai, Howard; Braam, Kyle; Hoppe, Wolfgang; Domnenko, Vitaly; Poonawala, Amyn; Xiao, Guangming
2018-03-01
Despite the large difficulties involved in extending 193i multiple patterning and the slow ramp of EUV lithography to full manufacturing readiness, the pace of development for new technology node variations has been accelerating. Multiple new variations of new and existing technology nodes have been introduced for a range of device applications; each variation with at least a few new process integration methods, layout constructs and/or design rules. This had led to a strong increase in the demand for predictive technology tools which can be used to quickly guide important patterning and design co-optimization decisions. In this paper, we introduce a novel hybrid predictive patterning method combining two patterning technologies which have each individually been widely used for process tuning, mask correction and process-design cooptimization. These technologies are rigorous lithography simulation and inverse lithography technology (ILT). Rigorous lithography simulation has been extensively used for process development/tuning, lithography tool user setup, photoresist hot-spot detection, photoresist-etch interaction analysis, lithography-TCAD interactions/sensitivities, source optimization and basic lithography design rule exploration. ILT has been extensively used in a range of lithographic areas including logic hot-spot fixing, memory layout correction, dense memory cell optimization, assist feature (AF) optimization, source optimization, complex patterning design rules and design-technology co-optimization (DTCO). The combined optimization capability of these two technologies will therefore have a wide range of useful applications. We investigate the benefits of the new functionality for a few of these advanced applications including correction for photoresist top loss and resist scumming hotspots.
Design issues for a reinforcement-based self-learning fuzzy controller
NASA Technical Reports Server (NTRS)
Yen, John; Wang, Haojin; Dauherity, Walter
1993-01-01
Fuzzy logic controllers have some often cited advantages over conventional techniques such as PID control: easy implementation, its accommodation to natural language, the ability to cover wider range of operating conditions and others. One major obstacle that hinders its broader application is the lack of a systematic way to develop and modify its rules and as result the creation and modification of fuzzy rules often depends on try-error or pure experimentation. One of the proposed approaches to address this issue is self-learning fuzzy logic controllers (SFLC) that use reinforcement learning techniques to learn the desirability of states and to adjust the consequent part of fuzzy control rules accordingly. Due to the different dynamics of the controlled processes, the performance of self-learning fuzzy controller is highly contingent on the design. The design issue has not received sufficient attention. The issues related to the design of a SFLC for the application to chemical process are discussed and its performance is compared with that of PID and self-tuning fuzzy logic controller.
Designing Rules for Accounting Transaction Identification based on Indonesian NLP
NASA Astrophysics Data System (ADS)
Iswandi, I.; Suwardi, I. S.; Maulidevi, N. U.
2017-03-01
Recording accounting transactions carried out by the evidence of the transactions. It can be invoices, receipts, letters of intent, electricity bill, telephone bill, etc. In this paper, we proposed design of rules to identify the entities located on the sales invoice. There are some entities identified in a sales invoice, namely : invoice date, company name, invoice number, product id, product name, quantity and total price. Identification this entities using named entity recognition method. The entities generated from the rules used as a basis for automation process of data input into the accounting system.
Design space exploration for early identification of yield limiting patterns
NASA Astrophysics Data System (ADS)
Li, Helen; Zou, Elain; Lee, Robben; Hong, Sid; Liu, Square; Wang, JinYan; Du, Chunshan; Zhang, Recco; Madkour, Kareem; Ali, Hussein; Hsu, Danny; Kabeel, Aliaa; ElManhawy, Wael; Kwan, Joe
2016-03-01
In order to resolve the causality dilemma of which comes first, accurate design rules or real designs, this paper presents a flow for exploration of the layout design space to early identify problematic patterns that will negatively affect the yield. A new random layout generating method called Layout Schema Generator (LSG) is reported in this paper, this method generates realistic design-like layouts without any design rule violation. Lithography simulation is then used on the generated layout to discover the potentially problematic patterns (hotspots). These hotspot patterns are further explored by randomly inducing feature and context variations to these identified hotspots through a flow called Hotspot variation Flow (HSV). Simulation is then performed on these expanded set of layout clips to further identify more problematic patterns. These patterns are then classified into design forbidden patterns that should be included in the design rule checker and legal patterns that need better handling in the RET recipes and processes.
ERIC Educational Resources Information Center
Frenette, Micheline
Trying to change the predictive rule for the sinking and floating phenomena, students have a great difficulty in understanding density and they are insensitive to empirical counter-examples designed to challenge their own rule. The purpose of this study is to examine the process whereby students from sixth and seventh grades relinquish their…
Conformance Testing: Measurement Decision Rules
NASA Technical Reports Server (NTRS)
Mimbs, Scott M.
2010-01-01
The goal of a Quality Management System (QMS) as specified in ISO 9001 and AS9100 is to provide assurance to the customer that end products meet specifications. Measuring devices, often called measuring and test equipment (MTE), are used to provide the evidence of product conformity to specified requirements. Unfortunately, processes that employ MTE can become a weak link to the overall QMS if proper attention is not given to the measurement process design, capability, and implementation. Documented "decision rules" establish the requirements to ensure measurement processes provide the measurement data that supports the needs of the QMS. Measurement data are used to make the decisions that impact all areas of technology. Whether measurements support research, design, production, or maintenance, ensuring the data supports the decision is crucial. Measurement data quality can be critical to the resulting consequences of measurement-based decisions. Historically, most industries required simplistic, one-size-fits-all decision rules for measurements. One-size-fits-all rules in some cases are not rigorous enough to provide adequate measurement results, while in other cases are overly conservative and too costly to implement. Ideally, decision rules should be rigorous enough to match the criticality of the parameter being measured, while being flexible enough to be cost effective. The goal of a decision rule is to ensure that measurement processes provide data with a sufficient level of quality to support the decisions being made - no more, no less. This paper discusses the basic concepts of providing measurement-based evidence that end products meet specifications. Although relevant to all measurement-based conformance tests, the target audience is the MTE end-user, which is anyone using MTE other than calibration service providers. Topics include measurement fundamentals, the associated decision risks, verifying conformance to specifications, and basic measurement decisions rules.
FPGA chip performance improvement with gate shrink through alternating PSM 90nm process
NASA Astrophysics Data System (ADS)
Yu, Chun-Chi; Shieh, Ming-Feng; Liu, Erick; Lin, Benjamin; Ho, Jonathan; Wu, Xin; Panaite, Petrisor; Chacko, Manoj; Zhang, Yunqiang; Lei, Wen-Kang
2005-11-01
In the post-physical verification space called 'Mask Synthesis' a key component of design-for-manufacturing (DFM), double-exposure based, dark-field, alternating PSM (Alt-PSM) is being increasingly applied at the 90nm node in addition with other mature resolution enhancement techniques (RETs) such as optical proximity correction (OPC) and sub-resolution assist features (SRAF). Several high-performance IC manufacturers already use alt-PSM technology in 65nm production. At 90nm having strong control over the lithography process is a critical component in meeting targeted yield goals. However, implementing alt-PSM in production has been challenging due to several factors such as phase conflict errors, mask manufacturing, and the increased production cost due to the need for two masks in the process. Implementation of Alt-PSM generally requires phase compliance rules and proper phase topology in the layout and this has been successful for the technology node with these rules implemented. However, this may not be true for a mature, production process technology, in this case 90 nm. Especially, in the foundry-fabless business model where the foundry provides a standard set of design rules to its customers for a given process technology, and where not all the foundry customers require Alt-PSM in their tapeout flow. With minimum design changes, design houses usually are motivated by higher product performance for the existing designs. What follows is an in-depth review of the motivation to apply alt-PSM on a production FPGA, the DFM challenges to each partner faced, its effect on the tapeout flow, and how design, manufacturing, and EDA teams worked together to resolve phase conflicts, tapeout the chip, and finally verify the silicon results in production.
77 FR 37634 - Proposed Significant New Use Rule on Certain Chemical Substances
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-22
... Toxic Substances Control Act (TSCA) for chemical substances identified generically as complex strontium... Proposed Significant New Use Rule on Certain Chemical Substances AGENCY: Environmental Protection Agency... process any of the chemical substances for an activity that is designated as a significant new use by this...
NASA Astrophysics Data System (ADS)
Boning, Duane S.; Chung, James E.
1998-11-01
Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.
37 CFR 1.152 - Design drawings.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Design drawings. 1.152... COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Design Patents § 1.152 Design drawings. The design must be represented by a drawing that complies with the requirements of § 1...
37 CFR 1.152 - Design drawings.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Design drawings. 1.152... COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Design Patents § 1.152 Design drawings. The design must be represented by a drawing that complies with the requirements of § 1...
Grouin, Cyril; Zweigenbaum, Pierre
2013-01-01
In this paper, we present a comparison of two approaches to automatically de-identify medical records written in French: a rule-based system and a machine-learning based system using a conditional random fields (CRF) formalism. Both systems have been designed to process nine identifiers in a corpus of medical records in cardiology. We performed two evaluations: first, on 62 documents in cardiology, and on 10 documents in foetopathology - produced by optical character recognition (OCR) - to evaluate the robustness of our systems. We achieved a 0.843 (rule-based) and 0.883 (machine-learning) exact match overall F-measure in cardiology. While the rule-based system allowed us to achieve good results on nominative (first and last names) and numerical data (dates, phone numbers, and zip codes), the machine-learning approach performed best on more complex categories (postal addresses, hospital names, medical devices, and towns). On the foetopathology corpus, although our systems have not been designed for this corpus and despite OCR character recognition errors, we obtained promising results: a 0.681 (rule-based) and 0.638 (machine-learning) exact-match overall F-measure. This demonstrates that existing tools can be applied to process new documents of lower quality.
ERIC Educational Resources Information Center
Yildirim, Nilay
2013-01-01
This cross-case study examines the relationships between game design attributes and collaborative problem solving process in the context of multi-player video games. The following game design attributes: sensory stimuli elements, level of challenge, and presentation of game goals and rules were examined to determine their influence on game…
Pre-PDK block-level PPAC assessment of technology options for sub-7nm high-performance logic
NASA Astrophysics Data System (ADS)
Liebmann, L.; Northrop, G.; Facchini, M.; Riviere Cazaux, L.; Baum, Z.; Nakamoto, N.; Sun, K.; Chanemougame, D.; Han, G.; Gerousis, V.
2018-03-01
This paper describes a rigorous yet flexible standard cell place-and-route flow that is used to quantify block-level power, performance, and area trade-offs driven by two unique cell architectures and their associated design rule differences. The two architectures examined in this paper differ primarily in their use of different power-distribution-networks to achieve the desired circuit performance for high-performance logic designs. The paper shows the importance of incorporating block-level routability experiments in the early phases of design-technology co-optimization by reviewing a series of routing trials that explore different aspects of the technology definition. Since the electrical and physical parameters leading to critical process assumptions and design rules are unique to specific integration schemes and design objectives, it is understood that the goal of this work is not to promote one cell-architecture over another, but rather to convey the importance of exploring critical trade-offs long before the process details of the technology node are finalized to a point where a process design kit can be published.
37 CFR 1.152 - Design drawings.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 37 Patents, Trademarks, and Copyrights 1 2013-07-01 2013-07-01 false Design drawings. 1.152 Section 1.152 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Design Patents § 1.152 Design drawings. The design must be...
Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0
NASA Technical Reports Server (NTRS)
Schmidt, Conrad K.
2013-01-01
Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.
An expert system design to diagnose cancer by using a new method reduced rule base.
Başçiftçi, Fatih; Avuçlu, Emre
2018-04-01
A Medical Expert System (MES) was developed which uses Reduced Rule Base to diagnose cancer risk according to the symptoms in an individual. A total of 13 symptoms were used. With the new MES, the reduced rules are controlled instead of all possibilities (2 13 = 8192 different possibilities occur). By controlling reduced rules, results are found more quickly. The method of two-level simplification of Boolean functions was used to obtain Reduced Rule Base. Thanks to the developed application with the number of dynamic inputs and outputs on different platforms, anyone can easily test their own cancer easily. More accurate results were obtained considering all the possibilities related to cancer. Thirteen different risk factors were determined to determine the type of cancer. The truth table produced in our study has 13 inputs and 4 outputs. The Boolean Function Minimization method is used to obtain less situations by simplifying logical functions. Diagnosis of cancer quickly thanks to control of the simplified 4 output functions. Diagnosis made with the 4 output values obtained using Reduced Rule Base was found to be quicker than diagnosis made by screening all 2 13 = 8192 possibilities. With the improved MES, more probabilities were added to the process and more accurate diagnostic results were obtained. As a result of the simplification process in breast and renal cancer diagnosis 100% diagnosis speed gain, in cervical cancer and lung cancer diagnosis rate gain of 99% was obtained. With Boolean function minimization, less number of rules is evaluated instead of evaluating a large number of rules. Reducing the number of rules allows the designed system to work more efficiently and to save time, and facilitates to transfer the rules to the designed Expert systems. Interfaces were developed in different software platforms to enable users to test the accuracy of the application. Any one is able to diagnose the cancer itself using determinative risk factors. Thereby likely to beat the cancer with early diagnosis. Copyright © 2018 Elsevier B.V. All rights reserved.
An Integrated Children Disease Prediction Tool within a Special Social Network.
Apostolova Trpkovska, Marika; Yildirim Yayilgan, Sule; Besimi, Adrian
2016-01-01
This paper proposes a social network with an integrated children disease prediction system developed by the use of the specially designed Children General Disease Ontology (CGDO). This ontology consists of children diseases and their relationship with symptoms and Semantic Web Rule Language (SWRL rules) that are specially designed for predicting diseases. The prediction process starts by filling data about the appeared signs and symptoms by the user which are after that mapped with the CGDO ontology. Once the data are mapped, the prediction results are presented. The phase of prediction executes the rules which extract the predicted disease details based on the SWRL rule specified. The motivation behind the development of this system is to spread knowledge about the children diseases and their symptoms in a very simple way using the specialized social networking website www.emama.mk.
NASA Technical Reports Server (NTRS)
Worm, Jeffrey A.; Culas, Donald E.
1991-01-01
Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. This paper examines the concepts of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to provide the possible optimal solution. By incorporating principles from these theories, a decision-making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much we believe these rules is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of its fuzzy attributes is studied.
Considerations for the Systematic Analysis and Use of Single-Case Research
ERIC Educational Resources Information Center
Horner, Robert H.; Swaminathan, Hariharan; Sugai, George; Smolkowski, Keith
2012-01-01
Single-case research designs provide a rigorous research methodology for documenting experimental control. If single-case methods are to gain wider application, however, a need exists to define more clearly (a) the logic of single-case designs, (b) the process and decision rules for visual analysis, and (c) an accepted process for integrating…
NASA Astrophysics Data System (ADS)
Wagh, Aditi
Two strands of work motivate the three studies in this dissertation. Evolutionary change can be viewed as a computational complex system in which a small set of rules operating at the individual level result in different population level outcomes under different conditions. Extensive research has documented students' difficulties with learning about evolutionary change (Rosengren et al., 2012), particularly in terms of levels slippage (Wilensky & Resnick, 1999). Second, though building and using computational models is becoming increasingly common in K-12 science education, we know little about how these two modalities compare. This dissertation adopts agent-based modeling as a representational system to compare these modalities in the conceptual context of micro-evolutionary processes. Drawing on interviews, Study 1 examines middle-school students' productive ways of reasoning about micro-evolutionary processes to find that the specific framing of traits plays a key role in whether slippage explanations are cued. Study 2, which was conducted in 2 schools with about 150 students, forms the crux of the dissertation. It compares learning processes and outcomes when students build their own models or explore a pre-built model. Analysis of Camtasia videos of student pairs reveals that builders' and explorers' ways of accessing rules, and sense-making of observed trends are of a different character. Builders notice rules through available blocks-based primitives, often bypassing their enactment while explorers attend to rules primarily through the enactment. Moreover, builders' sense-making of observed trends is more rule-driven while explorers' is more enactment-driven. Pre and posttests reveal that builders manifest a greater facility with accessing rules, providing explanations manifesting targeted assembly. Explorers use rules to construct explanations manifesting non-targeted assembly. Interviews reveal varying degrees of shifts away from slippage in both modalities, with students who built models not incorporating slippage explanations in responses. Study 3 compares these modalities with a control using traditional activities. Pre and posttests reveal that the two modalities manifested greater facility with accessing and assembling rules than the control. The dissertation offers implications for the design of learning environments for evolutionary change, design of the two modalities based on their strengths and weaknesses, and teacher training for the same.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-14
... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-64276; File No. SR-Phlx-2011-13] Self... Agreement, By-Laws, Rules, Advices and Regulations April 8, 2011. I. Introduction On February 16, 2011..., By-Laws, Rules, Advices and Regulations to alter its governance process and to make other non...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-20
... Filter September 16, 2013. I. Introduction On July 22, 2013, BOX Options Exchange LLC (the ``Exchange... included in the HSVF. A. Complex Order Filter BOX's Complex Order Filter provides a process designed to....\\4\\ BOX proposes to revise its rules to specifically provide that the Complex Order Filter operates...
ERIC Educational Resources Information Center
Tom, Alan R.
1988-01-01
This article proposes rules of thumb about the teacher education design process. The rules are grounded in the attempts at reforming teacher education at Washington University in the early 1970s, at a time during which a year-long, field-based alternative to the traditional elementary program was operated. (IAH)
A knowledge authoring tool for clinical decision support.
Dunsmuir, Dustin; Daniels, Jeremy; Brouse, Christopher; Ford, Simon; Ansermino, J Mark
2008-06-01
Anesthesiologists in the operating room are unable to constantly monitor all data generated by physiological monitors. They are further distracted by clinical and educational tasks. An expert system would ideally provide assistance to the anesthesiologist in this data-rich environment. Clinical monitoring expert systems have not been widely adopted, as traditional methods of knowledge encoding require both expert medical and programming skills, making knowledge acquisition difficult. A software application was developed for use as a knowledge authoring tool for physiological monitoring. This application enables clinicians to create knowledge rules without the need of a knowledge engineer or programmer. These rules are designed to provide clinical diagnosis, explanations and treatment advice for optimal patient care to the clinician in real time. By intelligently combining data from physiological monitors and demographical data sources the expert system can use these rules to assist in monitoring the patient. The knowledge authoring process is simplified by limiting connective relationships between rules. The application is designed to allow open collaboration between communities of clinicians to build a library of rules for clinical use. This design provides clinicians with a system for parameter surveillance and expert advice with a transparent pathway of reasoning. A usability evaluation demonstrated that anesthesiologists can rapidly develop useful rules for use in a predefined clinical scenario.
A step-by-step introduction to rule-based design of synthetic genetic constructs using GenoCAD.
Wilson, Mandy L; Hertzberg, Russell; Adam, Laura; Peccoud, Jean
2011-01-01
GenoCAD is an open source web-based system that provides a streamlined, rule-driven process for designing genetic sequences. GenoCAD provides a graphical interface that allows users to design sequences consistent with formalized design strategies specific to a domain, organization, or project. Design strategies include limited sets of user-defined parts and rules indicating how these parts are to be combined in genetic constructs. In addition to reducing design time to minutes, GenoCAD improves the quality and reliability of the finished sequence by ensuring that the designs follow established rules of sequence construction. GenoCAD.org is a publicly available instance of GenoCAD that can be found at www.genocad.org. The source code and latest build are available from SourceForge to allow advanced users to install and customize GenoCAD for their unique needs. This chapter focuses primarily on how the GenoCAD tools can be used to organize genetic parts into customized personal libraries, then how these libraries can be used to design sequences. In addition, GenoCAD's parts management system and search capabilities are described in detail. Instructions are provided for installing a local instance of GenoCAD on a server. Some of the future enhancements of this rapidly evolving suite of applications are briefly described. Copyright © 2011 Elsevier Inc. All rights reserved.
Three CLIPS-based expert systems for solving engineering problems
NASA Technical Reports Server (NTRS)
Parkinson, W. J.; Luger, G. F.; Bretz, R. E.
1990-01-01
We have written three expert systems, using the CLIPS PC-based expert system shell. These three expert systems are rule based and are relatively small, with the largest containing slightly less than 200 rules. The first expert system is an expert assistant that was written to help users of the ASPEN computer code choose the proper thermodynamic package to use with their particular vapor-liquid equilibrium problem. The second expert system was designed to help petroleum engineers choose the proper enhanced oil recovery method to be used with a given reservoir. The effectiveness of each technique is highly dependent upon the reservoir conditions. The third expert system is a combination consultant and control system. This system was designed specifically for silicon carbide whisker growth. Silicon carbide whiskers are an extremely strong product used to make ceramic and metal composites. The manufacture of whiskers is a very complicated process. which to date. has defied a good mathematical model. The process was run by experts who had gained their expertise by trial and error. A system of rules was devised by these experts both for procedure setup and for the process control. In this paper we discuss the three problem areas of the design, development and evaluation of the CLIPS-based programs.
A Priori Knowledge and Heuristic Reasoning in Architectural Design.
ERIC Educational Resources Information Center
Rowe, Peter G.
1982-01-01
It is proposed that the various classes of a priori knowledge incorporated in heuristic reasoning processes exert a strong influence over architectural design activity. Some design problems require exercise of some provisional set of rules, inference, or plausible strategy which requires heuristic reasoning. A case study illustrates this concept.…
Enhancements to the Design Manager's Aide for Intelligent Decomposition (DeMAID)
NASA Technical Reports Server (NTRS)
Rogers, James L.; Barthelemy, Jean-Francois M.
1992-01-01
This paper discusses the addition of two new enhancements to the program Design Manager's Aide for Intelligent Decomposition (DeMAID). DeMAID is a knowledge-based tool used to aid a design manager in understanding the interactions among the tasks of a complex design problem. This is done by ordering the tasks to minimize feedback, determining the participating subsystems, and displaying them in an easily understood format. The two new enhancements include (1) rules for ordering a complex assembly process and (2) rules for determining which analysis tasks must be re-executed to compute the output of one task based on a change in input to that or another task.
Enhancements to the Design Manager's Aide for Intelligent Decomposition (DeMaid)
NASA Technical Reports Server (NTRS)
Rogers, James L.; Barthelemy, Jean-Francois M.
1992-01-01
This paper discusses the addition of two new enhancements to the program Design Manager's Aide for Intelligent Decomposition (DeMAID). DeMAID is a knowledge-based tool used to aid a design manager in understanding the interactions among the tasks of a complex design problem. This is done by ordering the tasks to minimize feedback, determining the participating subsystems, and displaying them in an easily understood format. The two new enhancements include (1) rules for ordering a complex assembly process and (2) rules for determining which analysis tasks must be re-executed to compute the output of one task based on a change in input to that or another task.
Rules of performance in the nursing home: A grounded theory of nurse-CNA communication.
Madden, Connie; Clayton, Margaret; Canary, Heather E; Towsley, Gail; Cloyes, Kristin; Lund, Dale
This study offers an initial theoretical understanding of nurse-CNA communication processes from the perspectives of nurses and CNAs who are providing direct care to residents in nursing homes. A grounded theory approach provided an understanding of nurse-CNA communication process within the complexities of the nursing home setting. Four themes (maintaining information flow, following procedure, fostering collegiality, and showing respect) describe the "rules of performance" that intertwine in nuanced relationships to guide nurse-CNA communication processes. Understanding how these rules of performance guide nurse-CNA communication processes, and how they are positively and negatively influenced, suggests that nurse-CNA communication during direct care of nursing home residents could be improved through policy and education that is specifically designed to be relevant and applicable to direct care providers in the nursing home environment. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Silva, Paulo
2018-05-01
In many societies, informality has been a relevant part of the construction of the urban fabric. This is valid along a city’s history and in recent urbanization processes. In the past, informality was in the origin of many of urban planning. Very soon urban planning adopted, as one of their main missions malfunctions in cities. Therefore, the need of formalization became one of the main reasons on the emergence, the control of informal processes. As an answer to informal individual solutions, urban planning responded with standardized rules and the urge of creating spaces fitting into pre-established rules instead of rules fitting into spaces. Urban planning as a discipline has gradually changed its path. The contrast between urbanization promoted under formal urban planning and informal urbanization is only one sign of the mismatch between urban planning actions and informal urbanization dynamics. Considering this tension between formal and informal dynamics, in some cases, planning rules and planning processes continue ignoring informal dynamics; in other cases, planning rules are designed to integrate informality “without losing its face” through “planning games” [1]; and a third and less explored way in which planning systems interact with informality and from that interaction learn how to improve (we consider it a process of enrichment) planning rules while they promote an upgrade of informal interventions [2]. This latter win-win situation in which both informal and formal systems benefit from their interaction is still rare: most of the time either only one side benefits or none benefit from the interaction. Nevertheless, there are signs that from this interaction co-dependent adaptation might occur with positive outcomes for the urban system – in which co-evolutionary dynamics can be traced. We propose to look at the way building rules have been designed in Europe in a context considered successful in the sense of dealing of informality – the one of Portugal. The country experienced a wave of informality associated with illegal urbanization since the 1960’s in the main urban areas. The process of interaction between informal and formal urban systems proved to be a success in statistic terms. Slum clearance reduced the existence of informal occupations to almost zero. Informal settlements involving land tenure have been dealt with in the last two decades with considerable positive impact in the urban fabric. Based on this, with this paper we will evaluate how informal and formal systems are impacting each other and changing along the time the shape of building and of planning rules. For this we will look at the planning tools created to formalize informal settlements in the Lisbon Metropolitan Area from the last forty years to see how urban and building rules were adapted to respond to the specific needs of informal settlements; how this adaptation moved from temporary and exceptional to permanent rules; finally, how were these new rules able to “contaminate” the general planning and building codes. We aim that these findings would help us to contribute to a “healthier” relation between formal and informal urban systems, not ignoring each other, not controlling each other but instead learning with each other. By achieving this, planning systems become more responsive; on the other hand, informal occupations can be upgraded without being destroyed with the contribution of the planning systems.
49 CFR 551.45 - What is the purpose of this subpart?
Code of Federal Regulations, 2010 CFR
2010-10-01
... service of administrative or judicial notices or processes may be made. ... TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PROCEDURAL RULES Service of Process on Foreign Manufacturers and Importers Designation of An Agent for Service of Process § 551.45 What is the purpose of this...
NASA Astrophysics Data System (ADS)
Pries-Heje, Jan; Baskerville, Richard L.
This paper elaborates a design science approach for management planning anchored to the concept of a management design theory. Unlike the notions of design theories arising from information systems, management design theories can appear as a system of technological rules, much as a system of hypotheses or propositions can embody scientific theories. The paper illus trates this form of management design theories with three grounded cases. These grounded cases include a software process improvement study, a user involvement study, and an organizational change study. Collectively these studies demonstrate how design theories founded on technological rules can not only improve the design of information systems, but that these concepts have great practical value for improving the framing of strategic organi zational design decisions about such systems. Each case is either grounded in an empirical sense, that is to say, actual practice, or it is grounded to practices described extensively in the practical literature. Such design theories will help managers more easily approach complex, strategic decisions.
Design of freeze-drying processes for pharmaceuticals: practical advice.
Tang, Xiaolin; Pikal, Michael J
2004-02-01
Design of freeze-drying processes is often approached with a "trial and error" experimental plan or, worse yet, the protocol used in the first laboratory run is adopted without further attempts at optimization. Consequently, commercial freeze-drying processes are often neither robust nor efficient. It is our thesis that design of an "optimized" freeze-drying process is not particularly difficult for most products, as long as some simple rules based on well-accepted scientific principles are followed. It is the purpose of this review to discuss the scientific foundations of the freeze-drying process design and then to consolidate these principles into a set of guidelines for rational process design and optimization. General advice is given concerning common stability issues with proteins, but unusual and difficult stability issues are beyond the scope of this review. Control of ice nucleation and crystallization during the freezing step is discussed, and the impact of freezing on the rest of the process and final product quality is reviewed. Representative freezing protocols are presented. The significance of the collapse temperature and the thermal transition, denoted Tg', are discussed, and procedures for the selection of the "target product temperature" for primary drying are presented. Furthermore, guidelines are given for selection of the optimal shelf temperature and chamber pressure settings required to achieve the target product temperature without thermal and/or mass transfer overload of the freeze dryer. Finally, guidelines and "rules" for optimization of secondary drying and representative secondary drying protocols are presented.
Reliability based design of the primary structure of oil tankers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casella, G.; Dogliani, M.; Guedes Soares, C.
1996-12-31
The present paper describes the reliability analysis carried out for two oil tanker-ships having comparable dimensions but different design. The scope of the analysis was to derive indications on the value of the reliability index obtained for existing, typical and well designed oil tankers, as well as to apply the tentative rule checking formulation developed within the CEC-funded SHIPREL Project. The checking formula was adopted to redesign the midships section of one of the considered ships, upgrading her in order to meet the target failure probability considered in the rule development process. The resulting structure, in view of an upgradingmore » of the steel grade in the central part of the deck, lead to a convenient reliability level. The results of the analysis clearly showed that a large scatter exists presently in the design safety levels of ships, even when the Classification Societies` unified requirements are satisfied. A reliability based approach for the calibration of the rules for the global strength of ships is therefore proposed, in order to assist designers and Classification Societies in the process of producing ships which are more optimized, with respect to ensured safety levels. Based on the work reported in the paper, the feasibility and usefulness of a reliability based approach in the development of ship longitudinal strength requirements has been demonstrated.« less
Knowledge-based control of an adaptive interface
NASA Technical Reports Server (NTRS)
Lachman, Roy
1989-01-01
The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.
Life insurance risk assessment using a fuzzy logic expert system
NASA Technical Reports Server (NTRS)
Carreno, Luis A.; Steel, Roy A.
1992-01-01
In this paper, we present a knowledge based system that combines fuzzy processing with rule-based processing to form an improved decision aid for evaluating risk for life insurance. This application illustrates the use of FuzzyCLIPS to build a knowledge based decision support system possessing fuzzy components to improve user interactions and KBS performance. The results employing FuzzyCLIPS are compared with the results obtained from the solution of the problem using traditional numerical equations. The design of the fuzzy solution consists of a CLIPS rule-based system for some factors combined with fuzzy logic rules for others. This paper describes the problem, proposes a solution, presents the results, and provides a sample output of the software product.
Design of Composite Structures Using Knowledge-Based and Case Based Reasoning
NASA Technical Reports Server (NTRS)
Lambright, Jonathan Paul
1996-01-01
A method of using knowledge based and case based reasoning to assist designers during conceptual design tasks of composite structures was proposed. The cooperative use of heuristics, procedural knowledge, and previous similar design cases suggests a potential reduction in design cycle time and ultimately product lead time. The hypothesis of this work is that the design process of composite structures can be improved by using Case-Based Reasoning (CBR) and Knowledge-Based (KB) reasoning in the early design stages. The technique of using knowledge-based and case-based reasoning facilitates the gathering of disparate information into one location that is easily and readily available. The method suggests that the inclusion of downstream life-cycle issues into the conceptual design phase reduces potential of defective, and sub-optimal composite structures. Three industry experts were interviewed extensively. The experts provided design rules, previous design cases, and test problems. A Knowledge Based Reasoning system was developed using the CLIPS (C Language Interpretive Procedural System) environment and a Case Based Reasoning System was developed using the Design Memory Utility For Sharing Experiences (MUSE) xviii environment. A Design Characteristic State (DCS) was used to document the design specifications, constraints, and problem areas using attribute-value pair relationships. The DCS provided consistent design information between the knowledge base and case base. Results indicated that the use of knowledge based and case based reasoning provided a robust design environment for composite structures. The knowledge base provided design guidance from well defined rules and procedural knowledge. The case base provided suggestions on design and manufacturing techniques based on previous similar designs and warnings of potential problems and pitfalls. The case base complemented the knowledge base and extended the problem solving capability beyond the existence of limited well defined rules. The findings indicated that the technique is most effective when used as a design aid and not as a tool to totally automate the composites design process. Other areas of application and implications for future research are discussed.
Dehghani Soufi, Mahsa; Samad-Soltani, Taha; Shams Vahdati, Samad; Rezaei-Hachesu, Peyman
2018-06-01
Fast and accurate patient triage for the response process is a critical first step in emergency situations. This process is often performed using a paper-based mode, which intensifies workload and difficulty, wastes time, and is at risk of human errors. This study aims to design and evaluate a decision support system (DSS) to determine the triage level. A combination of the Rule-Based Reasoning (RBR) and Fuzzy Logic Classifier (FLC) approaches were used to predict the triage level of patients according to the triage specialist's opinions and Emergency Severity Index (ESI) guidelines. RBR was applied for modeling the first to fourth decision points of the ESI algorithm. The data relating to vital signs were used as input variables and modeled using fuzzy logic. Narrative knowledge was converted to If-Then rules using XML. The extracted rules were then used to create the rule-based engine and predict the triage levels. Fourteen RBR and 27 fuzzy rules were extracted and used in the rule-based engine. The performance of the system was evaluated using three methods with real triage data. The accuracy of the clinical decision support systems (CDSSs; in the test data) was 99.44%. The evaluation of the error rate revealed that, when using the traditional method, 13.4% of the patients were miss-triaged, which is statically significant. The completeness of the documentation also improved from 76.72% to 98.5%. Designed system was effective in determining the triage level of patients and it proved helpful for nurses as they made decisions, generated nursing diagnoses based on triage guidelines. The hybrid approach can reduce triage misdiagnosis in a highly accurate manner and improve the triage outcomes. Copyright © 2018 Elsevier B.V. All rights reserved.
Measuring uncertainty by extracting fuzzy rules using rough sets
NASA Technical Reports Server (NTRS)
Worm, Jeffrey A.
1991-01-01
Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.
37 CFR 1.155 - Expedited examination of design applications.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 37 Patents, Trademarks, and Copyrights 1 2013-07-01 2013-07-01 false Expedited examination of design applications. 1.155 Section 1.155 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing...
37 CFR 1.155 - Expedited examination of design applications.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Expedited examination of design applications. 1.155 Section 1.155 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing...
37 CFR 1.155 - Expedited examination of design applications.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 37 Patents, Trademarks, and Copyrights 1 2014-07-01 2014-07-01 false Expedited examination of design applications. 1.155 Section 1.155 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing...
37 CFR 1.155 - Expedited examination of design applications.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 37 Patents, Trademarks, and Copyrights 1 2012-07-01 2012-07-01 false Expedited examination of design applications. 1.155 Section 1.155 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing...
37 CFR 1.155 - Expedited examination of design applications.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Expedited examination of design applications. 1.155 Section 1.155 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing...
NASA Astrophysics Data System (ADS)
Li, Yutong; Wang, Yuxin; Duffy, Alex H. B.
2014-11-01
Computer-based conceptual design for routine design has made great strides, yet non-routine design has not been given due attention, and it is still poorly automated. Considering that the function-behavior-structure(FBS) model is widely used for modeling the conceptual design process, a computer-based creativity enhanced conceptual design model(CECD) for non-routine design of mechanical systems is presented. In the model, the leaf functions in the FBS model are decomposed into and represented with fine-grain basic operation actions(BOA), and the corresponding BOA set in the function domain is then constructed. Choosing building blocks from the database, and expressing their multiple functions with BOAs, the BOA set in the structure domain is formed. Through rule-based dynamic partition of the BOA set in the function domain, many variants of regenerated functional schemes are generated. For enhancing the capability to introduce new design variables into the conceptual design process, and dig out more innovative physical structure schemes, the indirect function-structure matching strategy based on reconstructing the combined structure schemes is adopted. By adjusting the tightness of the partition rules and the granularity of the divided BOA subsets, and making full use of the main function and secondary functions of each basic structure in the process of reconstructing of the physical structures, new design variables and variants are introduced into the physical structure scheme reconstructing process, and a great number of simpler physical structure schemes to accomplish the overall function organically are figured out. The creativity enhanced conceptual design model presented has a dominant capability in introducing new deign variables in function domain and digging out simpler physical structures to accomplish the overall function, therefore it can be utilized to solve non-routine conceptual design problem.
37 CFR 1.154 - Arrangement of application elements in a design application.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 37 Patents, Trademarks, and Copyrights 1 2013-07-01 2013-07-01 false Arrangement of application elements in a design application. 1.154 Section 1.154 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Design Patents § 1.154...
Checking-up of optical graduated rules by laser interferometry
NASA Astrophysics Data System (ADS)
Miron, Nicolae P.; Sporea, Dan G.
1996-05-01
The main aspects related to the operating principle, design, and implementation of high-productivity equipment for checking-up the graduation accuracy of optical graduated rules used as a length reference in optical measuring instruments for precision machine tools are presented. The graduation error checking-up is done with a Michelson interferometer as a length transducer. The instrument operation is managed by a computer, which controls the equipment, data acquisition, and processing. The evaluation is performed for rule lengths from 100 to 3000 mm, with a checking-up error less than 2 micrometers/m. The checking-up time is about 15 min for a 1000-mm rule, with averaging over four measurements.
NASA Astrophysics Data System (ADS)
Bonnema, E. C.; Cunningham, E. K.; Rumel, J. D.
2014-01-01
The Department of Energy requires its subcontractors to meet 10 CFR 851 Appendix A Part 4 for all new pressure vessels and pressure piping. The stainless steel pressure vessel boundaries surrounding SCRF cavities fall under this requirement. Methods for meeting this requirement include design and fabrication of the pressure vessels to meet the requirements of the ASME Boiler & Pressure Vessel Code Section VIII Division 1 or Division 2. Design considerations include determining whether the configuration of the SCRF cavity can be accommodated under the rules of Division 1 or must be analyzed under Division 2 Part 4 Design by Rule Requirements or Part 5 Design by Analysis Requirements. Regardless of the Division or Part choice, designers will find the rules of the ASME Code require thicker pressure boundary members, larger welds, and additional non-destructive testing and quality assurance requirements. These challenges must be met and overcome by the fabricator through the development of robust, detailed, and repeatable manufacturing processes. In this paper we discuss the considerations for stainless steel pressure vessels that must meet the ASME Code and illustrate the discussion with examples from direct experience fabricating such vessels.
NASA Technical Reports Server (NTRS)
Follett, William W.; Rajagopal, Raj
2001-01-01
The focus of the AA MDO team is to reduce product development cost through the capture and automation of best design and analysis practices and through increasing the availability of low-cost, high-fidelity analysis. Implementation of robust designs reduces costs associated with the Test-Fall-Fix cycle. RD is currently focusing on several technologies to improve the design process, including optimization and robust design, expert and rule-based systems, and collaborative technologies.
Service without a smile: comparing the consequences of neutral and positive display rules.
Trougakos, John P; Jackson, Christine L; Beal, Daniel J
2011-03-01
We used an experimental design to examine the intrapersonal and interpersonal processes through which neutral display rules, compared to positive display rules, influence objective task performance of poll workers and ratings provided by survey respondents of the poll workers. Student participants (N = 140) were trained to adhere to 1 of the 2 display rule conditions while delivering opinion surveys to potential patrons of an organization during a 40-min period. Results showed that, compared to positive display rules, neutral display rules resulted in less task persistence and greater avoidance behavior. These effects were mediated through a greater use of expression suppression. In addition, neutral display rules resulted in less positive respondent mood, which accounted for lower ratings of service quality and of overall favorability attitudes toward the sponsoring organization. The importance and ubiquity of neutral display rules are discussed, given the potential for positive and negative consequences at work. PsycINFO Database Record (c) 2011 APA, all rights reserved.
A study of applications scribe frame data verifications using design rule check
NASA Astrophysics Data System (ADS)
Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki
2013-06-01
In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.
CHAM: weak signals detection through a new multivariate algorithm for process control
NASA Astrophysics Data System (ADS)
Bergeret, François; Soual, Carole; Le Gratiet, B.
2016-10-01
Derivatives technologies based on core CMOS processes are significantly aggressive in term of design rules and process control requirements. Process control plan is a derived from Process Assumption (PA) calculations which result in a design rule based on known process variability capabilities, taking into account enough margin to be safe not only for yield but especially for reliability. Even though process assumptions are calculated with a 4 sigma known process capability margin, efficient and competitive designs are challenging the process especially for derivatives technologies in 40 and 28nm nodes. For wafer fab process control, PA are declined in monovariate (layer1 CD, layer2 CD, layer2 to layer1 overlay, layer3 CD etc….) control charts with appropriated specifications and control limits which all together are securing the silicon. This is so far working fine but such system is not really sensitive to weak signals coming from interactions of multiple key parameters (high layer2 CD combined with high layer3 CD as an example). CHAM is a software using an advanced statistical algorithm specifically designed to detect small signals, especially when there are many parameters to control and when the parameters can interact to create yield issues. In this presentation we will first present the CHAM algorithm, then the case-study on critical dimensions, with the results, and we will conclude on future work. This partnership between Ippon and STM is part of E450LMDAP, European project dedicated to metrology and lithography development for future technology nodes, especially 10nm.
Sheehan, Barbara; Nigrovic, Lise E; Dayan, Peter S; Kuppermann, Nathan; Ballard, Dustin W; Alessandrini, Evaline; Bajaj, Lalit; Goldberg, Howard; Hoffman, Jeffrey; Offerman, Steven R; Mark, Dustin G; Swietlik, Marguerite; Tham, Eric; Tzimenatos, Leah; Vinson, David R; Jones, Grant S; Bakken, Suzanne
2013-10-01
Integration of clinical decision support services (CDSS) into electronic health records (EHRs) may be integral to widespread dissemination and use of clinical prediction rules in the emergency department (ED). However, the best way to design such services to maximize their usefulness in such a complex setting is poorly understood. We conducted a multi-site cross-sectional qualitative study whose aim was to describe the sociotechnical environment in the ED to inform the design of a CDSS intervention to implement the Pediatric Emergency Care Applied Research Network (PECARN) clinical prediction rules for children with minor blunt head trauma. Informed by a sociotechnical model consisting of eight dimensions, we conducted focus groups, individual interviews and workflow observations in 11 EDs, of which 5 were located in academic medical centers and 6 were in community hospitals. A total of 126 ED clinicians, information technology specialists, and administrators participated. We clustered data into 19 categories of sociotechnical factors through a process of thematic analysis and subsequently organized the categories into a sociotechnical matrix consisting of three high-level sociotechnical dimensions (workflow and communication, organizational factors, human factors) and three themes (interdisciplinary assessment processes, clinical practices related to prediction rules, EHR as a decision support tool). Design challenges that emerged from the analysis included the need to use structured data fields to support data capture and re-use while maintaining efficient care processes, supporting interdisciplinary communication, and facilitating family-clinician interaction for decision-making. Copyright © 2013 Elsevier Inc. All rights reserved.
Colucci, E; Clark, A; Lang, C E; Pomeroy, V M
2017-12-01
Dose-optimisation studies as precursors to clinical trials are rare in stroke rehabilitation. To develop a rule-based, dose-finding design for stroke rehabilitation research. 3+3 rule-based, dose-finding study. Dose escalation/de-escalation was undertaken according to preset rules and a mathematical sequence (modified Fibonacci sequence). The target starting daily dose was 50 repetitions for the first cohort. Adherence was recorded by an electronic counter. At the end of the 2-week training period, the adherence record indicated dose tolerability (adherence to target dose) and the outcome measure indicated dose benefit (10% increase in motor function). The preset increment/decrease and checking rules were then applied to set the dose for the subsequent cohort. The process was repeated until preset stopping rules were met. Participants had a mean age of 68 (range 48 to 81) years, and were a mean of 70 (range 9 to 289) months post stroke with moderate upper limb paresis. A custom-built model of exercise-based training to enhance ability to open the paretic hand. Repetitions per minute of extension/flexion of paretic digits against resistance. Usability of the preset rules and whether the maximally tolerated dose was identifiable. Five cohorts of three participants were involved. Discernibly different doses were set for each subsequent cohort (i.e. 50, 100, 167, 251 and 209 repetitions/day). The maximally tolerated dose for the model training task was 209 repetitions/day. This dose-finding design is a feasible method for use in stroke rehabilitation research. Copyright © 2017 Chartered Society of Physiotherapy. All rights reserved.
NASA Technical Reports Server (NTRS)
Wolf, M.
1981-01-01
The effect of solar cell metallization pattern design on solar cell performance and the costs and performance effects of different metallization processes are discussed. Definitive design rules for the front metallization pattern for large area solar cells are presented. Chemical and physical deposition processes for metallization are described and compared. An economic evaluation of the 6 principal metallization options is presented. Instructions for preparing Format A cost data for solar cell manufacturing processes from UPPC forms for input into the SAMIC computer program are presented.
Algorithmic Mechanism Design of Evolutionary Computation.
Pei, Yan
2015-01-01
We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm.
Algorithmic Mechanism Design of Evolutionary Computation
2015-01-01
We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm. PMID:26257777
Model based high NA anamorphic EUV RET
NASA Astrophysics Data System (ADS)
Jiang, Fan; Wiaux, Vincent; Fenger, Germain; Clifford, Chris; Liubich, Vlad; Hendrickx, Eric
2018-03-01
With the announcement of the extension of the Extreme Ultraviolet (EUV) roadmap to a high NA lithography tool that utilizes anamorphic optics design, an investigation of design tradeoffs unique to the imaging of anamorphic lithography tool is shown. An anamorphic optical proximity correction (OPC) solution has been developed that models fully the EUV near field electromagnetic effects and the anamorphic imaging using the Domain Decomposition Method (DDM). Clips of imec representative for the N3 logic node were used to demonstrate the OPC solutions on critical layers that will benefit from the increased contrast at high NA using anamorphic imaging. However, unlike isomorphic case, from wafer perspective, OPC needs to treat x and y differently. In the paper, we show a design trade-off seen unique to Anamorphic EUV, namely that using a mask rule of 48nm (mask scale), approaching current state of the art, limitations are observed in the available correction that can be applied to the mask. The metal pattern has a pitch of 24nm and CD of 12nm. During OPC, the correction of the metal lines oriented vertically are being limited by the mask rule of 12nm 1X. The horizontally oriented lines do not suffer from this mask rule limitation as the correction is allowed to go to 6nm 1X. For this example, the masks rules will need to be more aggressive to allow complete correction, or design rules and wafer processes (wafer rotation) would need to be created that utilize the orientation that can image more aggressive features. When considering VIA or block level correction, aggressive polygon corner to corner designs can be handled with various solutions, including applying a 45 degree chop. Multiple solutions are discussed with the metrics of edge placement error (EPE) and Process Variation Bands (PVBands), together with all the mask constrains. Noted in anamorphic OPC, the 45 degree chop is maintained at the mask level to meet mask manufacturing constraints, but results in skewed angle edge in wafer level correction. In this paper, we used both contact (Via/block) patterns and metal patterns for OPC practice. By comparing the EPE of horizontal and vertical patterns with a fixed mask rule check (MRC), and the PVBand, we focus on the challenges and the solutions of OPC with anamorphic High-NA lens.
The Evolvement of Automobile Steering System Based on TRIZ
NASA Astrophysics Data System (ADS)
Zhao, Xinjun; Zhang, Shuang
Products and techniques pass through a process of birth, growth, maturity, death and quit the stage like biological evolution process. The developments of products and techniques conform to some evolvement rules. If people know and hold these rules, they can design new kind of products and forecast the develop trends of the products. Thereby, enterprises can grasp the future technique directions of products, and make product and technique innovation. Below, based on TRIZ theory, the mechanism evolvement, the function evolvement and the appearance evolvement of automobile steering system had been analyzed and put forward some new ideas about future automobile steering system.
Developmental changes in automatic rule-learning mechanisms across early childhood.
Mueller, Jutta L; Friederici, Angela D; Männel, Claudia
2018-06-27
Infants' ability to learn complex linguistic regularities from early on has been revealed by electrophysiological studies indicating that 3-month-olds, but not adults, can automatically detect non-adjacent dependencies between syllables. While different ERP responses in adults and infants suggest that both linguistic rule learning and its link to basic auditory processing undergo developmental changes, systematic investigations of the developmental trajectories are scarce. In the present study, we assessed 2- and 4-year-olds' ERP indicators of pitch discrimination and linguistic rule learning in a syllable-based oddball design. To test for the relation between auditory discrimination and rule learning, ERP responses to pitch changes were used as predictor for potential linguistic rule-learning effects. Results revealed that 2-year-olds, but not 4-year-olds, showed ERP markers of rule learning. Although, 2-year-olds' rule learning was not dependent on differences in pitch perception, 4-year-old children demonstrated a dependency, such that those children who showed more pronounced responses to pitch changes still showed an effect of rule learning. These results narrow down the developmental decline of the ability for automatic linguistic rule learning to the age between 2 and 4 years, and, moreover, point towards a strong modification of this change by auditory processes. At an age when the ability of automatic linguistic rule learning phases out, rule learning can still be observed in children with enhanced auditory responses. The observed interrelations are plausible causes for age-of-acquisition effects and inter-individual differences in language learning. © 2018 John Wiley & Sons Ltd.
Design of fuzzy systems using neurofuzzy networks.
Figueiredo, M; Gomide, F
1999-01-01
This paper introduces a systematic approach for fuzzy system design based on a class of neural fuzzy networks built upon a general neuron model. The network structure is such that it encodes the knowledge learned in the form of if-then fuzzy rules and processes data following fuzzy reasoning principles. The technique provides a mechanism to obtain rules covering the whole input/output space as well as the membership functions (including their shapes) for each input variable. Such characteristics are of utmost importance in fuzzy systems design and application. In addition, after learning, it is very simple to extract fuzzy rules in the linguistic form. The network has universal approximation capability, a property very useful in, e.g., modeling and control applications. Here we focus on function approximation problems as a vehicle to illustrate its usefulness and to evaluate its performance. Comparisons with alternative approaches are also included. Both, nonnoisy and noisy data have been studied and considered in the computational experiments. The neural fuzzy network developed here and, consequently, the underlying approach, has shown to provide good results from the accuracy, complexity, and system design points of view.
Building distributed rule-based systems using the AI Bus
NASA Technical Reports Server (NTRS)
Schultz, Roger D.; Stobie, Iain C.
1990-01-01
The AI Bus software architecture was designed to support the construction of large-scale, production-quality applications in areas of high technology flux, running heterogeneous distributed environments, utilizing a mix of knowledge-based and conventional components. These goals led to its current development as a layered, object-oriented library for cooperative systems. This paper describes the concepts and design of the AI Bus and its implementation status as a library of reusable and customizable objects, structured by layers from operating system interfaces up to high-level knowledge-based agents. Each agent is a semi-autonomous process with specialized expertise, and consists of a number of knowledge sources (a knowledge base and inference engine). Inter-agent communication mechanisms are based on blackboards and Actors-style acquaintances. As a conservative first implementation, we used C++ on top of Unix, and wrapped an embedded Clips with methods for the knowledge source class. This involved designing standard protocols for communication and functions which use these protocols in rules. Embedding several CLIPS objects within a single process was an unexpected problem because of global variables, whose solution involved constructing and recompiling a C++ version of CLIPS. We are currently working on a more radical approach to incorporating CLIPS, by separating out its pattern matcher, rule and fact representations and other components as true object oriented modules.
31 CFR 203.9 - Scope of the subpart.
Code of Federal Regulations, 2013 CFR
2013-07-01
... rules that financial institutions must follow when they process electronic Federal tax payment transactions. A financial institution is not required to be designated as a TT&L depositary in order to process..., DEPARTMENT OF THE TREASURY FINANCIAL MANAGEMENT SERVICE PAYMENT OF FEDERAL TAXES AND THE TREASURY TAX AND...
31 CFR 203.9 - Scope of the subpart.
Code of Federal Regulations, 2012 CFR
2012-07-01
... rules that financial institutions must follow when they process electronic Federal tax payment transactions. A financial institution is not required to be designated as a TT&L depositary in order to process..., DEPARTMENT OF THE TREASURY FINANCIAL MANAGEMENT SERVICE PAYMENT OF FEDERAL TAXES AND THE TREASURY TAX AND...
31 CFR 203.9 - Scope of the subpart.
Code of Federal Regulations, 2011 CFR
2011-07-01
... rules that financial institutions must follow when they process electronic Federal tax payment transactions. A financial institution is not required to be designated as a TT&L depositary in order to process..., DEPARTMENT OF THE TREASURY FINANCIAL MANAGEMENT SERVICE PAYMENT OF FEDERAL TAXES AND THE TREASURY TAX AND...
31 CFR 203.9 - Scope of the subpart.
Code of Federal Regulations, 2014 CFR
2014-07-01
... rules that financial institutions must follow when they process electronic Federal tax payment transactions. A financial institution is not required to be designated as a TT&L depositary in order to process electronic Federal tax payments. In addition, a financial institution does not become a TT&L depositary by...
Checking Flight Rules with TraceContract: Application of a Scala DSL for Trace Analysis
NASA Technical Reports Server (NTRS)
Barringer, Howard; Havelund, Klaus; Morris, Robert A.
2011-01-01
Typically during the design and development of a NASA space mission, rules and constraints are identified to help reduce reasons for failure during operations. These flight rules are usually captured in a set of indexed tables, containing rule descriptions, rationales for the rules, and other information. Flight rules can be part of manual operations procedures carried out by humans. However, they can also be automated, and either implemented as on-board monitors, or as ground based monitors that are part of a ground data system. In the case of automated flight rules, one considerable expense to be addressed for any mission is the extensive process by which system engineers express flight rules in prose, software developers translate these requirements into code, and then both experts verify that the resulting application is correct. This paper explores the potential benefits of using an internal Scala DSL for general trace analysis, named TRACECONTRACT, to write executable specifications of flight rules. TRACECONTRACT can generally be applied to analysis of for example log files or for monitoring executing systems online.
Code of Federal Regulations, 2010 CFR
2010-07-01
... COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Plant Patents § 1.166... quantity and at a time in its stage of growth as may be designated, for study and inspection. Such...
Compensatory Mitigation for Losses of Aquatic Resources; Final Rule
These regulations are designed to improve the effectiveness of compensatory mitigation to replace lost aquatic resource functions and area, and increase the efficiency and predictability of the mitigation project review process.
Complete denture tooth arrangement technology driven by a reconfigurable rule.
Dai, Ning; Yu, Xiaoling; Fan, Qilei; Yuan, Fulai; Liu, Lele; Sun, Yuchun
2018-01-01
The conventional technique for the fabrication of complete dentures is complex, with a long fabrication process and difficult-to-control restoration quality. In recent years, digital complete denture design has become a research focus. Digital complete denture tooth arrangement is a challenging issue that is difficult to efficiently implement under the constraints of complex tooth arrangement rules and the patient's individualized functional aesthetics. The present study proposes a complete denture automatic tooth arrangement method driven by a reconfigurable rule; it uses four typical operators, including a position operator, a scaling operator, a posture operator, and a contact operator, to establish the constraint mapping association between the teeth and the constraint set of the individual patient. By using the process reorganization of different constraint operators, this method can flexibly implement different clinical tooth arrangement rules. When combined with a virtual occlusion algorithm based on progressive iterative Laplacian deformation, the proposed method can achieve automatic and individual tooth arrangement. Finally, the experimental results verify that the proposed method is flexible and efficient.
NASA Astrophysics Data System (ADS)
Perry, Dan; Nakamoto, Mark; Verghese, Nishath; Hurat, Philippe; Rouse, Rich
2007-03-01
Model-based hotspot detection and silicon-aware parametric analysis help designers optimize their chips for yield, area and performance without the high cost of applying foundries' recommended design rules. This set of DFM/ recommended rules is primarily litho-driven, but cannot guarantee a manufacturable design without imposing overly restrictive design requirements. This rule-based methodology of making design decisions based on idealized polygons that no longer represent what is on silicon needs to be replaced. Using model-based simulation of the lithography, OPC, RET and etch effects, followed by electrical evaluation of the resulting shapes, leads to a more realistic and accurate analysis. This analysis can be used to evaluate intelligent design trade-offs and identify potential failures due to systematic manufacturing defects during the design phase. The successful DFM design methodology consists of three parts: 1. Achieve a more aggressive layout through limited usage of litho-related recommended design rules. A 10% to 15% area reduction is achieved by using more aggressive design rules. DFM/recommended design rules are used only if there is no impact on cell size. 2. Identify and fix hotspots using a model-based layout printability checker. Model-based litho and etch simulation are done at the cell level to identify hotspots. Violations of recommended rules may cause additional hotspots, which are then fixed. The resulting design is ready for step 3. 3. Improve timing accuracy with a process-aware parametric analysis tool for transistors and interconnect. Contours of diffusion, poly and metal layers are used for parametric analysis. In this paper, we show the results of this physical and electrical DFM methodology at Qualcomm. We describe how Qualcomm was able to develop more aggressive cell designs that yielded a 10% to 15% area reduction using this methodology. Model-based shape simulation was employed during library development to validate architecture choices and to optimize cell layout. At the physical verification stage, the shape simulator was run at full-chip level to identify and fix residual hotspots on interconnect layers, on poly or metal 1 due to interaction between adjacent cells, or on metal 1 due to interaction between routing (via and via cover) and cell geometry. To determine an appropriate electrical DFM solution, Qualcomm developed an experiment to examine various electrical effects. After reporting the silicon results of this experiment, which showed sizeable delay variations due to lithography-related systematic effects, we also explain how contours of diffusion, poly and metal can be used for silicon-aware parametric analysis of transistors and interconnect at the cell-, block- and chip-level.
NASA Technical Reports Server (NTRS)
1977-01-01
Low frequency gratings obtainable with present technology, can meet the grating-efficiency design goals for potential space telescope spectrographs. Gratings made with changes in the three specific parameters: the ruling tool profile, the coating material, and the lubricants used during the ruling process were compared. A series of coatings and test gratings were fabricated and were examined for surface smoothness with a Nomarski differential interference microscope and an electron microsocope. Photomicrographs were obtained to show the difference in smoothness of the various coatings and rulings. Efficiency measurements were made for those test rulings that showed good groove characteristics: smoothness, proper ruling depth, and absence of defects (e.g., streaks, feathered edges and rough sides). Higher grating efficiency should be correlated with the degree of smoothness of both the coating and the grating groove.
In-camera automation of photographic composition rules.
Banerjee, Serene; Evans, Brian L
2007-07-01
At the time of image acquisition, professional photographers apply many rules of thumb to improve the composition of their photographs. This paper develops a joint optical-digital processing framework for automating composition rules during image acquisition for photographs with one main subject. Within the framework, we automate three photographic composition rules: repositioning the main subject, making the main subject more prominent, and making objects that merge with the main subject less prominent. The idea is to provide to the user alternate pictures obtained by applying photographic composition rules in addition to the original picture taken by the user. The proposed algorithms do not depend on prior knowledge of the indoor/outdoor setting or scene content. The proposed algorithms are also designed to be amenable to software implementation on fixed-point programmable digital signal processors available in digital still cameras.
10 CFR Appendix C to Part 52 - Design Certification Rule for the AP600 Design
Code of Federal Regulations, 2011 CFR
2011-01-01
..., Manager, Passive Plant Engineering, Westinghouse Electric Company, P.O. Box 355, Pittsburgh, Pennsylvania... Web site, http://www.nrc.gov, and/or at the NRC Public Document Room, is insufficient; c. The...) Fuel criteria evaluation process. (4) Fire areas. (5) Human factors engineering. c. A licensee who...
10 CFR Appendix C to Part 52 - Design Certification Rule for the AP600 Design
Code of Federal Regulations, 2010 CFR
2010-01-01
..., Manager, Passive Plant Engineering, Westinghouse Electric Company, P.O. Box 355, Pittsburgh, Pennsylvania... Web site, http://www.nrc.gov, and/or at the NRC Public Document Room, is insufficient; c. The...) Fuel criteria evaluation process. (4) Fire areas. (5) Human factors engineering. c. A licensee who...
ERIC Educational Resources Information Center
Rabin, Colette; Smith, Grinell
2016-01-01
As teacher educators, the authors developed an assignment focused on care ethics to prepare teacher candidates to design classroom-management procedures aimed at cultivating caring community. The teacher candidates revised traditional classroom-management processes, such as class rules, into cocreated norms. They also designed original management…
Integrated model-based retargeting and optical proximity correction
NASA Astrophysics Data System (ADS)
Agarwal, Kanak B.; Banerjee, Shayak
2011-04-01
Conventional resolution enhancement techniques (RET) are becoming increasingly inadequate at addressing the challenges of subwavelength lithography. In particular, features show high sensitivity to process variation in low-k1 lithography. Process variation aware RETs such as process-window OPC are becoming increasingly important to guarantee high lithographic yield, but such techniques suffer from high runtime impact. An alternative to PWOPC is to perform retargeting, which is a rule-assisted modification of target layout shapes to improve their process window. However, rule-based retargeting is not a scalable technique since rules cannot cover the entire search space of two-dimensional shape configurations, especially with technology scaling. In this paper, we propose to integrate the processes of retargeting and optical proximity correction (OPC). We utilize the normalized image log slope (NILS) metric, which is available at no extra computational cost during OPC. We use NILS to guide dynamic target modification between iterations of OPC. We utilize the NILS tagging capabilities of Calibre TCL scripting to identify fragments with low NILS. We then perform NILS binning to assign different magnitude of retargeting to different NILS bins. NILS is determined both for width, to identify regions of pinching, and space, to locate regions of potential bridging. We develop an integrated flow for 1x metal lines (M1) which exhibits lesser lithographic hotspots compared to a flow with just OPC and no retargeting. We also observe cases where hotspots that existed in the rule-based retargeting flow are fixed using our methodology. We finally also demonstrate that such a retargeting methodology does not significantly alter design properties by electrically simulating a latch layout before and after retargeting. We observe less than 1% impact on latch Clk-Q and D-Q delays post-retargeting, which makes this methodology an attractive one for use in improving shape process windows without perturbing designed values.
Det Norske Veritas rule philosophy with regard to gas turbines for marine propulsion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, P.
1999-04-01
This paper is mainly based on Det Norske Veritas (DNV) Rules of January 1996, Part 4, Chapter 2, Section 4 -- Gas Turbines, and is intended to at least open the dialogue between the gas turbine industry and DNV. There is a need for design approval and manufacturing inspection process systematic and testing procedures to match the standards of the industry. The role and expectations imposed by owners, the authorities, insurance agencies, etc. needs to be understood. These expectations often have technical implications that may go against the normal procedures and practices of the gas turbine industry, and could havemore » cost impacts. The question of DNV acceptance criteria has been asked many times, with respect to gas turbines. DNV relies a great deal on the manufacturer to provide the basis for the design criteria, manufacturing, and testing criteria of the gas turbine. However, DNV adds its knowledge and experience to this, and checks that the documentation presented by the manufacturer is technically acceptable. Generally, a high level of the state-of-the-art theoretical documentation is required to support the design of modern gas turbines. A proper understanding of the rule philosophy of DNV could prove to be useful in developing better gas turbines systems, which fulfill the rule requirements, and at the same time save resources such as money and time. It is important for gas turbine manufacturers to understand the intent of the rules since it is the intent that needs to be fulfilled. Further, the rules do have the principle of equivalence, which means that there is full freedom in how one fulfills the intent of the rules, as long as DNV accepts the solution.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-10
... and approval process for advertisements, correspondence, and institutional sales material. The...\\ in particular, in that it is designed to prevent fraudulent and manipulative acts and practices, to...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-10
... and approval process for advertisements, correspondence, and institutional sales material. The..., in that it is designed to prevent fraudulent and manipulative acts and practices, to promote just and...
The research on construction and application of machining process knowledge base
NASA Astrophysics Data System (ADS)
Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai
2018-03-01
In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.
NASA Astrophysics Data System (ADS)
Li, Dongni; Guo, Rongtao; Zhan, Rongxin; Yin, Yong
2018-06-01
In this article, an innovative artificial bee colony (IABC) algorithm is proposed, which incorporates two mechanisms. On the one hand, to provide the evolutionary process with a higher starting level, genetic programming (GP) is used to generate heuristic rules by exploiting the elements that constitute the problem. On the other hand, to achieve a better balance between exploration and exploitation, a leading mechanism is proposed to attract individuals towards a promising region. To evaluate the performance of IABC in solving practical and complex problems, it is applied to the intercell scheduling problem with limited transportation capacity. It is observed that the GP-generated rules incorporate the elements of the most competing human-designed rules, and they are more effective than the human-designed ones. Regarding the leading mechanism, the strategies of the ageing leader and multiple challengers make the algorithm less likely to be trapped in local optima.
A rule based computer aided design system
NASA Technical Reports Server (NTRS)
Premack, T.
1986-01-01
A Computer Aided Design (CAD) system is presented which supports the iterative process of design, the dimensional continuity between mating parts, and the hierarchical structure of the parts in their assembled configuration. Prolog, an interactive logic programming language, is used to represent and interpret the data base. The solid geometry representing the parts is defined in parameterized form using the swept volume method. The system is demonstrated with a design of a spring piston.
NASA Astrophysics Data System (ADS)
Ugon, B.; Nandong, J.; Zang, Z.
2017-06-01
The presence of unstable dead-time systems in process plants often leads to a daunting challenge in the design of standard PID controllers, which are not only intended to provide close-loop stability but also to give good performance-robustness overall. In this paper, we conduct stability analysis on a double-loop control scheme based on the Routh-Hurwitz stability criteria. We propose to use this unstable double-loop control scheme which employs two P/PID controllers to control first-order or second-order unstable dead-time processes typically found in process industries. Based on the Routh-Hurwitz stability necessary and sufficient criteria, we establish several stability regions which enclose within them the P/PID parameter values that guarantee close-loop stability of the double-loop control scheme. A systematic tuning rule is developed for the purpose of obtaining the optimal P/PID parameter values within the established regions. The effectiveness of the proposed tuning rule is demonstrated using several numerical examples and the result are compared with some well-established tuning methods reported in the literature.
Design and evaluation of a service oriented architecture for paperless ICU tarification.
Steurbaut, Kristof; Colpaert, Kirsten; Van Hoecke, Sofie; Steurbaut, Sabrina; Danneels, Chris; Decruyenaere, Johan; De Turck, Filip
2012-06-01
The computerization of Intensive Care Units provides an overwhelming amount of electronic data for both medical and financial analysis. However, the current tarification, which is the process to tick and count patients' procedures, is still a repetitive, time-consuming process on paper. Nurses and secretaries keep track manually of the patients' medical procedures. This paper describes the design methodology and implementation of automated tarification services. In this study we investigate if the tarification can be modeled in service oriented architecture as a composition of interacting services. Services are responsible for data collection, automatic assignment of records to physicians and application of rules. Performance is evaluated in terms of execution time, cost evaluation and return on investment based on tracking of real procedures. The services provide high flexibility in terms of maintenance, integration and rules support. It is shown that services offer a more accurate, less time-consuming and cost-effective tarification.
Design and Characterization of a Secure Automatic Dependent Surveillance-Broadcast Prototype
2015-03-26
during the thesis process. Thank you to Mr. Dave Prentice of AFRL for providing the Aeroflex IFR 6000 baseband signals, upon which many design decisions...35 25 Example Aeroflex IFR 6000 signal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 26...Global Positioning System HDL hardware description language I in-phase IFR Instrument Flight Rules IP Internet Protocol IP intellectual property IPSec
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schunk, Peter Randall; King, William P.; Sun, Amy Cha-Tien
2006-08-01
This paper presents continuum simulations of polymer flow during nanoimprint lithography (NIL). The simulations capture the underlying physics of polymer flow from the nanometer to millimeter length scale and examine geometry and thermophysical process quantities affecting cavity filling. Variations in embossing tool geometry and polymer film thickness during viscous flow distinguish different flow driving mechanisms. Three parameters can predict polymer deformation mode: cavity width to polymer thickness ratio, polymer supply ratio, and Capillary number. The ratio of cavity width to initial polymer film thickness determines vertically or laterally dominant deformation. The ratio of indenter width to residual film thickness measuresmore » polymer supply beneath the indenter which determines Stokes or squeeze flow. The local geometry ratios can predict a fill time based on laminar flow between plates, Stokes flow, or squeeze flow. Characteristic NIL capillary number based on geometry-dependent fill time distinguishes between capillary or viscous driven flows. The three parameters predict filling modes observed in published studies of NIL deformation over nanometer to millimeter length scales. The work seeks to establish process design rules for NIL and to provide tools for the rational design of NIL master templates, resist polymers, and process parameters.« less
Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.
2013-01-01
Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887
Automation for pattern library creation and in-design optimization
NASA Astrophysics Data System (ADS)
Deng, Rock; Zou, Elain; Hong, Sid; Wang, Jinyan; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua; Huang, Jason
2015-03-01
Semiconductor manufacturing technologies are becoming increasingly complex with every passing node. Newer technology nodes are pushing the limits of optical lithography and requiring multiple exposures with exotic material stacks for each critical layer. All of this added complexity usually amounts to further restrictions in what can be designed. Furthermore, the designs must be checked against all these restrictions in verification and sign-off stages. Design rules are intended to capture all the manufacturing limitations such that yield can be maximized for any given design adhering to all the rules. Most manufacturing steps employ some sort of model based simulation which characterizes the behavior of each step. The lithography models play a very big part of the overall yield and design restrictions in patterning. However, lithography models are not practical to run during design creation due to their slow and prohibitive run times. Furthermore, the models are not usually given to foundry customers because of the confidential and sensitive nature of every foundry's processes. The design layout locations where a model flags unacceptable simulated results can be used to define pattern rules which can be shared with customers. With advanced technology nodes we see a large growth of pattern based rules. This is due to the fact that pattern matching is very fast and the rules themselves can be very complex to describe in a standard DRC language. Therefore, the patterns are left as either pattern layout clips or abstracted into pattern-like syntax which a pattern matcher can use directly. The patterns themselves can be multi-layered with "fuzzy" designations such that groups of similar patterns can be found using one description. The pattern matcher is often integrated with a DRC tool such that verification and signoff can be done in one step. The patterns can be layout constructs that are "forbidden", "waived", or simply low-yielding in nature. The patterns can also contain remedies built in so that fixing happens either automatically or in a guided manner. Building a comprehensive library of patterns is a very difficult task especially when a new technology node is being developed or the process keeps changing. The main dilemma is not having enough representative layouts to use for model simulation where pattern locations can be marked and extracted. This paper will present an automatic pattern library creation flow by using a few known yield detractor patterns to systematically expand the pattern library and generate optimized patterns. We will also look at the specific fixing hints in terms of edge movements, additive, or subtractive changes needed during optimization. Optimization will be shown for both the digital physical implementation and custom design methods.
Automating Rule Strengths in Expert Systems.
1987-05-01
systems were designed in an incremental, iterative way. One of the most easily identifiable phases in this process, sometimes called tuning, consists...attenuators. The designer of the knowledge-based system must determine (synthesize) or adjust (xfine, if estimates of the values are given) these...values. We consider two ways in which the designer can learn the values. We call the first model of learning the complete case and the second model the
On nonstationarity-related errors in modal combination rules of the response spectrum method
NASA Astrophysics Data System (ADS)
Pathak, Shashank; Gupta, Vinay K.
2017-10-01
Characterization of seismic hazard via (elastic) design spectra and the estimation of linear peak response of a given structure from this characterization continue to form the basis of earthquake-resistant design philosophy in various codes of practice all over the world. Since the direct use of design spectrum ordinates is a preferred option for the practicing engineers, modal combination rules play central role in the peak response estimation. Most of the available modal combination rules are however based on the assumption that nonstationarity affects the structural response alike at the modal and overall response levels. This study considers those situations where this assumption may cause significant errors in the peak response estimation, and preliminary models are proposed for the estimation of the extents to which nonstationarity affects the modal and total system responses, when the ground acceleration process is assumed to be a stationary process. It is shown through numerical examples in the context of complete-quadratic-combination (CQC) method that the nonstationarity-related errors in the estimation of peak base shear may be significant, when strong-motion duration of the excitation is too small compared to the period of the system and/or the response is distributed comparably in several modes. It is also shown that these errors are reduced marginally with the use of the proposed nonstationarity factor models.
Advanced metrology by offline SEM data processing
NASA Astrophysics Data System (ADS)
Lakcher, Amine; Schneider, Loïc.; Le-Gratiet, Bertrand; Ducoté, Julien; Farys, Vincent; Besacier, Maxime
2017-06-01
Today's technology nodes contain more and more complex designs bringing increasing challenges to chip manufacturing process steps. It is necessary to have an efficient metrology to assess process variability of these complex patterns and thus extract relevant data to generate process aware design rules and to improve OPC models. Today process variability is mostly addressed through the analysis of in-line monitoring features which are often designed to support robust measurements and as a consequence are not always very representative of critical design rules. CD-SEM is the main CD metrology technique used in chip manufacturing process but it is challenged when it comes to measure metrics like tip to tip, tip to line, areas or necking in high quantity and with robustness. CD-SEM images contain a lot of information that is not always used in metrology. Suppliers have provided tools that allow engineers to extract the SEM contours of their features and to convert them into a GDS. Contours can be seen as the signature of the shape as it contains all the dimensional data. Thus the methodology is to use the CD-SEM to take high quality images then generate SEM contours and create a data base out of them. Contours are used to feed an offline metrology tool that will process them to extract different metrics. It was shown in two previous papers that it is possible to perform complex measurements on hotspots at different process steps (lithography, etch, copper CMP) by using SEM contours with an in-house offline metrology tool. In the current paper, the methodology presented previously will be expanded to improve its robustness and combined with the use of phylogeny to classify the SEM images according to their geometrical proximities.
Industry Application Emergency Core Cooling System Cladding Acceptance Criteria Early Demonstration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szilard, Ronaldo H.; Youngblood, Robert W.; Zhang, Hongbin
2015-09-01
The U. S. NRC is currently proposing rulemaking designated as “10 CFR 50.46c” to revise the loss-of-coolant-accident (LOCA)/emergency core cooling system (ECCS) acceptance criteria to include the effects of higher burnup on cladding performance as well as to address other technical issues. The NRC is also currently resolving the public comments with the final rule expected to be issued in April 2016. The impact of the final 50.46c rule on the industry may involve updating of fuel vendor LOCA evaluation models, NRC review and approval, and licensee submittal of new LOCA evaluations or re-analyses and associated technical specification revisions formore » NRC review and approval. The rule implementation process, both industry and NRC activities, is expected to take 4-6 years following the rule effective date. As motivated by the new rule, the need to use advanced cladding designs may be a result. A loss of operational margin may result due to the more restrictive cladding embrittlement criteria. Initial and future compliance with the rule may significantly increase vendor workload and licensee cost as a spectrum of fuel rod initial burnup states may need to be analyzed to demonstrate compliance. Consequently, there will be an increased focus on licensee decision making related to LOCA analysis to minimize cost and impact, and to manage margin. The proposed rule would apply to a light water reactor and to all cladding types.« less
Knowledge base rule partitioning design for CLIPS
NASA Technical Reports Server (NTRS)
Mainardi, Joseph D.; Szatkowski, G. P.
1990-01-01
This describes a knowledge base (KB) partitioning approach to solve the problem of real-time performance using the CLIPS AI shell when containing large numbers of rules and facts. This work is funded under the joint USAF/NASA Advanced Launch System (ALS) Program as applied research in expert systems to perform vehicle checkout for real-time controller and diagnostic monitoring tasks. The Expert System advanced development project (ADP-2302) main objective is to provide robust systems responding to new data frames of 0.1 to 1.0 second intervals. The intelligent system control must be performed within the specified real-time window, in order to meet the demands of the given application. Partitioning the KB reduces the complexity of the inferencing Rete net at any given time. This reduced complexity improves performance but without undo impacts during load and unload cycles. The second objective is to produce highly reliable intelligent systems. This requires simple and automated approaches to the KB verification & validation task. Partitioning the KB reduces rule interaction complexity overall. Reduced interaction simplifies the V&V testing necessary by focusing attention only on individual areas of interest. Many systems require a robustness that involves a large number of rules, most of which are mutually exclusive under different phases or conditions. The ideal solution is to control the knowledge base by loading rules that directly apply for that condition, while stripping out all rules and facts that are not used during that cycle. The practical approach is to cluster rules and facts into associated 'blocks'. A simple approach has been designed to control the addition and deletion of 'blocks' of rules and facts, while allowing real-time operations to run freely. Timing tests for real-time performance for specific machines under R/T operating systems have not been completed but are planned as part of the analysis process to validate the design.
Optics Toolbox: An Intelligent Relational Database System For Optical Designers
NASA Astrophysics Data System (ADS)
Weller, Scott W.; Hopkins, Robert E.
1986-12-01
Optical designers were among the first to use the computer as an engineering tool. Powerful programs have been written to do ray-trace analysis, third-order layout, and optimization. However, newer computing techniques such as database management and expert systems have not been adopted by the optical design community. For the purpose of this discussion we will define a relational database system as a database which allows the user to specify his requirements using logical relations. For example, to search for all lenses in a lens database with a F/number less than two, and a half field of view near 28 degrees, you might enter the following: FNO < 2.0 and FOV of 28 degrees ± 5% Again for the purpose of this discussion, we will define an expert system as a program which contains expert knowledge, can ask intelligent questions, and can form conclusions based on the answers given and the knowledge which it contains. Most expert systems store this knowledge in the form of rules-of-thumb, which are written in an English-like language, and which are easily modified by the user. An example rule is: IF require microscope objective in air and require NA > 0.9 THEN suggest the use of an oil immersion objective The heart of the expert system is the rule interpreter, sometimes called an inference engine, which reads the rules and forms conclusions based on them. The use of a relational database system containing lens prototypes seems to be a viable prospect. However, it is not clear that expert systems have a place in optical design. In domains such as medical diagnosis and petrology, expert systems are flourishing. These domains are quite different from optical design, however, because optical design is a creative process, and the rules are difficult to write down. We do think that an expert system is feasible in the area of first order layout, which is sufficiently diagnostic in nature to permit useful rules to be written. This first-order expert would emulate an expert designer as he interacted with a customer for the first time: asking the right questions, forming conclusions, and making suggestions. With these objectives in mind, we have developed the Optics Toolbox. Optics Toolbox is actually two programs in one: it is a powerful relational database system with twenty-one search parameters, four search modes, and multi-database support, as well as a first-order optical design expert system with a rule interpreter which has full access to the relational database. The system schematic is shown in Figure 1.
NASA Technical Reports Server (NTRS)
Starks, Scott; Abdel-Hafeez, Saleh; Usevitch, Bryan
1997-01-01
This paper discusses the implementation of a fuzzy logic system using an ASICs design approach. The approach is based upon combining the inherent advantages of symmetric triangular membership functions and fuzzy singleton sets to obtain a novel structure for fuzzy logic system application development. The resulting structure utilizes a fuzzy static RAM to store the rule-base and the end-points of the triangular membership functions. This provides advantages over other approaches in which all sampled values of membership functions for all universes must be stored. The fuzzy coprocessor structure implements the fuzzification and defuzzification processes through a two-stage parallel pipeline architecture which is capable of executing complex fuzzy computations in less than 0.55us with an accuracy of more than 95%, thus making it suitable for a wide range of applications. Using the approach presented in this paper, a fuzzy logic rule-base can be directly downloaded via a host processor to an onchip rule-base memory with a size of 64 words. The fuzzy coprocessor's design supports up to 49 rules for seven fuzzy membership functions associated with each of the chip's two input variables. This feature allows designers to create fuzzy logic systems without the need for additional on-board memory. Finally, the paper reports on simulation studies that were conducted for several adaptive filter applications using the least mean squared adaptive algorithm for adjusting the knowledge rule-base.
Application of Kansei engineering and data mining in the Thai ceramic manufacturing
NASA Astrophysics Data System (ADS)
Kittidecha, Chaiwat; Yamada, Koichi
2018-01-01
Ceramic is one of the highly competitive products in Thailand. Many Thai ceramic companies are attempting to know the customer needs and perceptions for making favorite products. To know customer needs is the target of designers and to develop a product that must satisfy customers. This research is applied Kansei Engineering (KE) and Data Mining (DM) into the customer driven product design process. KE can translate customer emotions into the product attributes. This method determines the relationships between customer feelings or Kansei words and the design attributes. Decision tree J48 and Class association rule which implemented through Waikato Environment for Knowledge Analysis (WEKA) software are used to generate a predictive model and to find the appropriate rules. In this experiment, the emotion scores were rated by 37 participants for training data and 16 participants for test data. 6 Kansei words were selected, namely, attractive, ease of drinking, ease of handing, quality, modern and durable. 10 mugs were selected as product samples. The results of this study indicate that the proposed models and rules can interpret the design product elements affecting the customer emotions. Finally, this study provides useful understanding for the application DM in KE and can be applied to a variety of design cases.
A Computer Program You Can Use: Edging and Trimmer Trainer
Philip A. Araman; D. Earl Kline; Matthew F. Winn
1996-01-01
We present a computerized training tool designed to help hardwood sawmill edger and trim saw operators improve their processing performance. It can also be used by managers to understand the affects of processing decisions such as limiting wane beyond standard grading rule restrictions. The program helps users understand the relationships between lumber grade, surface...
Resource Planning for Massive Number of Process Instances
NASA Astrophysics Data System (ADS)
Xu, Jiajie; Liu, Chengfei; Zhao, Xiaohui
Resource allocation has been recognised as an important topic for business process execution. In this paper, we focus on planning resources for a massive number of process instances to meet the process requirements and cater for rational utilisation of resources before execution. After a motivating example, we present a model for planning resources for process instances. Then we design a set of heuristic rules that take both optimised planning at build time and instance dependencies at run time into account. Based on these rules we propose two strategies, one is called holistic and the other is called batched, for resource planning. Both strategies target a lower cost, however, the holistic strategy can achieve an earlier deadline while the batched strategy aims at rational use of resources. We discuss how to find balance between them in the paper with a comprehensive experimental study on these two approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szilard, Ronaldo Henriques; Youngblood, Robert; Frepoli, Cesare
2015-04-01
The U. S. NRC is currently proposing rulemaking designated as “10 CFR 50.46c” to revise the LOCA/ECCS acceptance criteria to include the effects of higher burnup on cladding performance as well as to address some other issues. The NRC is also currently resolving the public comments with the final rule expected to be issued in the summer of 2016. The impact of the final 50.46c rule on the industry will involve updating of fuel vendor LOCA evaluation models, NRC review and approval, and licensee submittal of new LOCA evaluations or reanalyses and associated technical specification revisions for NRC review andmore » approval. The rule implementation process, both industry and NRC activities, is expected to take 5-10 years following the rule effective date. The need to use advanced cladding designs is expected. A loss of operational margin will result due to the more restrictive cladding embrittlement criteria. Initial and future compliance with the rule may significantly increase vendor workload and licensee cost as a spectrum of fuel rod initial burnup states may need to be analyzed to demonstrate compliance. Consequently there will be an increased focus on licensee decision making related to LOCA analysis to minimize cost and impact, and to manage margin.« less
Design and Training of Limited-Interconnect Architectures
1991-07-16
and signal processing. Neuromorphic (brain like) models, allow an alternative for achieving real-time operation tor such tasks, while having a...compact and robust architecture. Neuromorphic models consist of interconnections of simple computational nodes. In this approach, each node computes a...operational performance. I1. Research Objectives The research objectives were: 1. Development of on- chip local training rules specifically designed for
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-11
... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-68834; File No. SR-DTC-2012-10] Self-Regulatory Organizations; The Depository Trust Company; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change To Reduce Liquidity Risk Relating to Its Processing of Maturity and Income Presentments and Issuances of Money Marke...
Robot companions and ethics a pragmatic approach of ethical design.
Cornet, Gérard
2013-12-01
From his experience as ethical expert for two Robot Companion prototype projects aiming at empowering older MCI persons to remain at home and to support their family carers, Gerard Cornet, Gerontologist, review the ethical rules, principles and pragmatic approaches in different cultures. The ethical process of these two funded projects, one European, Companionable (FP7 e-inclusion call1), the other French, Quo vadis (ANR tecsan) are described from the inclusion of the targeted end users in the process, to the assessment and ranking of their main needs and whishes to design the specifications, test the performance expected. Obstacles to turn round and limits for risks evaluation (directs or implicit), acceptability, utility, respect of intimacy and dignity, and balance with freedom and security and frontiers to artificial intelligence are discussed As quoted in the discussion with the French and Japanese experts attending the Toulouse Robotics and medicine symposium (March 26th 2011), the need of a new ethical approach, going further the present ethical rules is needed for the design and social status of ethical robots, having capacity cas factor of progress and global quality of innovation design in an ageing society.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 39 Postal Service 1 2010-07-01 2010-07-01 false Do these rules affect the service of process....26 Do these rules affect the service of process requirements of the Federal Rules of Civil Procedure... Rules of Civil Procedure regarding service of process. ...
77 FR 61117 - Significant New Use Rules on Certain Chemical Substances
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-05
...EPA is promulgating significant new use rules (SNURs) under the Toxic Substances Control Act (TSCA) for 78 chemical substances which were the subject of premanufacture notices (PMNs). Seven of these chemical substances are subject to TSCA section 5(e) consent orders issued by EPA. This action requires persons who intend to manufacture, import, or process any of these 78 chemical substances for an activity that is designated as a significant new use by this rule to notify EPA at least 90 days before commencing that activity. The required notification will provide EPA with the opportunity to evaluate the intended use and, if necessary, to prohibit or limit that activity before it occurs.
Sandia Advanced MEMS Design Tools v. 3.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yarberry, Victor R.; Allen, James J.; Lantz, Jeffrey W.
This is a major revision to the Sandia Advanced MEMS Design Tools. It replaces all previous versions. New features in this version: Revised to support AutoCAD 2014 and 2015 This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Provide enabling educational information (including pictures, videos, technical information) c) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) d) Facilitate the process of having MEMS fabricated at Sandia National Laboratories e) Facilitate the process of having post-fabrication services performed. While there exists somemore » files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.« less
Analyzing CMOS/SOS fabrication for LSI arrays
NASA Technical Reports Server (NTRS)
Ipri, A. C.
1978-01-01
Report discusses set of design rules that have been developed as result of work with test arrays. Set of optimum dimensions is given that would maximize process output and would correspondingly minimize costs in fabrication of large-scale integration (LSI) arrays.
Design of integration-ready metasurface-based infrared absorbers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ogando, Karim, E-mail: karim@cab.cnea.gov.ar; Pastoriza, Hernán
2015-07-28
We introduce an integration ready design of metamaterial infrared absorber, highly compatible with many kinds of fabrication processes. We present the results of an exhaustive experimental characterization, including an analysis of the effects of single meta-atom geometrical parameters and collective arrangement. We confront the results with the theoretical interpretations proposed in the literature. Based on the results, we develop a set of practical design rules for metamaterial absorbers in the infrared region.
Optimization of RET flow using test layout
NASA Astrophysics Data System (ADS)
Zhang, Yunqiang; Sethi, Satyendra; Lucas, Kevin
2008-11-01
At advanced technology nodes with extremely low k1 lithography, it is very hard to achieve image fidelity requirements and process window for some layout configurations. Quite often these layouts are within simple design rule constraints for a given technology node. It is important to have these layouts included during early RET flow development. Most of RET developments are based on shrunk layout from the previous technology node, which is possibly not good enough. A better methodology in creating test layout is required for optical proximity correction (OPC) recipe and assists feature development. In this paper we demonstrate the application of programmable test layouts in RET development. Layout pattern libraries are developed and embedded in a layout tool (ICWB). Assessment gauges are generated together with patterns for quick correction accuracy assessment. Several groups of test pattern libraries have been developed based on learning from product patterns and a layout DOE approach. The interaction between layout patterns and OPC recipe has been studied. Correction of a contact layer is quite challenge because of poor convergence and low process window. We developed test pattern library with many different contact configurations. Different OPC schemes are studied on these test layouts. The worst process window patterns are pinpointed for a given illumination condition. Assist features (AF) are frequently placed according to pre-determined rules to improve lithography process window. These rules are usually derived from lithographic models and experiments. Direct validation of AF rules is required at development phase. We use the test layout approach to determine rules in order to eliminate AF printability problem.
Salzmann-Erikson, Martin
2017-11-01
Ward rules in psychiatric care aim to promote safety for both patients and staff. Simultaneously, ward rules are associated with increased patient violence, leading to neither a safe work environment nor a safe caring environment. Although ward rules are routinely used, few studies have explicitly accounted for their impact. To describe the process of a team development project considering ward rule issues, and to develop a working model to empower staff in their daily in-patient psychiatric nursing practices. The design of this study is explorative and descriptive. Participatory action research methodology was applied to understand ward rules. Data consists of audio-recorded group discussions, observations and field notes, together creating a data set of 556 text pages. More than 100 specific ward rules were identified. In this process, the word rules was relinquished in favor of adopting the term principles, since rules are inconsistent with a caring ideology. A linguistic transition led to the development of a framework embracing the (1) Principle of Safety, (2) Principle of Structure and (3) Principle of Interplay. The principles were linked to normative guidelines and applied ethical theories: deontology, consequentialism and ethics of care. The work model reminded staff about the principles, empowered their professional decision-making, decreased collegial conflicts because of increased acceptance for individual decisions, and, in general, improved well-being at work. Furthermore, the work model also empowered staff to find support for their decisions based on principles that are grounded in the ethics of totality.
VIP: A knowledge-based design aid for the engineering of space systems
NASA Technical Reports Server (NTRS)
Lewis, Steven M.; Bellman, Kirstie L.
1990-01-01
The Vehicles Implementation Project (VIP), a knowledge-based design aid for the engineering of space systems is described. VIP combines qualitative knowledge in the form of rules, quantitative knowledge in the form of equations, and other mathematical modeling tools. The system allows users rapidly to develop and experiment with models of spacecraft system designs. As information becomes available to the system, appropriate equations are solved symbolically and the results are displayed. Users may browse through the system, observing dependencies and the effects of altering specific parameters. The system can also suggest approaches to the derivation of specific parameter values. In addition to providing a tool for the development of specific designs, VIP aims at increasing the user's understanding of the design process. Users may rapidly examine the sensitivity of a given parameter to others in the system and perform tradeoffs or optimizations of specific parameters. A second major goal of VIP is to integrate the existing corporate knowledge base of models and rules into a central, symbolic form.
NASA Astrophysics Data System (ADS)
Mohamad, M. L.; Rahman, M. T. A.; Khan, S. F.; Basha, M. H.; Adom, A. H.; Hashim, M. S. M.
2017-10-01
The main purpose of this study is to make improvement for the UniMAP Automotive Racing Team car chassis which has several problems associated with the chassis must be fixed and some changes are needed to be made in order to perform well. This study involves the process of designing three chassis that are created based on the rules stated by FSAE rules book (2017/2018). The three chassis will undergo analysis test that consists of five tests which are main roll hoop test, front roll hoop test, static shear, side impact, static torsional loading and finally one of them will be selected as the best design in term of Von Mises Stress and torsional displacement. From the results obtained, the new selected chassis design which also declared as the new improved design poses the weight of 27.66 kg which was decreased by 16.7% from the existing chassis (32.77 kg). The torsional rigidity of the improved chassis increased by 37.74%.
10 CFR Appendix C to Part 52 - Design Certification Rule for the AP600 Design
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 2 2012-01-01 2012-01-01 false Design Certification Rule for the AP600 Design C Appendix C to Part 52 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES, CERTIFICATIONS, AND APPROVALS FOR NUCLEAR POWER PLANTS Pt. 52, App. C Appendix C to Part 52—Design Certification Rule for the...
Pattern centric design based sensitive patterns and process monitor in manufacturing
NASA Astrophysics Data System (ADS)
Hsiang, Chingyun; Cheng, Guojie; Wu, Kechih
2017-03-01
When design rule is mitigating to smaller dimension, process variation requirement is tighter than ever and challenges the limits of device yield. Masks, lithography, etching and other processes have to meet very tight specifications in order to keep defect and CD within the margins of the process window. Conventionally, Inspection and metrology equipments are utilized to monitor and control wafer quality in-line. In high throughput optical inspection, nuisance and review-classification become a tedious labor intensive job in manufacturing. Certain high-resolution SEM images are taken to validate defects after optical inspection. These high resolution SEM images catch not only optical inspection highlighted point, also its surrounding patterns. However, this pattern information is not well utilized in conventional quality control method. Using this complementary design based pattern monitor not only monitors and analyzes the variation of patterns sensitivity but also reduce nuisance and highlight defective patterns or killer defects. After grouping in either single or multiple layers, systematic defects can be identified quickly in this flow. In this paper, we applied design based pattern monitor in different layers to monitor process variation impacts on all kinds of patterns. First, the contour of high resolutions SEM image is extracted and aligned to design with offset adjustment and fine alignment [1]. Second, specified pattern rules can be applied on design clip area, the same size as SEM image, and form POI (pattern of interest) areas. Third, the discrepancy of contour and design measurement at different pattern types in measurement blocks. Fourth, defective patterns are reported by discrepancy detection criteria and pattern grouping [4]. Meanwhile, reported pattern defects are ranked by number and severity by discrepancy. In this step, process sensitive high repeatable systematic defects can be identified quickly Through this design based process pattern monitor method, most of optical inspection nuisances can be filtered out at contour to design discrepancy measurement. Daily analysis results are stored at database as reference to compare with incoming data. Defective pattern library contains existing and known systematic defect patterns which help to catch and identify new pattern defects or process impacts. On the other hand, this defect pattern library provides extra valuable information for mask, pattern and defects verification, inspection care area generation, further OPC fix and process enhancement and investigation.
Metareasoning and Social Evaluations in Cognitive Agents
NASA Astrophysics Data System (ADS)
Pinyol, Isaac; Sabater-Mir, Jordi
Reputation mechanisms have been recognized one of the key technologies when designing multi-agent systems. They are specially relevant in complex open environments, becoming a non-centralized mechanism to control interactions among agents. Cognitive agents tackling such complex societies must use reputation information not only for selecting partners to interact with, but also in metareasoning processes to change reasoning rules. This is the focus of this paper. We argue about the necessity to allow, as a cognitive systems designers, certain degree of freedom in the reasoning rules of the agents. We also describes cognitive approaches of agency that support this idea. Furthermore, taking as a base the computational reputation model Repage, and its integration in a BDI architecture, we use the previous ideas to specify metarules and processes to modify at run-time the reasoning paths of the agent. In concrete we propose a metarule to update the link between Repage and the belief base, and a metarule and a process to update an axiom incorporated in the belief logic of the agent. Regarding this last issue we also provide empirical results that show the evolution of agents that use it.
Cai, Weidong; Leung, Hoi-Chung
2011-01-01
Background The human inferior frontal cortex (IFC) is a large heterogeneous structure with distinct cytoarchitectonic subdivisions and fiber connections. It has been found involved in a wide range of executive control processes from target detection, rule retrieval to response control. Since these processes are often being studied separately, the functional organization of executive control processes within the IFC remains unclear. Methodology/Principal Findings We conducted an fMRI study to examine the activities of the subdivisions of IFC during the presentation of a task cue (rule retrieval) and during the performance of a stop-signal task (requiring response generation and inhibition) in comparison to a not-stop task (requiring response generation but not inhibition). We utilized a mixed event-related and block design to separate brain activity in correspondence to transient control processes from rule-related and sustained control processes. We found differentiation in control processes within the IFC. Our findings reveal that the bilateral ventral-posterior IFC/anterior insula are more active on both successful and unsuccessful stop trials relative to not-stop trials, suggesting their potential role in the early stage of stopping such as triggering the stop process. Direct countermanding seems to be outside of the IFC. In contrast, the dorsal-posterior IFC/inferior frontal junction (IFJ) showed transient activity in correspondence to the infrequent presentation of the stop signal in both tasks and the left anterior IFC showed differential activity in response to the task cues. The IFC subdivisions also exhibited similar but distinct patterns of functional connectivity during response control. Conclusions/Significance Our findings suggest that executive control processes are distributed across the IFC and that the different subdivisions of IFC may support different control operations through parallel cortico-cortical and cortico-striatal circuits. PMID:21673969
First-order fire effects models for land Management: Overview and issues
Elizabeth D. Reinhardt; Matthew B. Dickinson
2010-01-01
We give an overview of the science application process at work in supporting fire management. First-order fire effects models, such as those discussed in accompanying papers, are the building blocks of software systems designed for application to landscapes over time scales from days to centuries. Fire effects may be modeled using empirical, rule based, or process...
NASA Astrophysics Data System (ADS)
Seoud, Ahmed; Kim, Juhwan; Ma, Yuansheng; Jayaram, Srividya; Hong, Le; Chae, Gyu-Yeol; Lee, Jeong-Woo; Park, Dae-Jin; Yune, Hyoung-Soon; Oh, Se-Young; Park, Chan-Ha
2018-03-01
Sub-resolution assist feature (SRAF) insertion techniques have been effectively used for a long time now to increase process latitude in the lithography patterning process. Rule-based SRAF and model-based SRAF are complementary solutions, and each has its own benefits, depending on the objectives of applications and the criticality of the impact on manufacturing yield, efficiency, and productivity. Rule-based SRAF provides superior geometric output consistency and faster runtime performance, but the associated recipe development time can be of concern. Model-based SRAF provides better coverage for more complicated pattern structures in terms of shapes and sizes, with considerably less time required for recipe development, although consistency and performance may be impacted. In this paper, we introduce a new model-assisted template extraction (MATE) SRAF solution, which employs decision tree learning in a model-based solution to provide the benefits of both rule-based and model-based SRAF insertion approaches. The MATE solution is designed to automate the creation of rules/templates for SRAF insertion, and is based on the SRAF placement predicted by model-based solutions. The MATE SRAF recipe provides optimum lithographic quality in relation to various manufacturing aspects in a very short time, compared to traditional methods of rule optimization. Experiments were done using memory device pattern layouts to compare the MATE solution to existing model-based SRAF and pixelated SRAF approaches, based on lithographic process window quality, runtime performance, and geometric output consistency.
NASA Astrophysics Data System (ADS)
Maneri, E.; Gawronski, W.
1999-10-01
The linear quadratic Gaussian (LQG) design algorithms described in [2] and [5] have been used in the controller design of JPL's beam-waveguide [5] and 70-m [6] antennas. This algorithm significantly improves tracking precision in a windy environment. This article describes the graphical user interface (GUI) software for the design LQG controllers. It consists of two parts: the basic LQG design and the fine-tuning of the basic design using a constrained optimization algorithm. The presented GUI was developed to simplify the design process, to make the design process user-friendly, and to enable design of an LQG controller for one with a limited control engineering background. The user is asked to manipulate the GUI sliders and radio buttons to watch the antenna performance. Simple rules are given at the GUI display.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false The United States Patent and Trademark Office as a Designated Office or Elected Office. 1.414 Section 1.414 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES International Processing...
Survey research: it's just a few questions, right?
Tait, Alan R; Voepel-Lewis, Terri
2015-07-01
While most anesthesiologists and other physician- or nurse-scientists are familiar with traditional descriptive, observational, and interventional study design, survey research has typically remained the preserve of the social scientists. To that end, this article provides a basic overview of the elements of good survey design and offers some rules of thumb to help guide investigators through the survey process. © 2015 John Wiley & Sons Ltd.
Rule-Based Event Processing and Reaction Rules
NASA Astrophysics Data System (ADS)
Paschke, Adrian; Kozlenkov, Alexander
Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.
Design Rules for Life Support Systems
NASA Technical Reports Server (NTRS)
Jones, Harry
2002-01-01
This paper considers some of the common assumptions and engineering rules of thumb used in life support system design. One general design rule is that the longer the mission, the more the life support system should use recycling and regenerable technologies. A more specific rule is that, if the system grows more than half the food, the food plants will supply all the oxygen needed for the crew life support. There are many such design rules that help in planning the analysis of life support systems and in checking results. These rules are typically if-then statements describing the results of steady-state, "back of the envelope," mass flow calculations. They are useful in identifying plausible candidate life support system designs and in rough allocations between resupply and resource recovery. Life support system designers should always review the design rules and make quick steady state calculations before doing detailed design and dynamic simulation. This paper develops the basis for the different assumptions and design rules and discusses how they should be used. We start top-down, with the highest level requirement to sustain human beings in a closed environment off Earth. We consider the crew needs for air, water, and food. We then discuss atmosphere leakage and recycling losses. The needs to support the crew and to make up losses define the fundamental life support system requirements. We consider the trade-offs between resupplying and recycling oxygen, water, and food. The specific choices between resupply and recycling are determined by mission duration, presence of in-situ resources, etc., and are defining parameters of life support system design.
Research directed toward improved echelles for the ultraviolet
NASA Technical Reports Server (NTRS)
1977-01-01
Research was undertaken to demonstrate that improved efficiencies for low frequency gratings are obtainable with the careful application of present technology. The motivation for the study was the desire to be assured that the grating-efficiency design goals for potential Space Telescope spectrographs can be achieved. The work was organized to compare gratings made with changes in the three specific parameters: the ruling tool profile, the coating material, and the lubricants used during the ruling process. A series of coatings and test gratings were fabricated and were examined for surface smoothness with a Nomarski Differential Interference Microscope and an electron microscope. Photomicrographs were obtained to show the difference in smoothness of the various coatings and rulings. Efficiency measurements were made for those test rulings that showed good groove characteristics: smoothness, proper ruling depth, and absence of defects. The intuitive feeling that higher grating efficiency should be correlated with the degree of smoothness of both the coating and the grating is supported by the results.
NASA Astrophysics Data System (ADS)
Gleason, J. L.; Hillyer, T. N.
2011-12-01
Clouds and the Earth's Radiant Energy System (CERES) is one of NASA's highest priority Earth Observing System (EOS) scientific instruments. The CERES science team will integrate data from the CERES Flight Model 5 (FM5) on the NPOESS Preparatory Project (NPP) in addition to the four CERES scanning instrument on Terra and Aqua. The CERES production system consists of over 75 Product Generation Executives (PGEs) maintained by twelve subsystem groups. The processing chain fuses CERES instrument observations with data from 19 other unique sources. The addition of FM5 to over 22 instrument years of data to be reprocessed from flight models 1-4 creates a need for an optimized production processing approach. This poster discusses a new approach, using JBoss and Perl to manage job scheduling and interdependencies between PGEs and external data sources. The new optimized approach uses JBoss to serve handler servlets which regulate PGE-level job interdependencies and job completion notifications. Additional servlets are used to regulate all job submissions from the handlers and to interact with the operator. Perl submission scripts are used to build Process Control Files and to interact directly with the operating system and cluster scheduler. The result is a reduced burden on the operator by algorithmically enforcing a set of rules that determine the optimal time to produce data products with the highest integrity. These rules are designed on a per PGE basis and periodically change. This design provides the means to dynamically update PGE rules at run time and increases the processing throughput by using an event driven controller. The immediate notification of a PGE's completion (an event) allows successor PGEs to launch at the proper time with minimal start up latency, thereby increasing computer system utilization.
Elements of Designing for Cost
NASA Technical Reports Server (NTRS)
Dean, Edwin B.; Unal, Resit
1992-01-01
During recent history in the United States, government systems development has been performance driven. As a result, systems within a class have experienced exponentially increasing cost over time in fixed year dollars. Moreover, little emphasis has been placed on reducing cost. This paper defines designing for cost and presents several tools which, if used in the engineering process, offer the promise of reducing cost. Although other potential tools exist for designing for cost, this paper focuses on rules of thumb, quality function deployment, Taguchi methods, concurrent engineering, and activity based costing. Each of these tools has been demonstrated to reduce cost if used within the engineering process.
Elements of designing for cost
NASA Technical Reports Server (NTRS)
Dean, Edwin B.; Unal, Resit
1992-01-01
During recent history in the United States, government systems development has been performance driven. As a result, systems within a class have experienced exponentially increasing cost over time in fixed year dollars. Moreover, little emphasis has been placed on reducing cost. This paper defines designing for cost and presents several tools which, if used in the engineering process, offer the promise of reducing cost. Although other potential tools exist for designing for cost, this paper focuses on rules of thumb, quality function deployment, Taguchi methods, concurrent engineering, and activity-based costing. Each of these tools has been demonstrated to reduce cost if used within the engineering process.
Code of Federal Regulations, 2011 CFR
2011-07-01
... requirements of the Federal Rules of Civil Procedure (28 U.S.C. Appendix)? 230.26 Section 230.26 Postal Service....26 Do these rules affect the service of process requirements of the Federal Rules of Civil Procedure... Rules of Civil Procedure regarding service of process. ...
Indexing. ERIC Processing Manual, Section VII.
ERIC Educational Resources Information Center
Houston, Jim, Ed.
Rules and guidelines are provided for subject indexing in the ERIC system. The principle of "subject access" is discussed with particular reference to "coordinate indexing," which involves designating subject content by unit terms (or tags) that may be put together or "coordinated" for subsequent retrieval. The nature…
Pro Se Court: A Simulation Game
ERIC Educational Resources Information Center
Gallagher, Arlene F.; Hartstein, Elliott
1973-01-01
The complexities of courtroom procedure and rule of evidence often dissuade the classroom teacher from using the mock trial strategy. This model has been designed for role playing and for focusing on the judicial decision-making process: deliberation on the issues of a case. (Author/JB)
A Hybrid Genetic Programming Algorithm for Automated Design of Dispatching Rules.
Nguyen, Su; Mei, Yi; Xue, Bing; Zhang, Mengjie
2018-06-04
Designing effective dispatching rules for production systems is a difficult and timeconsuming task if it is done manually. In the last decade, the growth of computing power, advanced machine learning, and optimisation techniques has made the automated design of dispatching rules possible and automatically discovered rules are competitive or outperform existing rules developed by researchers. Genetic programming is one of the most popular approaches to discovering dispatching rules in the literature, especially for complex production systems. However, the large heuristic search space may restrict genetic programming from finding near optimal dispatching rules. This paper develops a new hybrid genetic programming algorithm for dynamic job shop scheduling based on a new representation, a new local search heuristic, and efficient fitness evaluators. Experiments show that the new method is effective regarding the quality of evolved rules. Moreover, evolved rules are also significantly smaller and contain more relevant attributes.
Mining Rare Associations between Biological Ontologies
Benites, Fernando; Simon, Svenja; Sapozhnikova, Elena
2014-01-01
The constantly increasing volume and complexity of available biological data requires new methods for their management and analysis. An important challenge is the integration of information from different sources in order to discover possible hidden relations between already known data. In this paper we introduce a data mining approach which relates biological ontologies by mining cross and intra-ontology pairwise generalized association rules. Its advantage is sensitivity to rare associations, for these are important for biologists. We propose a new class of interestingness measures designed for hierarchically organized rules. These measures allow one to select the most important rules and to take into account rare cases. They favor rules with an actual interestingness value that exceeds the expected value. The latter is calculated taking into account the parent rule. We demonstrate this approach by applying it to the analysis of data from Gene Ontology and GPCR databases. Our objective is to discover interesting relations between two different ontologies or parts of a single ontology. The association rules that are thus discovered can provide the user with new knowledge about underlying biological processes or help improve annotation consistency. The obtained results show that produced rules represent meaningful and quite reliable associations. PMID:24404165
Mining rare associations between biological ontologies.
Benites, Fernando; Simon, Svenja; Sapozhnikova, Elena
2014-01-01
The constantly increasing volume and complexity of available biological data requires new methods for their management and analysis. An important challenge is the integration of information from different sources in order to discover possible hidden relations between already known data. In this paper we introduce a data mining approach which relates biological ontologies by mining cross and intra-ontology pairwise generalized association rules. Its advantage is sensitivity to rare associations, for these are important for biologists. We propose a new class of interestingness measures designed for hierarchically organized rules. These measures allow one to select the most important rules and to take into account rare cases. They favor rules with an actual interestingness value that exceeds the expected value. The latter is calculated taking into account the parent rule. We demonstrate this approach by applying it to the analysis of data from Gene Ontology and GPCR databases. Our objective is to discover interesting relations between two different ontologies or parts of a single ontology. The association rules that are thus discovered can provide the user with new knowledge about underlying biological processes or help improve annotation consistency. The obtained results show that produced rules represent meaningful and quite reliable associations.
Clinical Trials Targeting Aging and Age-Related Multimorbidity
Crimmins, Eileen M; Grossardt, Brandon R; Crandall, Jill P; Gelfond, Jonathan A L; Harris, Tamara B; Kritchevsky, Stephen B; Manson, JoAnn E; Robinson, Jennifer G; Rocca, Walter A; Temprosa, Marinella; Thomas, Fridtjof; Wallace, Robert; Barzilai, Nir
2017-01-01
Abstract Background There is growing interest in identifying interventions that may increase health span by targeting biological processes underlying aging. The design of efficient and rigorous clinical trials to assess these interventions requires careful consideration of eligibility criteria, outcomes, sample size, and monitoring plans. Methods Experienced geriatrics researchers and clinical trialists collaborated to provide advice on clinical trial design. Results Outcomes based on the accumulation and incidence of age-related chronic diseases are attractive for clinical trials targeting aging. Accumulation and incidence rates of multimorbidity outcomes were developed by selecting at-risk subsets of individuals from three large cohort studies of older individuals. These provide representative benchmark data for decisions on eligibility, duration, and assessment protocols. Monitoring rules should be sensitive to targeting aging-related, rather than disease-specific, outcomes. Conclusions Clinical trials targeting aging are feasible, but require careful design consideration and monitoring rules. PMID:28364543
Meyer, Claas; Reutter, Michaela; Matzdorf, Bettina; Sattler, Claudia; Schomers, Sarah
2015-07-01
In recent years, increasing attention has been paid to financial environmental policy instruments that have played important roles in solving agri-environmental problems throughout the world, particularly in the European Union and the United States. The ample and increasing literature on Payments for Ecosystem Services (PES) and agri-environmental measures (AEMs), generally understood as governmental PES, shows that certain single design rules may have an impact on the success of a particular measure. Based on this research, we focused on the interplay of several design rules and conducted a comparative analysis of AEMs' institutional arrangements by examining 49 German cases. We analyzed the effects of the design rules and certain rule combinations on the success of AEMs. Compliance and noncompliance with the hypothesized design rules and the success of the AEMs were surveyed by questioning the responsible agricultural administration and the AEMs' mid-term evaluators. The different rules were evaluated in regard to their necessity and sufficiency for success using Qualitative Comparative Analysis (QCA). Our results show that combinations of certain design rules such as environmental goal targeting and area targeting conditioned the success of the AEMs. Hence, we generalize design principles for AEMs and discuss implications for the general advancement of ecosystem services and the PES approach in agri-environmental policies. Moreover, we highlight the relevance of the results for governmental PES program research and design worldwide. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Takács, Ondřej; Kostolányová, Kateřina
2016-06-01
This paper describes the Virtual Teacher that uses a set of rules to automatically adapt the way of teaching. These rules compose of two parts: conditions on various students' properties or learning situation; conclusions that specify different adaptation parameters. The rules can be used for general adaptation of each subject or they can be specific to some subject. The rule based system of Virtual Teacher is dedicated to be used in pedagogical experiments in adaptive e-learning and is therefore designed for users without education in computer science. The Virtual Teacher was used in dissertation theses of two students, who executed two pedagogical experiments. This paper also describes the phase of simulating and modeling of the theoretically prepared adaptive process in the modeling tool, which has all the required parameters and has been created especially for the occasion. The experiments are being conducted on groups of virtual students and by using a virtual study material.
NASA Astrophysics Data System (ADS)
Gabor, Allen H.; Brendler, Andrew C.; Brunner, Timothy A.; Chen, Xuemei; Culp, James A.; Levinson, Harry J.
2018-03-01
The relationship between edge placement error, semiconductor design-rule determination and predicted yield in the era of EUV lithography is examined. This paper starts with the basics of edge placement error and then builds up to design-rule calculations. We show that edge placement error (EPE) definitions can be used as the building blocks for design-rule equations but that in the last several years the term "EPE" has been used in the literature to refer to many patterning errors that are not EPE. We then explore the concept of "Good Fields"1 and use it predict the n-sigma value needed for design-rule determination. Specifically, fundamental yield calculations based on the failure opportunities per chip are used to determine at what n-sigma "value" design-rules need to be tested to ensure high yield. The "value" can be a space between two features, an intersect area between two features, a minimum area of a feature, etc. It is shown that across chip variation of design-rule important values needs to be tested at sigma values between seven and eight which is much higher than the four-sigma values traditionally used for design-rule determination. After recommending new statistics be used for design-rule calculations the paper examines the impact of EUV lithography on sources of variation important for design-rule calculations. We show that stochastics can be treated as an effective dose variation that is fully sampled across every chip. Combining the increased within chip variation from EUV with the understanding that across chip variation of design-rule important values needs to not cause a yield loss at significantly higher sigma values than have traditionally been looked at, the conclusion is reached that across-wafer, wafer-to-wafer and lot-to-lot variation will have to overscale for any technology introducing EUV lithography where stochastic noise is a significant fraction of the effective dose variation. We will emphasize stochastic effects on edge placement error distributions and appropriate design-rule setting. While CD distributions with long tails coming from stochastic effects do bring increased risk of failure (especially on chips that may have over a billion failure opportunities per layer) there are other sources of variation that have sharp cutoffs, i.e. have no tails. We will review these sources and show how distributions with different skew and kurtosis values combine.
Learning the Rules of the Game
NASA Astrophysics Data System (ADS)
Smith, Donald A.
2018-03-01
Games have often been used in the classroom to teach physics ideas and concepts, but there has been less published on games that can be used to teach scientific thinking. D. Maloney and M. Masters describe an activity in which students attempt to infer rules to a game from a history of moves, but the students don't actually play the game. Giving the list of moves allows the instructor to emphasize the important fact that nature usually gives us incomplete data sets, but it does make the activity less immersive. E. Kimmel suggested letting students attempt to figure out the rules to Reversi by playing it, but this game only has two players, which makes it difficult to apply in a classroom setting. Kimmel himself admits the choice of Reversi is somewhat arbitrary. There are games, however, that are designed to make the process of figuring out the rules an integral aspect of play. These games involve more people and require only a deck or two of cards. I present here an activity constructed around the card game Mao, which can be used to help students recognize aspects of scientific thinking. The game is particularly good at illustrating the importance of falsification tests (questions designed to elicit a negative answer) over verification tests (examples that confirm what is already suspected) for illuminating the underlying rules.
A rule-based system for real-time analysis of control systems
NASA Astrophysics Data System (ADS)
Larson, Richard R.; Millard, D. Edward
1992-10-01
An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.
A rule-based system for real-time analysis of control systems
NASA Technical Reports Server (NTRS)
Larson, Richard R.; Millard, D. Edward
1992-01-01
An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.
ERIC Educational Resources Information Center
Martin, Elizabeth; And Others
Based on definitions of a machine-readable data file (MRDF) taken from the Anglo-American Cataloging Rules, second edition (AACR2) and Standards for Cataloging Nonprint Materials, the following recommendations for processing items of computer software are provided: (1) base main and added entry determination on AACR2; (2) place designation of form…
Multi-tasking arbitration and behaviour design for human-interactive robots
NASA Astrophysics Data System (ADS)
Kobayashi, Yuichi; Onishi, Masaki; Hosoe, Shigeyuki; Luo, Zhiwei
2013-05-01
Robots that interact with humans in household environments are required to handle multiple real-time tasks simultaneously, such as carrying objects, collision avoidance and conversation with human. This article presents a design framework for the control and recognition processes to meet these requirements taking into account stochastic human behaviour. The proposed design method first introduces a Petri net for synchronisation of multiple tasks. The Petri net formulation is converted to Markov decision processes and processed in an optimal control framework. Three tasks (safety confirmation, object conveyance and conversation) interact and are expressed by the Petri net. Using the proposed framework, tasks that normally tend to be designed by integrating many if-then rules can be designed in a systematic manner in a state estimation and optimisation framework from the viewpoint of the shortest time optimal control. The proposed arbitration method was verified by simulations and experiments using RI-MAN, which was developed for interactive tasks with humans.
NASA Astrophysics Data System (ADS)
Toussaint, Bert
In this paper, the author wants to explore the knowledge development in two crucial fields, river management and coast management in the 19th century and first decades of the 20th century. Were there similar characteristics in this development? Which types of knowledge can be distinguished? Who were the principal actors in these processes? Did the knowledge evolution have a Dutch stamp or a rather international flavour? To structure the analysis, the author uses the concept of technology regime, a set of technical rules which shapes the know-how of engineers, their design rules and research processes. The analysis shows that the knowledge development of river management and coastal management followed different evolution paths between 1800 and 1940. In the field of river management, a substantial amount of mathematical and physical theories had been gradually developed since the end of the 17th century. After 1850, the regularization approach met gradually a widespread support. Empirical data, design rules, theoretical knowledge and engineering pivoted around the regularization approach, and a technology regime around this approach emerged. The regularization regime further developed in the 20th century, and handbooks were increasingly shaped by mathematical and physical reasoning and formulas. On the other hand, coastal management was until the 1880s a rather marginal activity. Coastal engineering was an extremely complex and multidimensional field of knowledge which no engineer was able to grasp. The foundation of a Dutch weather institute was a first important step towards a more theoretical approach. The Zuiderzee works (starting in 1925) gave probably the most important stimuli to scientific coastal research. It was also a main factor in setting up scientific institutes by Rijkswaterstaat. So from the 1920s, Rijkswaterstaat became a major producer of scientific knowledge, not only in tidal modelling but also in coastal research. Due to a multidisciplinary knowledge network, coastal research transformed from a marginal to a first-rank scientific field, and this transformation enabled Rijkswaterstaat to set a much higher level of ambition in coastal management. The 1953 flood and the Deltaworks marked a new era. New design rules for sea dykes and river levees, based on a revolutionary statistical risk approach were determined, and design rules for the Deltaworks estuary closures were developed, being enabled by the development of hydraulic research.
DOT National Transportation Integrated Search
2006-05-01
To help transportation agencies understand and implement the provisions of the Rule, FHWA has : developed four guidance documents. This Guide is designed to help transportation agencies : develop and/or update their own policies, processes, and proce...
ERIC Educational Resources Information Center
Salpeter, Judy
2008-01-01
Complying with, and teaching young people about, copyright in an educational setting often feels burdensome. That's because copyright laws were not designed to facilitate the sort of sharing and collaborating that has become widespread in the digital age. The innovative nonprofit organization Creative Commons turns the process around, making the…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-21
... of Persons Undermining the Sovereignty of Lebanon or Its Democratic Processes and Institutions... persons to undermine Lebanon's legitimate and democratically elected government or democratic institutions, to contribute to the deliberate breakdown in the rule of law in Lebanon, including through...
Computer Aided Process Planning for Non-Axisymmetric Deep Drawing Products
NASA Astrophysics Data System (ADS)
Park, Dong Hwan; Yarlagadda, Prasad K. D. V.
2004-06-01
In general, deep drawing products have various cross-section shapes such as cylindrical, rectangular and non-axisymmetric shapes. The application of the surface area calculation to non-axisymmetric deep drawing process has not been published yet. In this research, a surface area calculation for non-axisymmetric deep drawing products with elliptical shape was constructed for a design of blank shape of deep drawing products by using an AutoLISP function of AutoCAD software. A computer-aided process planning (CAPP) system for rotationally symmetric deep drawing products has been developed. However, the application of the system to non-axisymmetric components has not been reported yet. Thus, the CAPP system for non-axisymmetric deep drawing products with elliptical shape was constructed by using process sequence design. The system developed in this work consists of four modules. The first is recognition of shape module to recognize non-axisymmetric products. The second is a three-dimensional (3-D) modeling module to calculate the surface area for non-axisymmetric products. The third is a blank design module to create an oval-shaped blank with the identical surface area. The forth is a process planning module based on the production rules that play the best important role in an expert system for manufacturing. The production rules are generated and upgraded by interviewing field engineers. Especially, the drawing coefficient, the punch and die radii for elliptical shape products are considered as main design parameters. The suitability of this system was verified by applying to a real deep drawing product. This CAPP system constructed would be very useful to reduce lead-time for manufacturing and improve an accuracy of products.
An application of object-oriented knowledge representation to engineering expert systems
NASA Technical Reports Server (NTRS)
Logie, D. S.; Kamil, H.; Umaretiya, J. R.
1990-01-01
The paper describes an object-oriented knowledge representation and its application to engineering expert systems. The object-oriented approach promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects and organized by defining relationships between the objects. An Object Representation Language (ORL) was implemented as a tool for building and manipulating the object base. Rule-based knowledge representation is then used to simulate engineering design reasoning. Using a common object base, very large expert systems can be developed, comprised of small, individually processed, rule sets. The integration of these two schemes makes it easier to develop practical engineering expert systems. The general approach to applying this technology to the domain of the finite element analysis, design, and optimization of aerospace structures is discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-27
...The Department of Agriculture (USDA) is adopting, as a final rule, without change, an interim rule that suspended the reporting and assessment requirements prescribed under the Washington-Oregon fresh prune marketing order. The marketing order regulates the handling of fresh prunes grown in designated counties in Washington and in Umatilla County, Oregon, and is administered locally by the Washington-Oregon Fresh Prune Marketing Committee (Committee). On June 1, 2010, the Committee unanimously voted to terminate Marketing Order No. 924. Since the only regulatory actions then in effect were the reporting and assessment requirements, the Committee included a recommendation to immediately suspend those activities while USDA processes the termination request. The reporting and assessment requirements will remain suspended until reinstated or permanently terminated.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-16
...-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer Period for Commission Action on Proceedings to Determine Whether to Approve or Disapprove Proposed Rule Change To Establish... proposed rule change to establish various ``Benchmark Orders'' under NASDAQ Rule 4751(f). The proposed rule...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-29
...-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer Period for Commission Action on Proceedings To Determine Whether To Approve or Disapprove Proposed Rule Change To Amend Rule...,\\2\\ a proposed rule change to amend Exchange Rule 4626--Limitation of Liability (``accommodation...
The study on the effect of pattern density distribution on the STI CMP process
NASA Astrophysics Data System (ADS)
Sub, Yoon Myung; Hian, Bernard Yap Tzen; Fong, Lee It; Anak, Philip Menit; Minhar, Ariffin Bin; Wui, Tan Kim; Kim, Melvin Phua Twang; Jin, Looi Hui; Min, Foo Thai
2017-08-01
The effects of pattern density on CMP characteristics were investigated using specially designed wafer for the characterization of pattern-dependencies in STI CMP [1]. The purpose of this study is to investigate the planarization behavior based on a direct STI CMP used in cerium (CeO2) based slurry system in terms of pattern density variation. The minimal design rule (DR) of 180nm generation technology node was adopted for the mask layout. The mask was successfully applied for evaluation of a cerium (CeO2) abrasive based direct STI CMP process. In this study, we described a planarization behavior of the loading-effects of pattern density variation which were characterized with layout pattern density and pitch variations using masks mentioned above. Furthermore, the characterizing pattern dependent on the variations of the dimensions and spacing features, in thickness remaining after CMP, were analyzed and evaluated. The goal was to establish a concept of library method which will be used to generate design rules reducing the probability of CMP-related failures. Details of the characterization were measured in various layouts showing different pattern density ranges and the effects of pattern density on STI CMP has been discussed in this paper.
Kawakami, Tomoya; Fujita, Naotaka; Yoshihisa, Tomoki; Tsukamoto, Masahiko
2014-01-01
In recent years, sensors become popular and Home Energy Management System (HEMS) takes an important role in saving energy without decrease in QoL (Quality of Life). Currently, many rule-based HEMSs have been proposed and almost all of them assume "IF-THEN" rules. The Rete algorithm is a typical pattern matching algorithm for IF-THEN rules. Currently, we have proposed a rule-based Home Energy Management System (HEMS) using the Rete algorithm. In the proposed system, rules for managing energy are processed by smart taps in network, and the loads for processing rules and collecting data are distributed to smart taps. In addition, the number of processes and collecting data are reduced by processing rules based on the Rete algorithm. In this paper, we evaluated the proposed system by simulation. In the simulation environment, rules are processed by a smart tap that relates to the action part of each rule. In addition, we implemented the proposed system as HEMS using smart taps.
Patients' feelings about ward nursing regimes and involvement in rule construction.
Alexander, J
2006-10-01
This study compared two acute psychiatric ward nursing regimes, focusing on ward rules as a means of investigating the relationship between the flexibility/inflexibility of the regimes and patient outcomes. Previous studies identified an association between ward rules and patient aggression. A link between absconding and nurses' attitudes towards rule enforcement has also been explored. However, an in-depth exploration of ward rules from the perspective of nurses and patients had not been undertaken previously. The study aimed to discover the content of rules within acute psychiatric wards; to explore patients' responses to the rules; to evaluate the impact of rules and rule enforcement on nurse-patient relationships and on ward events; and to investigate the relationship between ward rules, ward atmosphere and ward design. The relevance of sociological theory emerged from the data analysis. During this process, the results were moved up to another conceptual level to represent the meaning of lived experience at the level of theory. For example, nurses' descriptions of their feelings in relation to rule enforcement were merged as role ambivalence. This concept was supported by examples from the transcripts. Other possible explanations for the data and the connections between them were checked by returning to each text unit in the cluster and ensuring that it fitted with the emergent theory. The design centred on a comparative interview study of 30 patients and 30 nurses within two acute psychiatric wards in different hospitals. Non-participant observations provided a context for the interview data. Measures of the Ward Atmosphere Scale, the Hospital-Hostel Practices Profile, ward incidents and levels of as required (PRN) medication were obtained. The analysis of the quantitative data was assisted by spss, and the qualitative analysis by QSR *NUDIST. Thematic and interpretative phenomenological methods were used in the analysis of the qualitative data. A series of 11 interrelated concepts emerged from an analysis of the data, and a synthesis of the main themes. This paper focuses on the results and recommendations that emerged from the quantitative and qualitative patient data. A further paper will focus on nurses' perceptions of the same topics.
Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda
2016-01-01
Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules' performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2-4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved.
Kawano, Tomonori
2013-03-01
There have been a wide variety of approaches for handling the pieces of DNA as the "unplugged" tools for digital information storage and processing, including a series of studies applied to the security-related area, such as DNA-based digital barcodes, water marks and cryptography. In the present article, novel designs of artificial genes as the media for storing the digitally compressed data for images are proposed for bio-computing purpose while natural genes principally encode for proteins. Furthermore, the proposed system allows cryptographical application of DNA through biochemically editable designs with capacity for steganographical numeric data embedment. As a model case of image-coding DNA technique application, numerically and biochemically combined protocols are employed for ciphering the given "passwords" and/or secret numbers using DNA sequences. The "passwords" of interest were decomposed into single letters and translated into the font image coded on the separate DNA chains with both the coding regions in which the images are encoded based on the novel run-length encoding rule, and the non-coding regions designed for biochemical editing and the remodeling processes revealing the hidden orientation of letters composing the original "passwords." The latter processes require the molecular biological tools for digestion and ligation of the fragmented DNA molecules targeting at the polymerase chain reaction-engineered termini of the chains. Lastly, additional protocols for steganographical overwriting of the numeric data of interests over the image-coding DNA are also discussed.
NASA Technical Reports Server (NTRS)
1981-01-01
The results of a preliminary study on the design of a radiation hardened fusible link programmable read-only memory (PROM) are presented. Various fuse technologies and the effects of radiation on MOS integrated circuits are surveyed. A set of design rules allowing the fabrication of a radiation hardened PROM using a Si-gate CMOS process is defined. A preliminary cell layout was completed and the programming concept defined. A block diagram is used to describe the circuit components required for a 4 K design. A design goal data sheet giving target values for the AC, DC, and radiation parameters of the circuit is presented.
Skrivanek, Zachary; Berry, Scott; Berry, Don; Chien, Jenny; Geiger, Mary Jane; Anderson, James H.; Gaydos, Brenda
2012-01-01
Background Dulaglutide (dula, LY2189265), a long-acting glucagon-like peptide-1 analog, is being developed to treat type 2 diabetes mellitus. Methods To foster the development of dula, we designed a two-stage adaptive, dose-finding, inferentially seamless phase 2/3 study. The Bayesian theoretical framework is used to adaptively randomize patients in stage 1 to 7 dula doses and, at the decision point, to either stop for futility or to select up to 2 dula doses for stage 2. After dose selection, patients continue to be randomized to the selected dula doses or comparator arms. Data from patients assigned the selected doses will be pooled across both stages and analyzed with an analysis of covariance model, using baseline hemoglobin A1c and country as covariates. The operating characteristics of the trial were assessed by extensive simulation studies. Results Simulations demonstrated that the adaptive design would identify the correct doses 88% of the time, compared to as low as 6% for a fixed-dose design (the latter value based on frequentist decision rules analogous to the Bayesian decision rules for adaptive design). Conclusions This article discusses the decision rules used to select the dula dose(s); the mathematical details of the adaptive algorithm—including a description of the clinical utility index used to mathematically quantify the desirability of a dose based on safety and efficacy measurements; and a description of the simulation process and results that quantify the operating characteristics of the design. PMID:23294775
Code of Federal Regulations, 2014 CFR
2014-07-01
... COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Plant Patents § 1.166 Specimens. The applicant may be required to furnish specimens of the plant, or its flower or fruit, in a quantity and at a time in its stage of growth as may be designated, for study and inspection. Such...
Code of Federal Regulations, 2012 CFR
2012-07-01
... COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Plant Patents § 1.166 Specimens. The applicant may be required to furnish specimens of the plant, or its flower or fruit, in a quantity and at a time in its stage of growth as may be designated, for study and inspection. Such...
Code of Federal Regulations, 2013 CFR
2013-07-01
... COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Plant Patents § 1.166 Specimens. The applicant may be required to furnish specimens of the plant, or its flower or fruit, in a quantity and at a time in its stage of growth as may be designated, for study and inspection. Such...
Code of Federal Regulations, 2011 CFR
2011-07-01
... COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Plant Patents § 1.166 Specimens. The applicant may be required to furnish specimens of the plant, or its flower or fruit, in a quantity and at a time in its stage of growth as may be designated, for study and inspection. Such...
9 CFR 205.101 - Certification-request and processing.
Code of Federal Regulations, 2014 CFR
2014-01-01
... an introductory explanation of how the system will operate; (2) Identify the information which will... the system is created and operated, and the system operator is designated; (ii) All regulations, rules... certification of a system, a written request for certification must be filed together with such documents as...
9 CFR 205.101 - Certification-request and processing.
Code of Federal Regulations, 2013 CFR
2013-01-01
... an introductory explanation of how the system will operate; (2) Identify the information which will... the system is created and operated, and the system operator is designated; (ii) All regulations, rules... certification of a system, a written request for certification must be filed together with such documents as...
40 CFR 65.163 - Other records.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) Detailed schematics, design specifications of the control device, and piping and instrumentation diagrams... FEDERAL AIR RULE Closed Vent Systems, Control Devices, and Routing to a Fuel Gas System or a Process § 65... bypass lines that could divert a vent stream away from the control device and to the atmosphere, the...
Cycle time reduction by Html report in mask checking flow
NASA Astrophysics Data System (ADS)
Chen, Jian-Cheng; Lu, Min-Ying; Fang, Xiang; Shen, Ming-Feng; Ma, Shou-Yuan; Yang, Chuen-Huei; Tsai, Joe; Lee, Rachel; Deng, Erwin; Lin, Ling-Chieh; Liao, Hung-Yueh; Tsai, Jenny; Bowhill, Amanda; Vu, Hien; Russell, Gordon
2017-07-01
The Mask Data Correctness Check (MDCC) is a reticle-level, multi-layer DRC-like check evolved from mask rule check (MRC). The MDCC uses extended job deck (EJB) to achieve mask composition and to perform a detailed check for positioning and integrity of each component of the reticle. Different design patterns on the mask will be mapped to different layers. Therefore, users may be able to review the whole reticle and check the interactions between different designs before the final mask pattern file is available. However, many types of MDCC check results, such as errors from overlapping patterns usually have very large and complex-shaped highlighted areas covering the boundary of the design. Users have to load the result OASIS file and overlap it to the original database that was assembled in MDCC process on a layout viewer, then search for the details of the check results. We introduce a quick result-reviewing method based on an html format report generated by Calibre® RVE. In the report generation process, we analyze and extract the essential part of result OASIS file to a result database (RDB) file by standard verification rule format (SVRF) commands. Calibre® RVE automatically loads the assembled reticle pattern and generates screen shots of these check results. All the processes are automatically triggered just after the MDCC process finishes. Users just have to open the html report to get the information they need: for example, check summary, captured images of results and their coordinates.
Ambient-aware continuous care through semantic context dissemination.
Ongenae, Femke; Famaey, Jeroen; Verstichel, Stijn; De Zutter, Saar; Latré, Steven; Ackaert, Ann; Verhoeve, Piet; De Turck, Filip
2014-12-04
The ultimate ambient-intelligent care room contains numerous sensors and devices to monitor the patient, sense and adjust the environment and support the staff. This sensor-based approach results in a large amount of data, which can be processed by current and future applications, e.g., task management and alerting systems. Today, nurses are responsible for coordinating all these applications and supplied information, which reduces the added value and slows down the adoption rate.The aim of the presented research is the design of a pervasive and scalable framework that is able to optimize continuous care processes by intelligently reasoning on the large amount of heterogeneous care data. The developed Ontology-based Care Platform (OCarePlatform) consists of modular components that perform a specific reasoning task. Consequently, they can easily be replicated and distributed. Complex reasoning is achieved by combining the results of different components. To ensure that the components only receive information, which is of interest to them at that time, they are able to dynamically generate and register filter rules with a Semantic Communication Bus (SCB). This SCB semantically filters all the heterogeneous care data according to the registered rules by using a continuous care ontology. The SCB can be distributed and a cache can be employed to ensure scalability. A prototype implementation is presented consisting of a new-generation nurse call system supported by a localization and a home automation component. The amount of data that is filtered and the performance of the SCB are evaluated by testing the prototype in a living lab. The delay introduced by processing the filter rules is negligible when 10 or fewer rules are registered. The OCarePlatform allows disseminating relevant care data for the different applications and additionally supports composing complex applications from a set of smaller independent components. This way, the platform significantly reduces the amount of information that needs to be processed by the nurses. The delay resulting from processing the filter rules is linear in the amount of rules. Distributed deployment of the SCB and using a cache allows further improvement of these performance results.
Developing a semantic web model for medical differential diagnosis recommendation.
Mohammed, Osama; Benlamri, Rachid
2014-10-01
In this paper we describe a novel model for differential diagnosis designed to make recommendations by utilizing semantic web technologies. The model is a response to a number of requirements, ranging from incorporating essential clinical diagnostic semantics to the integration of data mining for the process of identifying candidate diseases that best explain a set of clinical features. We introduce two major components, which we find essential to the construction of an integral differential diagnosis recommendation model: the evidence-based recommender component and the proximity-based recommender component. Both approaches are driven by disease diagnosis ontologies designed specifically to enable the process of generating diagnostic recommendations. These ontologies are the disease symptom ontology and the patient ontology. The evidence-based diagnosis process develops dynamic rules based on standardized clinical pathways. The proximity-based component employs data mining to provide clinicians with diagnosis predictions, as well as generates new diagnosis rules from provided training datasets. This article describes the integration between these two components along with the developed diagnosis ontologies to form a novel medical differential diagnosis recommendation model. This article also provides test cases from the implementation of the overall model, which shows quite promising diagnostic recommendation results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Joshua M.
Manufacturing tasks that are deemed too hazardous for workers require the use of automation, robotics, and/or other remote handling tools. The associated hazards may be radiological or nonradiological, and based on the characteristics of the environment and processing, a design may necessitate robotic labor, human labor, or both. There are also other factors such as cost, ergonomics, maintenance, and efficiency that also effect task allocation and other design choices. Handling the tradeoffs of these factors can be complex, and lack of experience can be an issue when trying to determine if and what feasible automation/robotics options exist. To address thismore » problem, we utilize common engineering design approaches adapted more for manufacturing system design in hazardous environments. We limit our scope to the conceptual and embodiment design stages, specifically a computational algorithm for concept generation and early design evaluation. In regard to concept generation, we first develop the functional model or function structure for the process, using the common 'verb-noun' format for describing function. A common language or functional basis for manufacturing was developed and utilized to formalize function descriptions and guide rules for function decomposition. Potential components for embodiment are also grouped in terms of this functional language and are stored in a database. The properties of each component are given as quantitative and qualitative criteria. Operators are also rated for task-relevant criteria which are used to address task compatibility. Through the gathering of process requirements/constraints, construction of the component database, and development of the manufacturing basis and rule set, design knowledge is stored and available for computer use. Thus, once the higher level process functions are defined, the computer can automate the synthesis of new design concepts through alternating steps of embodiment and function structure updates/decomposition. In the process, criteria guide function allocation of components/operators and help ensure compatibility and feasibility. Through multiple function assignment options and varied function structures, multiple design concepts are created. All of the generated designs are then evaluated based on a number of relevant evaluation criteria: cost, dose, ergonomics, hazards, efficiency, etc. These criteria are computed using physical properties/parameters of each system based on the qualities an engineer would use to make evaluations. Nuclear processes such as oxide conversion and electrorefining are utilized to aid algorithm development and provide test cases for the completed program. Through our approach, we capture design knowledge related to manufacturing and other operations in hazardous environments to enable a computational program to automatically generate and evaluate system design concepts.« less
Entrepreneurship in health education and health promotion: five cardinal rules.
Eddy, James M; Stellefson, Michael L
2009-07-01
The nature of health education and health promotion (HE/HP) offers a fertile ground for entrepreneurial activity. As primary prevention of chronic diseases becomes a more central component of the health and/ or medical care continuum, entrepreneurial opportunities for health educators will continue to expand. The process used to design, implement, and evaluate health promotion and disease prevention has clear articulation with entrepreneurial, marketing management, and other business processes. Thus, entrepreneurs in HE/HP must be able to utilize business process to facilitate creative, new HE/HP business ideas. The purpose of this article is to weave theory and practical application into a primer on entrepreneurial applications in HE/HP. More specifically, the authors meld their prospective experiences and expertise to provide background thoughts on entrepreneurship in HE/HP and develop a framework for establishing an entrepreneurial venture in HE/HP. Five Cardinal Rules for Entrepreneurs in HE/HP are proposed.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-24
... Organizations; C2 Options Exchange, Incorporated; Order Approving a Proposed Rule Change To Adopt a Designated... thereunder,\\2\\ a proposed rule change to adopt a Designated Primary Market-Maker (``DPM'') program. The... the Notice, C2 has proposed to adopt a DPM program. The associated proposed rules are based on the...
Layout finishing of a 28nm, 3 billions transistors, multi-core processor
NASA Astrophysics Data System (ADS)
Morey-Chaisemartin, Philippe; Beisser, Eric
2013-06-01
Designing a fully new 256 cores processor is a great challenge for a fabless startup. In addition to all architecture, functionalities and timing issues, the layout by itself is a bottleneck due to all the process constraints of a 28nm technology. As developers of advanced layout finishing solutions, we were involved in the design flow of this huge chip with its 3 billions transistors. We had to face the issue of dummy patterns instantiation with respect to design constraints. All the design rules to generate the "dummies" are clearly defined in the Design Rule Manual, and some automatic procedures are provided by the foundry itself, but these routines don't take care of the designer requests. Such a chip, embeds both digital parts and analog modules for clock and power management. These two different type of designs have each their own set of constraints. In both cases, the insertion of dummies should not introduce unexpected variations leading to malfunctions. For example, on digital parts were signal race conditions are critical on long wires or bus, introduction of uncontrolled parasitic along these nets are highly critical. For analog devices such as high frequency and high sensitivity comparators, the exact symmetry of the two parts of a current mirror generator should be guaranteed. Thanks to the easily customizable features of our dummies insertion tool, we were able to configure it in order to meet all the designer requirements as well as the process constraints. This paper will present all these advanced key features as well as the layout tricks used to fulfill all requirements.
150-nm DR contact holes die-to-database inspection
NASA Astrophysics Data System (ADS)
Kuo, Shen C.; Wu, Clare; Eran, Yair; Staud, Wolfgang; Hemar, Shirley; Lindman, Ofer
2000-07-01
Using a failure analysis-driven yield enhancements concept, based on an optimization of the mask manufacturing process and UV reticle inspection is studied and shown to improve the contact layer quality. This is achieved by relating various manufacturing processes to very fine tuned contact defect detection. In this way, selecting an optimized manufacturing process with fine-tuned inspection setup is achieved in a controlled manner. This paper presents a study, performed on a specially designed test reticle, which simulates production contact layers of design rule 250nm, 180nm and 150nm. This paper focuses on the use of advanced UV reticle inspection techniques as part of the process optimization cycle. Current inspection equipment uses traditional and insufficient methods of small contact-hole inspection and review.
Audio visual summary: Implementing PURPA in Mid-America
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The audio-visual presentation, Implementing PURPA in Mid-America, is a slide presentation designed to complement deliverable W-101-2, a booklet entitled Implementing PURPA in Mid-America: A Guide to the Public Utility Regulatory Policies Act. The presentation lasts 10 to 12 min and explains the major sections of PURPA, the rules promulgated by the Federal Energy Regulatory Commission to implement PURPA, and the implications of PURPA and its rules. It delineates the rights and responsibilities of citizens who want to sell electricity to utilities, explains the certification process, and discusses the rights and responsibilities of the utilities.
NASA Astrophysics Data System (ADS)
Yu, Nanpeng
As U.S. regional electricity markets continue to refine their market structures, designs and rules of operation in various ways, two critical issues are emerging. First, although much experience has been gained and costly and valuable lessons have been learned, there is still a lack of a systematic platform for evaluation of the impact of a new market design from both engineering and economic points of view. Second, the transition from a monopoly paradigm characterized by a guaranteed rate of return to a competitive market created various unfamiliar financial risks for various market participants, especially for the Investor Owned Utilities (IOUs) and Independent Power Producers (IPPs). This dissertation uses agent-based simulation methods to tackle the market rules evaluation and financial risk management problems. The California energy crisis in 2000-01 showed what could happen to an electricity market if it did not go through a comprehensive and rigorous testing before its implementation. Due to the complexity of the market structure, strategic interaction between the participants, and the underlying physics, it is difficult to fully evaluate the implications of potential changes to market rules. This dissertation presents a flexible and integrative method to assess market designs through agent-based simulations. Realistic simulation scenarios on a 225-bus system are constructed for evaluation of the proposed PJM-like market power mitigation rules of the California electricity market. Simulation results show that in the absence of market power mitigation, generation company (GenCo) agents facilitated by Q-learning are able to exploit the market flaws and make significantly higher profits relative to the competitive benchmark. The incorporation of PJM-like local market power mitigation rules is shown to be effective in suppressing the exercise of market power. The importance of financial risk management is exemplified by the recent financial crisis. In this dissertation, basic financial risk management concepts relevant for wholesale electric power markets are carefully explained and illustrated. In addition, the financial risk management problem in wholesale electric power markets is generalized as a four-stage process. Within the proposed financial risk management framework, the critical problem of financial bilateral contract negotiation is addressed. This dissertation analyzes a financial bilateral contract negotiation process between a generating company and a load-serving entity in a wholesale electric power market with congestion managed by locational marginal pricing. Nash bargaining theory is used to model a Pareto-efficient settlement point. The model predicts negotiation results under varied conditions and identifies circumstances in which the two parties might fail to reach an agreement. Both analysis and agent-based simulation are used to gain insight regarding how relative risk aversion and biased price estimates influence negotiated outcomes. These results should provide useful guidance to market participants in their bilateral contract negotiation processes.
Smith-Spark, James H; Henry, Lucy A; Messer, David J; Zięcik, Adam P
2017-08-01
The executive function of fluency describes the ability to generate items according to specific rules. Production of words beginning with a certain letter (phonemic fluency) is impaired in dyslexia, while generation of words belonging to a certain semantic category (semantic fluency) is typically unimpaired. However, in dyslexia, verbal fluency has generally been studied only in terms of overall words produced. Furthermore, performance of adults with dyslexia on non-verbal design fluency tasks has not been explored but would indicate whether deficits could be explained by executive control, rather than phonological processing, difficulties. Phonemic, semantic and design fluency tasks were presented to adults with dyslexia and without dyslexia, using fine-grained performance measures and controlling for IQ. Hierarchical regressions indicated that dyslexia predicted lower phonemic fluency, but not semantic or design fluency. At the fine-grained level, dyslexia predicted a smaller number of switches between subcategories on phonemic fluency, while dyslexia did not predict the size of phonemically related clusters of items. Overall, the results suggested that phonological processing problems were at the root of dyslexia-related fluency deficits; however, executive control difficulties could not be completely ruled out as an alternative explanation. Developments in research methodology, equating executive demands across fluency tasks, may resolve this issue. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Simplify to survive: prescriptive layouts ensure profitable scaling to 32nm and beyond
NASA Astrophysics Data System (ADS)
Liebmann, Lars; Pileggi, Larry; Hibbeler, Jason; Rovner, Vyacheslav; Jhaveri, Tejas; Northrop, Greg
2009-03-01
The time-to-market driven need to maintain concurrent process-design co-development, even in spite of discontinuous patterning, process, and device innovation is reiterated. The escalating design rule complexity resulting from increasing layout sensitivities in physical and electrical yield and the resulting risk to profitable technology scaling is reviewed. Shortcomings in traditional Design for Manufacturability (DfM) solutions are identified and contrasted to the highly successful integrated design-technology co-optimization used for SRAM and other memory arrays. The feasibility of extending memory-style design-technology co-optimization, based on a highly simplified layout environment, to logic chips is demonstrated. Layout density benefits, modeled patterning and electrical yield improvements, as well as substantially improved layout simplicity are quantified in a conventional versus template-based design comparison on a 65nm IBM PowerPC 405 microprocessor core. The adaptability of this highly regularized template-based design solution to different yield concerns and design styles is shown in the extension of this work to 32nm with an increased focus on interconnect redundancy. In closing, the work not covered in this paper, focused on the process side of the integrated process-design co-optimization, is introduced.
New cellular automaton designed to simulate geometration in gel electrophoresis
NASA Astrophysics Data System (ADS)
Krawczyk, M. J.; Kułakowski, K.; Maksymowicz, A. Z.
2002-08-01
We propose a new kind of cellular automaton to simulate transportation of molecules of DNA through agarose gel. Two processes are taken into account: reptation at strong electric field E, described in the particle model, and geometration, i.e. subsequent hookings and releases of long molecules at and from gel fibres. The automaton rules are deterministic and they are designed to describe both processes within one unified approach. Thermal fluctuations are not taken into account. The number of simultaneous hookings is limited by the molecule length. The features of the automaton are: (i) the size of the cell neighbourhood for the automaton rule varies dynamically, from nearest neighbors to the entire molecule; (ii) the length of the time step is determined at each step according to dynamic rules. Calculations are made up to N=244 reptons in a molecule. Two subsequent stages of the motion are found. Firstly, an initial set of random configurations of molecules is transformed into a more ordered phase, where most molecules are elongated along the applied field direction. After some transient time, the mobility μ reaches a constant value. Then, it varies with N as 1/ N for long molecules. The band dispersion varies with time t approximately as Nt1/2. Our results indicate that the well-known plateau of the mobility μ vs. N does not hold at large electric fields.
NASA Astrophysics Data System (ADS)
Reuter, Markus; van Schaik, Antoinette
In this paper the link between process metallurgy, classical minerals processing, product centric recycling and urban/landfill mining is discussed. The depth that has to be achieved in urban mining and recycling must glean from the wealth of theoretical knowledge and insight that have been developed in the past in minerals and metallurgical processing. This background learns that recycling demands a product centric approach, which considers simultaneously the multi-material interactions in man-made complex `minerals'. Fast innovation in recycling and urban mining can be achieved by further evolving from this well developed basis, evolving the techniques and tools that have been developed over the years. This basis has already been used for many years to design, operate and control industrial plants for metal production. This has been the basis for Design for Recycling rules for End-of-Life products. Using, among others, the UNEP Metal Recycling report as a basis (authors are respectively Lead and Main authors of report), it is demonstrated that a common theoretical basis as developed in metallurgy and minerals processing can help much to level the playing field between primary processing, secondary processing, recycling, and urban/landfill mining and product design hence enhancing resource efficiency. Thus various scales of detail link product design with metallurgical process design and its fundamentals.
Das, Saptarshi; Pan, Indranil; Das, Shantanu
2015-09-01
An optimal trade-off design for fractional order (FO)-PID controller is proposed with a Linear Quadratic Regulator (LQR) based technique using two conflicting time domain objectives. A class of delayed FO systems with single non-integer order element, exhibiting both sluggish and oscillatory open loop responses, have been controlled here. The FO time delay processes are handled within a multi-objective optimization (MOO) formalism of LQR based FOPID design. A comparison is made between two contemporary approaches of stabilizing time-delay systems withinLQR. The MOO control design methodology yields the Pareto optimal trade-off solutions between the tracking performance and total variation (TV) of the control signal. Tuning rules are formed for the optimal LQR-FOPID controller parameters, using median of the non-dominated Pareto solutions to handle delayed FO processes. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Layout pattern analysis using the Voronoi diagram of line segments
NASA Astrophysics Data System (ADS)
Dey, Sandeep Kumar; Cheilaris, Panagiotis; Gabrani, Maria; Papadopoulou, Evanthia
2016-01-01
Early identification of problematic patterns in very large scale integration (VLSI) designs is of great value as the lithographic simulation tools face significant timing challenges. To reduce the processing time, such a tool selects only a fraction of possible patterns which have a probable area of failure, with the risk of missing some problematic patterns. We introduce a fast method to automatically extract patterns based on their structure and context, using the Voronoi diagram of line-segments as derived from the edges of VLSI design shapes. Designers put line segments around the problematic locations in patterns called "gauges," along which the critical distance is measured. The gauge center is the midpoint of a gauge. We first use the Voronoi diagram of VLSI shapes to identify possible problematic locations, represented as gauge centers. Then we use the derived locations to extract windows containing the problematic patterns from the design layout. The problematic locations are prioritized by the shape and proximity information of the design polygons. We perform experiments for pattern selection in a portion of a 22-nm random logic design layout. The design layout had 38,584 design polygons (consisting of 199,946 line segments) on layer Mx, and 7079 markers generated by an optical rule checker (ORC) tool. The optical rules specify requirements for printing circuits with minimum dimension. Markers are the locations of some optical rule violations in the layout. We verify our approach by comparing the coverage of our extracted patterns to the ORC-generated markers. We further derive a similarity measure between patterns and between layouts. The similarity measure helps to identify a set of representative gauges that reduces the number of patterns for analysis.
DeviceEditor visual biological CAD canvas
2012-01-01
Background Biological Computer Aided Design (bioCAD) assists the de novo design and selection of existing genetic components to achieve a desired biological activity, as part of an integrated design-build-test cycle. To meet the emerging needs of Synthetic Biology, bioCAD tools must address the increasing prevalence of combinatorial library design, design rule specification, and scar-less multi-part DNA assembly. Results We report the development and deployment of web-based bioCAD software, DeviceEditor, which provides a graphical design environment that mimics the intuitive visual whiteboard design process practiced in biological laboratories. The key innovations of DeviceEditor include visual combinatorial library design, direct integration with scar-less multi-part DNA assembly design automation, and a graphical user interface for the creation and modification of design specification rules. We demonstrate how biological designs are rendered on the DeviceEditor canvas, and we present effective visualizations of genetic component ordering and combinatorial variations within complex designs. Conclusions DeviceEditor liberates researchers from DNA base-pair manipulation, and enables users to create successful prototypes using standardized, functional, and visual abstractions. Open and documented software interfaces support further integration of DeviceEditor with other bioCAD tools and software platforms. DeviceEditor saves researcher time and institutional resources through correct-by-construction design, the automation of tedious tasks, design reuse, and the minimization of DNA assembly costs. PMID:22373390
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-07
... Stock Exchange LLC Amending NYSE Rule 1 To Provide for the Designation of Qualified Employees and NYSE... qualified employees to act in place of any person named in a rule as having authority to act under such rule... 1 to provide that the Exchange may formally designate one or more qualified employees to act in...
ERIC Educational Resources Information Center
Lai, Mark H. C.; Kwok, Oi-man
2015-01-01
Educational researchers commonly use the rule of thumb of "design effect smaller than 2" as the justification of not accounting for the multilevel or clustered structure in their data. The rule, however, has not yet been systematically studied in previous research. In the present study, we generated data from three different models…
Cerrada, Christian Jules; Dzubur, Eldin; Blackman, Kacie C. A.; Mays, Vickie; Shoptaw, Steven; Huh, Jimi
2017-01-01
Purpose Cigarette smoking is a preventable risk factor that contributes to unnecessary lung cancer burden among Korean Americans and there is limited research on effective smoking cessation strategies for this population. Smartphone-based smoking cessation apps that leverage just-in-time adaptive interventions (JITAIs) hold promise for smokers attempting to quit. However, little is known about how to develop and tailor a smoking cessation JITAI for Korean American emerging adult (KAEA) smokers. Method This paper documents the development process of MyQuit USC according to design guidelines for JITAI. Our development process builds on findings from a prior ecological momentary assessment study by using qualitative research methods. Semi-structured interviews and a focus group were conducted to inform which intervention options to offer and the decision rules that dictate their delivery. Results Qualitative findings highlighted that (1) smoking episodes are highly context-driven and that (2) KAEA smokers believe they need personalized cessation strategies tailored to different contexts. Thus, MyQuit USC operates via decision rules that guide the delivery of personalized implementation intentions, which are contingent on dynamic factors, to be delivered “just in time” at user-scheduled, high-risk smoking situations. Conclusion Through an iterative design process, informed by quantitative and qualitative formative research, we developed a smoking cessation JITAI tailored specifically for KAEA smokers. Further testing is under way to optimize future versions of the app with the most effective intervention strategies and decision rules. MyQuit USC has the potential to provide cessation support in real-world settings, when KAEAs need them the most. PMID:28070868
Cerrada, Christian Jules; Dzubur, Eldin; Blackman, Kacie C A; Mays, Vickie; Shoptaw, Steven; Huh, Jimi
2017-10-01
Cigarette smoking is a preventable risk factor that contributes to unnecessary lung cancer burden among Korean Americans and there is limited research on effective smoking cessation strategies for this population. Smartphone-based smoking cessation apps that leverage just-in-time adaptive interventions (JITAIs) hold promise for smokers attempting to quit. However, little is known about how to develop and tailor a smoking cessation JITAI for Korean American emerging adult (KAEA) smokers. This paper documents the development process of MyQuit USC according to design guidelines for JITAI. Our development process builds on findings from a prior ecological momentary assessment study by using qualitative research methods. Semi-structured interviews and a focus group were conducted to inform which intervention options to offer and the decision rules that dictate their delivery. Qualitative findings highlighted that (1) smoking episodes are highly context-driven and that (2) KAEA smokers believe they need personalized cessation strategies tailored to different contexts. Thus, MyQuit USC operates via decision rules that guide the delivery of personalized implementation intentions, which are contingent on dynamic factors, to be delivered "just in time" at user-scheduled, high-risk smoking situations. Through an iterative design process, informed by quantitative and qualitative formative research, we developed a smoking cessation JITAI tailored specifically for KAEA smokers. Further testing is under way to optimize future versions of the app with the most effective intervention strategies and decision rules. MyQuit USC has the potential to provide cessation support in real-world settings, when KAEAs need them the most.
Designing Class Methods from Dataflow Diagrams
NASA Astrophysics Data System (ADS)
Shoval, Peretz; Kabeli-Shani, Judith
A method for designing the class methods of an information system is described. The method is part of FOOM - Functional and Object-Oriented Methodology. In the analysis phase of FOOM, two models defining the users' requirements are created: a conceptual data model - an initial class diagram; and a functional model - hierarchical OO-DFDs (object-oriented dataflow diagrams). Based on these models, a well-defined process of methods design is applied. First, the OO-DFDs are converted into transactions, i.e., system processes that supports user task. The components and the process logic of each transaction are described in detail, using pseudocode. Then, each transaction is decomposed, according to well-defined rules, into class methods of various types: basic methods, application-specific methods and main transaction (control) methods. Each method is attached to a proper class; messages between methods express the process logic of each transaction. The methods are defined using pseudocode or message charts.
Virtual modelling of components of a production system as the tool of lean engineering
NASA Astrophysics Data System (ADS)
Monica, Z.
2015-11-01
Between the most effective techniques of manufacturing management is considered the Lean Engineering. The term “lean engineering” was created by Japanese manufacturers. The high efficiency of this method resulted in a meaningful growth in concern in the philosophy of Lean among European companies, and consequently the use of its European markets. Lean philosophy is an approach to manufacturing to minimize the use of all resources, including time. These are resources that are used in the company for a variety of activities. This implies, first identify and then eliminate activities which does not generate added value in the field of design, manufacturing, supply chain management, and customer relations. The producers of these principles not only employ teams multi-professional employees at all levels of the organization, but also use a more automated machines to produce large quantities of products with a high degree of diversity. Lean Engineering is to use a number of principles and practical guidelines that allow you to reduce costs by eliminating absolute extravagance, and also simplification of all manufacturing processes and maintenance. Nowadays it could be applied the powerful engineering programs to realize the concept of Lean Engineering. They could be described using the term CAD/CAM/CAE. They consist of completely different packages for both the design of elements, as well process design. Their common feature is generally considered with their application area. They are used for computer programs assisting the design, development and manufacturing phases of a manufacturing process. The idea of the presented work is to use the Siemens NX software for aiding the process of Lean Engineering system creating. The investigated system is a robotized workcell. In the NX system are created the components of the designed workcell such as machine tools, as industrial robot, as conveyors and buffers. The system let to functionally link these components to simulate the work process and to introduce the rules of Lean Engineering. The purpose is also to determine the rules of Lean designing in such advanced design and simulation environments.
The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies
NASA Technical Reports Server (NTRS)
Mulqueen, Jack; Jones, David; Hopkins, Randy
2011-01-01
This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.
NASA Technical Reports Server (NTRS)
Nieten, Joseph L.; Burke, Roger
1992-01-01
The System Diagnostic Builder (SDB) is an automated software verification and validation tool using state-of-the-art Artificial Intelligence (AI) technologies. The SDB is used extensively by project BURKE at NASA-JSC as one component of a software re-engineering toolkit. The SDB is applicable to any government or commercial organization which performs verification and validation tasks. The SDB has an X-window interface, which allows the user to 'train' a set of rules for use in a rule-based evaluator. The interface has a window that allows the user to plot up to five data parameters (attributes) at a time. Using these plots and a mouse, the user can identify and classify a particular behavior of the subject software. Once the user has identified the general behavior patterns of the software, he can train a set of rules to represent his knowledge of that behavior. The training process builds rules and fuzzy sets to use in the evaluator. The fuzzy sets classify those data points not clearly identified as a particular classification. Once an initial set of rules is trained, each additional data set given to the SDB will be used by a machine learning mechanism to refine the rules and fuzzy sets. This is a passive process and, therefore, it does not require any additional operator time. The evaluation component of the SDB can be used to validate a single software system using some number of different data sets, such as a simulator. Moreover, it can be used to validate software systems which have been re-engineered from one language and design methodology to a totally new implementation.
Luan, Xiaoli; Chen, Qiang; Liu, Fei
2014-09-01
This article presents a new scheme to design full matrix controller for high dimensional multivariable processes based on equivalent transfer function (ETF). Differing from existing ETF method, the proposed ETF is derived directly by exploiting the relationship between the equivalent closed-loop transfer function and the inverse of open-loop transfer function. Based on the obtained ETF, the full matrix controller is designed utilizing the existing PI tuning rules. The new proposed ETF model can more accurately represent the original processes. Furthermore, the full matrix centralized controller design method proposed in this paper is applicable to high dimensional multivariable systems with satisfactory performance. Comparison with other multivariable controllers shows that the designed ETF based controller is superior with respect to design-complexity and obtained performance. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Process-based organization design and hospital efficiency.
Vera, Antonio; Kuntz, Ludwig
2007-01-01
The central idea of process-based organization design is that organizing a firm around core business processes leads to cost reductions and quality improvements. We investigated theoretically and empirically whether the implementation of a process-based organization design is advisable in hospitals. The data came from a database compiled by the Statistical Office of the German federal state of Rheinland-Pfalz and from a written questionnaire, which was sent to the chief executive officers (CEOs) of all 92 hospitals in this federal state. We used data envelopment analysis (DEA) to measure hospital efficiency, and factor analysis and regression analysis to test our hypothesis. Our principal finding is that a high degree of process-based organization has a moderate but significant positive effect on the efficiency of hospitals. The main implication is that hospitals should implement a process-based organization to improve their efficiency. However, to actually achieve positive effects on efficiency, it is of paramount importance to observe some implementation rules, in particular to mobilize physician participation and to create an adequate organizational culture.
Modeling Collective Animal Behavior with a Cognitive Perspective: A Methodological Framework
Weitz, Sebastian; Blanco, Stéphane; Fournier, Richard; Gautrais, Jacques; Jost, Christian; Theraulaz, Guy
2012-01-01
The last decades have seen an increasing interest in modeling collective animal behavior. Some studies try to reproduce as accurately as possible the collective dynamics and patterns observed in several animal groups with biologically plausible, individual behavioral rules. The objective is then essentially to demonstrate that the observed collective features may be the result of self-organizing processes involving quite simple individual behaviors. Other studies concentrate on the objective of establishing or enriching links between collective behavior researches and cognitive or physiological ones, which then requires that each individual rule be carefully validated. Here we discuss the methodological consequences of this additional requirement. Using the example of corpse clustering in ants, we first illustrate that it may be impossible to discriminate among alternative individual rules by considering only observational data collected at the group level. Six individual behavioral models are described: They are clearly distinct in terms of individual behaviors, they all reproduce satisfactorily the collective dynamics and distribution patterns observed in experiments, and we show theoretically that it is strictly impossible to discriminate two of these models even in the limit of an infinite amount of data whatever the accuracy level. A set of methodological steps are then listed and discussed as practical ways to partially overcome this problem. They involve complementary experimental protocols specifically designed to address the behavioral rules successively, conserving group-level data for the overall model validation. In this context, we highlight the importance of maintaining a sharp distinction between model enunciation, with explicit references to validated biological concepts, and formal translation of these concepts in terms of quantitative state variables and fittable functional dependences. Illustrative examples are provided of the benefits expected during the often long and difficult process of refining a behavioral model, designing adapted experimental protocols and inversing model parameters. PMID:22761685
Modeling collective animal behavior with a cognitive perspective: a methodological framework.
Weitz, Sebastian; Blanco, Stéphane; Fournier, Richard; Gautrais, Jacques; Jost, Christian; Theraulaz, Guy
2012-01-01
The last decades have seen an increasing interest in modeling collective animal behavior. Some studies try to reproduce as accurately as possible the collective dynamics and patterns observed in several animal groups with biologically plausible, individual behavioral rules. The objective is then essentially to demonstrate that the observed collective features may be the result of self-organizing processes involving quite simple individual behaviors. Other studies concentrate on the objective of establishing or enriching links between collective behavior researches and cognitive or physiological ones, which then requires that each individual rule be carefully validated. Here we discuss the methodological consequences of this additional requirement. Using the example of corpse clustering in ants, we first illustrate that it may be impossible to discriminate among alternative individual rules by considering only observational data collected at the group level. Six individual behavioral models are described: They are clearly distinct in terms of individual behaviors, they all reproduce satisfactorily the collective dynamics and distribution patterns observed in experiments, and we show theoretically that it is strictly impossible to discriminate two of these models even in the limit of an infinite amount of data whatever the accuracy level. A set of methodological steps are then listed and discussed as practical ways to partially overcome this problem. They involve complementary experimental protocols specifically designed to address the behavioral rules successively, conserving group-level data for the overall model validation. In this context, we highlight the importance of maintaining a sharp distinction between model enunciation, with explicit references to validated biological concepts, and formal translation of these concepts in terms of quantitative state variables and fittable functional dependences. Illustrative examples are provided of the benefits expected during the often long and difficult process of refining a behavioral model, designing adapted experimental protocols and inversing model parameters.
ERIC Educational Resources Information Center
Orfield, Gary, Ed.; Marin, Patricia, Ed.; Flores, Stella M., Ed.; Garces, Liliana M., Ed.
2007-01-01
The United States Supreme Court's landmark 2003 decisions in "Grutter v. Bollinger" and "Gratz v. Bollinger" firmly established that university admissions policies which are designed to promote student body diversity and which employ race in a carefully crafted selection process can withstand constitutional challenge. The Supreme Court ruled that…
Fuzzy Logic Based Autonomous Parallel Parking System with Kalman Filtering
NASA Astrophysics Data System (ADS)
Panomruttanarug, Benjamas; Higuchi, Kohji
This paper presents an emulation of fuzzy logic control schemes for an autonomous parallel parking system in a backward maneuver. There are four infrared sensors sending the distance data to a microcontroller for generating an obstacle-free parking path. Two of them mounted on the front and rear wheels on the parking side are used as the inputs to the fuzzy rules to calculate a proper steering angle while backing. The other two attached to the front and rear ends serve for avoiding collision with other cars along the parking space. At the end of parking processes, the vehicle will be in line with other parked cars and positioned in the middle of the free space. Fuzzy rules are designed based upon a wall following process. Performance of the infrared sensors is improved using Kalman filtering. The design method needs extra information from ultrasonic sensors. Starting from modeling the ultrasonic sensor in 1-D state space forms, one makes use of the infrared sensor as a measurement to update the predicted values. Experimental results demonstrate the effectiveness of sensor improvement.
Online intelligent controllers for an enzyme recovery plant: design methodology and performance.
Leite, M S; Fujiki, T L; Silva, F V; Fileti, A M F
2010-12-27
This paper focuses on the development of intelligent controllers for use in a process of enzyme recovery from pineapple rind. The proteolytic enzyme bromelain (EC 3.4.22.4) is precipitated with alcohol at low temperature in a fed-batch jacketed tank. Temperature control is crucial to avoid irreversible protein denaturation. Fuzzy or neural controllers offer a way of implementing solutions that cover dynamic and nonlinear processes. The design methodology and a comparative study on the performance of fuzzy-PI, neurofuzzy, and neural network intelligent controllers are presented. To tune the fuzzy PI Mamdani controller, various universes of discourse, rule bases, and membership function support sets were tested. A neurofuzzy inference system (ANFIS), based on Takagi-Sugeno rules, and a model predictive controller, based on neural modeling, were developed and tested as well. Using a Fieldbus network architecture, a coolant variable speed pump was driven by the controllers. The experimental results show the effectiveness of fuzzy controllers in comparison to the neural predictive control. The fuzzy PI controller exhibited a reduced error parameter (ITAE), lower power consumption, and better recovery of enzyme activity.
Online Intelligent Controllers for an Enzyme Recovery Plant: Design Methodology and Performance
Leite, M. S.; Fujiki, T. L.; Silva, F. V.; Fileti, A. M. F.
2010-01-01
This paper focuses on the development of intelligent controllers for use in a process of enzyme recovery from pineapple rind. The proteolytic enzyme bromelain (EC 3.4.22.4) is precipitated with alcohol at low temperature in a fed-batch jacketed tank. Temperature control is crucial to avoid irreversible protein denaturation. Fuzzy or neural controllers offer a way of implementing solutions that cover dynamic and nonlinear processes. The design methodology and a comparative study on the performance of fuzzy-PI, neurofuzzy, and neural network intelligent controllers are presented. To tune the fuzzy PI Mamdani controller, various universes of discourse, rule bases, and membership function support sets were tested. A neurofuzzy inference system (ANFIS), based on Takagi-Sugeno rules, and a model predictive controller, based on neural modeling, were developed and tested as well. Using a Fieldbus network architecture, a coolant variable speed pump was driven by the controllers. The experimental results show the effectiveness of fuzzy controllers in comparison to the neural predictive control. The fuzzy PI controller exhibited a reduced error parameter (ITAE), lower power consumption, and better recovery of enzyme activity. PMID:21234106
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-04
... Capital Commitment Schedule (``CCS'') interest; (3) NYSE Rule 70.25 to permit d-Quotes to be designated... that MPL Orders may interact with CCS interest; (3) NYSE Rule 70.25 to permit d- Quotes to be... the CCS pursuant to Rule 1000 would not be permitted to be designated as MPL Orders. The CCS is a...
2016-12-15
In 2011 and 2012, the Secretary, Department of Health and Human Services (HHS), promulgated regulations designed to govern the World Trade Center (WTC) Health Program (Program), including the processes by which eligible responders and survivors may apply for enrollment in the Program, obtain health monitoring and treatment for WTC-related health conditions, and appeal enrollment and treatment decisions, as well as a process to add new conditions to the List of WTC-Related Health Conditions (List). After using the regulations for a number of years, the Administrator of the WTC Health Program identified potential improvements to certain existing provisions, including, but not limited to, appeals of enrollment, certification, and treatment decisions, as well as the procedures for the addition of health conditions for WTC Health Program coverage. He also identified the need to add new regulatory provisions, including, but not limited to, standards for the disenrollment of a WTC Health Program member and decertification of a certified WTC-related health condition. A notice of proposed rulemaking was published on August 17, 2016; this action addresses public comments received on that proposed rulemaking, as well as three interim final rules promulgated since 2011, and finalizes the proposed rule and three interim final rules.
A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base
NASA Technical Reports Server (NTRS)
Kautzmann, Frank N., III
1988-01-01
Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.
50 CFR 424.16 - Proposed rules.
Code of Federal Regulations, 2011 CFR
2011-10-01
... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.16 Proposed rules. (a) General. Based... any proposed rule to list, delist, or reclassify a species, or to designate or revise critical habitat...
Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda
2016-01-01
Background Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules’ performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Methods Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. Results A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2–4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Conclusion Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved. PMID:26730980
Cognitive Tutoring based on Intelligent Decision Support in the PENTHA Instructional Design Model
NASA Astrophysics Data System (ADS)
dall'Acqua, Luisa
2010-06-01
The research finality of this paper is how to support Authors to develop rule driven—subject oriented, adaptable course content, meta-rules—representing the disciplinary epistemology, model of teaching, Learning Path structure, and assessment parameters—for intelligent Tutoring actions in a personalized, adaptive e-Learning environment. The focus is to instruct the student to be a decision manager for himself, able to recognize the elements of a problem, select the necessary information with the perspective of factual choices. In particular, our research intends to provide some fundamental guidelines for the definition of didactical rules and logical relations, that Authors should provide to a cognitive Tutoring system through the use of an Instructional Design method (PENTHA Model) which proposes an educational environment, able to: increase productivity and operability, create conditions for a cooperative dialogue, developing participatory research activities of knowledge, observations and discoveries, customizing the learning design in a complex and holistic vision of the learning / teaching processes.
A Rule-Based Modeling for the Description of Flexible and Self-healing Business Processes
NASA Astrophysics Data System (ADS)
Boukhebouze, Mohamed; Amghar, Youssef; Benharkat, Aïcha-Nabila; Maamar, Zakaria
In this paper we discuss the importance of ensuring that business processes are label robust and agile at the same time robust and agile. To this end, we consider reviewing the way business processes are managed. For instance we consider offering a flexible way to model processes so that changes in regulations are handled through some self-healing mechanisms. These changes may raise exceptions at run-time if not properly reflected on these processes. To this end we propose a new rule based model that adopts the ECA rules and is built upon formal tools. The business logic of a process can be summarized with a set of rules that implement an organization’s policies. Each business rule is formalized using our ECAPE formalism (Event-Condition-Action-Post condition- post Event). This formalism allows translating a process into a graph of rules that is analyzed in terms of reliably and flexibility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguirre, Jordan C.; Hawks, Steven A.; Ferreira, Amy S.
2015-03-18
Design rules are presented for significantly expanding sequential processing (SqP) into previously inaccessible polymer:fullerene systems by tailoring binary solvent blends for fullerene deposition. Starting with a base solvent that has high fullerene solubility, 2-chlorophenol (2-CP), ellipsometry-based swelling experiments are used to investigate different co-solvents for the fullerene-casting solution. By tuning the Flory-Huggins χ parameter of the 2-CP/co-solvent blend, it is possible to optimally swell the polymer of interest for fullerene interdiffusion without dissolution of the polymer underlayer. In this way solar cell power conversion efficiencies are obtained for the PTB7 (poly[(4,8-bis[(2-ethylhexyl)oxy]benzo[1,2-b:4,5-b']dithiophene-2,6-diyl)(3-fluoro-2-[(2-ethylhexyl)carbonyl]thieno[3,4-b]thiophenediyl)]) and PC61BM (phenyl-C61-butyric acid methyl ester) materials combination thatmore » match those of blend-cast films. Both semicrystalline (e.g., P3HT (poly(3-hexylthiophene-2,5-diyl)) and entirely amorphous (e.g., PSDTTT (poly[(4,8-di(2-butyloxy)benzo[1,2-b:4,5-b']dithiophene-2,6-diyl)-alt-(2,5-bis(4,4'-bis(2-octyl)dithieno[3,2-b:2'3'-d]silole-2,6-diyl)thiazolo[5,4-d]thiazole)]) conjugated polymers can be processed into highly efficient photovoltaic devices using the solvent-blend SqP design rules. Grazing-incidence wide-angle x-ray diffraction experiments confirm that proper choice of the fullerene casting co-solvent yields well-ordered interdispersed bulk heterojunction (BHJ) morphologies without the need for subsequent thermal annealing or the use of trace solvent additives (e.g., diiodooctane). The results open SqP to polymer/fullerene systems that are currently incompatible with traditional methods of device fabrication, and make BHJ morphology control a more tractable problem.« less
Designing high-performance cost-efficient embedded SRAM in deep-submicron era
NASA Astrophysics Data System (ADS)
Kobozeva, Olga; Venkatraman, Ramnath; Castagnetti, Ruggero; Duan, Franklin; Kamath, Arvind; Ramesh, Shiva
2004-05-01
We have previously presented the smallest and fastest 6 Transistor (6T)-Static Random Access Memories (SRAM) bitcells for System-on-Chip (SoC) high-density (HD) memories in 0.18 μm and 0.13 μm technologies. Our 1.87 μm2 6TSRAM bitcell with cell current of 47 μA and industry lowest soft error rate (0.35 FIT/Kbit) is used to assemble memory blocks embedded into SoC designs in 0.13 μm process technology. Excellent performance is achieved at a low overall cost, as our bitcells are based on standard CMOS process and demonstrate high yields in manufacturing. This paper discusses our methodology of embedded SRAM bitcell design. The key aspects of our approach are: 1) judicious selection of tightest achievable yet manufacturable design rules to build the cell; 2) compatibility with standard Optical Proximity Correction (OPC) flow; 3) use of parametric testing and yield analysis to achieve excellent design robustness and manufacturability. A thorough understanding of process limitations, particularly those related to photolithography was critical to the successful design and manufacturing of our aggressive, yet robust SRAM bitcells. The patterning of critical layers, such as diffusion, poly gate, contact and metal 1 has profound implications on functionality, electrical performance and manufacturability of memories. We have conducted the development of SRAM bitcells using two approaches for OPC: a) "manual" OPC, wherein the bitcell layout of each of the critical layers is achieved using iterative improvement of layout & aerial image simulation and b) automated OPC-compatible design, wherein the drawn bitcell layout becomes a subject of a full chip OPC. While manual-OPC remains a popular option, automated OPC-compatible bitcell design is very attractive, as it does not require additional development costs to achieve fab-to-fab portability. In both cases we have obtained good results with respect to patterning of the critical layers, electrical performance of the bitcell and memory yields. A critical part of our memory technology development effort is the design of memory-specific test structures that are used for: a) verifying electrical characteristics of SRAM transistors and b) confirming the robustness of the design rules used within the SRAM cell. In addition to electrical test structures, we have a fully functional SRAM test chip called RAMPCM that is composed of sub-blocks each designated to evaluate the robustness of a specific critical design rule used within the bitcells. The results from the electrical testing and RAMPCM yield analysis are used to identify opportunities for improvements in the layout design. The paper will also suggest some techniques that can result in more design friendly OPC solutions. Our work indicates that future IC designs can benefit from an automated OPC tool that can intelligently handle layout modifications according to design priorities.
Kawano, Tomonori
2013-01-01
There have been a wide variety of approaches for handling the pieces of DNA as the “unplugged” tools for digital information storage and processing, including a series of studies applied to the security-related area, such as DNA-based digital barcodes, water marks and cryptography. In the present article, novel designs of artificial genes as the media for storing the digitally compressed data for images are proposed for bio-computing purpose while natural genes principally encode for proteins. Furthermore, the proposed system allows cryptographical application of DNA through biochemically editable designs with capacity for steganographical numeric data embedment. As a model case of image-coding DNA technique application, numerically and biochemically combined protocols are employed for ciphering the given “passwords” and/or secret numbers using DNA sequences. The “passwords” of interest were decomposed into single letters and translated into the font image coded on the separate DNA chains with both the coding regions in which the images are encoded based on the novel run-length encoding rule, and the non-coding regions designed for biochemical editing and the remodeling processes revealing the hidden orientation of letters composing the original “passwords.” The latter processes require the molecular biological tools for digestion and ligation of the fragmented DNA molecules targeting at the polymerase chain reaction-engineered termini of the chains. Lastly, additional protocols for steganographical overwriting of the numeric data of interests over the image-coding DNA are also discussed. PMID:23750303
Learning Problem-Solving Rules as Search Through a Hypothesis Space.
Lee, Hee Seung; Betts, Shawn; Anderson, John R
2016-07-01
Learning to solve a class of problems can be characterized as a search through a space of hypotheses about the rules for solving these problems. A series of four experiments studied how different learning conditions affected the search among hypotheses about the solution rule for a simple computational problem. Experiment 1 showed that a problem property such as computational difficulty of the rules biased the search process and so affected learning. Experiment 2 examined the impact of examples as instructional tools and found that their effectiveness was determined by whether they uniquely pointed to the correct rule. Experiment 3 compared verbal directions with examples and found that both could guide search. The final experiment tried to improve learning by using more explicit verbal directions or by adding scaffolding to the example. While both manipulations improved learning, learning still took the form of a search through a hypothesis space of possible rules. We describe a model that embodies two assumptions: (1) the instruction can bias the rules participants hypothesize rather than directly be encoded into a rule; (2) participants do not have memory for past wrong hypotheses and are likely to retry them. These assumptions are realized in a Markov model that fits all the data by estimating two sets of probabilities. First, the learning condition induced one set of Start probabilities of trying various rules. Second, should this first hypothesis prove wrong, the learning condition induced a second set of Choice probabilities of considering various rules. These findings broaden our understanding of effective instruction and provide implications for instructional design. Copyright © 2015 Cognitive Science Society, Inc.
Munoz, Maria Isabel; Bouldi, Nadia; Barcellini, Flore; Nascimento, Adelaide
2012-01-01
This communication deals with the involvement of ergonomists in a research-action design process of a software platform in radiotherapy. The goal of the design project is to enhance patient safety by designing a workflow software that supports cooperation between professionals producing treatment in radiotherapy. The general framework of our approach is the ergonomics management of a design process, which is based in activity analysis and grounded in participatory design. Two fields are concerned by the present action: a design environment which is a participatory design process that involves software designers, caregivers as future users and ergonomists; and a reference real work setting in radiotherapy. Observations, semi-structured interviews and participatory workshops allow the characterization of activity in radiotherapy dealing with uses of cooperative tools, sources of variability and non-ruled strategies to manage the variability of the situations. This production of knowledge about work searches to enhance the articulation between technocentric and anthropocentric approaches, and helps in clarifying design requirements. An issue of this research-action is to develop a framework to define the parameters of the workflow tool, and the conditions of its deployment.
A novel approach of ensuring layout regularity correct by construction in advanced technologies
NASA Astrophysics Data System (ADS)
Ahmed, Shafquat Jahan; Vaderiya, Yagnesh; Gupta, Radhika; Parthasarathy, Chittoor; Marin, Jean-Claude; Robert, Frederic
2017-03-01
In advanced technology nodes, layout regularity has become a mandatory prerequisite to create robust designs less sensitive to variations in manufacturing process in order to improve yield and minimizing electrical variability. In this paper we describe a method for designing regular full custom layouts based on design and process co-optimization. The method includes various design rule checks that can be used on-the-fly during leaf-cell layout development. We extract a Layout Regularity Index (LRI) from the layouts based on the jogs, alignments and pitches used in the design for any given metal layer. Regularity Index of a layout is the direct indicator of manufacturing yield and is used to compare the relative health of different layout blocks in terms of process friendliness. The method has been deployed for 28nm and 40nm technology nodes for Memory IP and is being extended to other IPs (IO, standard-cell). We have quantified the gain of layout regularity with the deployed method on printability and electrical characteristics by process-variation (PV) band simulation analysis and have achieved up-to 5nm reduction in PV band.
Zhang, Jie; Wang, Yuping; Feng, Junhong
2013-01-01
In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption.
Wang, Yuping; Feng, Junhong
2013-01-01
In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption. PMID:23766683
NASA Astrophysics Data System (ADS)
Sha, Wei E. I.; Zhu, Hugh L.; Chen, Luzhou; Chew, Weng Cho; Choy, Wallace C. H.
2015-02-01
It is well known that transport paths of photocarriers (electrons and holes) before collected by electrodes strongly affect bulk recombination and thus electrical properties of solar cells, including open-circuit voltage and fill factor. For boosting device performance, a general design rule, tailored to arbitrary electron to hole mobility ratio, is proposed to decide the transport paths of photocarriers. Due to a unique ability to localize and concentrate light, plasmonics is explored to manipulate photocarrier transport through spatially redistributing light absorption at the active layer of devices. Without changing the active materials, we conceive a plasmonic-electrical concept, which tunes electrical properties of solar cells via the plasmon-modified optical field distribution, to realize the design rule. Incorporating spectrally and spatially configurable metallic nanostructures, thin-film solar cells are theoretically modelled and experimentally fabricated to validate the design rule and verify the plasmonic-tunable electrical properties. The general design rule, together with the plasmonic-electrical effect, contributes to the evolution of emerging photovoltaics.
NASA Technical Reports Server (NTRS)
Kriegler, F. J.; Gordon, M. F.; Mclaughlin, R. H.; Marshall, R. E.
1975-01-01
The MIDAS (Multivariate Interactive Digital Analysis System) processor is a high-speed processor designed to process multispectral scanner data (from Landsat, EOS, aircraft, etc.) quickly and cost-effectively to meet the requirements of users of remote sensor data, especially from very large areas. MIDAS consists of a fast multipipeline preprocessor and classifier, an interactive color display and color printer, and a medium scale computer system for analysis and control. The system is designed to process data having as many as 16 spectral bands per picture element at rates of 200,000 picture elements per second into as many as 17 classes using a maximum likelihood decision rule.
A methodology proposal for collaborative business process elaboration using a model-driven approach
NASA Astrophysics Data System (ADS)
Mu, Wenxin; Bénaben, Frédérick; Pingaud, Hervé
2015-05-01
Business process management (BPM) principles are commonly used to improve processes within an organisation. But they can equally be applied to supporting the design of an Information System (IS). In a collaborative situation involving several partners, this type of BPM approach may be useful to support the design of a Mediation Information System (MIS), which would ensure interoperability between the partners' ISs (which are assumed to be service oriented). To achieve this objective, the first main task is to build a collaborative business process cartography. The aim of this article is to present a method for bringing together collaborative information and elaborating collaborative business processes from the information gathered (by using a collaborative situation framework, an organisational model, an informational model, a functional model and a metamodel and by using model transformation rules).
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-25
... DEPARTMENT OF AGRICULTURE Agricultural Marketing Service 7 CFR Part 922 [Docket No. AMS-FV-12-0028... Regulations AGENCY: Agricultural Marketing Service, USDA. ACTION: Affirmation of interim rule as final rule... the marketing order for apricots grown in designated Counties in Washington. The interim rule...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rawls, G.; Newhouse, N.; Rana, M.
2010-04-13
The Boiler and Pressure Vessel Project Team on Hydrogen Tanks was formed in 2004 to develop Code rules to address the various needs that had been identified for the design and construction of up to 15000 psi hydrogen storage vessel. One of these needs was the development of Code rules for high pressure composite vessels with non-load sharing liners for stationary applications. In 2009, ASME approved new Appendix 8, for Section X Code which contains the rules for these vessels. These vessels are designated as Class III vessels with design pressure ranging from 20.7 MPa (3,000 ps)i to 103.4 MPamore » (15,000 psi) and maximum allowable outside liner diameter of 2.54 m (100 inches). The maximum design life of these vessels is limited to 20 years. Design, fabrication, and examination requirements have been specified, included Acoustic Emission testing at time of manufacture. The Code rules include the design qualification testing of prototype vessels. Qualification includes proof, expansion, burst, cyclic fatigue, creep, flaw, permeability, torque, penetration, and environmental testing.« less
Comparison of optimization algorithms for the slow shot phase in HPDC
NASA Astrophysics Data System (ADS)
Frings, Markus; Berkels, Benjamin; Behr, Marek; Elgeti, Stefanie
2018-05-01
High-pressure die casting (HPDC) is a popular manufacturing process for aluminum processing. The slow shot phase in HPDC is the first phase of this process. During this phase, the molten metal is pushed towards the cavity under moderate plunger movement. The so-called shot curve describes this plunger movement. A good design of the shot curve is important to produce high-quality cast parts. Three partially competing process goals characterize the slow shot phase: (1) reducing air entrapment, (2) avoiding temperature loss, and (3) minimizing oxide caused by the air-aluminum contact. Due to the rough process conditions with high pressure and temperature, it is hard to design the shot curve experimentally. There exist a few design rules that are based on theoretical considerations. Nevertheless, the quality of the shot curve design still depends on the experience of the machine operator. To improve the shot curve it seems to be natural to use numerical optimization. This work compares different optimization strategies for the slow shot phase optimization. The aim is to find the best optimization approach on a simple test problem.
Elements of decisional dynamics: An agent-based approach applied to artificial financial market
NASA Astrophysics Data System (ADS)
Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille
2018-02-01
This paper introduces an original mathematical description for describing agents' decision-making process in the case of problems affected by both individual and collective behaviors in systems characterized by nonlinear, path dependent, and self-organizing interactions. An application to artificial financial markets is proposed by designing a multi-agent system based on the proposed formalization. In this application, agents' decision-making process is based on fuzzy logic rules and the price dynamics is purely deterministic according to the basic matching rules of a central order book. Finally, while putting most parameters under evolutionary control, the computational agent-based system is able to replicate several stylized facts of financial time series (distributions of stock returns showing a heavy tail with positive excess kurtosis, absence of autocorrelations in stock returns, and volatility clustering phenomenon).
Elements of decisional dynamics: An agent-based approach applied to artificial financial market.
Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille
2018-02-01
This paper introduces an original mathematical description for describing agents' decision-making process in the case of problems affected by both individual and collective behaviors in systems characterized by nonlinear, path dependent, and self-organizing interactions. An application to artificial financial markets is proposed by designing a multi-agent system based on the proposed formalization. In this application, agents' decision-making process is based on fuzzy logic rules and the price dynamics is purely deterministic according to the basic matching rules of a central order book. Finally, while putting most parameters under evolutionary control, the computational agent-based system is able to replicate several stylized facts of financial time series (distributions of stock returns showing a heavy tail with positive excess kurtosis, absence of autocorrelations in stock returns, and volatility clustering phenomenon).
Machine Learning Techniques in Optimal Design
NASA Technical Reports Server (NTRS)
Cerbone, Giuseppe
1992-01-01
Many important applications can be formalized as constrained optimization tasks. For example, we are studying the engineering domain of two-dimensional (2-D) structural design. In this task, the goal is to design a structure of minimum weight that bears a set of loads. A solution to a design problem in which there is a single load (L) and two stationary support points (S1 and S2) consists of four members, E1, E2, E3, and E4 that connect the load to the support points is discussed. In principle, optimal solutions to problems of this kind can be found by numerical optimization techniques. However, in practice [Vanderplaats, 1984] these methods are slow and they can produce different local solutions whose quality (ratio to the global optimum) varies with the choice of starting points. Hence, their applicability to real-world problems is severely restricted. To overcome these limitations, we propose to augment numerical optimization by first performing a symbolic compilation stage to produce: (a) objective functions that are faster to evaluate and that depend less on the choice of the starting point and (b) selection rules that associate problem instances to a set of recommended solutions. These goals are accomplished by successive specializations of the problem class and of the associated objective functions. In the end, this process reduces the problem to a collection of independent functions that are fast to evaluate, that can be differentiated symbolically, and that represent smaller regions of the overall search space. However, the specialization process can produce a large number of sub-problems. This is overcome by deriving inductively selection rules which associate problems to small sets of specialized independent sub-problems. Each set of candidate solutions is chosen to minimize a cost function which expresses the tradeoff between the quality of the solution that can be obtained from the sub-problem and the time it takes to produce it. The overall solution to the problem, is then obtained by solving in parallel each of the sub-problems in the set and computing the one with the minimum cost. In addition to speeding up the optimization process, our use of learning methods also relieves the expert from the burden of identifying rules that exactly pinpoint optimal candidate sub-problems. In real engineering tasks it is usually too costly to the engineers to derive such rules. Therefore, this paper also contributes to a further step towards the solution of the knowledge acquisition bottleneck [Feigenbaum, 1977] which has somewhat impaired the construction of rulebased expert systems.
Gong, Xing-Chu; Chen, Teng; Qu, Hai-Bin
2017-03-01
Quality by design (QbD) concept is an advanced pharmaceutical quality control concept. The application of QbD concept in the research and development of pharmaceutical processes of traditional Chinese medicines (TCM) mainly contains five parts, including the definition of critical processes and their evaluation criteria, the determination of critical process parameters and critical material attributes, the establishment of quantitative models, the development of design space, as well as the application and continuous improvement of control strategy. In this work, recent research advances in QbD concept implementation methods in the secondary development of Chinese patent medicines were reviewed, and five promising fields of the implementation of QbD concept were pointed out, including the research and development of TCM new drugs and Chinese medicine granules for formulation, modeling of pharmaceutical processes, development of control strategy based on industrial big data, strengthening the research of process amplification rules, and the development of new pharmaceutical equipment.. Copyright© by the Chinese Pharmaceutical Association.
Design Rules and Analysis of a Capture Mechanism for Rendezvous between a Space Tether and Payload
NASA Technical Reports Server (NTRS)
Sorensen, Kirk F.; Canfield, Stephen L.; Norris, Marshall A.
2006-01-01
Momentum-exchange/electrodynamic reboost (MXER) tether systems have been proposed to serve as an "upper stage in space". A MXER tether station would boost spacecraft from low Earth orbit to a high-energy orbit quickly, like a high-thrust rocket. Then, it would slowly rebuild its orbital momentum through electrodynamic thrust, minimizing the use of propellant. One of the primary challenges in developing a momentum-exchange/electrodynamic reboost tether system as identified by the 2003 MXER Technology Assessment Group is in the development of a mechanism that will enable the processes of capture, carry and release of a payload by the rotating tether as required by the MXER tether approach. This paper will present a concept that will achieve the desired goals of the capture system. This solution is presented as a multi-DOF (degree-of-freedom) capture mechanism with nearly passive operation that features matching of the capture space and expected window of capture error, efficient use of mass and nearly passive actuation during the capture process. This paper will describe the proposed capture mechanism concept and provide an evaluation of the concept through a dynamic model and experimental tests performed on a prototype article of the mechanism in a dynamically similar environment. This paper will also develop a set of rules to guide the design of such a capture mechanism based on analytical and experimental analyses. The primary contributions of this paper will be a description of the proposed capture mechanism concept, a collection of rules to guide its design, and empirical and model information that can be used to evaluate the capability of the concept
Sediq, Amany Mohy-Eldin; Abdel-Azeez, Ahmad GabAllahm Hala
2014-01-01
The current practice in Zagazig University Hospitals Laboratories (ZUHL) is manual verification of all results for the later release of reports. These processes are time consuming and tedious, with large inter-individual variation that slows the turnaround time (TAT). Autoverification is the process of comparing patient results, generated from interfaced instruments, against laboratory-defined acceptance parameters. This study describes an autoverification engine designed and implemented in ZUHL, Egypt. A descriptive study conducted at ZUHL, from January 2012-December 2013. A rule-based system was used in designing an autoverification engine. The engine was preliminarily evaluated on a thyroid function panel. A total of 563 rules were written and tested on 563 simulated cases and 1673 archived cases. The engine decisions were compared to that of 4 independent expert reviewers. The impact of engine implementation on TAT was evaluated. Agreement was achieved among the 4 reviewers in 55.5% of cases, and with the engine in 51.5% of cases. The autoverification rate for archived cases was 63.8%. Reported lab TAT was reduced by 34.9%, and TAT segment from the completion of analysis to verification was reduced by 61.8%. The developed rule-based autoverification system has a verification rate comparable to that of the commercially available software. However, the in-house development of this system had saved the hospital the cost of commercially available ones. The implementation of the system shortened the TAT and minimized the number of samples that needed staff revision, which enabled laboratory staff to devote more time and effort to handle problematic test results and to improve patient care quality.
NASA Technical Reports Server (NTRS)
1994-01-01
C Language Integrated Production System (CLIPS), a NASA-developed software shell for developing expert systems, has been embedded in a PC-based expert system for training oil rig personnel in monitoring oil drilling. Oil drilling rigs if not properly maintained for possible blowouts pose hazards to human life, property and the environment may be destroyed. CLIPS is designed to permit the delivery of artificial intelligence on computer. A collection of rules is set up and, as facts become known, these rules are applied. In the Well Site Advisor, CLIPS provides the capability to accurately process, predict and interpret well data in a real time mode. CLIPS was provided to INTEQ by COSMIC.
Decision net, directed graph, and neural net processing of imaging spectrometer data
NASA Technical Reports Server (NTRS)
Casasent, David; Liu, Shiaw-Dong; Yoneyama, Hideyuki; Barnard, Etienne
1989-01-01
A decision-net solution involving a novel hierarchical classifier and a set of multiple directed graphs, as well as a neural-net solution, are respectively presented for large-class problem and mixture problem treatments of imaging spectrometer data. The clustering method for hierarchical classifier design, when used with multiple directed graphs, yields an efficient decision net. New directed-graph rules for reducing local maxima as well as the number of perturbations required, and the new starting-node rules for extending the reachability and reducing the search time of the graphs, are noted to yield superior results, as indicated by an illustrative 500-class imaging spectrometer problem.
Genetic reinforcement learning through symbiotic evolution for fuzzy controller design.
Juang, C F; Lin, J Y; Lin, C T
2000-01-01
An efficient genetic reinforcement learning algorithm for designing fuzzy controllers is proposed in this paper. The genetic algorithm (GA) adopted in this paper is based upon symbiotic evolution which, when applied to fuzzy controller design, complements the local mapping property of a fuzzy rule. Using this Symbiotic-Evolution-based Fuzzy Controller (SEFC) design method, the number of control trials, as well as consumed CPU time, are considerably reduced when compared to traditional GA-based fuzzy controller design methods and other types of genetic reinforcement learning schemes. Moreover, unlike traditional fuzzy controllers, which partition the input space into a grid, SEFC partitions the input space in a flexible way, thus creating fewer fuzzy rules. In SEFC, different types of fuzzy rules whose consequent parts are singletons, fuzzy sets, or linear equations (TSK-type fuzzy rules) are allowed. Further, the free parameters (e.g., centers and widths of membership functions) and fuzzy rules are all tuned automatically. For the TSK-type fuzzy rule especially, which put the proposed learning algorithm in use, only the significant input variables are selected to participate in the consequent of a rule. The proposed SEFC design method has been applied to different simulated control problems, including the cart-pole balancing system, a magnetic levitation system, and a water bath temperature control system. The proposed SEFC has been verified to be efficient and superior from these control problems, and from comparisons with some traditional GA-based fuzzy systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erlenwein, P.; Frisch, W.; Kafka, P.
Nuclear reactors of 200- to 400-MW(thermal) power for district heating are the subject of increasing interest, and several specific designs are under discussion today. In the Federal Republic of Germany (FRG), the Kraftwerk Union AG has presented a 200-MW(thermal) heating reactor concept. The main safety issues of this design are assessed. In this design, the primary system is fully integrated into the reactor pressure vessel (RPV), which is tightly enclosed by the containment. The low process parameters like pressure, temperature, and power density and the high ratio of coolant volume to thermal power allow the design of simple safety features.more » This is supported by the preference of passive over active components. A special feature is a newly designed hydraulic control and rod drive mechanism, which is also integrated into the RPV. Within the safety assessment an overview of the relevant FRG safety rules and guidelines, developed mainly for large, electricity-generating power plants, is given. Included is a discussion of the extent to which these licensing rules can be applied to the concept of heating reactors.« less
VASSAR: Value assessment of system architectures using rules
NASA Astrophysics Data System (ADS)
Selva, D.; Crawley, E. F.
A key step of the mission development process is the selection of a system architecture, i.e., the layout of the major high-level system design decisions. This step typically involves the identification of a set of candidate architectures and a cost-benefit analysis to compare them. Computational tools have been used in the past to bring rigor and consistency into this process. These tools can automatically generate architectures by enumerating different combinations of decisions and options. They can also evaluate these architectures by applying cost models and simplified performance models. Current performance models are purely quantitative tools that are best fit for the evaluation of the technical performance of mission design. However, assessing the relative merit of a system architecture is a much more holistic task than evaluating performance of a mission design. Indeed, the merit of a system architecture comes from satisfying a variety of stakeholder needs, some of which are easy to quantify, and some of which are harder to quantify (e.g., elegance, scientific value, political robustness, flexibility). Moreover, assessing the merit of a system architecture at these very early stages of design often requires dealing with a mix of: a) quantitative and semi-qualitative data; objective and subjective information. Current computational tools are poorly suited for these purposes. In this paper, we propose a general methodology that can used to assess the relative merit of several candidate system architectures under the presence of objective, subjective, quantitative, and qualitative stakeholder needs. The methodology called VASSAR (Value ASsessment for System Architectures using Rules). The major underlying assumption of the VASSAR methodology is that the merit of a system architecture can assessed by comparing the capabilities of the architecture with the stakeholder requirements. Hence for example, a candidate architecture that fully satisfies all critical sta- eholder requirements is a good architecture. The assessment process is thus fundamentally seen as a pattern matching process where capabilities match requirements, which motivates the use of rule-based expert systems (RBES). This paper describes the VASSAR methodology and shows how it can be applied to a large complex space system, namely an Earth observation satellite system. Companion papers show its applicability to the NASA space communications and navigation program and the joint NOAA-DoD NPOESS program.
Identified research directions for using manufacturing knowledge earlier in the product lifecycle
Hedberg, Thomas D.; Hartman, Nathan W.; Rosche, Phil; Fischer, Kevin
2016-01-01
Design for Manufacturing (DFM), especially the use of manufacturing knowledge to support design decisions, has received attention in the academic domain. However, industry practice has not been studied enough to provide solutions that are mature for industry. The current state of the art for DFM is often rule-based functionality within Computer-Aided Design (CAD) systems that enforce specific design requirements. That rule-based functionality may or may not dynamically affect geometry definition. And, if rule-based functionality exists in the CAD system, it is typically a customization on a case-by-case basis. Manufacturing knowledge is a phrase with vast meanings, which may include knowledge on the effects of material properties decisions, machine and process capabilities, or understanding the unintended consequences of design decisions on manufacturing. One of the DFM questions to answer is how can manufacturing knowledge, depending on its definition, be used earlier in the product lifecycle to enable a more collaborative development environment? This paper will discuss the results of a workshop on manufacturing knowledge that highlights several research questions needing more study. This paper proposes recommendations for investigating the relationship of manufacturing knowledge with shape, behavior, and context characteristics of product to produce a better understanding of what knowledge is most important. In addition, the proposal includes recommendations for investigating the system-level barriers to reusing manufacturing knowledge and how model-based manufacturing may ease the burden of knowledge sharing. Lastly, the proposal addresses the direction of future research for holistic solutions of using manufacturing knowledge earlier in the product lifecycle. PMID:27990027
Fuzzy self-learning control for magnetic servo system
NASA Technical Reports Server (NTRS)
Tarn, J. H.; Kuo, L. T.; Juang, K. Y.; Lin, C. E.
1994-01-01
It is known that an effective control system is the key condition for successful implementation of high-performance magnetic servo systems. Major issues to design such control systems are nonlinearity; unmodeled dynamics, such as secondary effects for copper resistance, stray fields, and saturation; and that disturbance rejection for the load effect reacts directly on the servo system without transmission elements. One typical approach to design control systems under these conditions is a special type of nonlinear feedback called gain scheduling. It accommodates linear regulators whose parameters are changed as a function of operating conditions in a preprogrammed way. In this paper, an on-line learning fuzzy control strategy is proposed. To inherit the wealth of linear control design, the relations between linear feedback and fuzzy logic controllers have been established. The exercise of engineering axioms of linear control design is thus transformed into tuning of appropriate fuzzy parameters. Furthermore, fuzzy logic control brings the domain of candidate control laws from linear into nonlinear, and brings new prospects into design of the local controllers. On the other hand, a self-learning scheme is utilized to automatically tune the fuzzy rule base. It is based on network learning infrastructure; statistical approximation to assign credit; animal learning method to update the reinforcement map with a fast learning rate; and temporal difference predictive scheme to optimize the control laws. Different from supervised and statistical unsupervised learning schemes, the proposed method learns on-line from past experience and information from the process and forms a rule base of an FLC system from randomly assigned initial control rules.
Identified research directions for using manufacturing knowledge earlier in the product lifecycle.
Hedberg, Thomas D; Hartman, Nathan W; Rosche, Phil; Fischer, Kevin
2017-01-01
Design for Manufacturing (DFM), especially the use of manufacturing knowledge to support design decisions, has received attention in the academic domain. However, industry practice has not been studied enough to provide solutions that are mature for industry. The current state of the art for DFM is often rule-based functionality within Computer-Aided Design (CAD) systems that enforce specific design requirements. That rule-based functionality may or may not dynamically affect geometry definition. And, if rule-based functionality exists in the CAD system, it is typically a customization on a case-by-case basis. Manufacturing knowledge is a phrase with vast meanings, which may include knowledge on the effects of material properties decisions, machine and process capabilities, or understanding the unintended consequences of design decisions on manufacturing. One of the DFM questions to answer is how can manufacturing knowledge, depending on its definition, be used earlier in the product lifecycle to enable a more collaborative development environment? This paper will discuss the results of a workshop on manufacturing knowledge that highlights several research questions needing more study. This paper proposes recommendations for investigating the relationship of manufacturing knowledge with shape, behavior, and context characteristics of product to produce a better understanding of what knowledge is most important. In addition, the proposal includes recommendations for investigating the system-level barriers to reusing manufacturing knowledge and how model-based manufacturing may ease the burden of knowledge sharing. Lastly, the proposal addresses the direction of future research for holistic solutions of using manufacturing knowledge earlier in the product lifecycle.
On the Design of a Fuzzy Logic-Based Control System for Freeze-Drying Processes.
Fissore, Davide
2016-12-01
This article is focused on the design of a fuzzy logic-based control system to optimize a drug freeze-drying process. The goal of the system is to keep product temperature as close as possible to the threshold value of the formulation being processed, without trespassing it, in such a way that product quality is not jeopardized and the sublimation flux is maximized. The method involves the measurement of product temperature and a set of rules that have been obtained through process simulation with the goal to obtain a unique set of rules for products with very different characteristics. Input variables are the difference between the temperature of the product and the threshold value, the difference between the temperature of the heating fluid and that of the product, and the rate of change of product temperature. The output variables are the variation of the temperature of the heating fluid and the pressure in the drying chamber. The effect of the starting value of the input variables and of the control interval has been investigated, thus resulting in the optimal configuration of the control system. Experimental investigation carried out in a pilot-scale freeze-dryer has been carried out to validate the proposed system. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
An expert system for integrated structural analysis and design optimization for aerospace structures
NASA Technical Reports Server (NTRS)
1992-01-01
The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and design optimization tasks in the integrated aerospace structural design process. These expert systems were developed to work in conjunction with procedural finite element structural analysis and design optimization modules (developed in-house at SAT, Inc.). The complete software, AutoDesign, so developed, can be used for integrated 'intelligent' structural analysis and design optimization. The software was beta-tested at a variety of companies, used by a range of engineers with different levels of background and expertise. Based on the feedback obtained by such users, conclusions were developed and are provided.
An expert system for integrated structural analysis and design optimization for aerospace structures
NASA Astrophysics Data System (ADS)
1992-04-01
The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and design optimization tasks in the integrated aerospace structural design process. These expert systems were developed to work in conjunction with procedural finite element structural analysis and design optimization modules (developed in-house at SAT, Inc.). The complete software, AutoDesign, so developed, can be used for integrated 'intelligent' structural analysis and design optimization. The software was beta-tested at a variety of companies, used by a range of engineers with different levels of background and expertise. Based on the feedback obtained by such users, conclusions were developed and are provided.
Simulation-based MDP verification for leading-edge masks
NASA Astrophysics Data System (ADS)
Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki
2017-07-01
For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.
A Methodology for Multihazards Load Combinations of Earthquake and Heavy Trucks for Bridges
Wang, Xu; Sun, Baitao
2014-01-01
Issues of load combinations of earthquakes and heavy trucks are important contents in multihazards bridge design. Current load resistance factor design (LRFD) specifications usually treat extreme hazards alone and have no probabilistic basis in extreme load combinations. Earthquake load and heavy truck load are considered as random processes with respective characteristics, and the maximum combined load is not the simple superimposition of their maximum loads. Traditional Ferry Borges-Castaneda model that considers load lasting duration and occurrence probability well describes random process converting to random variables and load combinations, but this model has strict constraint in time interval selection to obtain precise results. Turkstra's rule considers one load reaching its maximum value in bridge's service life combined with another load with its instantaneous value (or mean value), which looks more rational, but the results are generally unconservative. Therefore, a modified model is presented here considering both advantages of Ferry Borges-Castaneda's model and Turkstra's rule. The modified model is based on conditional probability, which can convert random process to random variables relatively easily and consider the nonmaximum factor in load combinations. Earthquake load and heavy truck load combinations are employed to illustrate the model. Finally, the results of a numerical simulation are used to verify the feasibility and rationality of the model. PMID:24883347
ELIPS: Toward a Sensor Fusion Processor on a Chip
NASA Technical Reports Server (NTRS)
Daud, Taher; Stoica, Adrian; Tyson, Thomas; Li, Wei-te; Fabunmi, James
1998-01-01
The paper presents the concept and initial tests from the hardware implementation of a low-power, high-speed reconfigurable sensor fusion processor. The Extended Logic Intelligent Processing System (ELIPS) processor is developed to seamlessly combine rule-based systems, fuzzy logic, and neural networks to achieve parallel fusion of sensor in compact low power VLSI. The first demonstration of the ELIPS concept targets interceptor functionality; other applications, mainly in robotics and autonomous systems are considered for the future. The main assumption behind ELIPS is that fuzzy, rule-based and neural forms of computation can serve as the main primitives of an "intelligent" processor. Thus, in the same way classic processors are designed to optimize the hardware implementation of a set of fundamental operations, ELIPS is developed as an efficient implementation of computational intelligence primitives, and relies on a set of fuzzy set, fuzzy inference and neural modules, built in programmable analog hardware. The hardware programmability allows the processor to reconfigure into different machines, taking the most efficient hardware implementation during each phase of information processing. Following software demonstrations on several interceptor data, three important ELIPS building blocks (a fuzzy set preprocessor, a rule-based fuzzy system and a neural network) have been fabricated in analog VLSI hardware and demonstrated microsecond-processing times.
NASA Technical Reports Server (NTRS)
Frassinelli, G. J.
1972-01-01
Cost estimates and funding schedules are presented for a given configuration and costing ground rules. Cost methodology is described and the cost evolution from a baseline configuration to a selected configuration is given, emphasizing cases in which cost was a design driver. Programmatic cost avoidance techniques are discussed.
ERIC Educational Resources Information Center
Shantz, Kailen
2017-01-01
This study reports on a self-paced reading experiment in which native and non-native speakers of English read sentences designed to evaluate the predictions of usage-based and rule-based approaches to second language acquisition (SLA). Critical stimuli were four-word sequences embedded into sentences in which phrase frequency and grammaticality…
Knowledge Assisted Integrated Design of a Component and Its Manufacturing Process
NASA Astrophysics Data System (ADS)
Gautham, B. P.; Kulkarni, Nagesh; Khan, Danish; Zagade, Pramod; Reddy, Sreedhar; Uppaluri, Rohith
Integrated design of a product and its manufacturing processes would significantly reduce the total cost of the products as well as the cost of its development. However this would only be possible if we have a platform that allows us to link together simulations tools used for product design, performance evaluation and its manufacturing processes in a closed loop. In addition to that having a comprehensive knowledgebase that provides systematic knowledge guided assistance to product or process designers who may not possess in-depth design knowledge or in-depth knowledge of the simulation tools, would significantly speed up the end-to-end design process. In this paper, we propose a process and illustrate a case for achieving an integrated product and manufacturing process design assisted by knowledge support for the user to make decisions at various stages. We take transmission component design as an example. The example illustrates the design of a gear for its geometry, material selection and its manufacturing processes, particularly, carburizing-quenching and tempering, and feeding the material properties predicted during heat treatment into performance estimation in a closed loop. It also identifies and illustrates various decision stages in the integrated life cycle and discusses the use of knowledge engineering tools such as rule-based guidance, to assist the designer make informed decisions. Simulation tools developed on various commercial, open-source platforms as well as in-house tools along with knowledge engineering tools are linked to build a framework with appropriate navigation through user-friendly interfaces. This is illustrated through examples in this paper.
The Good, the Bad, and the Ugly: A Theoretical Framework for the Assessment of Continuous Colormaps.
Bujack, Roxana; Turton, Terece L; Samsel, Francesca; Ware, Colin; Rogers, David H; Ahrens, James
2018-01-01
A myriad of design rules for what constitutes a "good" colormap can be found in the literature. Some common rules include order, uniformity, and high discriminative power. However, the meaning of many of these terms is often ambiguous or open to interpretation. At times, different authors may use the same term to describe different concepts or the same rule is described by varying nomenclature. These ambiguities stand in the way of collaborative work, the design of experiments to assess the characteristics of colormaps, and automated colormap generation. In this paper, we review current and historical guidelines for colormap design. We propose a specified taxonomy and provide unambiguous mathematical definitions for the most common design rules.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Designated contract market and swap execution facility position limits and accountability rules. (a) Spot... rules and procedures for monitoring and enforcing spot-month position limits set at levels no greater... monitoring and enforcing spot-month position limits set at levels no greater than 25 percent of estimated...
Code of Federal Regulations, 2013 CFR
2013-04-01
... Designated contract market and swap execution facility position limits and accountability rules. (a) Spot... rules and procedures for monitoring and enforcing spot-month position limits set at levels no greater... monitoring and enforcing spot-month position limits set at levels no greater than 25 percent of estimated...
Methodology for balancing design and process tradeoffs for deep-subwavelength technologies
NASA Astrophysics Data System (ADS)
Graur, Ioana; Wagner, Tina; Ryan, Deborah; Chidambarrao, Dureseti; Kumaraswamy, Anand; Bickford, Jeanne; Styduhar, Mark; Wang, Lee
2011-04-01
For process development of deep-subwavelength technologies, it has become accepted practice to use model-based simulation to predict systematic and parametric failures. Increasingly, these techniques are being used by designers to ensure layout manufacturability, as an alternative to, or complement to, restrictive design rules. The benefit of model-based simulation tools in the design environment is that manufacturability problems are addressed in a design-aware way by making appropriate trade-offs, e.g., between overall chip density and manufacturing cost and yield. The paper shows how library elements and the full ASIC design flow benefit from eliminating hot spots and improving design robustness early in the design cycle. It demonstrates a path to yield optimization and first time right designs implemented in leading edge technologies. The approach described herein identifies those areas in the design that could benefit from being fixed early, leading to design updates and avoiding later design churn by careful selection of design sensitivities. This paper shows how to achieve this goal by using simulation tools incorporating various models from sparse to rigorously physical, pattern detection and pattern matching, checking and validating failure thresholds.
Automating expert role to determine design concept in Kansei Engineering
NASA Astrophysics Data System (ADS)
Lokman, Anitawati Mohd; Haron, Mohammad Bakri Che; Abidin, Siti Zaleha Zainal; Khalid, Noor Elaiza Abd
2016-02-01
Affect has become imperative in product quality. In affective design field, Kansei Engineering (KE) has been recognized as a technology that enables discovery of consumer's emotion and formulation of guide to design products that win consumers in the competitive market. Albeit powerful technology, there is no rule of thumb in its analysis and interpretation process. KE expertise is required to determine sets of related Kansei and the significant concept of emotion. Many research endeavors become handicapped with the limited number of available and accessible KE experts. This work is performed to simulate the role of experts with the use of Natphoric algorithm thus providing sound solution to the complexity and flexibility in KE. The algorithm is designed to learn the process by implementing training datasets taken from previous KE research works. A framework for automated KE is then designed to realize the development of automated KE system. A comparative analysis is performed to determine feasibility of the developed prototype to automate the process. The result shows that the significant Kansei is determined by manual KE implementation and the automated process is highly similar. KE research advocates will benefit this system to automatically determine significant design concepts.
Design and application of process control charting methodologies to gamma irradiation practices
NASA Astrophysics Data System (ADS)
Saylor, M. C.; Connaghan, J. P.; Yeadon, S. C.; Herring, C. M.; Jordan, T. M.
2002-12-01
The relationship between the contract irradiation facility and the customer has historically been based upon a "PASS/FAIL" approach with little or no quality metrics used to gage the control of the irradiation process. Application of process control charts, designed in coordination with mathematical simulation of routine radiation processing, can provide a basis for understanding irradiation events. By using tools that simulate the physical rules associated with the irradiation process, end-users can explore process-related boundaries and the effects of process changes. Consequently, the relationship between contractor and customer can evolve based on the derived knowledge. The resulting level of mutual understanding of the irradiation process and its resultant control benefits both the customer and contract operation, and provides necessary assurances to regulators. In this article we examine the complementary nature of theoretical (point kernel) and experimental (dosimetric) process evaluation, and the resulting by-product of improved understanding, communication and control generated through the implementation of effective process control charting strategies.
Klepiszewski, K; Schmitt, T G
2002-01-01
While conventional rule based, real time flow control of sewer systems is in common use, control systems based on fuzzy logic have been used only rarely, but successfully. The intention of this study is to compare a conventional rule based control of a combined sewer system with a fuzzy logic control by using hydrodynamic simulation. The objective of both control strategies is to reduce the combined sewer overflow volume by an optimization of the utilized storage capacities of four combined sewer overflow tanks. The control systems affect the outflow of four combined sewer overflow tanks depending on the water levels inside the structures. Both systems use an identical rule base. The developed control systems are tested and optimized for a single storm event which affects heterogeneously hydraulic load conditions and local discharge. Finally the efficiencies of the two different control systems are compared for two more storm events. The results indicate that the conventional rule based control and the fuzzy control similarly reach the objective of the control strategy. In spite of the higher expense to design the fuzzy control system its use provides no advantages in this case.
Design of a naturalized flow regime—An example from the Lower Missouri River, USA
Jacobson, Robert B.; Galat, David L.
2008-01-01
group of river managers, stakeholders, and scientists met during summer 2005 to design a more naturalized flow regime for the Lower Missouri River (LMOR). The objective was to comply with requirements under the U.S. Endangered Species Act to support reproduction and survival of threatened and endangered species, with emphasis on the endangered pallid sturgeon (Scaphirhynchus albus), while minimizing negative effects to existing social and economic benefits of prevailing river management. Specific hydrograph requirements for pallid sturgeon reproduction are unknown, hence much of the design process was based on features of the natural flow regime. Environmental flow components (EFCs) extracted from the reference natural flow regime were used to design and assess performance of alternative flow regimes.The design process incorporated a primary stage in which conceptual hydrographs were developed and assessed for their general ecological and social-economic performance. The second stage accounted for hydroclimatic variation by coding the conceptual hydrographs into reservoir release rules, adding constraints for downstream flooding and low-storage precludes, and running the rules through 100 years of hydroclimatic simulation. The output flow regimes were then evaluated for presumed ecological benefits based on how closely they resembled EFCs in the reference natural flow regime. Flow regimes also were assessed for social-economic cost indicators, including days of flooding of low-lying agricultural land, days over flood stage, and storage levels in system reservoirs.Our experience with flow-regime design on the LMOR underscored the lack of confidence the stakeholders place in the value of the natural flow regime as a measure of ecosystem benefit in the absence of fundamental scientific documentation. Stakeholders desired proof of ecological benefits commensurate with the certainty of economic losses. We also gained insight into the processes of integrating science into a collaborative management exercise. Although the 2005 collaborative effort failed to reach a consensus among stakeholders on a naturalized flow regime, the process was successful in pilot-testing a design approach; it helped focus scienctific efforts on key knowledge gaps; and it demonstrated the potential for collaborations among scientists, stakeholders, and managers in river management decision making.
Control of Meiotic Crossovers: From Double-Strand Break Formation to Designation
Gray, Stephen
2017-01-01
Meiosis, the mechanism of creating haploid gametes, is a complex cellular process observed across sexually reproducing organisms. Fundamental to meiosis is the process of homologous recombination, whereby DNA double-strand breaks are introduced into the genome and are subsequently repaired to generate either noncrossovers or crossovers. Although homologous recombination is essential for chromosome pairing during prophase I, the resulting crossovers are critical for maintaining homolog interactions and enabling accurate segregation at the first meiotic division. Thus, the placement, timing, and frequency of crossover formation must be exquisitely controlled. In this review, we discuss the proteins involved in crossover formation, the process of their formation and designation, and the rules governing crossovers, all within the context of the important landmarks of prophase I. We draw together crossover designation data across organisms, analyze their evolutionary divergence, and propose a universal model for crossover regulation. PMID:27648641
A Process Algebraic Approach to Software Architecture Design
NASA Astrophysics Data System (ADS)
Aldini, Alessandro; Bernardo, Marco; Corradini, Flavio
Process algebra is a formal tool for the specification and the verification of concurrent and distributed systems. It supports compositional modeling through a set of operators able to express concepts like sequential composition, alternative composition, and parallel composition of action-based descriptions. It also supports mathematical reasoning via a two-level semantics, which formalizes the behavior of a description by means of an abstract machine obtained from the application of structural operational rules and then introduces behavioral equivalences able to relate descriptions that are syntactically different. In this chapter, we present the typical behavioral operators and operational semantic rules for a process calculus in which no notion of time, probability, or priority is associated with actions. Then, we discuss the three most studied approaches to the definition of behavioral equivalences - bisimulation, testing, and trace - and we illustrate their congruence properties, sound and complete axiomatizations, modal logic characterizations, and verification algorithms. Finally, we show how these behavioral equivalences and some of their variants are related to each other on the basis of their discriminating power.
Process wastewater treatability study for Westinghouse fluidized-bed coal gasification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winton, S.L.; Buvinger, B.J.; Evans, J.M.
1983-11-01
In the development of a synthetic fuels facility, water usage and wastewater treatment are major areas of concern. Coal gasification processes generally produce relatively large volumes of gas condensates. These wastewaters are typically composed of a variety of suspended and dissolved organic and inorganic solids and dissolved gaseous contaminants. Fluidized-bed coal gasification (FBG) processes are no exception to this rule. The Department of Energy's Morgantown Energy Technology Center (METC), the Gas Research Institute (GRI), and the Environmental Protection Agency (EPA/IERLRTP) recognized the need for a FBG treatment program to provide process design data for FBG wastewaters during the environmental, health,more » and safety characterization of the Westinghouse Process Development Unit (PDU). In response to this need, METC developed conceptual designs and a program plan to obtain process design and performance data for treating wastewater from commercial-scale Westinghouse-based synfuels plants. As a result of this plan, METC, GRI, and EPA entered into a joint program to develop performance data, design parameters, conceptual designs, and cost estimates for treating wastewaters from a FBG plant. Wastewater from the Westinghouse PDU consists of process quench and gas cooling condensates which are similar to those produced by other FBG processes such as U-Gas, and entrained-bed gasification processes such as Texaco. Therefore, wastewater from this facility was selected as the basis for this study. This paper outlines the current program for developing process design and cost data for the treatment of these wastewaters.« less
A Formalized Design Process for Bacterial Consortia That Perform Logic Computing
Sun, Rui; Xi, Jingyi; Wen, Dingqiao; Feng, Jingchen; Chen, Yiwei; Qin, Xiao; Ma, Yanrong; Luo, Wenhan; Deng, Linna; Lin, Hanchi; Yu, Ruofan; Ouyang, Qi
2013-01-01
The concept of microbial consortia is of great attractiveness in synthetic biology. Despite of all its benefits, however, there are still problems remaining for large-scaled multicellular gene circuits, for example, how to reliably design and distribute the circuits in microbial consortia with limited number of well-behaved genetic modules and wiring quorum-sensing molecules. To manage such problem, here we propose a formalized design process: (i) determine the basic logic units (AND, OR and NOT gates) based on mathematical and biological considerations; (ii) establish rules to search and distribute simplest logic design; (iii) assemble assigned basic logic units in each logic operating cell; and (iv) fine-tune the circuiting interface between logic operators. We in silico analyzed gene circuits with inputs ranging from two to four, comparing our method with the pre-existing ones. Results showed that this formalized design process is more feasible concerning numbers of cells required. Furthermore, as a proof of principle, an Escherichia coli consortium that performs XOR function, a typical complex computing operation, was designed. The construction and characterization of logic operators is independent of “wiring” and provides predictive information for fine-tuning. This formalized design process provides guidance for the design of microbial consortia that perform distributed biological computation. PMID:23468999
Fuzzy model-based fault detection and diagnosis for a pilot heat exchanger
NASA Astrophysics Data System (ADS)
Habbi, Hacene; Kidouche, Madjid; Kinnaert, Michel; Zelmat, Mimoun
2011-04-01
This article addresses the design and real-time implementation of a fuzzy model-based fault detection and diagnosis (FDD) system for a pilot co-current heat exchanger. The design method is based on a three-step procedure which involves the identification of data-driven fuzzy rule-based models, the design of a fuzzy residual generator and the evaluation of the residuals for fault diagnosis using statistical tests. The fuzzy FDD mechanism has been implemented and validated on the real co-current heat exchanger, and has been proven to be efficient in detecting and isolating process, sensor and actuator faults.
Magid, Steven K; Pancoast, Paul E; Fields, Theodore; Bradley, Diane G; Williams, Robert B
2007-01-01
Clinical decision support can be employed to increase patient safety and improve workflow efficiencies for physicians and other healthcare providers. Physician input into the design and deployment of clinical decision support systems can increase the utility of the alerts and reduce the likelihood of "alert fatigue." The Hospital for Special Surgery is a 146-bed orthopedic facility that performs approximately 18,000 surgeries a year Efficient work processes are a necessity. The facility began implementing a new electronic health record system in June 2005 and plan to go live in summer 2007. This article reports on some of the clinical decision support rules and alerts being incorporated into the facility's system in the following categories--high-risk, high-frequency scenarios, rules that provide efficiencies and value from the presciber perspective, and rules that relate to patient safety.
Advanced manufacturing rules check (MRC) for fully automated assessment of complex reticle designs
NASA Astrophysics Data System (ADS)
Gladhill, R.; Aguilar, D.; Buck, P. D.; Dawkins, D.; Nolke, S.; Riddick, J.; Straub, J. A.
2005-11-01
Advanced electronic design automation (EDA) tools, with their simulation, modeling, design rule checking, and optical proximity correction capabilities, have facilitated the improvement of first pass wafer yields. While the data produced by these tools may have been processed for optimal wafer manufacturing, it is possible for the same data to be far from ideal for photomask manufacturing, particularly at lithography and inspection stages, resulting in production delays and increased costs. The same EDA tools used to produce the data can be used to detect potential problems for photomask manufacturing in the data. A production implementation of automated photomask manufacturing rule checking (MRC) is presented and discussed for various photomask lithography and inspection lines. This paper will focus on identifying data which may cause production delays at the mask inspection stage. It will be shown how photomask MRC can be used to discover data related problems prior to inspection, separating jobs which are likely to have problems at inspection from those which are not. Photomask MRC can also be used to identify geometries requiring adjustment of inspection parameters for optimal inspection, and to assist with any special handling or change of routing requirements. With this foreknowledge, steps can be taken to avoid production delays that increase manufacturing costs. Finally, the data flow implemented for MRC can be used as a platform for other photomask data preparation tasks.
Design rules for phase-change materials in data storage applications.
Lencer, Dominic; Salinga, Martin; Wuttig, Matthias
2011-05-10
Phase-change materials can rapidly and reversibly be switched between an amorphous and a crystalline phase. Since both phases are characterized by very different optical and electrical properties, these materials can be employed for rewritable optical and electrical data storage. Hence, there are considerable efforts to identify suitable materials, and to optimize them with respect to specific applications. Design rules that can explain why the materials identified so far enable phase-change based devices would hence be very beneficial. This article describes materials that have been successfully employed and dicusses common features regarding both typical structures and bonding mechanisms. It is shown that typical structural motifs and electronic properties can be found in the crystalline state that are indicative for resonant bonding, from which the employed contrast originates. The occurence of resonance is linked to the composition, thus providing a design rule for phase-change materials. This understanding helps to unravel characteristic properties such as electrical and thermal conductivity which are discussed in the subsequent section. Then, turning to the transition kinetics between the phases, the current understanding and modeling of the processes of amorphization and crystallization are discussed. Finally, present approaches for improved high-capacity optical discs and fast non-volatile electrical memories, that hold the potential to succeed present-day's Flash memory, are presented. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Approach to design neural cryptography: a generalized architecture and a heuristic rule.
Mu, Nankun; Liao, Xiaofeng; Huang, Tingwen
2013-06-01
Neural cryptography, a type of public key exchange protocol, is widely considered as an effective method for sharing a common secret key between two neural networks on public channels. How to design neural cryptography remains a great challenge. In this paper, in order to provide an approach to solve this challenge, a generalized network architecture and a significant heuristic rule are designed. The proposed generic framework is named as tree state classification machine (TSCM), which extends and unifies the existing structures, i.e., tree parity machine (TPM) and tree committee machine (TCM). Furthermore, we carefully study and find that the heuristic rule can improve the security of TSCM-based neural cryptography. Therefore, TSCM and the heuristic rule can guide us to designing a great deal of effective neural cryptography candidates, in which it is possible to achieve the more secure instances. Significantly, in the light of TSCM and the heuristic rule, we further expound that our designed neural cryptography outperforms TPM (the most secure model at present) on security. Finally, a series of numerical simulation experiments are provided to verify validity and applicability of our results.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-24
... Equities Rule 104 To Adopt Pricing Obligations for Designated Market Makers September 20, 2010. Pursuant to... pricing obligations for Designated Market Makers (``DMMs''). The text of the proposed rule change is... adopt pricing obligations for DMMs. Under the proposal, the Exchange will require DMMs to continuously...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-07
... Organizations; International Securities Exchange, LLC; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change To List and Trade Option Contracts Overlying 10 Shares of a Security June... Act of 1934 (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to list and trade...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-25
...-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer Period for Commission Action on Proceedings To Determine Whether To Approve or Disapprove Proposed Rule Change To Link Market... Rule 19b-4 thereunder,\\2\\ a proposed rule change to discount certain market data fees and increase...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-15
...; Proposed Amendments to Rule G-8, on Books and Records, Rule G- 9, on Record Retention, and Rule G-18, on... of proposed MSRB Rule G-43, on broker's brokers; amendments to MSRB Rule G-8, on books and records...
Music models aberrant rule decoding and reward valuation in dementia
Clark, Camilla N; Golden, Hannah L; McCallion, Oliver; Nicholas, Jennifer M; Cohen, Miriam H; Slattery, Catherine F; Paterson, Ross W; Fletcher, Phillip D; Mummery, Catherine J; Rohrer, Jonathan D; Crutch, Sebastian J; Warren, Jason D
2018-01-01
Abstract Aberrant rule- and reward-based processes underpin abnormalities of socio-emotional behaviour in major dementias. However, these processes remain poorly characterized. Here we used music to probe rule decoding and reward valuation in patients with frontotemporal dementia (FTD) syndromes and Alzheimer’s disease (AD) relative to healthy age-matched individuals. We created short melodies that were either harmonically resolved (‘finished’) or unresolved (‘unfinished’); the task was to classify each melody as finished or unfinished (rule processing) and rate its subjective pleasantness (reward valuation). Results were adjusted for elementary pitch and executive processing; neuroanatomical correlates were assessed using voxel-based morphometry. Relative to healthy older controls, patients with behavioural variant FTD showed impairments of both musical rule decoding and reward valuation, while patients with semantic dementia showed impaired reward valuation but intact rule decoding, patients with AD showed impaired rule decoding but intact reward valuation and patients with progressive non-fluent aphasia performed comparably to healthy controls. Grey matter associations with task performance were identified in anterior temporal, medial and lateral orbitofrontal cortices, previously implicated in computing diverse biological and non-biological rules and rewards. The processing of musical rules and reward distils cognitive and neuroanatomical mechanisms relevant to complex socio-emotional dysfunction in major dementias. PMID:29186630
Multi-Criteria Approach in Multifunctional Building Design Process
NASA Astrophysics Data System (ADS)
Gerigk, Mateusz
2017-10-01
The paper presents new approach in multifunctional building design process. Publication defines problems related to the design of complex multifunctional buildings. Currently, contemporary urban areas are characterized by very intensive use of space. Today, buildings are being built bigger and contain more diverse functions to meet the needs of a large number of users in one capacity. The trends show the need for recognition of design objects in an organized structure, which must meet current design criteria. The design process in terms of the complex system is a theoretical model, which is the basis for optimization solutions for the entire life cycle of the building. From the concept phase through exploitation phase to disposal phase multipurpose spaces should guarantee aesthetics, functionality, system efficiency, system safety and environmental protection in the best possible way. The result of the analysis of the design process is presented as a theoretical model of the multifunctional structure. Recognition of multi-criteria model in the form of Cartesian product allows to create a holistic representation of the designed building in the form of a graph model. The proposed network is the theoretical base that can be used in the design process of complex engineering systems. The systematic multi-criteria approach makes possible to maintain control over the entire design process and to provide the best possible performance. With respect to current design requirements, there are no established design rules for multifunctional buildings in relation to their operating phase. Enrichment of the basic criteria with functional flexibility criterion makes it possible to extend the exploitation phase which brings advantages on many levels.
Nguyen, Su; Zhang, Mengjie; Tan, Kay Chen
2017-09-01
Automated design of dispatching rules for production systems has been an interesting research topic over the last several years. Machine learning, especially genetic programming (GP), has been a powerful approach to dealing with this design problem. However, intensive computational requirements, accuracy and interpretability are still its limitations. This paper aims at developing a new surrogate assisted GP to help improving the quality of the evolved rules without significant computational costs. The experiments have verified the effectiveness and efficiency of the proposed algorithms as compared to those in the literature. Furthermore, new simplification and visualisation approaches have also been developed to improve the interpretability of the evolved rules. These approaches have shown great potentials and proved to be a critical part of the automated design system.
75 FR 47063 - Mutual Fund Distribution Fees; Confirmations
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-04
... competition for distribution services. The proposed rule and rule amendments are designed to protect... designed to enhance investor understanding of those charges, limit the cumulative sales charges each...(b) was designed to protect funds from being charged excessive sales and promotional expenses.\\26...
Mass Transit: Implementation of FTA’s New Starts Evaluation Process and FY 2001 Funding Proposals
2000-04-01
formalize the process. FTA issued a proposed rule on April 7, 1999, and plans to issue final regulations by the summer of 2000. In selecting projects for...commit funds to any more New Starts projects during the last 2 years of TEA-21—through fiscal year 2003. Because there are plans for many more...regional review of alternatives, develop preliminary engineering plans , and meet FTA’s approval for the final design. TEA-21 requires that FTA evaluate
On the problem of zinc extraction from the slags of lead heat
NASA Astrophysics Data System (ADS)
Kozyrev, V. V.; Besser, A. D.; Paretskii, V. M.
2013-12-01
The possibilities of zinc extraction from the slags of lead heat are studied as applied to the ZAO Karat-TsM lead plant to be built for processing ore lead concentrates. The process of zinc extraction into commercial fumes using the technology of slag fuming by natural gas developed in Gintsvetmet is recommended for this purpose. Technological rules are developed for designing a commercial fuming plant, as applied to the conditions of the ZAO Karat-TsM plant.
Technology-design-manufacturing co-optimization for advanced mobile SoCs
NASA Astrophysics Data System (ADS)
Yang, Da; Gan, Chock; Chidambaram, P. R.; Nallapadi, Giri; Zhu, John; Song, S. C.; Xu, Jeff; Yeap, Geoffrey
2014-03-01
How to maintain the Moore's Law scaling beyond the 193 immersion resolution limit is the key question semiconductor industry needs to answer in the near future. Process complexity will undoubtfully increase for 14nm node and beyond, which brings both challenges and opportunities for technology development. A vertically integrated design-technologymanufacturing co-optimization flow is desired to better address the complicated issues new process changes bring. In recent years smart mobile wireless devices have been the fastest growing consumer electronics market. Advanced mobile devices such as smartphones are complex systems with the overriding objective of providing the best userexperience value by harnessing all the technology innovations. Most critical system drivers are better system performance/power efficiency, cost effectiveness, and smaller form factors, which, in turns, drive the need of system design and solution with More-than-Moore innovations. Mobile system-on-chips (SoCs) has become the leading driver for semiconductor technology definition and manufacturing. Here we highlight how the co-optimization strategy influenced architecture, device/circuit, process technology and package, in the face of growing process cost/complexity and variability as well as design rule restrictions.
Complexity, Training Paradigm Design, and the Contribution of Memory Subsystems to Grammar Learning
Ettlinger, Marc; Wong, Patrick C. M.
2016-01-01
Although there is variability in nonnative grammar learning outcomes, the contributions of training paradigm design and memory subsystems are not well understood. To examine this, we presented learners with an artificial grammar that formed words via simple and complex morphophonological rules. Across three experiments, we manipulated training paradigm design and measured subjects' declarative, procedural, and working memory subsystems. Experiment 1 demonstrated that passive, exposure-based training boosted learning of both simple and complex grammatical rules, relative to no training. Additionally, procedural memory correlated with simple rule learning, whereas declarative memory correlated with complex rule learning. Experiment 2 showed that presenting corrective feedback during the test phase did not improve learning. Experiment 3 revealed that structuring the order of training so that subjects are first exposed to the simple rule and then the complex improved learning. The cumulative findings shed light on the contributions of grammatical complexity, training paradigm design, and domain-general memory subsystems in determining grammar learning success. PMID:27391085
An architecture for designing fuzzy logic controllers using neural networks
NASA Technical Reports Server (NTRS)
Berenji, Hamid R.
1991-01-01
Described here is an architecture for designing fuzzy controllers through a hierarchical process of control rule acquisition and by using special classes of neural network learning techniques. A new method for learning to refine a fuzzy logic controller is introduced. A reinforcement learning technique is used in conjunction with a multi-layer neural network model of a fuzzy controller. The model learns by updating its prediction of the plant's behavior and is related to the Sutton's Temporal Difference (TD) method. The method proposed here has the advantage of using the control knowledge of an experienced operator and fine-tuning it through the process of learning. The approach is applied to a cart-pole balancing system.
Double jeopardy in inferring cognitive processes
Fific, Mario
2014-01-01
Inferences we make about underlying cognitive processes can be jeopardized in two ways due to problematic forms of aggregation. First, averaging across individuals is typically considered a very useful tool for removing random variability. The threat is that averaging across subjects leads to averaging across different cognitive strategies, thus harming our inferences. The second threat comes from the construction of inadequate research designs possessing a low diagnostic accuracy of cognitive processes. For that reason we introduced the systems factorial technology (SFT), which has primarily been designed to make inferences about underlying processing order (serial, parallel, coactive), stopping rule (terminating, exhaustive), and process dependency. SFT proposes that the minimal research design complexity to learn about n number of cognitive processes should be equal to 2n. In addition, SFT proposes that (a) each cognitive process should be controlled by a separate experimental factor, and (b) The saliency levels of all factors should be combined in a full factorial design. In the current study, the author cross combined the levels of jeopardies in a 2 × 2 analysis, leading to four different analysis conditions. The results indicate a decline in the diagnostic accuracy of inferences made about cognitive processes due to the presence of each jeopardy in isolation and when combined. The results warrant the development of more individual subject analyses and the utilization of full-factorial (SFT) experimental designs. PMID:25374545
Arruego Rodríguez, G; Chueca Rodríguez, R
2000-01-01
The recent ruling by the Constitutional Court (116/1999, 17 June) ended the process which had been initiated by the challenge filed by 63 conservative MPs against Law 35/1988, 22 November, on the Law on Human Assisted Reproduction Techniques. In our opinion, this ruling helps define the scope of the constitutional provisions used by the appellants in their challenge, provisions which had already been interpreted in lower rulings. The considerations given in this article are designed to establish the terms and framework which lawmakers and law experts should bear in mind when they prepare future, as will necessarily be the case. In view of some of the arguments used in the ruling, we believe it is appropriate to draw attention to some of the most salient constitutional aspects, such as the scope of the Constitutional Court's role as the ultimate judge of constitutionality, and the exact nature of the constitutional notion of fundamental right which, although complicated at times, is nonetheless a precise and accurate legal concept.
Park, Glen D; Mitchel, Jules T
2016-06-01
While the development of medical products and approval by the U.S. Food and Drug Administration (FDA) is well known, the development of countermeasures against exposure to toxic levels of radiation, chemicals, and infectious agents requires special consideration, and there has been, to date, little experience in working with the FDA to obtain approval of these products. The FDA has published a regulation entitled "Approval of Biological Products when Human Efficacy Studies are not Ethical or Feasible." This regulation, known simply as the "Animal Rule," was designed to permit approval or licensing of drugs and biologics when efficacy studies in humans are not ethical or feasible. To date, 12 products have been approved under the Animal Rule. It is highly recommended that sponsors of products that are to be developed under the Animal Rule meet with the FDA and other government entities early in the development process to ensure that the efficacy and safety studies that are planned will meet the FDA's requirements for approval of the product. © 2016 New York Academy of Sciences.
ERIC Educational Resources Information Center
Labatut, Julie; Aggeri, Franck; Astruc, Jean-Michel; Bibe, Bernard; Girard, Nathalie
2009-01-01
Purpose: The purpose of this paper is to investigate the role of instruments defined as artefacts, rules, models or norms, in the articulation between knowing-in-practice and knowledge, in learning processes. Design/methodology/approach: The paper focuses on a distributed, knowledge-intensive and instrumented activity at the core of any collective…
Designing Inquiry for Upper Elementary Students: Lessons Learned from Driver's Ed
ERIC Educational Resources Information Center
Rabbat, Suzy
2014-01-01
One of the most memorable achievements of adolescence is the independence gained from obtaining a driver's license. Students are highly motivated to study the rules of the road, hone their skills behind the wheel, and meet all the state requirements to reach their goal. They are invested in the process because they value the outcome. The Driver's…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-19
...)(iii).\\4\\ \\4\\ 17 CFR 240.19b-4(f)(6)(iii). The text of the proposed rule change is below. Proposed... term ``Order Type'' shall mean the unique processing prescribed for designated orders that are eligible... responsible single plan processor in order to comply with the quotation requirements for Market Makers set...
The Rule of Mimetic Desire in Higher Education: Governing through Naming, Shaming and Faming
ERIC Educational Resources Information Center
Brøgger, Katja
2016-01-01
The initiation of the Bologna Process was accompanied by a radical transition of governance in higher education throughout Europe from government to governance. This article argues that this shift in the design of governing was connected to the need to subtly bypass the European Union (EU) subsidiarity principle that kept education out of the EU's…
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The final rule of the Energy Policy Act of 2005 and its associated regulations enable covered state and alternative fuel provider fleets to obtain waivers from the alternative fuel vehicle (AFV)-acquisition requirements of Standard Compliance. Under Alternative Compliance, covered fleets instead meet a petroleum-use reduction requirement. This guidance document is designed to help fleets better understand the Alternative Compliance option and successfully complete the waiver application process.
Stern Frame and Hawsepipe Construction Technology
1978-01-01
to the classificationsocieties regarding possible changes in the rules governing stern frame and hawsepipe designs were also considered. In the first...which were most representativeof the ships being constructedor contemplatedfor constructionin U.S. shipyards,and comparing them from the standpoint of...equipmentneeded in the manufacturing process. Time: Length of time needed to completeunits on a comparative 1.3 Summary of Results The data obtained
Acquisition, representation and rule generation for procedural knowledge
NASA Technical Reports Server (NTRS)
Ortiz, Chris; Saito, Tim; Mithal, Sachin; Loftin, R. Bowen
1991-01-01
Current research into the design and continuing development of a system for the acquisition of procedural knowledge, its representation in useful forms, and proposed methods for automated C Language Integrated Production System (CLIPS) rule generation is discussed. The Task Analysis and Rule Generation Tool (TARGET) is intended to permit experts, individually or collectively, to visually describe and refine procedural tasks. The system is designed to represent the acquired knowledge in the form of graphical objects with the capacity for generating production rules in CLIPS. The generated rules can then be integrated into applications such as NASA's Intelligent Computer Aided Training (ICAT) architecture. Also described are proposed methods for use in translating the graphical and intermediate knowledge representations into CLIPS rules.
Verifying Architectural Design Rules of the Flight Software Product Line
NASA Technical Reports Server (NTRS)
Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen
2009-01-01
This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.
Rule Based Category Learning in Patients with Parkinson’s Disease
Price, Amanda; Filoteo, J. Vincent; Maddox, W. Todd
2009-01-01
Measures of explicit rule-based category learning are commonly used in neuropsychological evaluation of individuals with Parkinson’s disease (PD) and the pattern of PD performance on these measures tends to be highly varied. We review the neuropsychological literature to clarify the manner in which PD affects the component processes of rule-based category learning and work to identify and resolve discrepancies within this literature. In particular, we address the manner in which PD and its common treatments affect the processes of rule generation, maintenance, shifting and selection. We then integrate the neuropsychological research with relevant neuroimaging and computational modeling evidence to clarify the neurobiological impact of PD on each process. Current evidence indicates that neurochemical changes associated with PD primarily disrupt rule shifting, and may disturb feedback-mediated learning processes that guide rule selection. Although surgical and pharmacological therapies remediate this deficit, it appears that the same treatments may contribute to impaired rule generation, maintenance and selection processes. These data emphasize the importance of distinguishing between the impact of PD and its common treatments when considering the neuropsychological profile of the disease. PMID:19428385
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-16
... be made in a nondiscriminatory fashion.\\14\\ \\14\\ See NYSE Arca Equities Rule 7.45(d)(3). NYSE Arca... Securities will be required to establish and enforce policies and procedures that are reasonably designed to... other things, that the rules of a national securities exchange be designed to prevent fraudulent and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-26
...-Regulatory Organizations; NYSE Arca, Inc.; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change Proposing a Pilot Program To Create a Lead Market Maker Issuer Incentive Program for...'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to create and implement, on a pilot basis, a...
Learning CAD at University through Summaries of the Rules of Design Intent
ERIC Educational Resources Information Center
Barbero, Basilio Ramos; Pedrosa, Carlos Melgosa; Samperio, Raúl Zamora
2017-01-01
The ease with which 3D CAD models may be modified and reused are two key aspects that improve the design-intent variable and that can significantly shorten the development timelines of a product. A set of rules are gathered from various authors that take different 3D modelling strategies into account. These rules are then applied to CAD…
Basis of the tubesheet heat exchanger design rules used in the French pressure vessel code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osweiller, F.
1992-02-01
For about 40 years most tubessheet exchangers have been designed according to the standards of TEMA. Partly due to their simplicity, these rules do not assure a safe heat-exchanger design in all cases. This is the main reason why new tubesheet design rules were developed in 1981 in France for the French pressure vessel code CODAP. For fixed tubesheet heat exchangers, the new rules account for the elastic rotational restraint of the shell and channel at the outer edge of the tubesheet, as proposed in 1959 by Galletly. For floating-head and U-tube heat exchangers, the approach developed by Gardner inmore » 1969 was selected with some modifications. In both cases, the tubesheet is replaced by an equivalent solid plate with adequate effective elastic constants, and the tube bundle is simulated by an elastic foundation. The elastic restraint at the edge of the tubesheet due the shell and channel is accounted for in different ways in the two types of heat exchangers. The purpose of the paper is to present the main basis of these rules and to compare them to TEMA rules.« less
47 CFR 22.959 - Rules governing processing of applications for initial systems.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 2 2010-10-01 2010-10-01 false Rules governing processing of applications for...) COMMON CARRIER SERVICES PUBLIC MOBILE SERVICES Cellular Radiotelephone Service § 22.959 Rules governing processing of applications for initial systems. Pending applications for authority to operate the first...
Models of Quantitative Estimations: Rule-Based and Exemplar-Based Processes Compared
ERIC Educational Resources Information Center
von Helversen, Bettina; Rieskamp, Jorg
2009-01-01
The cognitive processes underlying quantitative estimations vary. Past research has identified task-contingent changes between rule-based and exemplar-based processes (P. Juslin, L. Karlsson, & H. Olsson, 2008). B. von Helversen and J. Rieskamp (2008), however, proposed a simple rule-based model--the mapping model--that outperformed the…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-03
... Proposed Rule Change Moving the Rule Text That Provides for Pegging on the Exchange From Supplementary Material .26 of NYSE Rule 70 to NYSE Rule 13 and Amending Such Text to (i) Permit Designated Market Maker... of the Terms of Substance of the Proposed Rule Change The Exchange proposes to move the rule text...
Reflow process stabilization by chemical characteristics and process conditions
NASA Astrophysics Data System (ADS)
Kim, Myoung-Soo; Park, Jeong-Hyun; Kim, Hak-Joon; Kim, Il-Hyung; Jeon, Jae-Ha; Gil, Myung-Goon; Kim, Bong-Ho
2002-07-01
With the shrunken device rule below 130nm, the patterning of smaller contact hole with enough process margin is required for mass production. Therefore, shrinking technology using thermal reflow process has been applied for smaller contact hole formation. In this paper, we have investigated the effects of chemical characteristics such as molecular weight, blocking ratio of resin, cross-linker amount and solvent type with its composition to reflow process of resist and found the optimized chemical composition for reflow process applicable condition. And several process conditions like resist coating thickness and multi-step thermal reflow method have been also evaluated to stabilize the pattern profile and improve CD uniformity after reflow process. From the experiment results, it was confirmed that the effect of crosslinker in resist to reflow properties such as reflow temperature and reflow rate were very critical and it controlled the pattern profile during reflow processing. And also, it showed stable CD uniformity and improved resist properties for top loss, film shrinkage and etch selectivity. The application of lower coating thickness of resist induced symmetric pattern profile even at edge with wider process margin. The introduction of two-step baking method for reflow process showed uniform CD value, also. It is believed that the application of resist containing crosslinker and optimized process conditions for smaller contact hole patterning is necessary for the mass production with a design rule below 130nm.
NASA Astrophysics Data System (ADS)
Yang, Xudong; Sun, Lingyu; Zhang, Cheng; Li, Lijun; Dai, Zongmiao; Xiong, Zhenkai
2018-03-01
The application of polymer composites as a substitution of metal is an effective approach to reduce vehicle weight. However, the final performance of composite structures is determined not only by the material types, structural designs and manufacturing process, but also by their mutual restrict. Hence, an integrated "material-structure-process-performance" method is proposed for the conceptual and detail design of composite components. The material selection is based on the principle of composite mechanics such as rule of mixture for laminate. The design of component geometry, dimension and stacking sequence is determined by parametric modeling and size optimization. The selection of process parameters are based on multi-physical field simulation. The stiffness and modal constraint conditions were obtained from the numerical analysis of metal benchmark under typical load conditions. The optimal design was found by multi-discipline optimization. Finally, the proposed method was validated by an application case of automotive hatchback using carbon fiber reinforced polymer. Compared with the metal benchmark, the weight of composite one reduces 38.8%, simultaneously, its torsion and bending stiffness increases 3.75% and 33.23%, respectively, and the first frequency also increases 44.78%.
Fatigue curve needs for higher strength 2-1/4Cr-1Mo steel for petroleum process vessels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jaske, C.E.
This paper reviews the data needed to develop fatigue design rules for pressure vessels fabricated from heat-treated 2-1/4Cr-1Mo steel (SA-387, Grade 22, Class 2 plates and SA-336, Grade F22 forgings) that are operated or designed to operate at temperatures greater than 371 C (700F). The available data were reviewed, and the results of that review were used to develop recommendations for needed analytical and experimental work. Extension of the fatigue-curve approach currently used for temperatures up to 371 C (700F) and development of a fracture-mechanics-based, crack-growth approach were addressed. Both of these two approaches must include means for assessing themore » time-dependent effects of oxidation and/or creep when fatigue cycling occurs at low stain rates or includes hold times. The recommendations of this study provide a plan for the development of fatigue design rules for the use of heat-treated 2-1/4Cr-1Mo steel at temperatures in the range of 371 to 482 C (700 to 900 F).« less
META II Complex Systems Design and Analysis (CODA)
2011-08-01
37 3.8.7 Variables, Parameters and Constraints ............................................................. 37 3.8.8 Objective...18 Figure 7: Inputs, States, Outputs and Parameters of System Requirements Specifications ......... 19...Design Rule Based on Device Parameter ....................................................... 57 Figure 35: AEE Device Design Rules (excerpt
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-14
... Proposed Rule Change To Modify the Requirements To Qualify for Credits as a Designated Liquidity Provider... requirements to qualify for credits as a designated liquidity provider under Rule 7018(i) and to make a minor... Designated Liquidity Providers: Charge to Designated Liquidity Provider $0.003 per share executed entering...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-30
...., wish to apply these airworthiness design standards to other airplane models, OHA, Inc. must submit a... affects only certain airworthiness design standards on Cessna model C172I, C172K, C172L, C172M airplanes... Design Standards for Acceptance Under the Primary Category Rule; Orlando Helicopter Airways (OHA), Inc...
Bendability optimization of flexible optical nanoelectronics via neutral axis engineering
2012-01-01
The enhancement of bendability of flexible nanoelectronics is critically important to realize future portable and wearable nanoelectronics for personal and military purposes. Because there is an enormous variety of materials and structures that are used for flexible nanoelectronic devices, a governing design rule for optimizing the bendability of these nanodevices is required. In this article, we suggest a design rule to optimize the bendability of flexible nanoelectronics through neutral axis (NA) engineering. In flexible optical nanoelectronics, transparent electrodes such as indium tin oxide (ITO) are usually the most fragile under an external load because of their brittleness. Therefore, we representatively focus on the bendability of ITO which has been widely used as transparent electrodes, and the NA is controlled by employing a buffer layer on the ITO layer. First, we independently investigate the effect of the thickness and elastic modulus of a buffer layer on the bendability of an ITO film. Then, we develop a design rule for the bendability optimization of flexible optical nanoelectronics. Because NA is determined by considering both the thickness and elastic modulus of a buffer layer, the design rule is conceived to be applicable regardless of the material and thickness that are used for the buffer layer. Finally, our design rule is applied to optimize the bendability of an organic solar cell, which allows the bending radius to reach about 1 mm. Our design rule is thus expected to provide a great strategy to enhance the bending performance of a variety of flexible nanoelectronics. PMID:22587757
Bendability optimization of flexible optical nanoelectronics via neutral axis engineering.
Lee, Sangmin; Kwon, Jang-Yeon; Yoon, Daesung; Cho, Handong; You, Jinho; Kang, Yong Tae; Choi, Dukhyun; Hwang, Woonbong
2012-05-15
The enhancement of bendability of flexible nanoelectronics is critically important to realize future portable and wearable nanoelectronics for personal and military purposes. Because there is an enormous variety of materials and structures that are used for flexible nanoelectronic devices, a governing design rule for optimizing the bendability of these nanodevices is required. In this article, we suggest a design rule to optimize the bendability of flexible nanoelectronics through neutral axis (NA) engineering. In flexible optical nanoelectronics, transparent electrodes such as indium tin oxide (ITO) are usually the most fragile under an external load because of their brittleness. Therefore, we representatively focus on the bendability of ITO which has been widely used as transparent electrodes, and the NA is controlled by employing a buffer layer on the ITO layer. First, we independently investigate the effect of the thickness and elastic modulus of a buffer layer on the bendability of an ITO film. Then, we develop a design rule for the bendability optimization of flexible optical nanoelectronics. Because NA is determined by considering both the thickness and elastic modulus of a buffer layer, the design rule is conceived to be applicable regardless of the material and thickness that are used for the buffer layer. Finally, our design rule is applied to optimize the bendability of an organic solar cell, which allows the bending radius to reach about 1 mm. Our design rule is thus expected to provide a great strategy to enhance the bending performance of a variety of flexible nanoelectronics.
Boot, Nathalie; Baas, Matthijs; Mühlfeld, Elisabeth; de Dreu, Carsten K W; van Gaal, Simon
2017-09-01
Critical to creative cognition and performance is both the generation of multiple alternative solutions in response to open-ended problems (divergent thinking) and a series of cognitive operations that converges on the correct or best possible answer (convergent thinking). Although the neural underpinnings of divergent and convergent thinking are still poorly understood, several electroencephalography (EEG) studies point to differences in alpha-band oscillations between these thinking modes. We reason that, because most previous studies employed typical block designs, these pioneering findings may mainly reflect the more sustained aspects of creative processes that extend over longer time periods, and that still much is unknown about the faster-acting neural mechanisms that dissociate divergent from convergent thinking during idea generation. To this end, we developed a new event-related paradigm, in which we measured participants' tendency to implicitly follow a rule set by examples, versus breaking that rule, during the generation of novel names for specific categories (e.g., pasta, planets). This approach allowed us to compare the oscillatory dynamics of rule convergent and rule divergent idea generation and at the same time enabled us to measure spontaneous switching between these thinking modes on a trial-to-trial basis. We found that, relative to more systematic, rule convergent thinking, rule divergent thinking was associated with widespread decreases in delta band activity. Therefore, this study contributes to advancing our understanding of the neural underpinnings of creativity by addressing some methodological challenges that neuroscientific creativity research faces. Copyright © 2017 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-22
...-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Designation of a Longer Period for Commission Action on a Proposed Rule Change Relating to Wash Sale Transactions and FINRA Rule...-4 thereunder,\\2\\ a proposed rule change to amend FINRA Rule 5210. The proposed rule change was...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2010 CFR
2010-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2011 CFR
2011-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2013 CFR
2013-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
Intuitive and deliberate judgments are based on common principles.
Kruglanski, Arie W; Gigerenzer, Gerd
2011-01-01
A popular distinction in cognitive and social psychology has been between intuitive and deliberate judgments. This juxtaposition has aligned in dual-process theories of reasoning associative, unconscious, effortless, heuristic, and suboptimal processes (assumed to foster intuitive judgments) versus rule-based, conscious, effortful, analytic, and rational processes (assumed to characterize deliberate judgments). In contrast, we provide convergent arguments and evidence for a unified theoretical approach to both intuitive and deliberative judgments. Both are rule-based, and in fact, the very same rules can underlie both intuitive and deliberate judgments. The important open question is that of rule selection, and we propose a 2-step process in which the task itself and the individual's memory constrain the set of applicable rules, whereas the individual's processing potential and the (perceived) ecological rationality of the rule for the task guide the final selection from that set. Deliberate judgments are not generally more accurate than intuitive judgments; in both cases, accuracy depends on the match between rule and environment: the rules' ecological rationality. Heuristics that are less effortful and in which parts of the information are ignored can be more accurate than cognitive strategies that have more information and computation. The proposed framework adumbrates a unified approach that specifies the critical dimensions on which judgmental situations may vary and the environmental conditions under which rules can be expected to be successful.
Optimization of High-Dimensional Functions through Hypercube Evaluation
Abiyev, Rahib H.; Tunay, Mustafa
2015-01-01
A novel learning algorithm for solving global numerical optimization problems is proposed. The proposed learning algorithm is intense stochastic search method which is based on evaluation and optimization of a hypercube and is called the hypercube optimization (HO) algorithm. The HO algorithm comprises the initialization and evaluation process, displacement-shrink process, and searching space process. The initialization and evaluation process initializes initial solution and evaluates the solutions in given hypercube. The displacement-shrink process determines displacement and evaluates objective functions using new points, and the search area process determines next hypercube using certain rules and evaluates the new solutions. The algorithms for these processes have been designed and presented in the paper. The designed HO algorithm is tested on specific benchmark functions. The simulations of HO algorithm have been performed for optimization of functions of 1000-, 5000-, or even 10000 dimensions. The comparative simulation results with other approaches demonstrate that the proposed algorithm is a potential candidate for optimization of both low and high dimensional functions. PMID:26339237
78 FR 67467 - Registration of Municipal Advisors
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-12
... the Exchange Act. These rules and forms are designed to give effect to provisions of Title IX of the... ``investment strategies'' in the final rule is designed to address the main concerns raised by these commenters... state, and provide tax advantages designed to encourage saving for future college costs.\\54\\ 529 Savings...
Multi-Mission Automated Task Invocation Subsystem
NASA Technical Reports Server (NTRS)
Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.
2009-01-01
Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,
49 CFR 106.10 - Process for issuing rules.
Code of Federal Regulations, 2010 CFR
2010-10-01
... filing written comments or making oral presentations). (4) Whom to call if you have questions about the... 49 Transportation 2 2010-10-01 2010-10-01 false Process for issuing rules. 106.10 Section 106.10... PHMSA Rulemaking Documents § 106.10 Process for issuing rules. (a) PHMSA (“we”) uses informal rulemaking...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... Organizations; National Stock Exchange, Inc.; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change To Adopt a New Order Type Called the ``Auto-Ex Only'' Order March 19, 2013. On January... (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to adopt a new order type called the...
Creation of a diagnostic wait times measurement framework based on evidence and consensus.
Gilbert, Julie E; Dobrow, Mark J; Kaan, Melissa; Dobranowski, Julian; Srigley, John R; Jusko Friedman, Audrey; Irish, Jonathan C
2014-09-01
Public reporting of wait times worldwide has to date focused largely on treatment wait times and is limited in its ability to capture earlier parts of the patient journey. The interval between suspicion and diagnosis or ruling out of cancer is a complex phase of the cancer journey. Diagnostic delays and inefficient use of diagnostic imaging procedures can result in poor patient outcomes, both physical and psychosocial. This study was designed to develop a framework that could be adopted for multiple disease sites across different jurisdictions to enable the measurement of diagnostic wait times and diagnostic delay. Diagnostic benchmarks and targets in cancer systems were explored through a targeted literature review and jurisdictional scan. Cancer system leaders and clinicians were interviewed to validate the information found in the jurisdictional scan. An expert panel was assembled to review and, through a modified Delphi consensus process, provide feedback on a diagnostic wait times framework. The consensus process resulted in agreement on a measurement framework that identified suspicion, referral, diagnosis, and treatment as the main time points for measuring this critical phase of the patient journey. This work will help guide initiatives designed to improve patient access to health services by developing an evidence-based approach to standardization of the various waypoints during the diagnostic pathway. The diagnostic wait times measurement framework provides a yardstick to measure the performance of programs that are designed to manage and expedite care processes between referral and diagnosis or ruling out of cancer. Copyright © 2014 by American Society of Clinical Oncology.
Conditions and Rules for Rational Discussion in a Legal Process: A Pragma-Dialectical Perspective.
ERIC Educational Resources Information Center
Feteris, Eveline T.
1990-01-01
Uses a pragma-dialectical analysis to argue that the legal process is rational. Suggests that the legal system's own rules guarantee that the conditions of rational and efficient discussion are present. Describes the Netherlands' civil procedure rules and shows how such rules help ensure that legal discussions are rational. (SG)
Conceptualising and managing trade-offs in sustainability assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morrison-Saunders, Angus, E-mail: A.Morrison-Saunders@murdoch.edu.au; School of Environmental Science, Murdoch University; Pope, Jenny
One of the defining characteristics of sustainability assessment as a form of impact assessment is that it provides a forum for the explicit consideration of the trade-offs that are inherent in complex decision-making processes. Few sustainability assessments have achieved this goal though, and none has considered trade-offs in a holistic fashion throughout the process. Recent contributions such as the Gibson trade-off rules have significantly progressed thinking in this area by suggesting appropriate acceptability criteria for evaluating substantive trade-offs arising from proposed development, as well as process rules for how evaluations of acceptability should occur. However, there has been negligible uptakemore » of these rules in practice. Overall, we argue that there is inadequate consideration of trade-offs, both process and substantive, throughout the sustainability assessment process, and insufficient considerations of how process decisions and compromises influence substantive outcomes. This paper presents a framework for understanding and managing both process and substantive trade-offs within each step of a typical sustainability assessment process. The framework draws together previously published literature and offers case studies that illustrate aspects of the practical application of the framework. The framing and design of sustainability assessment are vitally important, as process compromises or trade-offs can have substantive consequences in terms of sustainability outcomes delivered, with the choice of alternatives considered being a particularly significant determinant of substantive outcomes. The demarcation of acceptable from unacceptable impacts is a key aspect of managing trade-offs. Offsets can be considered as a form of trade-off within a category of sustainability that are utilised to enhance preferred alternatives once conditions of impact acceptability have been met. In this way they may enable net gains to be delivered; another imperative for progress to sustainability. Understanding the nature and implications of trade-offs within sustainability assessment is essential to improving practice. - Highlights: Black-Right-Pointing-Pointer A framework for understanding trade-offs in sustainability assessment is presented. Black-Right-Pointing-Pointer Trade-offs should be considered as early as possible in any sustainability assessment process. Black-Right-Pointing-Pointer Demarcation of acceptable from unacceptable impacts is needed for effective trade-off management. Black-Right-Pointing-Pointer Offsets in place, time or kind can ensure and attain a net benefit outcome overall. Black-Right-Pointing-Pointer Gibson's trade-off rules provide useful acceptability criteria and process guidance.« less
NASA Technical Reports Server (NTRS)
1994-01-01
A NASA contract led to the development of faster and more energy efficient semiconductor materials for digital integrated circuits. Gallium arsenide (GaAs) conducts electrons 4-6 times faster than silicon and uses less power at frequencies above 100-150 megahertz. However, the material is expensive, brittle, fragile and has lacked computer automated engineering tools to solve this problem. Systems & Processes Engineering Corporation (SPEC) developed a series of GaAs cell libraries for cell layout, design rule checking, logic synthesis, placement and routing, simulation and chip assembly. The system is marketed by Compare Design Automation.
CMLLite: a design philosophy for CML
2011-01-01
CMLLite is a collection of definitions and processes which provide strong and flexible validation for a document in Chemical Markup Language (CML). It consists of an updated CML schema (schema3), conventions specifying rules in both human and machine-understandable forms and a validator available both online and offline to check conformance. This article explores the rationale behind the changes which have been made to the schema, explains how conventions interact and how they are designed, formulated, implemented and tested, and gives an overview of the validation service. PMID:21999395
Scalable printed electronics: an organic decoder addressing ferroelectric non-volatile memory.
Ng, Tse Nga; Schwartz, David E; Lavery, Leah L; Whiting, Gregory L; Russo, Beverly; Krusor, Brent; Veres, Janos; Bröms, Per; Herlogsson, Lars; Alam, Naveed; Hagel, Olle; Nilsson, Jakob; Karlsson, Christer
2012-01-01
Scalable circuits of organic logic and memory are realized using all-additive printing processes. A 3-bit organic complementary decoder is fabricated and used to read and write non-volatile, rewritable ferroelectric memory. The decoder-memory array is patterned by inkjet and gravure printing on flexible plastics. Simulation models for the organic transistors are developed, enabling circuit designs tolerant of the variations in printed devices. We explain the key design rules in fabrication of complex printed circuits and elucidate the performance requirements of materials and devices for reliable organic digital logic.
Levels of integration in cognitive control and sequence processing in the prefrontal cortex.
Bahlmann, Jörg; Korb, Franziska M; Gratton, Caterina; Friederici, Angela D
2012-01-01
Cognitive control is necessary to flexibly act in changing environments. Sequence processing is needed in language comprehension to build the syntactic structure in sentences. Functional imaging studies suggest that sequence processing engages the left ventrolateral prefrontal cortex (PFC). In contrast, cognitive control processes additionally recruit bilateral rostral lateral PFC regions. The present study aimed to investigate these two types of processes in one experimental paradigm. Sequence processing was manipulated using two different sequencing rules varying in complexity. Cognitive control was varied with different cue-sets that determined the choice of a sequencing rule. Univariate analyses revealed distinct PFC regions for the two types of processing (i.e. sequence processing: left ventrolateral PFC and cognitive control processing: bilateral dorsolateral and rostral PFC). Moreover, in a common brain network (including left lateral PFC and intraparietal sulcus) no interaction between sequence and cognitive control processing was observed. In contrast, a multivariate pattern analysis revealed an interaction of sequence and cognitive control processing, such that voxels in left lateral PFC and parietal cortex showed different tuning functions for tasks involving different sequencing and cognitive control demands. These results suggest that the difference between the process of rule selection (i.e. cognitive control) and the process of rule-based sequencing (i.e. sequence processing) find their neuronal underpinnings in distinct activation patterns in lateral PFC. Moreover, the combination of rule selection and rule sequencing can shape the response of neurons in lateral PFC and parietal cortex.
Levels of Integration in Cognitive Control and Sequence Processing in the Prefrontal Cortex
Bahlmann, Jörg; Korb, Franziska M.; Gratton, Caterina; Friederici, Angela D.
2012-01-01
Cognitive control is necessary to flexibly act in changing environments. Sequence processing is needed in language comprehension to build the syntactic structure in sentences. Functional imaging studies suggest that sequence processing engages the left ventrolateral prefrontal cortex (PFC). In contrast, cognitive control processes additionally recruit bilateral rostral lateral PFC regions. The present study aimed to investigate these two types of processes in one experimental paradigm. Sequence processing was manipulated using two different sequencing rules varying in complexity. Cognitive control was varied with different cue-sets that determined the choice of a sequencing rule. Univariate analyses revealed distinct PFC regions for the two types of processing (i.e. sequence processing: left ventrolateral PFC and cognitive control processing: bilateral dorsolateral and rostral PFC). Moreover, in a common brain network (including left lateral PFC and intraparietal sulcus) no interaction between sequence and cognitive control processing was observed. In contrast, a multivariate pattern analysis revealed an interaction of sequence and cognitive control processing, such that voxels in left lateral PFC and parietal cortex showed different tuning functions for tasks involving different sequencing and cognitive control demands. These results suggest that the difference between the process of rule selection (i.e. cognitive control) and the process of rule-based sequencing (i.e. sequence processing) find their neuronal underpinnings in distinct activation patterns in lateral PFC. Moreover, the combination of rule selection and rule sequencing can shape the response of neurons in lateral PFC and parietal cortex. PMID:22952762
Landmarks selection in street map design
NASA Astrophysics Data System (ADS)
Kao, C. J.
2014-02-01
In Taiwan many electrical maps present their landmarks according to the category of the feature, a designer short of knowledge about mental representation of space, can cause the map to lose its communication effects. To resolve this map design problem, in this research through long-term memory recall, navigation and observation, and short-term memory processing 111 participants were asked to select the proper landmark from study area. The results reveal that in Taiwan convenience stores are the most popular local landmark in rural and urban areas. Their commercial signs have a unique design and bright color. Contrasted to their background, this makes the convenience store a salient feature. This study also developed a rule to assess the priority of the landmarks to design them in different scale maps.
Development of a knowledge management system for complex domains.
Perott, André; Schader, Nils; Bruder, Ralph; Leonhardt, Jörg
2012-01-01
Deutsche Flugsicherung GmbH, the German Air Navigation Service Provider, follows a systematic approach, called HERA, for investigating incidents. The HERA analysis shows a distinctive occurrence of incidents in German air traffic control in which the visual perception of information plays a key role. The reasons can be partially traced back to workstation design, where basic ergonomic rules and principles are not sufficiently followed by the designers in some cases. In cooperation with the Institute of Ergonomics in Darmstadt the DFS investigated possible approaches that may support designers to implement ergonomic systems. None of the currently available tools were found to be able to meet the identified user requirements holistically. Therefore it was suggested to develop an enhanced software tool called Design Process Guide. The name Design Process Guide indicates that this tool exceeds the classic functions of currently available Knowledge Management Systems. It offers "design element" based access, shows processual and content related topics, and shows the implications of certain design decisions. Furthermore, it serves as documentation, detailing why a designer made to a decision under a particular set of conditions.
Ordering Design Tasks Based on Coupling Strengths
NASA Technical Reports Server (NTRS)
Rogers, J. L.; Bloebaum, C. L.
1994-01-01
The design process associated with large engineering systems requires an initial decomposition of the complex system into modules of design tasks which are coupled through the transference of output data. In analyzing or optimizing such a coupled system, it is essential to be able to determine which interactions figure prominently enough to significantly affect the accuracy of the system solution. Many decomposition approaches assume the capability is available to determine what design tasks and interactions exist and what order of execution will be imposed during the analysis process. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature for DeMAID (Design Manager's Aid for Intelligent Decomposition) will allow the design manager to use coupling strength information to find a proper sequence for ordering the design tasks. In addition, these coupling strengths aid in deciding if certain tasks or couplings could be removed (or temporarily suspended) from consideration to achieve computational savings without a significant loss of system accuracy. New rules are presented and two small test cases are used to show the effects of using coupling strengths in this manner.
Ordering design tasks based on coupling strengths
NASA Technical Reports Server (NTRS)
Rogers, James L., Jr.; Bloebaum, Christina L.
1994-01-01
The design process associated with large engineering systems requires an initial decomposition of the complex system into modules of design tasks which are coupled through the transference of output data. In analyzing or optimizing such a coupled system, it is essential to be able to determine which interactions figure prominently enough to significantly affect the accuracy of the system solution. Many decomposition approaches assume the capability is available to determine what design tasks and interactions exist and what order of execution will be imposed during the analysis process. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature for DeMAID (Design Manager's Aid for Intelligent Decomposition) will allow the design manager to use coupling strength information to find a proper sequence for ordering the design tasks. In addition, these coupling strengths aid in deciding if certain tasks or couplings could be removed (or temporarily suspended) from consideration to achieve computational savings without a significant loss of system accuracy. New rules are presented and two small test cases are used to show the effects of using coupling strengths in this manner.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-24
... Rule 104 To Adopt Pricing Obligations for Designated Market Makers September 20, 2010. Pursuant to... of the Proposed Rule Change The Exchange proposes to amend Rule 104 to adopt pricing obligations for.... Purpose The Exchange proposes to amend Rule 104 to adopt pricing obligations for DMMs. Under the proposal...
NASA Astrophysics Data System (ADS)
Chang, Ya-Ting; Chang, Li-Chiu; Chang, Fi-John
2005-04-01
To bridge the gap between academic research and actual operation, we propose an intelligent control system for reservoir operation. The methodology includes two major processes, the knowledge acquired and implemented, and the inference system. In this study, a genetic algorithm (GA) and a fuzzy rule base (FRB) are used to extract knowledge based on the historical inflow data with a design objective function and on the operating rule curves respectively. The adaptive network-based fuzzy inference system (ANFIS) is then used to implement the knowledge, to create the fuzzy inference system, and then to estimate the optimal reservoir operation. To investigate its applicability and practicability, the Shihmen reservoir, Taiwan, is used as a case study. For the purpose of comparison, a simulation of the currently used M-5 operating rule curve is also performed. The results demonstrate that (1) the GA is an efficient way to search the optimal input-output patterns, (2) the FRB can extract the knowledge from the operating rule curves, and (3) the ANFIS models built on different types of knowledge can produce much better performance than the traditional M-5 curves in real-time reservoir operation. Moreover, we show that the model can be more intelligent for reservoir operation if more information (or knowledge) is involved.
Silicon Photonics: Challenges and Future
2007-01-01
process or phonon assisted. It directly impacts the internal quantum efficiency through the relationship : ηi = (1+ (τrad/τ non-rad ))-1 There are...linear cavity approach, the reported differential quantum efficiency is currently low. The measured characteristic temperature (To), is lower than...rule changes • package design 4.1.2 Inter-chip interconnects There is a requirement on the circuit card to transfer data more efficiently between
EPA has initiated a process to revise certain requirements in the WPS. By the end of FY2018, EPA expects to publish a Notice of Proposed Rulemaking to solicit public input on proposed revisions to the WPS requirements for minimum ages, designated represen
ERIC Educational Resources Information Center
Jelavich, Barbara
Designed as an introductory history, this book covers developments in the Balkan Peninsula from the 17th through the 19th centuries. Emphasis is placed on the process by which separate nationalities broke away from imperial rule, established independent states, and embarked on economic and social modernization. To establish perspective on the role…
Energy-efficient neuron, synapse and STDP integrated circuits.
Cruz-Albrecht, Jose M; Yung, Michael W; Srinivasa, Narayan
2012-06-01
Ultra-low energy biologically-inspired neuron and synapse integrated circuits are presented. The synapse includes a spike timing dependent plasticity (STDP) learning rule circuit. These circuits have been designed, fabricated and tested using a 90 nm CMOS process. Experimental measurements demonstrate proper operation. The neuron and the synapse with STDP circuits have an energy consumption of around 0.4 pJ per spike and synaptic operation respectively.
A fuzzy classifier system for process control
NASA Technical Reports Server (NTRS)
Karr, C. L.; Phillips, J. C.
1994-01-01
A fuzzy classifier system that discovers rules for controlling a mathematical model of a pH titration system was developed by researchers at the U.S. Bureau of Mines (USBM). Fuzzy classifier systems successfully combine the strengths of learning classifier systems and fuzzy logic controllers. Learning classifier systems resemble familiar production rule-based systems, but they represent their IF-THEN rules by strings of characters rather than in the traditional linguistic terms. Fuzzy logic is a tool that allows for the incorporation of abstract concepts into rule based-systems, thereby allowing the rules to resemble the familiar 'rules-of-thumb' commonly used by humans when solving difficult process control and reasoning problems. Like learning classifier systems, fuzzy classifier systems employ a genetic algorithm to explore and sample new rules for manipulating the problem environment. Like fuzzy logic controllers, fuzzy classifier systems encapsulate knowledge in the form of production rules. The results presented in this paper demonstrate the ability of fuzzy classifier systems to generate a fuzzy logic-based process control system.
Rules Mothers and Sons Use to Integrate Intent and Damage Information in Their Moral Judgments.
ERIC Educational Resources Information Center
Leon, Manuel
1984-01-01
The similarity between rules used by mothers and those used by sons was extensive. Results suggest that research should emphasize the process by which children come to employ multidimensional rules and the role of parental models in this process. Current research in moral judgments largely ignores the rule-governed nature of children's judgments.…
Cognitive changes in conjunctive rule-based category learning: An ERP approach.
Rabi, Rahel; Joanisse, Marc F; Zhu, Tianshu; Minda, John Paul
2018-06-25
When learning rule-based categories, sufficient cognitive resources are needed to test hypotheses, maintain the currently active rule in working memory, update rules after feedback, and to select a new rule if necessary. Prior research has demonstrated that conjunctive rules are more complex than unidimensional rules and place greater demands on executive functions like working memory. In our study, event-related potentials (ERPs) were recorded while participants performed a conjunctive rule-based category learning task with trial-by-trial feedback. In line with prior research, correct categorization responses resulted in a larger stimulus-locked late positive complex compared to incorrect responses, possibly indexing the updating of rule information in memory. Incorrect trials elicited a pronounced feedback-locked P300 elicited which suggested a disconnect between perception, and the rule-based strategy. We also examined the differential processing of stimuli that were able to be correctly classified by the suboptimal single-dimensional rule ("easy" stimuli) versus those that could only be correctly classified by the optimal, conjunctive rule ("difficult" stimuli). Among strong learners, a larger, late positive slow wave emerged for difficult compared with easy stimuli, suggesting differential processing of category items even though strong learners performed well on the conjunctive category set. Overall, the findings suggest that ERP combined with computational modelling can be used to better understand the cognitive processes involved in rule-based category learning.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-27
... Proposed Rule Change to Offer Risk Management Tools Designed to Allow Member Organizations to Monitor and... of the Proposed Rule Change The Exchange proposes to offer risk management tools designed to allow... risk management tools designed to allow member organizations to monitor and address exposure to risk...
77 FR 21161 - National Forest System Land Management Planning
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-09
... ecosystem services and multiple uses. The planning rule is designed to ensure that plans provide for the... adaptive and science-based, engages the public, and is designed to be efficient, effective, and within the..., the new rule is designed to make planning more efficient and effective. Purpose and Need for the New...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-21
... Habitat for Ivesia webberi (Webber's ivesia) AGENCY: Fish and Wildlife Service, Interior. ACTION: Proposed... dates published in the August 2, 2013, proposed rule to designate critical habitat for Ivesia webberi... rule to designate critical habitat for Ivesia webberi, we included the wrong date for the public...
Optimal Test Design with Rule-Based Item Generation
ERIC Educational Resources Information Center
Geerlings, Hanneke; van der Linden, Wim J.; Glas, Cees A. W.
2013-01-01
Optimal test-design methods are applied to rule-based item generation. Three different cases of automated test design are presented: (a) test assembly from a pool of pregenerated, calibrated items; (b) test generation on the fly from a pool of calibrated item families; and (c) test generation on the fly directly from calibrated features defining…
Nuclear driven water decomposition plant for hydrogen production
NASA Technical Reports Server (NTRS)
Parker, G. H.; Brecher, L. E.; Farbman, G. H.
1976-01-01
The conceptual design of a hydrogen production plant using a very-high-temperature nuclear reactor (VHTR) to energize a hybrid electrolytic-thermochemical system for water decomposition has been prepared. A graphite-moderated helium-cooled VHTR is used to produce 1850 F gas for electric power generation and 1600 F process heat for the water-decomposition process which uses sulfur compounds and promises performance superior to normal water electrolysis or other published thermochemical processes. The combined cycle operates at an overall thermal efficiency in excess of 45%, and the overall economics of hydrogen production by this plant have been evaluated predicated on a consistent set of economic ground rules. The conceptual design and evaluation efforts have indicated that development of this type of nuclear-driven water-decomposition plant will permit large-scale economic generation of hydrogen in the 1990s.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-04
... Moving the Rule Text That Provides for Pegging on the Exchange From Supplementary Material .26 of Rule 70--Equities to Rule 13--Equities and Amending Such Text to (i) Permit Designated Market Maker Interest To Be... Proposed Rule Change The Exchange proposes to move the rule text that provides for pegging on the Exchange...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-03
... Proposed Rule Change Amending Commentary .07 to NYSE Amex Options Rule 904 To Eliminate Position Limits for... Act of 1934 (the ``Act'') \\2\\ and Rule 19b-4 thereunder,\\3\\ a proposed rule change to eliminate... side of the market. The proposal would amend Commentary .07 to NYSE Amex Options Rule 904 to eliminate...
18 CFR 385.1403 - Petitions seeking institution of rulemaking proceedings (Rule 1404).
Code of Federal Regulations, 2010 CFR
2010-04-01
... PROCEDURE Oil Pipeline Proceedings § 385.1403 Petitions seeking institution of rulemaking proceedings (Rule... purpose of issuing statements, rules, or regulations of general applicability and significance designed to...
Direct Final Rule for Technical Amendments for Marine Spark-Ignition Engines and Vessels
Rule published September 16, 2010 to make technical amendments to the design standard for portable marine fuel tanks. This rule incorporates safe recommended practices, developed through industry consensus.
Knowledge-based low-level image analysis for computer vision systems
NASA Technical Reports Server (NTRS)
Dhawan, Atam P.; Baxi, Himanshu; Ranganath, M. V.
1988-01-01
Two algorithms for entry-level image analysis and preliminary segmentation are proposed which are flexible enough to incorporate local properties of the image. The first algorithm involves pyramid-based multiresolution processing and a strategy to define and use interlevel and intralevel link strengths. The second algorithm, which is designed for selected window processing, extracts regions adaptively using local histograms. The preliminary segmentation and a set of features are employed as the input to an efficient rule-based low-level analysis system, resulting in suboptimal meaningful segmentation.
Railway Online Booking System Design and Implementation
NASA Astrophysics Data System (ADS)
Zongjiang, Wang
In this paper, we define rule usefulness and introduce one approach to evaluate the rule usefulness in rough sets. And we raise one method to get most useful rules. This method is easy and effective in applications of prisoners' reform. Comparing with the method to get most interesting rules, ours is direct and objective. Rule interestingness must consider the predefined knowledge on what kind of information is interesting. Our method greatly reduces the rule numbers generated and provides a measure of rule usefulness at the same time.
Rule-based navigation control design for autonomous flight
NASA Astrophysics Data System (ADS)
Contreras, Hugo; Bassi, Danilo
2008-04-01
This article depicts a navigation control system design that is based on a set of rules in order to follow a desired trajectory. The full control of the aircraft considered here comprises: a low level stability control loop, based on classic PID controller and the higher level navigation whose main job is to exercise lateral control (course) and altitude control, trying to follow a desired trajectory. The rules and PID gains were adjusted systematically according to the result of flight simulation. In spite of its simplicity, the rule-based navigation control proved to be robust, even with big perturbation, like crossing winds.
Incorporation of negative rules and evolution of a fuzzy controller for yeast fermentation process.
Birle, Stephan; Hussein, Mohamed Ahmed; Becker, Thomas
2016-08-01
The control of bioprocesses can be very challenging due to the fact that these kinds of processes are highly affected by various sources of uncertainty like the intrinsic behavior of the used microorganisms. Due to the reason that these kinds of process uncertainties are not directly measureable in most cases, the overall control is either done manually because of the experience of the operator or intelligent expert systems are applied, e.g., on the basis of fuzzy logic theory. In the latter case, however, the control concept is mainly represented by using merely positive rules, e.g., "If A then do B". As this is not straightforward with respect to the semantics of the human decision-making process that also includes negative experience in form of constraints or prohibitions, the incorporation of negative rules for process control based on fuzzy logic is emphasized. In this work, an approach of fuzzy logic control of the yeast propagation process based on a combination of positive and negative rules is presented. The process is guided along a reference trajectory for yeast cell concentration by alternating the process temperature. The incorporation of negative rules leads to a much more stable and accurate control of the process as the root mean squared error of reference trajectory and system response could be reduced by an average of 62.8 % compared to the controller using only positive rules.
Golf in the United States: an evolution of accessibility.
Parziale, John R
2014-09-01
Golf affords physical and psychological benefits to persons who are physically challenged. Advances in adaptive technology, changes in golf course design, and rules modifications have enabled persons with neurological, musculoskeletal, and other impairments to play golf at a recreational, elite amateur, or professional level. The Americans with Disabilities Act has been cited in both federal and US Supreme Court rulings that have improved access for physically challenged golfers. Medical specialties, including physiatry, have played an important role in this process. This article reviews the history of golf's improvements in accessibility, and provides clinicians and physically challenged golfers with information that will facilitate participation in the sport. Copyright © 2014 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.
Experiments towards establishing of design rules for R2R-UV-NIL with polymer working shims
NASA Astrophysics Data System (ADS)
Nees, Dieter; Ruttloff, Stephan; Palfinger, Ursula; Stadlober, Barbara
2016-03-01
Roll-to-Roll-UV-nanoimprint lithography (R2R-UV-NIL) enables high resolution large area patterning of flexible substrates and is therefore of increasing industrial interest. We have set up a custom-made R2R-UV-NIL pilot machine which is able to convert 10 inch wide web with velocities of up to 30 m/min. In addition, we have developed self-replicable UV-curable resins with tunable surface energy and Young's modulus for UV-imprint material as well as for polymer working stamp/shim manufacturing. Now we have designed test patterns for the evaluation of the impact of structure shape, critical dimension, pitch, depth, side wall angle and orientation relative to the web movement onto the imprint fidelity and working shim life time. We have used female (recessed structures) silicon masters of that design with critical dimensions between CD = 200 nm and 1600 nm, and structure depths of d = 500 nm and 1000 nm - all with vertical as well as inclined side walls. These entire master patterns have been transferred onto single male (protruding structures) R2R polymer working shims. The polymer working shims have been used for R2R-UV-NIL runs of several hundred meters and the imprint fidelity and process stability of the various test patterns have been compared. This study is intended as a first step towards establishing of design rules and developing of nanoimprint proximity correction strategies for industrial R2R-UV-NIL processes using polymer working shims.
Nonequivalence of updating rules in evolutionary games under high mutation rates.
Kaiping, G A; Jacobs, G S; Cox, S J; Sluckin, T J
2014-10-01
Moran processes are often used to model selection in evolutionary simulations. The updating rule in Moran processes is a birth-death process, i. e., selection according to fitness of an individual to give birth, followed by the death of a random individual. For well-mixed populations with only two strategies this updating rule is known to be equivalent to selecting unfit individuals for death and then selecting randomly for procreation (biased death-birth process). It is, however, known that this equivalence does not hold when considering structured populations. Here we study whether changing the updating rule can also have an effect in well-mixed populations in the presence of more than two strategies and high mutation rates. We find, using three models from different areas of evolutionary simulation, that the choice of updating rule can change model results. We show, e. g., that going from the birth-death process to the death-birth process can change a public goods game with punishment from containing mostly defectors to having a majority of cooperative strategies. From the examples given we derive guidelines indicating when the choice of the updating rule can be expected to have an impact on the results of the model.
Nonequivalence of updating rules in evolutionary games under high mutation rates
NASA Astrophysics Data System (ADS)
Kaiping, G. A.; Jacobs, G. S.; Cox, S. J.; Sluckin, T. J.
2014-10-01
Moran processes are often used to model selection in evolutionary simulations. The updating rule in Moran processes is a birth-death process, i. e., selection according to fitness of an individual to give birth, followed by the death of a random individual. For well-mixed populations with only two strategies this updating rule is known to be equivalent to selecting unfit individuals for death and then selecting randomly for procreation (biased death-birth process). It is, however, known that this equivalence does not hold when considering structured populations. Here we study whether changing the updating rule can also have an effect in well-mixed populations in the presence of more than two strategies and high mutation rates. We find, using three models from different areas of evolutionary simulation, that the choice of updating rule can change model results. We show, e. g., that going from the birth-death process to the death-birth process can change a public goods game with punishment from containing mostly defectors to having a majority of cooperative strategies. From the examples given we derive guidelines indicating when the choice of the updating rule can be expected to have an impact on the results of the model.
A plausible neural circuit for decision making and its formation based on reinforcement learning.
Wei, Hui; Dai, Dawei; Bu, Yijie
2017-06-01
A human's, or lower insects', behavior is dominated by its nervous system. Each stable behavior has its own inner steps and control rules, and is regulated by a neural circuit. Understanding how the brain influences perception, thought, and behavior is a central mandate of neuroscience. The phototactic flight of insects is a widely observed deterministic behavior. Since its movement is not stochastic, the behavior should be dominated by a neural circuit. Based on the basic firing characteristics of biological neurons and the neural circuit's constitution, we designed a plausible neural circuit for this phototactic behavior from logic perspective. The circuit's output layer, which generates a stable spike firing rate to encode flight commands, controls the insect's angular velocity when flying. The firing pattern and connection type of excitatory and inhibitory neurons are considered in this computational model. We simulated the circuit's information processing using a distributed PC array, and used the real-time average firing rate of output neuron clusters to drive a flying behavior simulation. In this paper, we also explored how a correct neural decision circuit is generated from network flow view through a bee's behavior experiment based on the reward and punishment feedback mechanism. The significance of this study: firstly, we designed a neural circuit to achieve the behavioral logic rules by strictly following the electrophysiological characteristics of biological neurons and anatomical facts. Secondly, our circuit's generality permits the design and implementation of behavioral logic rules based on the most general information processing and activity mode of biological neurons. Thirdly, through computer simulation, we achieved new understanding about the cooperative condition upon which multi-neurons achieve some behavioral control. Fourthly, this study aims in understanding the information encoding mechanism and how neural circuits achieve behavior control. Finally, this study also helps establish a transitional bridge between the microscopic activity of the nervous system and macroscopic animal behavior.
78 FR 36434 - Revisions to Rules of Practice
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-18
... federal holidays, make grammatical corrections, and remove the reference to part-day holidays. Rule 3001... section, the following categories of persons are designated ``decision-making personnel'': (i) The.... The following categories of person are designated ``non-decision-making personnel'': (i) All...
Medas, Daniela; De Giudici, Giovanni; Casu, Maria Antonietta; Musu, Elodia; Gianoncelli, Alessandra; Iadecola, Antonella; Meneghini, Carlo; Tamburini, Elena; Sprocati, Anna Rosa; Turnau, Katarzyna; Lattanzi, Pierfranco
2015-02-03
Euphorbia pithyusa L. was used in a plant growth-promoting assisted field trial experiment. To unravel the microscopic processes at the interface, thin slices of E. pithyusa roots were investigated by micro-X-ray fluorescence mapping. Roots and rhizosphere materials were examined by X-ray absorption spectroscopy at the Zn K-edge, X-ray diffraction, and scanning electron microscopy. Results indicate some features common to all the investigated samples. (i) In the rhizosphere of E. pithyusa, Zn was found to exist in different phases. (ii) Si and Al are mainly concentrated in a rim at the epidermis of the roots. (iii) Zn is mostly stored in root epidermis and does not appear to be coordinated to organic molecules but mainly occurs in mineral phases such as Zn silicates. We interpreted that roots of E. pithyusa significantly promote mineral evolution in the rhizosphere. Concomitantly, the plant uses Si and Al extracted by soil minerals to build a biomineralization rim, which can capture Zn. This Zn silicate biomineralization has relevant implications for phytoremediation techniques and for further biotechnology development, which can be better designed and developed after specific knowledge of molecular processes ruling mineral evolution and biomineralization processes has been gained.
Clinical Decision Support for a Multicenter Trial of Pediatric Head Trauma
Swietlik, Marguerite; Deakyne, Sara; Hoffman, Jeffrey M.; Grundmeier, Robert W.; Paterno, Marilyn D.; Rocha, Beatriz H.; Schaeffer, Molly H; Pabbathi, Deepika; Alessandrini, Evaline; Ballard, Dustin; Goldberg, Howard S.; Kuppermann, Nathan; Dayan, Peter S.
2016-01-01
Summary Introduction For children who present to emergency departments (EDs) due to blunt head trauma, ED clinicians must decide who requires computed tomography (CT) scanning to evaluate for traumatic brain injury (TBI). The Pediatric Emergency Care Applied Research Network (PECARN) derived and validated two age-based prediction rules to identify children at very low risk of clinically-important traumatic brain injuries (ciTBIs) who do not typically require CT scans. In this case report, we describe the strategy used to implement the PECARN TBI prediction rules via electronic health record (EHR) clinical decision support (CDS) as the intervention in a multicenter clinical trial. Methods Thirteen EDs participated in this trial. The 10 sites receiving the CDS intervention used the Epic® EHR. All sites implementing EHR-based CDS built the rules by using the vendor’s CDS engine. Based on a sociotechnical analysis, we designed the CDS so that recommendations could be displayed immediately after any provider entered prediction rule data. One central site developed and tested the intervention package to be exported to other sites. The intervention package included a clinical trial alert, an electronic data collection form, the CDS rules and the format for recommendations. Results The original PECARN head trauma prediction rules were derived from physician documentation while this pragmatic trial led each site to customize their workflows and allow multiple different providers to complete the head trauma assessments. These differences in workflows led to varying completion rates across sites as well as differences in the types of providers completing the electronic data form. Site variation in internal change management processes made it challenging to maintain the same rigor across all sites. This led to downstream effects when data reports were developed. Conclusions The process of a centralized build and export of a CDS system in one commercial EHR system successfully supported a multicenter clinical trial. PMID:27437059
Electromigration failures under bidirectional current stress
NASA Astrophysics Data System (ADS)
Tao, Jiang; Cheung, Nathan W.; Hu, Chenming
1998-01-01
Electromigration failure under DC stress has been studied for more than 30 years, and the methodologies for accelerated DC testing and design rules have been well established in the IC industry. However, the electromigration behavior and design rules under time-varying current stress are still unclear. In CMOS circuits, as many interconnects carry pulsed-DC (local VCC and VSS lines) and bidirectional AC current (clock and signal lines), it is essential to assess the reliability of metallization systems under these conditions. Failure mechanisms of different metallization systems (Al-Si, Al-Cu, Cu, TiN/Al-alloy/TiN, etc.) and different metallization structures (via, plug and interconnect) under AC current stress in a wide frequency range (from mHz to 500 MHz) has been study in this paper. Based on these experimental results, a damage healing model is developed, and electromigration design rules are proposed. It shows that in the circuit operating frequency range, the "design-rule current" is the time-average current. The pure AC component of the current only contributes to self-heating, while the average (DC component) current contributes to electromigration. To ensure longer thermal-migration lifetime under high frequency AC stress, an additional design rule is proposed to limit the temperature rise due to self-joule heating.
WaferOptics® mass volume production and reliability
NASA Astrophysics Data System (ADS)
Wolterink, E.; Demeyer, K.
2010-05-01
The Anteryon WaferOptics® Technology platform contains imaging optics designs, materials, metrologies and combined with wafer level based Semicon & MEMS production methods. WaferOptics® first required complete new system engineering. This system closes the loop between application requirement specifications, Anteryon product specification, Monte Carlo Analysis, process windows, process controls and supply reject criteria. Regarding the Anteryon product Integrated Lens Stack (ILS), new design rules, test methods and control systems were assessed, implemented, validated and customer released for mass production. This includes novel reflowable materials, mastering process, replication, bonding, dicing, assembly, metrology, reliability programs and quality assurance systems. Many of Design of Experiments were performed to assess correlations between optical performance parameters and machine settings of all process steps. Lens metrologies such as FFL, BFL, and MTF were adapted for wafer level production and wafer mapping was introduced for yield management. Test methods for screening and validating suitable optical materials were designed. Critical failure modes such as delamination and popcorning were assessed and modeled with FEM. Anteryon successfully managed to integrate the different technologies starting from single prototypes to high yield mass volume production These parallel efforts resulted in a steep yield increase from 30% to over 90% in a 8 months period.
17 CFR Appendix A to Part 38 - Guidance on Compliance With Designation Criteria
Code of Federal Regulations, 2010 CFR
2010-04-01
... the criteria for designation. To the extent that compliance with, or satisfaction of, a criterion for designation is not self-explanatory from the face of the contract market's rules (as defined in § 40.1 of this... FACILITY—The board of trade shall—(A) establish and enforce rules defining, or specifications detailing...
10 CFR Appendix A to Part 52 - Design Certification Rule for the U.S. Advanced Boiling Water Reactor
Code of Federal Regulations, 2010 CFR
2010-01-01
... Water Reactor A Appendix A to Part 52 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES... Rule for the U.S. Advanced Boiling Water Reactor I. Introduction Appendix A constitutes the standard design certification for the U.S. Advanced Boiling Water Reactor (ABWR) design, in accordance with 10 CFR...
10 CFR Appendix A to Part 52 - Design Certification Rule for the U.S. Advanced Boiling Water Reactor
Code of Federal Regulations, 2011 CFR
2011-01-01
... Water Reactor A Appendix A to Part 52 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES... Rule for the U.S. Advanced Boiling Water Reactor I. Introduction Appendix A constitutes the standard design certification for the U.S. Advanced Boiling Water Reactor (ABWR) design, in accordance with 10 CFR...
76 FR 49303 - Approval and Promulgation of Air Quality Implementation Plans; Minnesota; Rules Update
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-10
... design requirements for the monitoring systems. The revised CMS rules also delineate the recordkeeping..., [Insert page number where the document begins]. 7017.1140 CEMS design 03/01/99 08/10/11, [Insert page...]. 7017.1190 COMS design 03/01/99 08/10/11, [Insert page requirements. number where the document begins...
Double Linear Damage Rule for Fatigue Analysis
NASA Technical Reports Server (NTRS)
Halford, G.; Manson, S.
1985-01-01
Double Linear Damage Rule (DLDR) method for use by structural designers to determine fatigue-crack-initiation life when structure subjected to unsteady, variable-amplitude cyclic loadings. Method calculates in advance of service how many loading cycles imposed on structural component before macroscopic crack initiates. Approach eventually used in design of high performance systems and incorporated into design handbooks and codes.
14 CFR 91.139 - Emergency air traffic rules.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 2 2014-01-01 2014-01-01 false Emergency air traffic rules. 91.139 Section...) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Flight Rules General § 91.139 Emergency air traffic rules. (a) This section prescribes a process for utilizing Notices to Airmen...
14 CFR 91.139 - Emergency air traffic rules.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 2 2012-01-01 2012-01-01 false Emergency air traffic rules. 91.139 Section...) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Flight Rules General § 91.139 Emergency air traffic rules. (a) This section prescribes a process for utilizing Notices to Airmen...
14 CFR 91.139 - Emergency air traffic rules.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 2 2010-01-01 2010-01-01 false Emergency air traffic rules. 91.139 Section...) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Flight Rules General § 91.139 Emergency air traffic rules. (a) This section prescribes a process for utilizing Notices to Airmen...
Why Rules Matter in Complex Event Processing...and Vice Versa
NASA Astrophysics Data System (ADS)
Vincent, Paul
Many commercial and research CEP solutions are moving beyond simple stream query languages to more complete definitions of "process" and thence to "decisions" and "actions". And as capabilities increase in event processing capabilities, there is an increasing realization that the humble "rule" is as relevant to the event cloud as it is to specific services. Less obvious is how much event processing has to offer the process and rule execution and management technologies. Does event processing change the way we should manage businesses, processes and services, together with their embedded (and hopefully managed) rulesets?
A Data Stream Model For Runoff Simulation In A Changing Environment
NASA Astrophysics Data System (ADS)
Yang, Q.; Shao, J.; Zhang, H.; Wang, G.
2017-12-01
Runoff simulation is of great significance for water engineering design, water disaster control, water resources planning and management in a catchment or region. A large number of methods including concept-based process-driven models and statistic-based data-driven models, have been proposed and widely used in worldwide during past decades. Most existing models assume that the relationship among runoff and its impacting factors is stationary. However, in the changing environment (e.g., climate change, human disturbance), their relationship usually evolves over time. In this study, we propose a data stream model for runoff simulation in a changing environment. Specifically, the proposed model works in three steps: learning a rule set, expansion of a rule, and simulation. The first step is to initialize a rule set. When a new observation arrives, the model will check which rule covers it and then use the rule for simulation. Meanwhile, Page-Hinckley (PH) change detection test is used to monitor the online simulation error of each rule. If a change is detected, the corresponding rule is removed from the rule set. In the second step, for each rule, if it covers more than a given number of instance, the rule is expected to expand. In the third step, a simulation model of each leaf node is learnt with a perceptron without activation function, and is updated with adding a newly incoming observation. Taking Fuxi River catchment as a case study, we applied the model to simulate the monthly runoff in the catchment. Results show that abrupt change is detected in the year of 1997 by using the Page-Hinckley change detection test method, which is consistent with the historic record of flooding. In addition, the model achieves good simulation results with the RMSE of 13.326, and outperforms many established methods. The findings demonstrated that the proposed data stream model provides a promising way to simulate runoff in a changing environment.
Anisotropy of Photopolymer Parts Made by Digital Light Processing
Monzón, Mario; Ortega, Zaida; Hernández, Alba; Paz, Rubén; Ortega, Fernando
2017-01-01
Digital light processing (DLP) is an accurate additive manufacturing (AM) technology suitable for producing micro-parts by photopolymerization. As most AM technologies, anisotropy of parts made by DLP is a key issue to deal with, taking into account that several operational factors modify this characteristic. Design for this technology and photopolymers becomes a challenge because the manufacturing process and post-processing strongly influence the mechanical properties of the part. This paper shows experimental work to demonstrate the particular behavior of parts made using DLP. Being different to any other AM technology, rules for design need to be adapted. Influence of build direction and post-curing process on final mechanical properties and anisotropy are reported and justified based on experimental data and theoretical simulation of bi-material parts formed by fully-cured resin and partially-cured resin. Three photopolymers were tested under different working conditions, concluding that post-curing can, in some cases, correct the anisotropy, mainly depending on the nature of photopolymer. PMID:28772426
Ensemble learning with trees and rules: supervised, semi-supervised, unsupervised
USDA-ARS?s Scientific Manuscript database
In this article, we propose several new approaches for post processing a large ensemble of conjunctive rules for supervised and semi-supervised learning problems. We show with various examples that for high dimensional regression problems the models constructed by the post processing the rules with ...
76 FR 4066 - Rules of Practice
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-24
... new Rules of Practice to formalize the process it will use when conducting proceedings to determine... address the process the Commission will follow to institute proceedings and provide notice of the grounds... Accounting Oversight Board (``PCAOB'').\\3\\ The Commission is amending Regulation P to add a rule providing...
50 CFR 424.16 - Proposed rules.
Code of Federal Regulations, 2014 CFR
2014-10-01
... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.16 Proposed rules. (a) General. Based..., delisting, or reclassification of a species or the designation or revision of critical habitat will also...
Knowledge-based critiquing of graphical user interfaces with CHIMES
NASA Technical Reports Server (NTRS)
Jiang, Jianping; Murphy, Elizabeth D.; Carter, Leslie E.; Truszkowski, Walter F.
1994-01-01
CHIMES is a critiquing tool that automates the process of checking graphical user interface (GUI) designs for compliance with human factors design guidelines and toolkit style guides. The current prototype identifies instances of non-compliance and presents problem statements, advice, and tips to the GUI designer. Changes requested by the designer are made automatically, and the revised GUI is re-evaluated. A case study conducted at NASA-Goddard showed that CHIMES has the potential for dramatically reducing the time formerly spent in hands-on consistency checking. Capabilities recently added to CHIMES include exception handling and rule building. CHIMES is intended for use prior to usability testing as a means, for example, of catching and correcting syntactic inconsistencies in a larger user interface.
Human factors certification in the development of future air traffic control systems
NASA Technical Reports Server (NTRS)
Evans, Alyson E.
1994-01-01
If human factors certification of aviation technologies aims to encompass the wide range of issues which need to be addressed for any new system, then human factors involvement must be present throughout the whole design process in a manner which relates to final certification. A certification process cannot simply be applied to the final product of design. Standards and guidelines will be required by designers at the outset of design for reference in preparing for certification. The most effective use of human factors principles, methods, and measures is made as part of an iterative design process, leading to a system which reflects these as far as possible. This particularly applies where the technology is complex and may be represented by a number of components or sub-systems. Some aspects of the system are best certified during early prototyping, when there is still scope to make changes to software or hardware. At this stage in design, financial and/or time pressures will not rule out the possibility of necessary changes, as may be the case later. Other aspects of the system will be best certified during the final phases of design when the system is in a more complete form and in a realistic environment.
75 FR 8759 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-25
... rule proposal methods. The FOCUS Report was designed to eliminate the overlapping regulatory reports... SECURITIES AND EXCHANGE COMMISSION [Rule 17a-5; SEC File No. 270-155; OMB Control No. 3235-0123... currently valid control number. Rule 17a-5 (17 CFR 240.17a-5) is the basic financial reporting rule for...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-25
...-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change To Require Members To Report OTC Equity Transactions... (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to amend FINRA trade reporting rules...
ERIC Educational Resources Information Center
Larsson, Ken
2014-01-01
This paper looks at the process of managing large numbers of exams efficiently and secure with the use of a dedicated IT support. The system integrates regulations on different levels, from national to local, (even down to departments) and ensures that the rules are employed in all stages of handling the exams. The system has a proven record of…
FEDERAL RULEMAKING: Procedural and Analytical Requirements at OSHA and Other Agencies
2001-06-14
informal rulemaking. Formal rulemaking is used in ratemaking proceedings and in certain other cases when rules are required by statute to be made “on...crosscutting statutory requirements that I have just listed are by no means the only statutory requirements that guide agency rulemaking. Regulations generally...directives designed to guide the federal rulemaking process, often with the goal of reducing regulatory burden. Although independent regulatory agencies are
Rules based process window OPC
NASA Astrophysics Data System (ADS)
O'Brien, Sean; Soper, Robert; Best, Shane; Mason, Mark
2008-03-01
As a preliminary step towards Model-Based Process Window OPC we have analyzed the impact of correcting post-OPC layouts using rules based methods. Image processing on the Brion Tachyon was used to identify sites where the OPC model/recipe failed to generate an acceptable solution. A set of rules for 65nm active and poly were generated by classifying these failure sites. The rules were based upon segment runlengths, figure spaces, and adjacent figure widths. 2.1 million sites for active were corrected in a small chip (comparing the pre and post rules based operations), and 59 million were found at poly. Tachyon analysis of the final reticle layout found weak margin sites distinct from those sites repaired by rules-based corrections. For the active layer more than 75% of the sites corrected by rules would have printed without a defect indicating that most rulesbased cleanups degrade the lithographic pattern. Some sites were missed by the rules based cleanups due to either bugs in the DRC software or gaps in the rules table. In the end dramatic changes to the reticle prevented catastrophic lithography errors, but this method is far too blunt. A more subtle model-based procedure is needed changing only those sites which have unsatisfactory lithographic margin.
17 CFR 240.3a40-1 - Designation of financial responsibility rules.
Code of Federal Regulations, 2010 CFR
2010-04-01
... relating to hypothecation or lending of customer securities; (c) Any rule adopted by any self-regulatory... other rule adopted by the Commission or any self-regulatory organization relating to the protection of...
Alabugin, Igor V; Timokhin, Vitaliy I; Abrams, Jason N; Manoharan, Mariappan; Abrams, Rachel; Ghiviriga, Ion
2008-08-20
Despite being predicted to be stereoelectronically favorable by the Baldwin rules, efficient formation of a C-C bond through a 5-endo-dig radical cyclization remained unknown for more than 40 years. This work reports a remarkable increase in the efficiency of this process upon beta-Ts substitution, which led to the development of an expedient approach to densely functionalized cyclic 1,3-dienes. Good qualitative agreement between the increased efficiency and stereoselectivity for the 5-endo-dig cyclization of Ts-substituted vinyl radicals and the results of density functional theory analysis further confirms the utility of computational methods in the design of new radical processes. Although reactions of Br atoms generated through photochemical Ts-Br bond homolysis lead to the formation of cyclic dibromide side products, the yields of target bromosulfones in the photochemically induced reactions can be increased by recycling the dibromide byproduct into the target bromosulfones through a sequence of addition/elimination reactions at the exocyclic double bond. Discovery of a relatively efficient radical 5-endo-dig closure, accompanied by a C-C bond formation, provides further support to stereoelectronic considerations at the heart of the Baldwin rules and fills one of the last remaining gaps in the arsenal of radical cyclizations.
Implementation of clinical decision rules in the emergency department.
Stiell, Ian G; Bennett, Carol
2007-11-01
Clinical decision rules (CDRs) are tools designed to help clinicians make bedside diagnostic and therapeutic decisions. The development of a CDR involves three stages: derivation, validation, and implementation. Several criteria need to be considered when designing and evaluating the results of an implementation trial. In this article, the authors review the results of implementation studies evaluating the effect of four CDRs: the Ottawa Ankle Rules, the Ottawa Knee Rule, the Canadian C-Spine Rule, and the Canadian CT Head Rule. Four implementation studies demonstrated that the implementation of CDRs in the emergency department (ED) safely reduced the use of radiography for ankle, knee, and cervical spine injuries. However, a recent trial failed to demonstrate an impact on computed tomography imaging rates. Well-developed and validated CDRs can be successfully implemented into practice, efficiently standardizing ED care. However, further research is needed to identify barriers to implementation in order to achieve improved uptake in the ED.
The expert explorer: a tool for hospital data visualization and adverse drug event rules validation.
Băceanu, Adrian; Atasiei, Ionuţ; Chazard, Emmanuel; Leroy, Nicolas
2009-01-01
An important part of adverse drug events (ADEs) detection is the validation of the clinical cases and the assessment of the decision rules to detect ADEs. For that purpose, a software called "Expert Explorer" has been designed by Ideea Advertising. Anonymized datasets have been extracted from hospitals into a common repository. The tool has 3 main features. (1) It can display hospital stays in a visual and comprehensive way (diagnoses, drugs, lab results, etc.) using tables and pretty charts. (2) It allows designing and executing dashboards in order to generate knowledge about ADEs. (3) It finally allows uploading decision rules obtained from data mining. Experts can then review the rules, the hospital stays that match the rules, and finally give their advice thanks to specialized forms. Then the rules can be validated, invalidated, or improved (knowledge elicitation phase).
TARGET: Rapid Capture of Process Knowledge
NASA Technical Reports Server (NTRS)
Ortiz, C. J.; Ly, H. V.; Saito, T.; Loftin, R. B.
1993-01-01
TARGET (Task Analysis/Rule Generation Tool) represents a new breed of tool that blends graphical process flow modeling capabilities with the function of a top-down reporting facility. Since NASA personnel frequently perform tasks that are primarily procedural in nature, TARGET models mission or task procedures and generates hierarchical reports as part of the process capture and analysis effort. Historically, capturing knowledge has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent the expert's knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some types of knowledge, procedural knowledge has received relatively little attention. In essence, TARGET is one of the first tools of its kind, commercial or institutional, that is designed to support this type of knowledge capture undertaking. This paper will describe the design and development of TARGET for the acquisition and representation of procedural knowledge. The strategies employed by TARGET to support use by knowledge engineers, subject matter experts, programmers and managers will be discussed. This discussion includes the method by which the tool employs its graphical user interface to generate a task hierarchy report. Next, the approach to generate production rules for incorporation in and development of a CLIPS based expert system will be elaborated. TARGET also permits experts to visually describe procedural tasks as a common medium for knowledge refinement by the expert community and knowledge engineer making knowledge consensus possible. The paper briefly touches on the verification and validation issues facing the CLIPS rule generation aspects of TARGET. A description of efforts to support TARGET's interoperability issues on PCs, Macintoshes and UNIX workstations concludes the paper.
Huang, Po-Hsin; Chiu, Ming-Chuan
2016-01-01
The Digital Accessible Information SYstem (DAISY) player is an assistive reading tool developed for use by persons with visual impairments. Certain problems have persisted in the operating procedure and interface of DAISY players, especially for their Chinese users. Therefore, the aim of this study was to redesign the DAISY player with increased usability features for use by native Chinese speakers. First, a User Centered Design (UCD) process was employed to analyze the development of the prototype. Next, operation procedures were reorganized according to GOMS (Goals, Operators, Methods, and Selection rules) methodology. Then the user interface was redesigned according to specific Universal Design (UD) principles. Following these revisions, an experiment involving four scenarios was conducted to compare the new prototype to other players, and it was tested by twelve visually impaired participants. Results indicate the prototype had the quickest operating times, the fewest number of operating errors, and the lowest mental workloads of all the compared players, significantly enhancing the prototype's usability. These findings have allowed us to generate suggestions for developing the next generation of DAISY players for people, especially for Chinese audience. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
RuleGO: a logical rules-based tool for description of gene groups by means of Gene Ontology
Gruca, Aleksandra; Sikora, Marek; Polanski, Andrzej
2011-01-01
Genome-wide expression profiles obtained with the use of DNA microarray technology provide abundance of experimental data on biological and molecular processes. Such amount of data need to be further analyzed and interpreted in order to obtain biological conclusions on the basis of experimental results. The analysis requires a lot of experience and is usually time-consuming process. Thus, frequently various annotation databases are used to improve the whole process of analysis. Here, we present RuleGO—the web-based application that allows the user to describe gene groups on the basis of logical rules that include Gene Ontology (GO) terms in their premises. Presented application allows obtaining rules that reflect coappearance of GO-terms describing genes supported by the rules. The ontology level and number of coappearing GO-terms is adjusted in automatic manner. The user limits the space of possible solutions only. The RuleGO application is freely available at http://rulego.polsl.pl/. PMID:21715384
NASA Technical Reports Server (NTRS)
Glick, B. J.
1985-01-01
Techniques for classifying objects into groups or clases go under many different names including, most commonly, cluster analysis. Mathematically, the general problem is to find a best mapping of objects into an index set consisting of class identifiers. When an a priori grouping of objects exists, the process of deriving the classification rules from samples of classified objects is known as discrimination. When such rules are applied to objects of unknown class, the process is denoted classification. The specific problem addressed involves the group classification of a set of objects that are each associated with a series of measurements (ratio, interval, ordinal, or nominal levels of measurement). Each measurement produces one variable in a multidimensional variable space. Cluster analysis techniques are reviewed and methods for incuding geographic location, distance measures, and spatial pattern (distribution) as parameters in clustering are examined. For the case of patterning, measures of spatial autocorrelation are discussed in terms of the kind of data (nominal, ordinal, or interval scaled) to which they may be applied.
Design, Simulation and Fabrication of Triaxial MEMS High Shock Accelerometer.
Zhang, Zhenhai; Shi, Zhiguo; Yang, Zhan; Xie, Zhihong; Zhang, Donghong; Cai, De; Li, Kejie; Shen, Yajing
2015-04-01
On the basis of analyzing the disadvantage of other structural accelerometer, three-axis high g MEMS piezoresistive accelerometer was put forward in order to apply to the high-shock test field. The accelerometer's structure and working principle were discussed in details. The simulation results show that three-axis high shock MEMS accelerometer can bear high shock. After bearing high shock impact in high-shock shooting test, three-axis high shock MEMS accelerometer can obtain the intact metrical information of the penetration process and still guarantee the accurate precision of measurement in high shock load range, so we can not only analyze the law of stress wave spreading and the penetration rule of the penetration process of the body of the missile, but also furnish the testing technology of the burst point controlling. The accelerometer has far-ranging application in recording the typical data that projectile penetrating hard target and furnish both technology guarantees for penetration rule and defend engineering.
Automated geospatial Web Services composition based on geodata quality requirements
NASA Astrophysics Data System (ADS)
Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael
2012-10-01
Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.
The Development of the World Anti-Doping Code.
Young, Richard
2017-01-01
This chapter addresses both the development and substance of the World Anti-Doping Code, which came into effect in 2003, as well as the subsequent Code amendments, which came into effect in 2009 and 2015. Through an extensive process of stakeholder input and collaboration, the World Anti-Doping Code has transformed the hodgepodge of inconsistent and competing pre-2003 anti-doping rules into a harmonized and effective approach to anti-doping. The Code, as amended, is now widely recognized worldwide as the gold standard in anti-doping. The World Anti-Doping Code originally went into effect on January 1, 2004. The first amendments to the Code went into effect on January 1, 2009, and the second amendments on January 1, 2015. The Code and the related international standards are the product of a long and collaborative process designed to make the fight against doping more effective through the adoption and implementation of worldwide harmonized rules and best practices. © 2017 S. Karger AG, Basel.
Importance of the DNA “bond” in programmable nanoparticle crystallization
Macfarlane, Robert J.; Thaner, Ryan V.; Brown, Keith A.; Zhang, Jian; Lee, Byeongdu; Nguyen, SonBinh T.; Mirkin, Chad A.
2014-01-01
If a solution of DNA-coated nanoparticles is allowed to crystallize, the thermodynamic structure can be predicted by a set of structural design rules analogous to Pauling’s rules for ionic crystallization. The details of the crystallization process, however, have proved more difficult to characterize as they depend on a complex interplay of many factors. Here, we report that this crystallization process is dictated by the individual DNA bonds and that the effect of changing structural or environmental conditions can be understood by considering the effect of these parameters on free oligonucleotides. Specifically, we observed the reorganization of nanoparticle superlattices using time-resolved synchrotron small-angle X-ray scattering in systems with different DNA sequences, salt concentrations, and densities of DNA linkers on the surface of the nanoparticles. The agreement between bulk crystallization and the behavior of free oligonucleotides may bear important consequences for constructing novel classes of crystals and incorporating new interparticle bonds in a rational manner. PMID:25298535
Dynamic Approaches to Language Processing
ERIC Educational Resources Information Center
Srinivasan, Narayanan
2007-01-01
Symbolic rule-based approaches have been a preferred way to study language and cognition. Dissatisfaction with rule-based approaches in the 1980s lead to alternative approaches to study language, the most notable being the dynamic approaches to language processing. Dynamic approaches provide a significant alternative by not being rule-based and…
20 CFR 418.1340 - What are the rules for our administrative review process?
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false What are the rules for our administrative review process? 418.1340 Section 418.1340 Employees' Benefits SOCIAL SECURITY ADMINISTRATION MEDICARE... they are not inconsistent with the rules in this subpart for making initial determinations and...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-24
... Disciplinary Rule 476A to add certain rules to Part 1A: List of Exchange Rule Violations and Fines Applicable... 1. Purpose The Exchange proposes to amend NYSE Amex Disciplinary Rule 476A to add certain rules to..., in connection with the Exchange's process to harmonize certain Exchange rules with rules of the...
The role of the striatum in rule application: the model of Huntington's disease at early stage.
Teichmann, Marc; Dupoux, Emmanuel; Kouider, Sid; Brugières, Pierre; Boissé, Marie-Françoise; Baudic, Sophie; Cesaro, Pierre; Peschanski, Marc; Bachoud-Lévi, Anne-Catherine
2005-05-01
The role of the basal ganglia, and more specifically of the striatum, in language is still debated. Recent studies have proposed that linguistic abilities involve two distinct types of processes: the retrieving of stored information, implicating temporal lobe areas, and the application of combinatorial rules, implicating fronto-striatal circuits. Studies of patients with focal lesions and neurodegenerative diseases have suggested a role for the striatum in morphological rule application, but functional imaging studies found that the left caudate was involved in syntactic processing and not morphological processing. In the present study, we tested the view that the basal ganglia are involved in rule application and not in lexical retrieving in a model of striatal dysfunction, namely Huntington's disease at early stages. We assessed the rule-lexicon dichotomy in the linguistic domain with morphology (conjugation of non-verbs and verbs) and syntax (sentence comprehension) and in a non-linguistic domain with arithmetic operations (subtraction and multiplication). Thirty Huntington's disease patients (15 at stage I and 15 at stage II) and 20 controls matched for their age and cultural level were included in this study. Huntington's disease patients were also assessed using the Unified Huntington's Disease Rating Scale (UHDRS) and MRI. We found that early Huntington's disease patients were impaired in rule application in the linguistic and non-linguistic domains (morphology, syntax and subtraction), whereas they were broadly spared with lexical processing. The pattern of performance was similar in patients at stage I and stage II, except that stage II patients were more impaired in all tasks assessing rules and had in addition a very slight impairment in the lexical condition of conjugation. Finally, syntactic rule abilities correlated with all markers of the disease evolution including bicaudate ratio and performance in executive function, whereas there was no correlation with arithmetic and morphological abilities. Together, this suggests that the striatum is involved in rule processing more than in lexical processing and that it extends to linguistic and non-linguistic domains. These results are discussed in terms of domain-specific versus domain-general processes of rule application.
Ultimate strength performance of tankers associated with industry corrosion addition practices
NASA Astrophysics Data System (ADS)
Kim, Do Kyun; Kim, Han Byul; Zhang, Xiaoming; Li, Chen Guang; Paik, Jeom Kee
2014-09-01
In the ship and offshore structure design, age-related problems such as corrosion damage, local denting, and fatigue damage are important factors to be considered in building a reliable structure as they have a significant influence on the residual structural capacity. In shipping, corrosion addition methods are widely adopted in structural design to prevent structural capacity degradation. The present study focuses on the historical trend of corrosion addition rules for ship structural design and investigates their effects on the ultimate strength performance such as hull girder and stiffened panel of double hull oil tankers. Three types of rules based on corrosion addition models, namely historic corrosion rules (pre-CSR), Common Structural Rules (CSR), and harmonised Common Structural Rules (CSRH) are considered and compared with two other corrosion models namely UGS model, suggested by the Union of Greek Shipowners (UGS), and Time-Dependent Corrosion Wastage Model (TDCWM). To identify the general trend in the effects of corrosion damage on the ultimate longitudinal strength performance, the corrosion addition rules are applied to four representative sizes of double hull oil tankers namely Panamax, Aframax, Suezmax, and VLCC. The results are helpful in understanding the trend of corrosion additions for tanker structures
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-17
... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-69733; File No. SR-NYSEMKT-2013-25] Self-Regulatory Organizations; NYSE MKT LLC; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change Amending NYSE MKT Rule 104--Equities To Codify Certain Traditional Trading Floor Functions That May Be Performed by Designated...
Multi-arm group sequential designs with a simultaneous stopping rule.
Urach, S; Posch, M
2016-12-30
Multi-arm group sequential clinical trials are efficient designs to compare multiple treatments to a control. They allow one to test for treatment effects already in interim analyses and can have a lower average sample number than fixed sample designs. Their operating characteristics depend on the stopping rule: We consider simultaneous stopping, where the whole trial is stopped as soon as for any of the arms the null hypothesis of no treatment effect can be rejected, and separate stopping, where only recruitment to arms for which a significant treatment effect could be demonstrated is stopped, but the other arms are continued. For both stopping rules, the family-wise error rate can be controlled by the closed testing procedure applied to group sequential tests of intersection and elementary hypotheses. The group sequential boundaries for the separate stopping rule also control the family-wise error rate if the simultaneous stopping rule is applied. However, we show that for the simultaneous stopping rule, one can apply improved, less conservative stopping boundaries for local tests of elementary hypotheses. We derive corresponding improved Pocock and O'Brien type boundaries as well as optimized boundaries to maximize the power or average sample number and investigate the operating characteristics and small sample properties of the resulting designs. To control the power to reject at least one null hypothesis, the simultaneous stopping rule requires a lower average sample number than the separate stopping rule. This comes at the cost of a lower power to reject all null hypotheses. Some of this loss in power can be regained by applying the improved stopping boundaries for the simultaneous stopping rule. The procedures are illustrated with clinical trials in systemic sclerosis and narcolepsy. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
50 CFR 424.16 - Proposed rules.
Code of Federal Regulations, 2010 CFR
2010-10-01
... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.16 Proposed rules. (a) General. Based... reclassification of a species or the designation or revision of critical habitat shall also include a summary of...
50 CFR 424.18 - Final rules-general.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., DEPARTMENT OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.18 Final rules—general. (a... rule to list, delist, or reclassify a species or designate or revise critical habitat will also provide...
50 CFR 424.18 - Final rules-general.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., DEPARTMENT OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.18 Final rules—general. (a... rule to list, delist, or reclassify a species or designate or revise critical habitat will also provide...
Use of evidence in a categorization task: analytic and holistic processing modes.
Greco, Alberto; Moretti, Stefania
2017-11-01
Category learning performance can be influenced by many contextual factors, but the effects of these factors are not the same for all learners. The present study suggests that these differences can be due to the different ways evidence is used, according to two main basic modalities of processing information, analytically or holistically. In order to test the impact of the information provided, an inductive rule-based task was designed, in which feature salience and comparison informativeness between examples of two categories were manipulated during the learning phases, by introducing and progressively reducing some perceptual biases. To gather data on processing modalities, we devised the Active Feature Composition task, a production task that does not require classifying new items but reproducing them by combining features. At the end, an explicit rating task was performed, which entailed assessing the accuracy of a set of possible categorization rules. A combined analysis of the data collected with these two different tests enabled profiling participants in regard to the kind of processing modality, the structure of representations and the quality of categorial judgments. Results showed that despite the fact that the information provided was the same for all participants, those who adopted analytic processing better exploited evidence and performed more accurately, whereas with holistic processing categorization is perfectly possible but inaccurate. Finally, the cognitive implications of the proposed procedure, with regard to involved processes and representations, are discussed.
Scalable printed electronics: an organic decoder addressing ferroelectric non-volatile memory
Ng, Tse Nga; Schwartz, David E.; Lavery, Leah L.; Whiting, Gregory L.; Russo, Beverly; Krusor, Brent; Veres, Janos; Bröms, Per; Herlogsson, Lars; Alam, Naveed; Hagel, Olle; Nilsson, Jakob; Karlsson, Christer
2012-01-01
Scalable circuits of organic logic and memory are realized using all-additive printing processes. A 3-bit organic complementary decoder is fabricated and used to read and write non-volatile, rewritable ferroelectric memory. The decoder-memory array is patterned by inkjet and gravure printing on flexible plastics. Simulation models for the organic transistors are developed, enabling circuit designs tolerant of the variations in printed devices. We explain the key design rules in fabrication of complex printed circuits and elucidate the performance requirements of materials and devices for reliable organic digital logic. PMID:22900143
NASA Technical Reports Server (NTRS)
1972-01-01
A long life assurance program for the development of design, process, test, and application guidelines for achieving reliable spacecraft hardware was conducted. The study approach consisted of a review of technical data performed concurrently with a survey of the aerospace industry. The data reviewed included design and operating characteristics, failure histories and solutions, and similar documents. The topics covered by the guidelines are reported. It is concluded that long life hardware is achieved through meticulous attention to many details and no simple set of rules can suffice.
Managing Large Scale Project Analysis Teams through a Web Accessible Database
NASA Technical Reports Server (NTRS)
O'Neil, Daniel A.
2008-01-01
Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.
Effective Design of Multifunctional Peptides by Combining Compatible Functions
Diener, Christian; Garza Ramos Martínez, Georgina; Moreno Blas, Daniel; Castillo González, David A.; Corzo, Gerardo; Castro-Obregon, Susana; Del Rio, Gabriel
2016-01-01
Multifunctionality is a common trait of many natural proteins and peptides, yet the rules to generate such multifunctionality remain unclear. We propose that the rules defining some protein/peptide functions are compatible. To explore this hypothesis, we trained a computational method to predict cell-penetrating peptides at the sequence level and learned that antimicrobial peptides and DNA-binding proteins are compatible with the rules of our predictor. Based on this finding, we expected that designing peptides for CPP activity may render AMP and DNA-binding activities. To test this prediction, we designed peptides that embedded two independent functional domains (nuclear localization and yeast pheromone activity), linked by optimizing their composition to fit the rules characterizing cell-penetrating peptides. These peptides presented effective cell penetration, DNA-binding, pheromone and antimicrobial activities, thus confirming the effectiveness of our computational approach to design multifunctional peptides with potential therapeutic uses. Our computational implementation is available at http://bis.ifc.unam.mx/en/software/dcf. PMID:27096600
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-31
... Proposed Rule Change, as Modified by Amendment No. 1 Thereto, To Adopt Commentary .03 to Rule 980NY To Limit the Volume of Complex Orders by a Single ATP Holder During the Trading Day December 24, 2013. On...\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to adopt Commentary .03 to NYSE MKT Rule 980NY to...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-10
...\\ FINRA Rule 6140(a) defines a ``designated security'' as any NMS stock as defined in Rule 600(b)(47) of Regulation NMS, 17 CFR 242.600(b)(47). \\8\\ See FINRA Rule 6140(h)(1)(A)-(B). \\9\\ See FINRA Rule 6140(h)(2... requirements of Section 15A(b) of the Act \\47\\ and the rules and regulations thereunder applicable to a...
Panas, Robert M.
2016-06-23
This paper presents a new analytical method for predicting the large displacement behavior of flexural double parallelogram (DP) bearings with underconstraint eliminator (UE) linkages. This closed-form perturbative Euler analysis method is able to – for the first time – directly incorporate the elastomechanics of a discrete UE linkage, which is a hybrid flexure element that is linked to ground as well as both stages on the bearing. The models are used to understand a nested linkage UE design, however the method is extensible to other UE linkages. Design rules and figures-of-merit are extracted from the analysis models, which provide powerfulmore » tools for accelerating the design process. The models, rules and figures-of-merit enable the rapid design of a UE for a desired large displacement behavior, as well as providing a means for determining the limits of UE and DP structures. This will aid in the adoption of UE linkages into DP bearings for precision mechanisms. Models are generated for a nested linkage UE design, and the performance of this DP with UE structure is compared to a DP-only bearing. As a result, the perturbative Euler analysis is shown to match existing theories for DP-only bearings with distributed compliance within ≈2%, and Finite Element Analysis for the DP with UE bearings within an average 10%.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Panas, Robert M.
This paper presents a new analytical method for predicting the large displacement behavior of flexural double parallelogram (DP) bearings with underconstraint eliminator (UE) linkages. This closed-form perturbative Euler analysis method is able to – for the first time – directly incorporate the elastomechanics of a discrete UE linkage, which is a hybrid flexure element that is linked to ground as well as both stages on the bearing. The models are used to understand a nested linkage UE design, however the method is extensible to other UE linkages. Design rules and figures-of-merit are extracted from the analysis models, which provide powerfulmore » tools for accelerating the design process. The models, rules and figures-of-merit enable the rapid design of a UE for a desired large displacement behavior, as well as providing a means for determining the limits of UE and DP structures. This will aid in the adoption of UE linkages into DP bearings for precision mechanisms. Models are generated for a nested linkage UE design, and the performance of this DP with UE structure is compared to a DP-only bearing. As a result, the perturbative Euler analysis is shown to match existing theories for DP-only bearings with distributed compliance within ≈2%, and Finite Element Analysis for the DP with UE bearings within an average 10%.« less
An expert system for natural language processing
NASA Technical Reports Server (NTRS)
Hennessy, John F.
1988-01-01
A solution to the natural language processing problem that uses a rule based system, written in OPS5, to replace the traditional parsing method is proposed. The advantage to using a rule based system are explored. Specifically, the extensibility of a rule based solution is discussed as well as the value of maintaining rules that function independently. Finally, the power of using semantics to supplement the syntactic analysis of a sentence is considered.
Automating the design of scientific computing software
NASA Technical Reports Server (NTRS)
Kant, Elaine
1992-01-01
SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.
Rule-Based Design of Plant Expression Vectors Using GenoCAD.
Coll, Anna; Wilson, Mandy L; Gruden, Kristina; Peccoud, Jean
2015-01-01
Plant synthetic biology requires software tools to assist on the design of complex multi-genic expression plasmids. Here a vector design strategy to express genes in plants is formalized and implemented as a grammar in GenoCAD, a Computer-Aided Design software for synthetic biology. It includes a library of plant biological parts organized in structural categories and a set of rules describing how to assemble these parts into large constructs. Rules developed here are organized and divided into three main subsections according to the aim of the final construct: protein localization studies, promoter analysis and protein-protein interaction experiments. The GenoCAD plant grammar guides the user through the design while allowing users to customize vectors according to their needs. Therefore the plant grammar implemented in GenoCAD will help plant biologists take advantage of methods from synthetic biology to design expression vectors supporting their research projects.
77 FR 30087 - Air Quality Designations for the 2008 Ozone National Ambient Air Quality Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-21
...This rule establishes initial air quality designations for most areas in the United States, including areas of Indian country, for the 2008 primary and secondary national ambient air quality standards (NAAQS) for ozone. The designations for several counties in Illinois, Indiana, and Wisconsin that the EPA is considering for inclusion in the Chicago nonattainment area will be designated in a subsequent action, no later than May 31, 2012. Areas designated as nonattainment are also being classified by operation of law according to the severity of their air quality problems. The classification categories are Marginal, Moderate, Serious, Severe, and Extreme. The EPA is establishing the air quality thresholds that define the classifications in a separate rule that the EPA is signing and publishing in the Federal Register on the same schedule as these designations. In accordance with that separate rule, six nonattainment areas in California are being reclassified to a higher classification.
AMICAL: An aid for architectural synthesis and exploration of control circuits
NASA Astrophysics Data System (ADS)
Park, Inhag
AMICAL is an architectural synthesis system for control flow dominated circuits. A behavioral finite state machine specification, where the scheduling and register allocation were performed, is presented. An abstract architecture specification that may feed existing silicon compilers acting at the logic and register transfer levels is described. AMICAL consists of five main functions allowing automatic, interactive and manual synthesis, as well as the combination of these methods. These functions are a synthesizer, a graphics editor, a verifier, an evaluator, and a documentor. Automatic synthesis is achieved by algorithms that allocate both functional units, stored in an expandable user defined library, and connections. AMICAL also allows the designer to interrupt the synthesis process at any stage and make interactive modifications via a specially designed graphics editor. The user's modifications are verified and evaluated to ensure that no design rules are broken and that any imposed constraints are still met. A documentor provides the designer with status and feedback reports from the synthesis process.
An Integrated Product Environment
NASA Technical Reports Server (NTRS)
Higgins, Chuck
1997-01-01
Mechanical Advantage is a mechanical design decision support system. Unlike our CAD/CAM cousins, Mechanical Advantage addresses true engineering processes, not just the form and fit of geometry. If we look at a traditional engineering environment, we see that an engineer starts with two things - performance goals and design rules. The intent is to have a product perform specific functions and accomplish that within a designated environment. Geometry should be a simple byproduct of that engineering process - not the controller of it. Mechanical Advantage is a performance modeler allowing engineers to consider all these criteria in making their decisions by providing such capabilities as critical parameter analysis, tolerance and sensitivity analysis, math driven Geometry, and automated design optimizations. If you should desire an industry standard solid model, we would produce an ACIS-based solid model. If you should desire an ANSI/ISO standard drawing, we would produce this as well with a virtual push of the button. For more information on this and other Advantage Series products, please contact the author.
Video Self-Modeling to Teach Classroom Rules to Two Students with Asperger's
ERIC Educational Resources Information Center
Lang, Russell; Shogren, Karrie A.; Machalicek, Wendy; Rispoli, Mandy; O'Reilly, Mark; Baker, Sonia; Regester, April
2009-01-01
Classroom rules are an integral part of classroom management. Children with Asperger's may require systematic instruction to learn classroom rules, but may be placed in classrooms in which the rules are not explicitly taught. A multiple baseline design across students with probes for maintenance after the intervention ceased was used to evaluate…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-18
...\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to make adjustments to the liquidity risk factor... liquidity risk factor component. The proposed rule change was published for comment in the Federal Register... on Proposed Rule Change Related to the Liquidity Factor of CME's CDS Margin Methodology June 12, 2013...
Cost efficiency of the non-associative flow rule simulation of an industrial component
NASA Astrophysics Data System (ADS)
Galdos, Lander; de Argandoña, Eneko Saenz; Mendiguren, Joseba
2017-10-01
In the last decade, metal forming industry is becoming more and more competitive. In this context, the FEM modeling has become a primary tool of information for the component and process design. Numerous researchers have been focused on improving the accuracy of the material models implemented on the FEM in order to improve the efficiency of the simulations. Aimed at increasing the efficiency of the anisotropic behavior modelling, in the last years the use of non-associative flow rule models (NAFR) has been presented as an alternative to the classic associative flow rule models (AFR). In this work, the cost efficiency of the used flow rule model has been numerically analyzed by simulating an industrial drawing operation with two different models of the same degree of flexibility: one AFR model and one NAFR model. From the present study, it has been concluded that the flow rule has a negligible influence on the final drawing prediction; this is mainly driven by the model parameter identification procedure. Even though the NAFR formulation is complex when compared to the AFR, the present study shows that the total simulation time while using explicit FE solvers has been reduced without loss of accuracy. Furthermore, NAFR formulations have an advantage over AFR formulations in parameter identification because the formulation decouples the yield stress and the Lankford coefficients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meshkati, N.; Buller, B.J.; Azadeh, M.A.
1995-04-01
The goal of this research is threefold: (1) use of the Skill-, Rule-, and Knowledge-based levels of cognitive control -- the SRK framework -- to develop an integrated information processing conceptual framework (for integration of workstation, job, and team design); (2) to evaluate the user interface component of this framework -- the Ecological display; and (3) to analyze the effect of operators` individual information processing behavior and decision styles on handling plant disturbances plus their performance on, and preference for, Traditional and Ecological user interfaces. A series of studies were conducted. In Part I, a computer simulation model and amore » mathematical model were developed. In Part II, an experiment was designed and conducted at the EBR-II plant of the Argonne National Laboratory-West in Idaho Falls, Idaho. It is concluded that: the integrated SRK-based information processing model for control room operations is superior to the conventional rule-based model; operators` individual decision styles and the combination of their styles play a significant role in effective handling of nuclear power plant disturbances; use of the Ecological interface results in significantly more accurate event diagnosis and recall of various plant parameters, faster response to plant transients, and higher ratings of subject preference; and operators` decision styles affect on both their performance and preference for the Ecological interface.« less
ERIC Educational Resources Information Center
Jha, Vikram; Duffy, Sean
2002-01-01
Reports the results of an evaluation of Distance Interactive Learning in Obstetrics and Gynecology (DIALOG) which is an electronic program for continuing education. Presents 10 golden rules for designing software for medical practitioners. (Contains 26 references.) (Author/YDS)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-02
... with a time-in- force designation of Good Til canceled (``GTC'') are treated as having a time-in-force... designation of Good Til Cancelled or Immediate or Cancel. See proposed BX Options Rules, Chapter VI, Section 1...
46 CFR 116.300 - Structural design.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Structure § 116.300 Structural design. Except as otherwise allowed by this subpart, a vessel must comply... the vessel. (a) Steel hull vessels: (1) Rules and Regulations for the Classification of Yachts and Small Craft, Lloyd's Register of Shipping (Lloyd's); or (2) Rules for Building and Classing Steel...
46 CFR 116.300 - Structural design.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Structure § 116.300 Structural design. Except as otherwise allowed by this subpart, a vessel must comply... the vessel. (a) Steel hull vessels: (1) Rules and Regulations for the Classification of Yachts and Small Craft, Lloyd's Register of Shipping (Lloyd's); or (2) Rules for Building and Classing Steel...
46 CFR 116.300 - Structural design.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Structure § 116.300 Structural design. Except as otherwise allowed by this subpart, a vessel must comply... the vessel. (a) Steel hull vessels: (1) Rules and Regulations for the Classification of Yachts and Small Craft, Lloyd's Register of Shipping (Lloyd's); or (2) Rules for Building and Classing Steel...
46 CFR 116.300 - Structural design.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Structure § 116.300 Structural design. Except as otherwise allowed by this subpart, a vessel must comply... the vessel. (a) Steel hull vessels: (1) Rules and Regulations for the Classification of Yachts and Small Craft, Lloyd's Register of Shipping (Lloyd's); or (2) Rules for Building and Classing Steel...
46 CFR 116.300 - Structural design.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Structure § 116.300 Structural design. Except as otherwise allowed by this subpart, a vessel must comply... the vessel. (a) Steel hull vessels: (1) Rules and Regulations for the Classification of Yachts and Small Craft, Lloyd's Register of Shipping (Lloyd's); or (2) Rules for Building and Classing Steel...
Intelligent design is not science, U.S. judge rules
NASA Astrophysics Data System (ADS)
Zielinski, Sarah
A Pennsylvania school district may not mandate the teaching of ‘intelligent design’ because the concept is not science and cannot be uncoupled from its religious roots in creationism, a U.S. federal judge ruled on 20 December.U.S. District Judge John E. Jones III ruled that the Dover, Pa., school board's policy on intelligent design violates the Establishment Clause of the First Amendment of the U.S. Constitution, which forbids the government from establishing a state religion, and also violates the Pennsylvania state constitution.
75 FR 37740 - Apricots Grown in Designated Counties in Washington; Increased Assessment Rate
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-30
... Washington; Increased Assessment Rate AGENCY: Agricultural Marketing Service, USDA. ACTION: Proposed rule. SUMMARY: This rule would increase the assessment rate established for the Washington Apricot Marketing..., June 30, 2010 / Proposed Rules#0;#0; [[Page 37740
EUV mask manufacturing readiness in the merchant mask industry
NASA Astrophysics Data System (ADS)
Green, Michael; Choi, Yohan; Ham, Young; Kamberian, Henry; Progler, Chris; Tseng, Shih-En; Chiou, Tsann-Bim; Miyazaki, Junji; Lammers, Ad; Chen, Alek
2017-10-01
As nodes progress into the 7nm and below regime, extreme ultraviolet lithography (EUVL) becomes critical for all industry participants interested in remaining at the leading edge. One key cost driver for EUV in the supply chain is the reflective EUV mask. As of today, the relatively few end users of EUV consist primarily of integrated device manufactures (IDMs) and foundries that have internal (captive) mask manufacturing capability. At the same time, strong and early participation in EUV by the merchant mask industry should bring value to these chip makers, aiding the wide-scale adoption of EUV in the future. For this, merchants need access to high quality, representative test vehicles to develop and validate their own processes. This business circumstance provides the motivation for merchants to form Joint Development Partnerships (JDPs) with IDMs, foundries, Original Equipment Manufacturers (OEMs) and other members of the EUV supplier ecosystem that leverage complementary strengths. In this paper, we will show how, through a collaborative supplier JDP model between a merchant and OEM, a novel, test chip driven strategy is applied to guide and validate mask level process development. We demonstrate how an EUV test vehicle (TV) is generated for mask process characterization in advance of receiving chip maker-specific designs. We utilize the TV to carry out mask process "stress testing" to define process boundary conditions which can be used to create Mask Rule Check (MRC) rules as well as serve as baseline conditions for future process improvement. We utilize Advanced Mask Characterization (AMC) techniques to understand process capability on designs of varying complexity that include EUV OPC models with and without sub-resolution assist features (SRAFs). Through these collaborations, we demonstrate ways to develop EUV processes and reduce implementation risks for eventual mass production. By reducing these risks, we hope to expand access to EUV mask capability for the broadest community possible as the technology is implemented first within and then beyond the initial early adopters.
Process Approach to Determining Quality Inspection Deployment
2015-06-08
27 B.1 The Deming Rule...k1/k2? [5] At this stage it is assumed that the manufacturing process is capable and that inspection is effective. The Deming rule is explained in...justify reducing inspectors. (See Appendix B for Deming rule discussion.) Three quantities must be determined: p, the probability of a nonconformity
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-06
... proposes to establish new minimum performance standards for specialist units.\\12\\ Specifically, new Rule... Streamline the Process for Specialist Evaluations and Clarify the Time Within Which SQTs and RSQTs Must Begin...-4 thereunder,\\2\\ a proposed rule change to update and streamline the process for specialist...
Streamling the Change Management with Business Rules
NASA Technical Reports Server (NTRS)
Savela, Christopher
2015-01-01
Will discuss how their organization is trying to streamline workflows and the change management process with business rules. In looking for ways to make things more efficient and save money one way is to reduce the work the workflow task approvers have to do when reviewing affected items. Will share the technical details of the business rules, how to implement them, how to speed up the development process by using the API to demonstrate the rules in action.
ERIC Educational Resources Information Center
Wieringa, Nienke; Janssen, Fred J. J. M.; Van Driel, Jan H.
2011-01-01
In science education in the Netherlands new, context-based, curricula are being developed. As in any innovation, the outcome will largely depend on the teachers who design and implement lessons. Central to the study presented here is the idea that teachers, when designing lessons, use rules-of-thumb: notions of what a lesson should look like if…
Barrett, Jeffrey S; Jayaraman, Bhuvana; Patel, Dimple; Skolnik, Jeffrey M
2008-06-01
Previous exploration of oncology study design efficiency has focused on Markov processes alone (probability-based events) without consideration for time dependencies. Barriers to study completion include time delays associated with patient accrual, inevaluability (IE), time to dose limiting toxicities (DLT) and administrative and review time. Discrete event simulation (DES) can incorporate probability-based assignment of DLT and IE frequency, correlated with cohort in the case of DLT, with time-based events defined by stochastic relationships. A SAS-based solution to examine study efficiency metrics and evaluate design modifications that would improve study efficiency is presented. Virtual patients are simulated with attributes defined from prior distributions of relevant patient characteristics. Study population datasets are read into SAS macros which select patients and enroll them into a study based on the specific design criteria if the study is open to enrollment. Waiting times, arrival times and time to study events are also sampled from prior distributions; post-processing of study simulations is provided within the decision macros and compared across designs in a separate post-processing algorithm. This solution is examined via comparison of the standard 3+3 decision rule relative to the "rolling 6" design, a newly proposed enrollment strategy for the phase I pediatric oncology setting.
Ethics in Publishing: Complexity Science and Human Factors Offer Insights to Develop a Just Culture.
Saurin, Tarcisio Abreu
2016-12-01
While ethics in publishing has been increasingly debated, there seems to be a lack of a theoretical framework for making sense of existing rules of behavior as well as for designing, managing and enforcing such rules. This letter argues that systems-oriented disciplines, such as complexity science and human factors, offer insights into new ways of dealing with ethics in publishing. Some examples of insights are presented. Also, a call is made for empirical studies that unveil the context and details of both retracted papers and the process of writing and publishing academic papers. This is expected to shed light on the complexity of the publication system as well as to support the development of a just culture, in which all participants are accountable.
Big data mining analysis method based on cloud computing
NASA Astrophysics Data System (ADS)
Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao
2017-08-01
Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.
Mechanisms of rule acquisition and rule following in inductive reasoning.
Crescentini, Cristiano; Seyed-Allaei, Shima; De Pisapia, Nicola; Jovicich, Jorge; Amati, Daniele; Shallice, Tim
2011-05-25
Despite the recent interest in the neuroanatomy of inductive reasoning processes, the regional specificity within prefrontal cortex (PFC) for the different mechanisms involved in induction tasks remains to be determined. In this study, we used fMRI to investigate the contribution of PFC regions to rule acquisition (rule search and rule discovery) and rule following. Twenty-six healthy young adult participants were presented with a series of images of cards, each consisting of a set of circles numbered in sequence with one colored blue. Participants had to predict the position of the blue circle on the next card. The rules that had to be acquired pertained to the relationship among succeeding stimuli. Responses given by subjects were categorized in a series of phases either tapping rule acquisition (responses given up to and including rule discovery) or rule following (correct responses after rule acquisition). Mid-dorsolateral PFC (mid-DLPFC) was active during rule search and remained active until successful rule acquisition. By contrast, rule following was associated with activation in temporal, motor, and medial/anterior prefrontal cortex. Moreover, frontopolar cortex (FPC) was active throughout the rule acquisition and rule following phases before a rule became familiar. We attributed activation in mid-DLPFC to hypothesis generation and in FPC to integration of multiple separate inferences. The present study provides evidence that brain activation during inductive reasoning involves a complex network of frontal processes and that different subregions respond during rule acquisition and rule following phases.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-13
...-Regulatory Organizations; Fixed Income Clearing Corporation; Notice of Designation of Longer Period for Commission Action on Proposed Rule Change To Allow the Mortgage-Backed Securities Division To Provide...''), and on November 21, 2011, amended a proposed rule change to allow the Mortgage-Backed Securities...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
...-4 thereunder,\\2\\ proposed rule changes to allow Retail Member Organizations (``RMOs'') to attest...-regulatory organization consents, the Commission shall either approve the proposed rule change, disapprove...-07] Self-Regulatory Organizations; New York Stock Exchange LLC; NYSE MKT LLC; Notice of Designation...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-20
... DEPARTMENT OF HEALTH AND HUMAN SERVICES 42 CFR Part 5 Negotiated Rulemaking Committee on Designation of Medically Underserved Populations and Health Professional Shortage Areas; Notice of Meeting Correction Proposed Rule document 2011-9081 was inadvertently published in the Rules section of the issue of...
Biggs, Jason D.; Voll, Judith A.; Mukamel, Shaul
2012-01-01
Two types of diagrammatic approaches for the design and simulation of nonlinear optical experiments (closed-time path loops based on the wave function and double-sided Feynman diagrams for the density matrix) are presented and compared. We give guidelines for the assignment of relevant pathways and provide rules for the interpretation of existing nonlinear experiments in carotenoids. PMID:22753822
Visualization of usability and functionality of a professional website through web-mining.
Jones, Josette F; Mahoui, Malika; Gopa, Venkata Devi Pragna
2007-10-11
Functional interface design requires understanding of the information system structure and the user. Web logs record user interactions with the interface, and thus provide some insight into user search behavior and efficiency of the search process. The present study uses a data-mining approach with techniques such as association rules, clustering and classification, to visualize the usability and functionality of a digital library through in depth analyses of web logs.
Kister, Alexander
2015-01-01
We present an alternative approach to protein 3D folding prediction based on determination of rules that specify distribution of “favorable” residues, that are mainly responsible for a given fold formation, and “unfavorable” residues, that are incompatible with that fold, in polypeptide sequences. The process of determining favorable and unfavorable residues is iterative. The starting assumptions are based on the general principles of protein structure formation as well as structural features peculiar to a protein fold under investigation. The initial assumptions are tested one-by-one for a set of all known proteins with a given structure. The assumption is accepted as a “rule of amino acid distribution” for the protein fold if it holds true for all, or near all, structures. If the assumption is not accepted as a rule, it can be modified to better fit the data and then tested again in the next step of the iterative search algorithm, or rejected. We determined the set of amino acid distribution rules for a large group of beta sandwich-like proteins characterized by a specific arrangement of strands in two beta sheets. It was shown that this set of rules is highly sensitive (~90%) and very specific (~99%) for identifying sequences of proteins with specified beta sandwich fold structure. The advantage of the proposed approach is that it does not require that query proteins have a high degree of homology to proteins with known structure. So long as the query protein satisfies residue distribution rules, it can be confidently assigned to its respective protein fold. Another advantage of our approach is that it allows for a better understanding of which residues play an essential role in protein fold formation. It may, therefore, facilitate rational protein engineering design. PMID:25625198
A genetic algorithms approach for altering the membership functions in fuzzy logic controllers
NASA Technical Reports Server (NTRS)
Shehadeh, Hana; Lea, Robert N.
1992-01-01
Through previous work, a fuzzy control system was developed to perform translational and rotational control of a space vehicle. This problem was then re-examined to determine the effectiveness of genetic algorithms on fine tuning the controller. This paper explains the problems associated with the design of this fuzzy controller and offers a technique for tuning fuzzy logic controllers. A fuzzy logic controller is a rule-based system that uses fuzzy linguistic variables to model human rule-of-thumb approaches to control actions within a given system. This 'fuzzy expert system' features rules that direct the decision process and membership functions that convert the linguistic variables into the precise numeric values used for system control. Defining the fuzzy membership functions is the most time consuming aspect of the controller design. One single change in the membership functions could significantly alter the performance of the controller. This membership function definition can be accomplished by using a trial and error technique to alter the membership functions creating a highly tuned controller. This approach can be time consuming and requires a great deal of knowledge from human experts. In order to shorten development time, an iterative procedure for altering the membership functions to create a tuned set that used a minimal amount of fuel for velocity vector approach and station-keep maneuvers was developed. Genetic algorithms, search techniques used for optimization, were utilized to solve this problem.
NASA Astrophysics Data System (ADS)
Magee, Daniel J.; Niemeyer, Kyle E.
2018-03-01
The expedient design of precision components in aerospace and other high-tech industries requires simulations of physical phenomena often described by partial differential equations (PDEs) without exact solutions. Modern design problems require simulations with a level of resolution difficult to achieve in reasonable amounts of time-even in effectively parallelized solvers. Though the scale of the problem relative to available computing power is the greatest impediment to accelerating these applications, significant performance gains can be achieved through careful attention to the details of memory communication and access. The swept time-space decomposition rule reduces communication between sub-domains by exhausting the domain of influence before communicating boundary values. Here we present a GPU implementation of the swept rule, which modifies the algorithm for improved performance on this processing architecture by prioritizing use of private (shared) memory, avoiding interblock communication, and overwriting unnecessary values. It shows significant improvement in the execution time of finite-difference solvers for one-dimensional unsteady PDEs, producing speedups of 2 - 9 × for a range of problem sizes, respectively, compared with simple GPU versions and 7 - 300 × compared with parallel CPU versions. However, for a more sophisticated one-dimensional system of equations discretized with a second-order finite-volume scheme, the swept rule performs 1.2 - 1.9 × worse than a standard implementation for all problem sizes.
NASA Astrophysics Data System (ADS)
Oesterle, Jonathan; Lionel, Amodeo
2018-06-01
The current competitive situation increases the importance of realistically estimating product costs during the early phases of product and assembly line planning projects. In this article, several multi-objective algorithms using difference dominance rules are proposed to solve the problem associated with the selection of the most effective combination of product and assembly lines. The list of developed algorithms includes variants of ant colony algorithms, evolutionary algorithms and imperialist competitive algorithms. The performance of each algorithm and dominance rule is analysed by five multi-objective quality indicators and fifty problem instances. The algorithms and dominance rules are ranked using a non-parametric statistical test.
A New Trend-Following Indicator: Using SSA to Design Trading Rules
NASA Astrophysics Data System (ADS)
Leles, Michel Carlo Rodrigues; Mozelli, Leonardo Amaral; Guimarães, Homero Nogueira
Singular Spectrum Analysis (SSA) is a non-parametric approach that can be used to decompose a time-series as trends, oscillations and noise. Trend-following strategies rely on the principle that financial markets move in trends for an extended period of time. Moving Averages (MAs) are the standard indicator to design such strategies. In this study, SSA is used as an alternative method to enhance trend resolution in comparison with the traditional MA. New trading rules using SSA as indicator are proposed. This paper shows that for the Down Jones Industrial Average (DJIA) and Shangai Securities Composite Index (SSCI) time-series the SSA trading rules provided, in general, better results in comparison to MA trading rules.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-04
... that MPL Orders may interact with Capital Commitment Schedule (``CCS'') interest; (3) NYSE MKT Rule 70... may interact with CCS interest; (3) NYSE MKT Rule 70.25--Equities to permit d-Quotes to be designated... specifically noted otherwise. DMM interest entered via the CCS pursuant to Rule 1000 would not be permitted to...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-20
... securities. \\5\\ The exemption in FINRA Rule 6730(e)(4) is conditioned, among other things, upon a data... 15A(b)(6) of the Act,\\6\\ which requires, among other things, that FINRA rules must be designed to... Commission's Internet comment form ( http://www.sec.gov/rules/sro.shtml ); or Send an e-mail to rule-comments...
Visual exploration and analysis of human-robot interaction rules
NASA Astrophysics Data System (ADS)
Zhang, Hui; Boyles, Michael J.
2013-01-01
We present a novel interaction paradigm for the visual exploration, manipulation and analysis of human-robot interaction (HRI) rules; our development is implemented using a visual programming interface and exploits key techniques drawn from both information visualization and visual data mining to facilitate the interaction design and knowledge discovery process. HRI is often concerned with manipulations of multi-modal signals, events, and commands that form various kinds of interaction rules. Depicting, manipulating and sharing such design-level information is a compelling challenge. Furthermore, the closed loop between HRI programming and knowledge discovery from empirical data is a relatively long cycle. This, in turn, makes design-level verification nearly impossible to perform in an earlier phase. In our work, we exploit a drag-and-drop user interface and visual languages to support depicting responsive behaviors from social participants when they interact with their partners. For our principal test case of gaze-contingent HRI interfaces, this permits us to program and debug the robots' responsive behaviors through a graphical data-flow chart editor. We exploit additional program manipulation interfaces to provide still further improvement to our programming experience: by simulating the interaction dynamics between a human and a robot behavior model, we allow the researchers to generate, trace and study the perception-action dynamics with a social interaction simulation to verify and refine their designs. Finally, we extend our visual manipulation environment with a visual data-mining tool that allows the user to investigate interesting phenomena such as joint attention and sequential behavioral patterns from multiple multi-modal data streams. We have created instances of HRI interfaces to evaluate and refine our development paradigm. As far as we are aware, this paper reports the first program manipulation paradigm that integrates visual programming interfaces, information visualization, and visual data mining methods to facilitate designing, comprehending, and evaluating HRI interfaces.
40 CFR 52.2723 - EPA-approved Puerto Rico regulations.
Code of Federal Regulations, 2012 CFR
2012-07-01
...—Agricultural Burning Authorized 9/28/95 ......do Rule 209—Modification of the Allowed Sulfur-in-Fuel Percentage...—Generic Prohibitions 9/28/95 ......do Rule 402—Open Burning 9/28/95 ......do Rule 403—Visible Emissions 9... Rule 406—Fuel Burning Equipment 9/28/95 ......do Rule 407—Process Sources 9/28/95 ......do Rule 408...
Methodological issues with adaptation of clinical trial design.
Hung, H M James; Wang, Sue-Jane; O'Neill, Robert T
2006-01-01
Adaptation of clinical trial design generates many issues that have not been resolved for practical applications, though statistical methodology has advanced greatly. This paper focuses on some methodological issues. In one type of adaptation such as sample size re-estimation, only the postulated value of a parameter for planning the trial size may be altered. In another type, the originally intended hypothesis for testing may be modified using the internal data accumulated at an interim time of the trial, such as changing the primary endpoint and dropping a treatment arm. For sample size re-estimation, we make a contrast between an adaptive test weighting the two-stage test statistics with the statistical information given by the original design and the original sample mean test with a properly corrected critical value. We point out the difficulty in planning a confirmatory trial based on the crude information generated by exploratory trials. In regards to selecting a primary endpoint, we argue that the selection process that allows switching from one endpoint to the other with the internal data of the trial is not very likely to gain a power advantage over the simple process of selecting one from the two endpoints by testing them with an equal split of alpha (Bonferroni adjustment). For dropping a treatment arm, distributing the remaining sample size of the discontinued arm to other treatment arms can substantially improve the statistical power of identifying a superior treatment arm in the design. A common difficult methodological issue is that of how to select an adaptation rule in the trial planning stage. Pre-specification of the adaptation rule is important for the practicality consideration. Changing the originally intended hypothesis for testing with the internal data generates great concerns to clinical trial researchers.
Designing 4H-SiC P-shielding trench gate MOSFET to optimize on-off electrical characteristics
NASA Astrophysics Data System (ADS)
Kyoung, Sinsu; Hong, Young-sung; Lee, Myung-hwan; Nam, Tae-jin
2018-02-01
In order to enhance specific on-resistance (Ron,sp), the trench gate structure was also introduced into 4H-SiC MOSFET as Si MOSFET. But the 4H-SiC trench gate has worse off-state characteristics than the Si trench gate due to the incomplete gate oxidation process (Šimonka et al., 2017). In order to overcome this problem, P-shielding trench gate MOSFET (TMOS) was proposed and researched in previous studies. But P-shielding has to be designed with minimum design rule in order to protect gate oxide effectively. P-shielding TMOS also has the drawback of on-state characteristics degradation corresponding to off state improvement for minimum design rule. Therefore optimized design is needed to satisfy both on and off characteristics. In this paper, the design parameters were analyzed and optimized so that the 4H-SiC P-shielding TMOS satisfies both on and off characteristics. Design limitations were proposed such that P-shielding is able to defend the gate oxide. The P-shielding layer should have the proper junction depth and concentration to defend the electric field to gate oxide during the off-state. However, overmuch P-shielding junction depth disturbs the on-state current flow, a problem which can be solved by increasing the trench depth. As trench depth increases, however, the breakdown voltage decreases. Therefore, trench depth should be designed with due consideration for on-off characteristics. For this, design conditions and modeling were proposed which allow P-shielding to operate without degradation of on-state characteristics. Based on this proposed model, the 1200 V 4H-SiC P-shielding trench gate MOSFET was designed and optimized.
Jacob, Louis; Uvarova, Maria; Boulet, Sandrine; Begaj, Inva; Chevret, Sylvie
2016-06-02
Multi-Arm Multi-Stage designs aim at comparing several new treatments to a common reference, in order to select or drop any treatment arm to move forward when such evidence already exists based on interim analyses. We redesigned a Bayesian adaptive design initially proposed for dose-finding, focusing our interest in the comparison of multiple experimental drugs to a control on a binary criterion measure. We redesigned a phase II clinical trial that randomly allocates patients across three (one control and two experimental) treatment arms to assess dropping decision rules. We were interested in dropping any arm due to futility, either based on historical control rate (first rule) or comparison across arms (second rule), and in stopping experimental arm due to its ability to reach a sufficient response rate (third rule), using the difference of response probabilities in Bayes binomial trials between the treated and control as a measure of treatment benefit. Simulations were then conducted to investigate the decision operating characteristics under a variety of plausible scenarios, as a function of the decision thresholds. Our findings suggest that one experimental treatment was less efficient than the control and could have been dropped from the trial based on a sample of approximately 20 instead of 40 patients. In the simulation study, stopping decisions were reached sooner for the first rule than for the second rule, with close mean estimates of response rates and small bias. According to the decision threshold, the mean sample size to detect the required 0.15 absolute benefit ranged from 63 to 70 (rule 3) with false negative rates of less than 2 % (rule 1) up to 6 % (rule 2). In contrast, detecting a 0.15 inferiority in response rates required a sample size ranging on average from 23 to 35 (rules 1 and 2, respectively) with a false positive rate ranging from 3.6 to 0.6 % (rule 3). Adaptive trial design is a good way to improve clinical trials. It allows removing ineffective drugs and reducing the trial sample size, while maintaining unbiased estimates. Decision thresholds can be set according to predefined fixed error decision rates. ClinicalTrials.gov Identifier: NCT01342692 .
SOI-CMOS Process for Monolithic, Radiation-Tolerant, Science-Grade Imagers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, George; Lee, Adam
In Phase I, Voxtel worked with Jazz and Sandia to document and simulate the processes necessary to implement a DH-BSI SOI CMOS imaging process. The development is based upon mature SOI CMOS process at both fabs, with the addition of only a few custom processing steps for integration and electrical interconnection of the fully-depleted photodetectors. In Phase I, Voxtel also characterized the Sandia process, including the CMOS7 design rules, and we developed the outline of a process option that included a “BOX etch”, that will permit a “detector in handle” SOI CMOS process to be developed The process flows weremore » developed in cooperation with both Jazz and Sandia process engineers, along with detailed TCAD modeling and testing of the photodiode array architectures. In addition, Voxtel tested the radiation performance of the Jazz’s CA18HJ process, using standard and circular-enclosed transistors.« less
Perez, Raymond; Archdeacon, Patrick; Roach, Nancy; Goodwin, Robert; Jarow, Jonathan; Stuccio, Nina; Forrest, Annemarie
2017-06-01
The Food and Drug Administration's final rule on investigational new drug application safety reporting, effective from 28 March 2011, clarified the reporting requirements for serious and unexpected suspected adverse reactions occurring in clinical trials. The Clinical Trials Transformation Initiative released recommendations in 2013 to assist implementation of the final rule; however, anecdotal reports and data from a Food and Drug Administration audit indicated that a majority of reports being submitted were still uninformative and did not result in actionable changes. Clinical Trials Transformation Initiative investigated remaining barriers and potential solutions to full implementation of the final rule by polling and interviewing investigators, clinical research staff, and sponsors. In an opinion-gathering effort, two discrete online surveys designed to assess challenges and motivations related to management of expedited (7- to 15-day) investigational new drug safety reporting processes in oncology trials were developed and distributed to two populations: investigators/clinical research staff and sponsors. Data were collected for approximately 1 year. Twenty-hour-long interviews were also conducted with Clinical Trials Transformation Initiative-nominated interview participants who were considered as having extensive knowledge of and experience with the topic. Interviewees included 13 principal investigators/study managers/research team members and 7 directors/vice presidents of pharmacovigilance operations from 5 large global pharmaceutical companies. The investigative site's responses indicate that too many individual reports are still being submitted, which are time-consuming to process and provide little value for patient safety assessments or for informing actionable changes. Fewer but higher quality reports would be more useful, and the investigator and staff would benefit from sponsors'"filtering" of reports and increased sponsor communication. Sponsors replied that their greatest challenges include (1) lack of global harmonization in reporting rules, (2) determining causality, and (3) fear of regulatory repercussions. Interaction with the Food and Drug Administration has helped improve sponsors' adherence to the final rule, and sponsors would benefit from increased communication with the Food and Drug Administration and educational materials. The goal of the final rule is to minimize uninformative safety reports so that important safety signals can be captured and communicated early enough in a clinical program to make changes that help ensure patient safety. Investigative staff and sponsors acknowledge that the rule has not been fully implemented although they agree with the intention. Clinical Trials Transformation Initiative will use the results from the surveys and interviews to develop new recommendations and educational materials that will be available to sponsors to increase compliance with the final rule and facilitate discussion between sponsors, investigators, and Food and Drug Administration representatives.
Perez, Raymond; Archdeacon, Patrick; Roach, Nancy; Goodwin, Robert; Jarow, Jonathan; Stuccio, Nina; Forrest, Annemarie
2017-01-01
Background/aims: The Food and Drug Administration’s final rule on investigational new drug application safety reporting, effective from 28 March 2011, clarified the reporting requirements for serious and unexpected suspected adverse reactions occurring in clinical trials. The Clinical Trials Transformation Initiative released recommendations in 2013 to assist implementation of the final rule; however, anecdotal reports and data from a Food and Drug Administration audit indicated that a majority of reports being submitted were still uninformative and did not result in actionable changes. Clinical Trials Transformation Initiative investigated remaining barriers and potential solutions to full implementation of the final rule by polling and interviewing investigators, clinical research staff, and sponsors. Methods: In an opinion-gathering effort, two discrete online surveys designed to assess challenges and motivations related to management of expedited (7- to 15-day) investigational new drug safety reporting processes in oncology trials were developed and distributed to two populations: investigators/clinical research staff and sponsors. Data were collected for approximately 1 year. Twenty-hour-long interviews were also conducted with Clinical Trials Transformation Initiative–nominated interview participants who were considered as having extensive knowledge of and experience with the topic. Interviewees included 13 principal investigators/study managers/research team members and 7 directors/vice presidents of pharmacovigilance operations from 5 large global pharmaceutical companies. Results: The investigative site’s responses indicate that too many individual reports are still being submitted, which are time-consuming to process and provide little value for patient safety assessments or for informing actionable changes. Fewer but higher quality reports would be more useful, and the investigator and staff would benefit from sponsors’“filtering” of reports and increased sponsor communication. Sponsors replied that their greatest challenges include (1) lack of global harmonization in reporting rules, (2) determining causality, and (3) fear of regulatory repercussions. Interaction with the Food and Drug Administration has helped improve sponsors’ adherence to the final rule, and sponsors would benefit from increased communication with the Food and Drug Administration and educational materials. Conclusion: The goal of the final rule is to minimize uninformative safety reports so that important safety signals can be captured and communicated early enough in a clinical program to make changes that help ensure patient safety. Investigative staff and sponsors acknowledge that the rule has not been fully implemented although they agree with the intention. Clinical Trials Transformation Initiative will use the results from the surveys and interviews to develop new recommendations and educational materials that will be available to sponsors to increase compliance with the final rule and facilitate discussion between sponsors, investigators, and Food and Drug Administration representatives. PMID:28345368
Iveson, Matthew H; Della Sala, Sergio; Anderson, Mike; MacPherson, Sarah E
2017-05-01
Goal maintenance is the process where task rules and instructions are kept active to exert their control on behavior. When this process fails, an individual may ignore a rule while performing the task, despite being able to describe it after task completion. Previous research has suggested that the goal maintenance system is limited by the number of concurrent rules which can be maintained during a task, and that this limit is dependent on an individual's level of fluid intelligence. However, the speed at which an individual can process information may also limit their ability to use task rules when the task demands them. In the present study, four experiments manipulated the number of instructions to be maintained by younger and older adults and examined whether performance on a rapid letter-monitoring task was predicted by individual differences in fluid intelligence or processing speed. Fluid intelligence played little role in determining how frequently rules were ignored during the task, regardless of the number of rules to be maintained. In contrast, processing speed predicted the rate of goal neglect in older adults, where increasing the presentation rate of the letter-monitoring task increased goal neglect. These findings suggest that goal maintenance may be limited by the speed at which it can operate. Copyright © 2017. Published by Elsevier B.V.
Design principles for therapeutic angiogenic materials
NASA Astrophysics Data System (ADS)
Briquez, Priscilla S.; Clegg, Lindsay E.; Martino, Mikaël M.; Gabhann, Feilim Mac; Hubbell, Jeffrey A.
2016-01-01
Despite extensive research, pro-angiogenic drugs have failed to translate clinically, and therapeutic angiogenesis, which has potential in the treatment of various cardiovascular diseases, remains a major challenge. Physiologically, angiogenesis — the process of blood-vessel growth from existing vasculature — is regulated by a complex interplay of biophysical and biochemical cues from the extracellular matrix (ECM), angiogenic factors and multiple cell types. The ECM can be regarded as the natural 3D material that regulates angiogenesis. Here, we leverage knowledge of ECM properties to derive design rules for engineering pro-angiogenic materials. We propose that pro-angiogenic materials should be biomimetic, incorporate angiogenic factors and mimic cooperative interactions between growth factors and the ECM. We highlight examples of material designs that demonstrate these principles and considerations for designing better angiogenic materials.
NASA Astrophysics Data System (ADS)
Sánchez, H. T.; Estrems, M.; Franco, P.; Faura, F.
2009-11-01
In recent years, the market of heat exchangers is increasingly demanding new products in short cycle time, which means that both the design and manufacturing stages must be extremely reduced. The design stage can be reduced by means of CAD-based parametric design techniques. The methodology presented in this proceeding is based on the optimized control of geometric parameters of a service chamber of a heat exchanger by means of the Application Programming Interface (API) provided by the Solidworks CAD package. Using this implementation, a set of different design configurations of the service chamber made of stainless steel AISI 316 are studied by means of the FE method. As a result of this study, a set of knowledge rules based on the fatigue behaviour are constructed and integrated into the design optimization process.
50 CFR 424.20 - Emergency rules.
Code of Federal Regulations, 2010 CFR
2010-10-01
... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.20 Emergency rules. (a) Sections 424...-being of a species of fish, wildlife, or plant. Such rules shall, at the discretion of the Secretary...
50 CFR 424.20 - Emergency rules.
Code of Federal Regulations, 2014 CFR
2014-10-01
... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.20 Emergency rules. (a) Sections 424...-being of a species of fish, wildlife, or plant. Such rules shall, at the discretion of the Secretary...
50 CFR 424.18 - Final rules-general.
Code of Federal Regulations, 2011 CFR
2011-10-01
... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.18 Final rules—general. (a) Contents... any conservation measures available under the rule. Publication of a final rule to list, delist, or...
50 CFR 424.20 - Emergency rules.
Code of Federal Regulations, 2011 CFR
2011-10-01
... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.20 Emergency rules. (a) Sections 424...-being of a species of fish, wildlife, or plant. Such rules shall, at the discretion of the Secretary...
50 CFR 424.20 - Emergency rules.
Code of Federal Regulations, 2012 CFR
2012-10-01
... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.20 Emergency rules. (a) Sections 424...-being of a species of fish, wildlife, or plant. Such rules shall, at the discretion of the Secretary...