Redundancy checking algorithms based on parallel novel extension rule
NASA Astrophysics Data System (ADS)
Liu, Lei; Yang, Yang; Li, Guangli; Wang, Qi; Lü, Shuai
2017-05-01
Redundancy checking (RC) is a key knowledge reduction technology. Extension rule (ER) is a new reasoning method, first presented in 2003 and well received by experts at home and abroad. Novel extension rule (NER) is an improved ER-based reasoning method, presented in 2009. In this paper, we first analyse the characteristics of the extension rule, and then present a simple algorithm for redundancy checking based on extension rule (RCER). In addition, we introduce MIMF, a type of heuristic strategy. Using the aforementioned rule and strategy, we design and implement RCHER algorithm, which relies on MIMF. Next we design and implement an RCNER (redundancy checking based on NER) algorithm based on NER. Parallel computing greatly accelerates the NER algorithm, which has weak dependence among tasks when executed. Considering this, we present PNER (parallel NER) and apply it to redundancy checking and necessity checking. Furthermore, we design and implement the RCPNER (redundancy checking based on PNER) and NCPPNER (necessary clause partition based on PNER) algorithms as well. The experimental results show that MIMF significantly influences the acceleration of algorithm RCER in formulae on a large scale and high redundancy. Comparing PNER with NER and RCPNER with RCNER, the average speedup can reach up to the number of task decompositions when executed. Comparing NCPNER with the RCNER-based algorithm on separating redundant formulae, speedup increases steadily as the scale of the formulae is incrementing. Finally, we describe the challenges that the extension rule will be faced with and suggest possible solutions.
Checking-up of optical graduated rules by laser interferometry
NASA Astrophysics Data System (ADS)
Miron, Nicolae P.; Sporea, Dan G.
1996-05-01
The main aspects related to the operating principle, design, and implementation of high-productivity equipment for checking-up the graduation accuracy of optical graduated rules used as a length reference in optical measuring instruments for precision machine tools are presented. The graduation error checking-up is done with a Michelson interferometer as a length transducer. The instrument operation is managed by a computer, which controls the equipment, data acquisition, and processing. The evaluation is performed for rule lengths from 100 to 3000 mm, with a checking-up error less than 2 micrometers/m. The checking-up time is about 15 min for a 1000-mm rule, with averaging over four measurements.
A rigorous approach to self-checking programming
NASA Technical Reports Server (NTRS)
Hua, Kien A.; Abraham, Jacob A.
1986-01-01
Self-checking programming is shown to be an effective concurrent error detection technique. The reliability of a self-checking program however relies on the quality of its assertion statements. A self-checking program written without formal guidelines could provide a poor coverage of the errors. A constructive technique for self-checking programming is presented. A Structured Program Design Language (SPDL) suitable for self-checking software development is defined. A set of formal rules, was also developed, that allows the transfromation of SPDL designs into self-checking designs to be done in a systematic manner.
Verifying Architectural Design Rules of the Flight Software Product Line
NASA Technical Reports Server (NTRS)
Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen
2009-01-01
This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.
Litho hotspots fixing using model based algorithm
NASA Astrophysics Data System (ADS)
Zhang, Meili; Yu, Shirui; Mao, Zhibiao; Shafee, Marwa; Madkour, Kareem; ElManhawy, Wael; Kwan, Joe; Hu, Xinyi; Wan, Qijian; Du, Chunshan
2017-04-01
As technology advances, IC designs are getting more sophisticated, thus it becomes more critical and challenging to fix printability issues in the design flow. Running lithography checks before tapeout is now mandatory for designers, which creates a need for more advanced and easy-to-use techniques for fixing hotspots found after lithographic simulation without creating a new design rule checking (DRC) violation or generating a new hotspot. This paper presents a new methodology for fixing hotspots on layouts while using the same engine currently used to detect the hotspots. The fix is achieved by applying minimum movement of edges causing the hotspot, with consideration of DRC constraints. The fix is internally simulated by the lithographic simulation engine to verify that the hotspot is eliminated and that no new hotspot is generated by the new edge locations. Hotspot fix checking is enhanced by adding DRC checks to the litho-friendly design (LFD) rule file to guarantee that any fix options that violate DRC checks are removed from the output hint file. This extra checking eliminates the need to re-run both DRC and LFD checks to ensure the change successfully fixed the hotspot, which saves time and simplifies the designer's workflow. This methodology is demonstrated on industrial designs, where the fixing rate of single and dual layer hotspots is reported.
NASA Technical Reports Server (NTRS)
1973-01-01
The development of and operational programs for effective use in design are presented for liquid rocket pressure regulators, relief valves, check valves, burst disks, and explosive valves. A review of the total design problem is presented, and design elements are identified which are involved in successful design. Current technology pertaining to these elements is also described. Design criteria are presented which state what rule or standard must be imposed on each essential design element to assure successful design. These criteria serve as a checklist of rules for a project manager to use in guiding a design or in assessing its adequacy. Recommended practices are included which state how to satisfy each of the criteria.
Cycle time reduction by Html report in mask checking flow
NASA Astrophysics Data System (ADS)
Chen, Jian-Cheng; Lu, Min-Ying; Fang, Xiang; Shen, Ming-Feng; Ma, Shou-Yuan; Yang, Chuen-Huei; Tsai, Joe; Lee, Rachel; Deng, Erwin; Lin, Ling-Chieh; Liao, Hung-Yueh; Tsai, Jenny; Bowhill, Amanda; Vu, Hien; Russell, Gordon
2017-07-01
The Mask Data Correctness Check (MDCC) is a reticle-level, multi-layer DRC-like check evolved from mask rule check (MRC). The MDCC uses extended job deck (EJB) to achieve mask composition and to perform a detailed check for positioning and integrity of each component of the reticle. Different design patterns on the mask will be mapped to different layers. Therefore, users may be able to review the whole reticle and check the interactions between different designs before the final mask pattern file is available. However, many types of MDCC check results, such as errors from overlapping patterns usually have very large and complex-shaped highlighted areas covering the boundary of the design. Users have to load the result OASIS file and overlap it to the original database that was assembled in MDCC process on a layout viewer, then search for the details of the check results. We introduce a quick result-reviewing method based on an html format report generated by Calibre® RVE. In the report generation process, we analyze and extract the essential part of result OASIS file to a result database (RDB) file by standard verification rule format (SVRF) commands. Calibre® RVE automatically loads the assembled reticle pattern and generates screen shots of these check results. All the processes are automatically triggered just after the MDCC process finishes. Users just have to open the html report to get the information they need: for example, check summary, captured images of results and their coordinates.
A study of applications scribe frame data verifications using design rule check
NASA Astrophysics Data System (ADS)
Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki
2013-06-01
In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.
Integrated layout based Monte-Carlo simulation for design arc optimization
NASA Astrophysics Data System (ADS)
Shao, Dongbing; Clevenger, Larry; Zhuang, Lei; Liebmann, Lars; Wong, Robert; Culp, James
2016-03-01
Design rules are created considering a wafer fail mechanism with the relevant design levels under various design cases, and the values are set to cover the worst scenario. Because of the simplification and generalization, design rule hinders, rather than helps, dense device scaling. As an example, SRAM designs always need extensive ground rule waivers. Furthermore, dense design also often involves "design arc", a collection of design rules, the sum of which equals critical pitch defined by technology. In design arc, a single rule change can lead to chain reaction of other rule violations. In this talk we present a methodology using Layout Based Monte-Carlo Simulation (LBMCS) with integrated multiple ground rule checks. We apply this methodology on SRAM word line contact, and the result is a layout that has balanced wafer fail risks based on Process Assumptions (PAs). This work was performed at the IBM Microelectronics Div, Semiconductor Research and Development Center, Hopewell Junction, NY 12533
Advanced manufacturing rules check (MRC) for fully automated assessment of complex reticle designs
NASA Astrophysics Data System (ADS)
Gladhill, R.; Aguilar, D.; Buck, P. D.; Dawkins, D.; Nolke, S.; Riddick, J.; Straub, J. A.
2005-11-01
Advanced electronic design automation (EDA) tools, with their simulation, modeling, design rule checking, and optical proximity correction capabilities, have facilitated the improvement of first pass wafer yields. While the data produced by these tools may have been processed for optimal wafer manufacturing, it is possible for the same data to be far from ideal for photomask manufacturing, particularly at lithography and inspection stages, resulting in production delays and increased costs. The same EDA tools used to produce the data can be used to detect potential problems for photomask manufacturing in the data. A production implementation of automated photomask manufacturing rule checking (MRC) is presented and discussed for various photomask lithography and inspection lines. This paper will focus on identifying data which may cause production delays at the mask inspection stage. It will be shown how photomask MRC can be used to discover data related problems prior to inspection, separating jobs which are likely to have problems at inspection from those which are not. Photomask MRC can also be used to identify geometries requiring adjustment of inspection parameters for optimal inspection, and to assist with any special handling or change of routing requirements. With this foreknowledge, steps can be taken to avoid production delays that increase manufacturing costs. Finally, the data flow implemented for MRC can be used as a platform for other photomask data preparation tasks.
LTSA Conformance Testing to Architectural Design of LMS Using Ontology
ERIC Educational Resources Information Center
Sengupta, Souvik; Dasgupta, Ranjan
2017-01-01
This paper proposes a new methodology for checking conformance of the software architectural design of Learning Management System (LMS) to Learning Technology System Architecture (LTSA). In our approach, the architectural designing of LMS follows the formal modeling style of Acme. An ontology is built to represent the LTSA rules and the software…
14 CFR 125.296 - Training, testing, and checking conducted by training centers: Special rules.
Code of Federal Regulations, 2010 CFR
2010-01-01
... AIRCRAFT Flight Crewmember Requirements § 125.296 Training, testing, and checking conducted by training centers: Special rules. A crewmember who has successfully completed training, testing, or checking in... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Training, testing, and checking conducted...
14 CFR 125.296 - Training, testing, and checking conducted by training centers: Special rules.
Code of Federal Regulations, 2012 CFR
2012-01-01
... AIRCRAFT Flight Crewmember Requirements § 125.296 Training, testing, and checking conducted by training centers: Special rules. A crewmember who has successfully completed training, testing, or checking in... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Training, testing, and checking conducted...
14 CFR 125.296 - Training, testing, and checking conducted by training centers: Special rules.
Code of Federal Regulations, 2011 CFR
2011-01-01
... AIRCRAFT Flight Crewmember Requirements § 125.296 Training, testing, and checking conducted by training centers: Special rules. A crewmember who has successfully completed training, testing, or checking in... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Training, testing, and checking conducted...
14 CFR 125.296 - Training, testing, and checking conducted by training centers: Special rules.
Code of Federal Regulations, 2014 CFR
2014-01-01
... AIRCRAFT Flight Crewmember Requirements § 125.296 Training, testing, and checking conducted by training centers: Special rules. A crewmember who has successfully completed training, testing, or checking in... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Training, testing, and checking conducted...
14 CFR 125.296 - Training, testing, and checking conducted by training centers: Special rules.
Code of Federal Regulations, 2013 CFR
2013-01-01
... AIRCRAFT Flight Crewmember Requirements § 125.296 Training, testing, and checking conducted by training centers: Special rules. A crewmember who has successfully completed training, testing, or checking in... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Training, testing, and checking conducted...
Reliability based design of the primary structure of oil tankers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casella, G.; Dogliani, M.; Guedes Soares, C.
1996-12-31
The present paper describes the reliability analysis carried out for two oil tanker-ships having comparable dimensions but different design. The scope of the analysis was to derive indications on the value of the reliability index obtained for existing, typical and well designed oil tankers, as well as to apply the tentative rule checking formulation developed within the CEC-funded SHIPREL Project. The checking formula was adopted to redesign the midships section of one of the considered ships, upgrading her in order to meet the target failure probability considered in the rule development process. The resulting structure, in view of an upgradingmore » of the steel grade in the central part of the deck, lead to a convenient reliability level. The results of the analysis clearly showed that a large scatter exists presently in the design safety levels of ships, even when the Classification Societies` unified requirements are satisfied. A reliability based approach for the calibration of the rules for the global strength of ships is therefore proposed, in order to assist designers and Classification Societies in the process of producing ships which are more optimized, with respect to ensured safety levels. Based on the work reported in the paper, the feasibility and usefulness of a reliability based approach in the development of ship longitudinal strength requirements has been demonstrated.« less
Using pattern enumeration to accelerate process development and ramp yield
NASA Astrophysics Data System (ADS)
Zhuang, Linda; Pang, Jenny; Xu, Jessy; Tsai, Mengfeng; Wang, Amy; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua
2016-03-01
During a new technology node process setup phase, foundries do not initially have enough product chip designs to conduct exhaustive process development. Different operational teams use manually designed simple test keys to set up their process flows and recipes. When the very first version of the design rule manual (DRM) is ready, foundries enter the process development phase where new experiment design data is manually created based on these design rules. However, these IP/test keys contain very uniform or simple design structures. This kind of design normally does not contain critical design structures or process unfriendly design patterns that pass design rule checks but are found to be less manufacturable. It is desired to have a method to generate exhaustive test patterns allowed by design rules at development stage to verify the gap of design rule and process. This paper presents a novel method of how to generate test key patterns which contain known problematic patterns as well as any constructs which designers could possibly draw based on current design rules. The enumerated test key patterns will contain the most critical design structures which are allowed by any particular design rule. A layout profiling method is used to do design chip analysis in order to find potential weak points on new incoming products so fab can take preemptive action to avoid yield loss. It can be achieved by comparing different products and leveraging the knowledge learned from previous manufactured chips to find possible yield detractors.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-25
... Proposed Rule on Enhanced Weapons, Firearms Background Checks, and Security Event Notifications AGENCY... the proposed enhanced weapons rule, the two draft regulatory guides, and the draft weapons safety.... No formal comments on the proposed enhanced weapons rule or the draft guidance documents will be...
Knowledge-based critiquing of graphical user interfaces with CHIMES
NASA Technical Reports Server (NTRS)
Jiang, Jianping; Murphy, Elizabeth D.; Carter, Leslie E.; Truszkowski, Walter F.
1994-01-01
CHIMES is a critiquing tool that automates the process of checking graphical user interface (GUI) designs for compliance with human factors design guidelines and toolkit style guides. The current prototype identifies instances of non-compliance and presents problem statements, advice, and tips to the GUI designer. Changes requested by the designer are made automatically, and the revised GUI is re-evaluated. A case study conducted at NASA-Goddard showed that CHIMES has the potential for dramatically reducing the time formerly spent in hands-on consistency checking. Capabilities recently added to CHIMES include exception handling and rule building. CHIMES is intended for use prior to usability testing as a means, for example, of catching and correcting syntactic inconsistencies in a larger user interface.
14 CFR 91.1075 - Training program: Special rules.
Code of Federal Regulations, 2011 CFR
2011-01-01
... (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Fractional Ownership... the following are eligible under this subpart to conduct training, testing, and checking under... chapter to conduct training, testing, and checking required by this subpart if the training center— (1...
14 CFR 91.1075 - Training program: Special rules.
Code of Federal Regulations, 2012 CFR
2012-01-01
... (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Fractional Ownership... the following are eligible under this subpart to conduct training, testing, and checking under... chapter to conduct training, testing, and checking required by this subpart if the training center— (1...
14 CFR 91.1075 - Training program: Special rules.
Code of Federal Regulations, 2013 CFR
2013-01-01
... (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Fractional Ownership... the following are eligible under this subpart to conduct training, testing, and checking under... chapter to conduct training, testing, and checking required by this subpart if the training center— (1...
14 CFR 91.1075 - Training program: Special rules.
Code of Federal Regulations, 2010 CFR
2010-01-01
... (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Fractional Ownership... the following are eligible under this subpart to conduct training, testing, and checking under... chapter to conduct training, testing, and checking required by this subpart if the training center— (1...
14 CFR 91.1075 - Training program: Special rules.
Code of Federal Regulations, 2014 CFR
2014-01-01
... (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Fractional Ownership... the following are eligible under this subpart to conduct training, testing, and checking under... chapter to conduct training, testing, and checking required by this subpart if the training center— (1...
Design Rules for Life Support Systems
NASA Technical Reports Server (NTRS)
Jones, Harry
2002-01-01
This paper considers some of the common assumptions and engineering rules of thumb used in life support system design. One general design rule is that the longer the mission, the more the life support system should use recycling and regenerable technologies. A more specific rule is that, if the system grows more than half the food, the food plants will supply all the oxygen needed for the crew life support. There are many such design rules that help in planning the analysis of life support systems and in checking results. These rules are typically if-then statements describing the results of steady-state, "back of the envelope," mass flow calculations. They are useful in identifying plausible candidate life support system designs and in rough allocations between resupply and resource recovery. Life support system designers should always review the design rules and make quick steady state calculations before doing detailed design and dynamic simulation. This paper develops the basis for the different assumptions and design rules and discusses how they should be used. We start top-down, with the highest level requirement to sustain human beings in a closed environment off Earth. We consider the crew needs for air, water, and food. We then discuss atmosphere leakage and recycling losses. The needs to support the crew and to make up losses define the fundamental life support system requirements. We consider the trade-offs between resupplying and recycling oxygen, water, and food. The specific choices between resupply and recycling are determined by mission duration, presence of in-situ resources, etc., and are defining parameters of life support system design.
Simulation-based MDP verification for leading-edge masks
NASA Astrophysics Data System (ADS)
Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki
2017-07-01
For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.
Forward-Chaining Versus A Graph Approach As The Inference Engine In Expert Systems
NASA Astrophysics Data System (ADS)
Neapolitan, Richard E.
1986-03-01
Rule-based expert systems are those in which a certain number of IF-THEN rules are assumed to be true. Based on the verity of some assertions, the rules deduce as many new conclusions as possible. A standard technique used to make these deductions is forward-chaining. In forward-chaining, the program or 'inference engine' cycles through the rules. At each rule, the premises for the rule are checked against the current true assertions. If all the premises are found, the conclusion is added to the list of true assertions. At that point it is necessary to start over at the first rule, since the new conclusion may be a premise in a rule already checked. Therefore, each time a new conclusion is deduced it is necessary to start the rule checking procedure over. This process continues until no new conclusions are added and the end of the list of rules is reached. The above process, although quite costly in terms of CPU cycles due to the necessity of repeatedly starting the process over, is necessary if the rules contain 'pattern variables'. An example of such a rule is, 'IF X IS A BACTERIA, THEN X CAN BE TREATED WITH ANTIBIOTICS'. Since the rule can lead to conclusions for many values of X, it is necessary to check each premise in the rule against every true assertion producing an association list to be used in the checking of the next premise. However, if the rule does not contain variable data, as is the case in many current expert systems, then a rule can lead to only one conclusion. In this case, the rules can be stored in a graph, and the true assertions in an assertion list. The assertion list is traversed only once; at each assertion a premise is triggered in all the rules which have that assertion as a premise. When all premises for a rule trigger, the rule's conclusion is added to the END of the list of assertions. It must be added at the end so that it will eventually be used to make further deductions. In the current paper, the two methods are described in detail, the relative advantages of each is discussed, and a benchmark comparing the CPU cycles consumed by each is included. It is also shown that, in the case of reasoning under uncertainty, it is possible to properly combine the certainties derived from rules arguing for the same conclusion when the graph approach is used.
14 CFR 91.1089 - Qualifications: Check pilots (aircraft) and check pilots (simulator).
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 2 2011-01-01 2011-01-01 false Qualifications: Check pilots (aircraft) and check pilots (simulator). 91.1089 Section 91.1089 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... RULES Fractional Ownership Operations Program Management § 91.1089 Qualifications: Check pilots...
14 CFR 91.1089 - Qualifications: Check pilots (aircraft) and check pilots (simulator).
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 2 2010-01-01 2010-01-01 false Qualifications: Check pilots (aircraft) and check pilots (simulator). 91.1089 Section 91.1089 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... RULES Fractional Ownership Operations Program Management § 91.1089 Qualifications: Check pilots...
Solutions to time variant problems of real-time expert systems
NASA Technical Reports Server (NTRS)
Yeh, Show-Way; Wu, Chuan-Lin; Hung, Chaw-Kwei
1988-01-01
Real-time expert systems for monitoring and control are driven by input data which changes with time. One of the subtle problems of this field is the propagation of time variant problems from rule to rule. This propagation problem is even complicated under a multiprogramming environment where the expert system may issue test commands to the system to get data and to access time consuming devices to retrieve data for concurrent reasoning. Two approaches are used to handle the flood of input data. Snapshots can be taken to freeze the system from time to time. The expert system treats the system as a stationary one and traces changes by comparing consecutive snapshots. In the other approach, when an input is available, the rules associated with it are evaluated. For both approaches, if the premise condition of a fired rule is changed to being false, the downstream rules should be deactivated. If the status change is due to disappearance of a transient problem, actions taken by the fired downstream rules which are no longer true may need to be undone. If a downstream rule is being evaluated, it should not be fired. Three mechanisms for solving this problem are discussed: tracing, backward checking, and censor setting. In the forward tracing mechanism, when the premise conditions of a fired rule become false, the premise conditions of downstream rules which have been fired or are being evaluated due to the firing of that rule are reevaluated. A tree with its root at the rule being deactivated is traversed. In the backward checking mechanism, when a rule is being fired, the expert system checks back on the premise conditions of the upstream rules that result in evaluation of the rule to see whether it should be fired. The root of the tree being traversed is the rule being fired. In the censor setting mechanism, when a rule is to be evaluated, a censor is constructed based on the premise conditions of the upstream rules and the censor is evaluated just before the rule is fired. Unlike the backward checking mechanism, this one does not search the upstream rules. This paper explores the details of implementation of the three mechanisms.
36 CFR 504.12 - Items to be checked.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Section 504.12 Parks, Forests, and Public Property SMITHSONIAN INSTITUTION RULES AND REGULATIONS GOVERNING SMITHSONIAN INSTITUTION BUILDINGS AND GROUNDS § 504.12 Items to be checked. Umbrellas, canes (not needed to... to be checked in buildings where checking facilities are provided. ...
The Paperwork Pile-Up: Measuring the Burden of Charter School Applications
ERIC Educational Resources Information Center
McShane, Michael Q.; Hatfield, Jenn; English, Elizabeth
2015-01-01
In 1988, Albert Shanker, head of the United Federation of Teachers, suggested that small groups of teachers could design charter (performance-based)schools as alternatives to local public schools. In theory, charter school teachers would be held in check by a performance contract but would be otherwise free from rules, norms, and regulations that…
2014-01-01
Background Providing scalable clinical decision support (CDS) across institutions that use different electronic health record (EHR) systems has been a challenge for medical informatics researchers. The lack of commonly shared EHR models and terminology bindings has been recognised as a major barrier to sharing CDS content among different organisations. The openEHR Guideline Definition Language (GDL) expresses CDS content based on openEHR archetypes and can support any clinical terminologies or natural languages. Our aim was to explore in an experimental setting the practicability of GDL and its underlying archetype formalism. A further aim was to report on the artefacts produced by this new technological approach in this particular experiment. We modelled and automatically executed compliance checking rules from clinical practice guidelines for acute stroke care. Methods We extracted rules from the European clinical practice guidelines as well as from treatment contraindications for acute stroke care and represented them using GDL. Then we executed the rules retrospectively on 49 mock patient cases to check the cases’ compliance with the guidelines, and manually validated the execution results. We used openEHR archetypes, GDL rules, the openEHR reference information model, reference terminologies and the Data Archetype Definition Language. We utilised the open-sourced GDL Editor for authoring GDL rules, the international archetype repository for reusing archetypes, the open-sourced Ocean Archetype Editor for authoring or modifying archetypes and the CDS Workbench for executing GDL rules on patient data. Results We successfully represented clinical rules about 14 out of 19 contraindications for thrombolysis and other aspects of acute stroke care with 80 GDL rules. These rules are based on 14 reused international archetypes (one of which was modified), 2 newly created archetypes and 51 terminology bindings (to three terminologies). Our manual compliance checks for 49 mock patients were a complete match versus the automated compliance results. Conclusions Shareable guideline knowledge for use in automated retrospective checking of guideline compliance may be achievable using GDL. Whether the same GDL rules can be used for at-the-point-of-care CDS remains unknown. PMID:24886468
42 CFR 433.40 - Treatment of uncashed or cancelled (voided) Medicaid checks.
Code of Federal Regulations, 2010 CFR
2010-10-01
...) Medicaid checks. 433.40 Section 433.40 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT...) Medicaid checks. (a) Purpose. This section provides the rules to ensure that States refund the Federal...— Cancelled (voided) check means a Medicaid check issued by a State or fiscal agent which prior to its being...
78 FR 23872 - HIPAA Privacy Rule and the National Instant Criminal Background Check System (NICS)
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-23
... any unintended adverse consequences for individuals seeking needed mental health services that may be... Check System (NICS) to help enforce these prohibitions.\\4\\ The NICS Index, a database administered by... to the NICS. Such an amendment might produce clarity regarding the Privacy Rule and help make it as...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-31
... Command Proficiency Check and Other Changes to the Pilot and Pilot School Certification Rules AGENCY... regulations concerning pilot, flight instructor, and pilot school certification. This rule will require pilot... and permits pilot schools and provisional pilot schools to apply for a combined private pilot...
High Level Rule Modeling Language for Airline Crew Pairing
NASA Astrophysics Data System (ADS)
Mutlu, Erdal; Birbil, Ş. Ilker; Bülbül, Kerem; Yenigün, Hüsnü
2011-09-01
The crew pairing problem is an airline optimization problem where a set of least costly pairings (consecutive flights to be flown by a single crew) that covers every flight in a given flight network is sought. A pairing is defined by using a very complex set of feasibility rules imposed by international and national regulatory agencies, and also by the airline itself. The cost of a pairing is also defined by using complicated rules. When an optimization engine generates a sequence of flights from a given flight network, it has to check all these feasibility rules to ensure whether the sequence forms a valid pairing. Likewise, the engine needs to calculate the cost of the pairing by using certain rules. However, the rules used for checking the feasibility and calculating the costs are usually not static. Furthermore, the airline companies carry out what-if-type analyses through testing several alternate scenarios in each planning period. Therefore, embedding the implementation of feasibility checking and cost calculation rules into the source code of the optimization engine is not a practical approach. In this work, a high level language called ARUS is introduced for describing the feasibility and cost calculation rules. A compiler for ARUS is also implemented in this work to generate a dynamic link library to be used by crew pairing optimization engines.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-23
... 1973, as amended. In total, approximately 36,498 kilometers (km) (22,679 miles (mi)) of streams (which... box that reads ``Enter Keyword or ID,'' enter the docket number for the proposed rule, which is FWS-R1-ES-2009-0085. Check the box that reads ``Open for Comment/Submission,'' and then click the Search...
Checking Flight Rules with TraceContract: Application of a Scala DSL for Trace Analysis
NASA Technical Reports Server (NTRS)
Barringer, Howard; Havelund, Klaus; Morris, Robert A.
2011-01-01
Typically during the design and development of a NASA space mission, rules and constraints are identified to help reduce reasons for failure during operations. These flight rules are usually captured in a set of indexed tables, containing rule descriptions, rationales for the rules, and other information. Flight rules can be part of manual operations procedures carried out by humans. However, they can also be automated, and either implemented as on-board monitors, or as ground based monitors that are part of a ground data system. In the case of automated flight rules, one considerable expense to be addressed for any mission is the extensive process by which system engineers express flight rules in prose, software developers translate these requirements into code, and then both experts verify that the resulting application is correct. This paper explores the potential benefits of using an internal Scala DSL for general trace analysis, named TRACECONTRACT, to write executable specifications of flight rules. TRACECONTRACT can generally be applied to analysis of for example log files or for monitoring executing systems online.
Performance optimization of internet firewalls
NASA Astrophysics Data System (ADS)
Chiueh, Tzi-cker; Ballman, Allen
1997-01-01
Internet firewalls control the data traffic in and out of an enterprise network by checking network packets against a set of rules that embodies an organization's security policy. Because rule checking is computationally more expensive than routing-table look-up, it could become a potential bottleneck for scaling up the performance of IP routers, which typically implement firewall functions in software. in this paper, we analyzed the performance problems associated with firewalls, particularly packet filters, propose a good connection cache to amortize the costly security check over the packets in a connection, and report the preliminary performance results of a trace-driven simulation that show the average packet check time can be reduced by a factor of 2.5 at the least.
TU-D-201-06: HDR Plan Prechecks Using Eclipse Scripting API
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palaniswaamy, G; Morrow, A; Kim, S
Purpose: Automate brachytherapy treatment plan quality check using Eclipse v13.6 scripting API based on pre-configured rules to minimize human error and maximize efficiency. Methods: The HDR Precheck system is developed based on a rules-driven approach using Eclipse scripting API. This system checks for critical plan parameters like channel length, first source position, source step size and channel mapping. The planned treatment time is verified independently based on analytical methods. For interstitial or SAVI APBI treatment plans, a Patterson-Parker system calculation is performed to verify the planned treatment time. For endobronchial treatments, an analytical formula from TG-59 is used. Acceptable tolerancesmore » were defined based on clinical experiences in our department. The system was designed to show PASS/FAIL status levels. Additional information, if necessary, is indicated appropriately in a separate comments field in the user interface. Results: The HDR Precheck system has been developed and tested to verify the treatment plan parameters that are routinely checked by the clinical physicist. The report also serves as a reminder or checklist for the planner to perform any additional critical checks such as applicator digitization or scenarios where the channel mapping was intentionally changed. It is expected to reduce the current manual plan check time from 15 minutes to <1 minute. Conclusion: Automating brachytherapy plan prechecks significantly reduces treatment plan precheck time and reduces human errors. When fully developed, this system will be able to perform TG-43 based second check of the treatment planning system’s dose calculation using random points in the target and critical structures. A histogram will be generated along with tabulated mean and standard deviation values for each structure. A knowledge database will also be developed for Brachyvision plans which will then be used for knowledge-based plan quality checks to further reduce treatment planning errors and increase confidence in the planned treatment.« less
NASA Astrophysics Data System (ADS)
Raghavan, Ajay; Saha, Bhaskar
2013-03-01
Photo enforcement devices for traffic rules such as red lights, toll, stops, and speed limits are increasingly being deployed in cities and counties around the world to ensure smooth traffic flow and public safety. These are typically unattended fielded systems, and so it is important to periodically check them for potential image/video quality problems that might interfere with their intended functionality. There is interest in automating such checks to reduce the operational overhead and human error involved in manually checking large camera device fleets. Examples of problems affecting such camera devices include exposure issues, focus drifts, obstructions, misalignment, download errors, and motion blur. Furthermore, in some cases, in addition to the sub-algorithms for individual problems, one also has to carefully design the overall algorithm and logic to check for and accurately classifying these individual problems. Some of these issues can occur in tandem or have the potential to be confused for each other by automated algorithms. Examples include camera misalignment that can cause some scene elements to go out of focus for wide-area scenes or download errors that can be misinterpreted as an obstruction. Therefore, the sequence in which the sub-algorithms are utilized is also important. This paper presents an overview of these problems along with no-reference and reduced reference image and video quality solutions to detect and classify such faults.
Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru
2009-04-27
Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I that cannot be applied with qualitative based model checking, is more reasonable than Rule II owing to the high coverage of predicted fate patterns (except for the genotype of lin-15ko; lin-12ko double mutants). More insights are also suggested. The quantitative simulation-based model checking approach is a useful means to provide us valuable biological insights and better understandings of biological systems and observation data that may be hard to capture with the qualitative one.
42 CFR 457.216 - Treatment of uncashed or canceled (voided) CHIP checks.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 4 2011-10-01 2011-10-01 false Treatment of uncashed or canceled (voided) CHIP... canceled (voided) CHIP checks. (a) Purpose. This section provides rules to ensure that States refund the... section— Canceled (voided) check means an CHIP check issued by a State or fiscal agent that prior to its...
42 CFR 457.216 - Treatment of uncashed or canceled (voided) CHIP checks.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 4 2014-10-01 2014-10-01 false Treatment of uncashed or canceled (voided) CHIP... canceled (voided) CHIP checks. (a) Purpose. This section provides rules to ensure that States refund the... section— Canceled (voided) check means an CHIP check issued by a State or fiscal agent that prior to its...
42 CFR 457.216 - Treatment of uncashed or canceled (voided) CHIP checks.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 4 2010-10-01 2010-10-01 false Treatment of uncashed or canceled (voided) CHIP... canceled (voided) CHIP checks. (a) Purpose. This section provides rules to ensure that States refund the... section— Canceled (voided) check means an CHIP check issued by a State or fiscal agent that prior to its...
42 CFR 457.216 - Treatment of uncashed or canceled (voided) CHIP checks.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 4 2012-10-01 2012-10-01 false Treatment of uncashed or canceled (voided) CHIP... canceled (voided) CHIP checks. (a) Purpose. This section provides rules to ensure that States refund the... section— Canceled (voided) check means an CHIP check issued by a State or fiscal agent that prior to its...
42 CFR 457.216 - Treatment of uncashed or canceled (voided) CHIP checks.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 4 2013-10-01 2013-10-01 false Treatment of uncashed or canceled (voided) CHIP... canceled (voided) CHIP checks. (a) Purpose. This section provides rules to ensure that States refund the... section— Canceled (voided) check means an CHIP check issued by a State or fiscal agent that prior to its...
ERIC Educational Resources Information Center
La Porta, Rafael; Lopez-de-Silanes, Florencio; Pop-Eleches, Cristian; Shleifer, Andrei
2004-01-01
In the Anglo-American constitutional tradition, judicial checks and balances are often seen as crucial guarantees of freedom. Hayek distinguishes two ways in which the judiciary provides such checks and balances: judicial independence and constitutional review. We create a new database of constitutional rules in 71 countries that reflect these…
14 CFR 135.291 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-01-01
... AND ON DEMAND OPERATIONS AND RULES GOVERNING PERSONS ON BOARD SUCH AIRCRAFT Crewmember Testing... checks required for pilot and flight attendant crewmembers and for the approval of check pilots in... chapter who meet the requirements of §§ 135.337 and 135.339 to conduct training, testing, and checking...
14 CFR 135.291 - Applicability.
Code of Federal Regulations, 2013 CFR
2013-01-01
... AND ON DEMAND OPERATIONS AND RULES GOVERNING PERSONS ON BOARD SUCH AIRCRAFT Crewmember Testing... checks required for pilot and flight attendant crewmembers and for the approval of check pilots in... chapter who meet the requirements of §§ 135.337 and 135.339 to conduct training, testing, and checking...
14 CFR 135.291 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-01-01
... AND ON DEMAND OPERATIONS AND RULES GOVERNING PERSONS ON BOARD SUCH AIRCRAFT Crewmember Testing... checks required for pilot and flight attendant crewmembers and for the approval of check pilots in... chapter who meet the requirements of §§ 135.337 and 135.339 to conduct training, testing, and checking...
14 CFR 135.291 - Applicability.
Code of Federal Regulations, 2011 CFR
2011-01-01
... AND ON DEMAND OPERATIONS AND RULES GOVERNING PERSONS ON BOARD SUCH AIRCRAFT Crewmember Testing... checks required for pilot and flight attendant crewmembers and for the approval of check pilots in... chapter who meet the requirements of §§ 135.337 and 135.339 to conduct training, testing, and checking...
14 CFR 135.291 - Applicability.
Code of Federal Regulations, 2010 CFR
2010-01-01
... AND ON DEMAND OPERATIONS AND RULES GOVERNING PERSONS ON BOARD SUCH AIRCRAFT Crewmember Testing... checks required for pilot and flight attendant crewmembers and for the approval of check pilots in... chapter who meet the requirements of §§ 135.337 and 135.339 to conduct training, testing, and checking...
Encoded physics knowledge in checking codes for nuclear cross section libraries at Los Alamos
NASA Astrophysics Data System (ADS)
Parsons, D. Kent
2017-09-01
Checking procedures for processed nuclear data at Los Alamos are described. Both continuous energy and multi-group nuclear data are verified by locally developed checking codes which use basic physics knowledge and common-sense rules. A list of nuclear data problems which have been identified with help of these checking codes is also given.
Spacecraft command verification: The AI solution
NASA Technical Reports Server (NTRS)
Fesq, Lorraine M.; Stephan, Amy; Smith, Brian K.
1990-01-01
Recently, a knowledge-based approach was used to develop a system called the Command Constraint Checker (CCC) for TRW. CCC was created to automate the process of verifying spacecraft command sequences. To check command files by hand for timing and sequencing errors is a time-consuming and error-prone task. Conventional software solutions were rejected when it was estimated that it would require 36 man-months to build an automated tool to check constraints by conventional methods. Using rule-based representation to model the various timing and sequencing constraints of the spacecraft, CCC was developed and tested in only three months. By applying artificial intelligence techniques, CCC designers were able to demonstrate the viability of AI as a tool to transform difficult problems into easily managed tasks. The design considerations used in developing CCC are discussed and the potential impact of this system on future satellite programs is examined.
Colucci, E; Clark, A; Lang, C E; Pomeroy, V M
2017-12-01
Dose-optimisation studies as precursors to clinical trials are rare in stroke rehabilitation. To develop a rule-based, dose-finding design for stroke rehabilitation research. 3+3 rule-based, dose-finding study. Dose escalation/de-escalation was undertaken according to preset rules and a mathematical sequence (modified Fibonacci sequence). The target starting daily dose was 50 repetitions for the first cohort. Adherence was recorded by an electronic counter. At the end of the 2-week training period, the adherence record indicated dose tolerability (adherence to target dose) and the outcome measure indicated dose benefit (10% increase in motor function). The preset increment/decrease and checking rules were then applied to set the dose for the subsequent cohort. The process was repeated until preset stopping rules were met. Participants had a mean age of 68 (range 48 to 81) years, and were a mean of 70 (range 9 to 289) months post stroke with moderate upper limb paresis. A custom-built model of exercise-based training to enhance ability to open the paretic hand. Repetitions per minute of extension/flexion of paretic digits against resistance. Usability of the preset rules and whether the maximally tolerated dose was identifiable. Five cohorts of three participants were involved. Discernibly different doses were set for each subsequent cohort (i.e. 50, 100, 167, 251 and 209 repetitions/day). The maximally tolerated dose for the model training task was 209 repetitions/day. This dose-finding design is a feasible method for use in stroke rehabilitation research. Copyright © 2017 Chartered Society of Physiotherapy. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-10
... Fingerprinting and Criminal History Records Check Requirements for Access To Safeguards Information (Effective... Bureau of Investigation (FBI) identification and criminal history records check are required of any... ``fingerprinting relief'' rule. Individuals relieved from fingerprinting and criminal history records checks under...
NASA Technical Reports Server (NTRS)
1994-01-01
A NASA contract led to the development of faster and more energy efficient semiconductor materials for digital integrated circuits. Gallium arsenide (GaAs) conducts electrons 4-6 times faster than silicon and uses less power at frequencies above 100-150 megahertz. However, the material is expensive, brittle, fragile and has lacked computer automated engineering tools to solve this problem. Systems & Processes Engineering Corporation (SPEC) developed a series of GaAs cell libraries for cell layout, design rule checking, logic synthesis, placement and routing, simulation and chip assembly. The system is marketed by Compare Design Automation.
CMLLite: a design philosophy for CML
2011-01-01
CMLLite is a collection of definitions and processes which provide strong and flexible validation for a document in Chemical Markup Language (CML). It consists of an updated CML schema (schema3), conventions specifying rules in both human and machine-understandable forms and a validator available both online and offline to check conformance. This article explores the rationale behind the changes which have been made to the schema, explains how conventions interact and how they are designed, formulated, implemented and tested, and gives an overview of the validation service. PMID:21999395
Rule-based topology system for spatial databases to validate complex geographic datasets
NASA Astrophysics Data System (ADS)
Martinez-Llario, J.; Coll, E.; Núñez-Andrés, M.; Femenia-Ribera, C.
2017-06-01
A rule-based topology software system providing a highly flexible and fast procedure to enforce integrity in spatial relationships among datasets is presented. This improved topology rule system is built over the spatial extension Jaspa. Both projects are open source, freely available software developed by the corresponding author of this paper. Currently, there is no spatial DBMS that implements a rule-based topology engine (considering that the topology rules are designed and performed in the spatial backend). If the topology rules are applied in the frontend (as in many GIS desktop programs), ArcGIS is the most advanced solution. The system presented in this paper has several major advantages over the ArcGIS approach: it can be extended with new topology rules, it has a much wider set of rules, and it can mix feature attributes with topology rules as filters. In addition, the topology rule system can work with various DBMSs, including PostgreSQL, H2 or Oracle, and the logic is performed in the spatial backend. The proposed topology system allows users to check the complex spatial relationships among features (from one or several spatial layers) that require some complex cartographic datasets, such as the data specifications proposed by INSPIRE in Europe and the Land Administration Domain Model (LADM) for Cadastral data.
Building validation tools for knowledge-based systems
NASA Technical Reports Server (NTRS)
Stachowitz, R. A.; Chang, C. L.; Stock, T. S.; Combs, J. B.
1987-01-01
The Expert Systems Validation Associate (EVA), a validation system under development at the Lockheed Artificial Intelligence Center for more than a year, provides a wide range of validation tools to check the correctness, consistency and completeness of a knowledge-based system. A declarative meta-language (higher-order language), is used to create a generic version of EVA to validate applications written in arbitrary expert system shells. The architecture and functionality of EVA are presented. The functionality includes Structure Check, Logic Check, Extended Structure Check (using semantic information), Extended Logic Check, Semantic Check, Omission Check, Rule Refinement, Control Check, Test Case Generation, Error Localization, and Behavior Verification.
Det Norske Veritas rule philosophy with regard to gas turbines for marine propulsion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, P.
1999-04-01
This paper is mainly based on Det Norske Veritas (DNV) Rules of January 1996, Part 4, Chapter 2, Section 4 -- Gas Turbines, and is intended to at least open the dialogue between the gas turbine industry and DNV. There is a need for design approval and manufacturing inspection process systematic and testing procedures to match the standards of the industry. The role and expectations imposed by owners, the authorities, insurance agencies, etc. needs to be understood. These expectations often have technical implications that may go against the normal procedures and practices of the gas turbine industry, and could havemore » cost impacts. The question of DNV acceptance criteria has been asked many times, with respect to gas turbines. DNV relies a great deal on the manufacturer to provide the basis for the design criteria, manufacturing, and testing criteria of the gas turbine. However, DNV adds its knowledge and experience to this, and checks that the documentation presented by the manufacturer is technically acceptable. Generally, a high level of the state-of-the-art theoretical documentation is required to support the design of modern gas turbines. A proper understanding of the rule philosophy of DNV could prove to be useful in developing better gas turbines systems, which fulfill the rule requirements, and at the same time save resources such as money and time. It is important for gas turbine manufacturers to understand the intent of the rules since it is the intent that needs to be fulfilled. Further, the rules do have the principle of equivalence, which means that there is full freedom in how one fulfills the intent of the rules, as long as DNV accepts the solution.« less
1991-05-01
or may not bypass the editing function. At present, editing rules beyond those required for translation have not been stipulated. 2When explicit... editing rules become defined, the editor at a site LGN may perform two levels of edit checking: warning, which would insert blanks or pass as submitted...position image transactions into a transaction set. This low-level edit checking is performed at the site LGN to reduce transmission costs and to
A Hierarchy of Proof Rules for Checking Differential Invariance of Algebraic Sets
2014-11-01
linear hybrid systems by linear algebraic methods. In SAS, volume 6337 of LNCS, pages 373–389. Springer, 2010. [19] E. W. Mayr. Membership in polynomial...383–394, 2009. [31] A. Tarski. A decision method for elementary algebra and geometry. Bull. Amer. Math. Soc., 59, 1951. [32] A. Tiwari. Abstractions...A Hierarchy of Proof Rules for Checking Differential Invariance of Algebraic Sets Khalil Ghorbal1 Andrew Sogokon2 André Platzer1 November 2014 CMU
Profitability of simple technical trading rules of Chinese stock exchange indexes
NASA Astrophysics Data System (ADS)
Zhu, Hong; Jiang, Zhi-Qiang; Li, Sai-Ping; Zhou, Wei-Xing
2015-12-01
Although technical trading rules have been widely used by practitioners in financial markets, their profitability still remains controversial. We here investigate the profitability of moving average (MA) and trading range break (TRB) rules by using the Shanghai Stock Exchange Composite Index (SHCI) from May 21, 1992 through December 31, 2013 and Shenzhen Stock Exchange Component Index (SZCI) from April 3, 1991 through December 31, 2013. The t-test is adopted to check whether the mean returns which are conditioned on the trading signals are significantly different from unconditioned returns and whether the mean returns conditioned on the buy signals are significantly different from the mean returns conditioned on the sell signals. We find that TRB rules outperform MA rules and short-term variable moving average (VMA) rules outperform long-term VMA rules. By applying White's Reality Check test and accounting for the data snooping effects, we find that the best trading rule outperforms the buy-and-hold strategy when transaction costs are not taken into consideration. Once transaction costs are included, trading profits will be eliminated completely. Our analysis suggests that simple trading rules like MA and TRB cannot beat the standard buy-and-hold strategy for the Chinese stock exchange indexes.
Class Model Development Using Business Rules
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Gudas, Saulius
New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, S; Wu, Y; Chang, X
Purpose: A novel computer software system, namely APDV (Automatic Pre-Delivery Verification), has been developed for verifying patient treatment plan parameters right prior to treatment deliveries in order to automatically detect and prevent catastrophic errors. Methods: APDV is designed to continuously monitor new DICOM plan files on the TMS computer at the treatment console. When new plans to be delivered are detected, APDV checks the consistencies of plan parameters and high-level plan statistics using underlying rules and statistical properties based on given treatment site, technique and modality. These rules were quantitatively derived by retrospectively analyzing all the EBRT treatment plans ofmore » the past 8 years at authors’ institution. Therapists and physicists will be notified with a warning message displayed on the TMS computer if any critical errors are detected, and check results, confirmation, together with dismissal actions will be saved into database for further review. Results: APDV was implemented as a stand-alone program using C# to ensure required real time performance. Mean values and standard deviations were quantitatively derived for various plan parameters including MLC usage, MU/cGy radio, beam SSD, beam weighting, and the beam gantry angles (only for lateral targets) per treatment site, technique and modality. 2D-based rules of combined MU/cGy ratio and averaged SSD values were also derived using joint probabilities of confidence error ellipses. The statistics of these major treatment plan parameters quantitatively evaluate the consistency of any treatment plans which facilitates automatic APDV checking procedures. Conclusion: APDV could be useful in detecting and preventing catastrophic errors immediately before treatment deliveries. Future plan including automatic patient identify and patient setup checks after patient daily images are acquired by the machine and become available on the TMS computer. This project is supported by the Agency for Healthcare Research and Quality (AHRQ) under award 1R01HS0222888. The senior author received research grants from ViewRay Inc. and Varian Medical System.« less
Black, Amanda M; Hagel, Brent E; Palacios-Derflingher, Luz; Schneider, Kathryn J; Emery, Carolyn A
2017-12-01
In 2013, Hockey Canada introduced an evidence-informed policy change delaying the earliest age of introduction to body checking in ice hockey until Bantam (ages 13-14) nationwide. To determine if the risk of injury, including concussions, changes for Pee Wee (11-12 years) ice hockey players in the season following a national policy change disallowing body checking. In a historical cohort study, Pee Wee players were recruited from teams in all divisions of play in 2011-2012 prior to the rule change and in 2013-2014 following the change. Baseline information, injury and exposure data for both cohorts were collected using validated injury surveillance. Pee Wee players were recruited from 59 teams in Calgary, Alberta (n=883) in 2011-2012 and from 73 teams in 2013-2014 (n=618). There were 163 game-related injuries (incidence rate (IR)=4.37/1000 game-hours) and 104 concussions (IR=2.79/1000 game-hours) in Alberta prior to the rule change, and 48 injuries (IR=2.16/1000 game-hours) and 25 concussions (IR=1.12/1000 game-hours) after the rule change. Based on multivariable Poisson regression with exposure hours as an offset, the adjusted incidence rate ratio associated with the national policy change disallowing body checking was 0.50 for all game-related injuries (95% CI 0.33 to 0.75) and 0.36 for concussion specifically (95% CI 0.22 to 0.58). Introduction of the 2013 national body checking policy change disallowing body checking in Pee Wee resulted in a 50% relative reduction in injury rate and a 64% reduction in concussion rate in 11-year-old and 12-year-old hockey players in Alberta. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Technical Reports Server (NTRS)
Izygon, Michel
1992-01-01
This report summarizes the findings and lessons learned from the development of an intelligent user interface for a space flight planning simulation program, in the specific area related to constraint-checking. The different functionalities of the Graphical User Interface part and of the rule-based part of the system have been identified. Their respective domain of applicability for error prevention and error checking have been specified.
78 FR 2214 - Enhanced Weapons, Firearms Background Checks, and Security Event Notifications
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-10
.... Voluntary Consensus Standards XI. Finding of No Significant Environmental Impact XII. Paperwork Reduction... ML110480470 of Nuclear Energy Institute, on the proposed ``Enhanced Weapons, Firearms Background Checks and... consensus standards. XI. Finding of No Significant Environmental Impact In the proposed rule published on...
14 CFR 91.1051 - Pilot safety background check.
Code of Federal Regulations, 2013 CFR
2013-01-01
... (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Fractional Ownership... previous employers must include, as applicable— (1) Crew member records. (2) Drug testing—collection, testing, and rehabilitation records pertaining to the individual. (3) Alcohol misuse prevention program...
14 CFR 91.1051 - Pilot safety background check.
Code of Federal Regulations, 2012 CFR
2012-01-01
... (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Fractional Ownership... previous employers must include, as applicable— (1) Crew member records. (2) Drug testing—collection, testing, and rehabilitation records pertaining to the individual. (3) Alcohol misuse prevention program...
36 CFR 504.12 - Items to be checked.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false Items to be checked. 504.12 Section 504.12 Parks, Forests, and Public Property SMITHSONIAN INSTITUTION RULES AND REGULATIONS GOVERNING... assist in walking), or other objects capable of inflicting damage to property or exhibits may be required...
36 CFR 520.13 - Items to be checked.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false Items to be checked. 520.13 Section 520.13 Parks, Forests, and Public Property SMITHSONIAN INSTITUTION RULES AND REGULATIONS GOVERNING THE BUILDINGS AND GROUNDS OF THE NATIONAL ZOOLOGICAL PARK OF THE SMITHSONIAN INSTITUTION § 520.13...
36 CFR 504.12 - Items to be checked.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Items to be checked. 504.12 Section 504.12 Parks, Forests, and Public Property SMITHSONIAN INSTITUTION RULES AND REGULATIONS GOVERNING... assist in walking), or other objects capable of inflicting damage to property or exhibits may be required...
36 CFR 520.13 - Items to be checked.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false Items to be checked. 520.13 Section 520.13 Parks, Forests, and Public Property SMITHSONIAN INSTITUTION RULES AND REGULATIONS GOVERNING THE BUILDINGS AND GROUNDS OF THE NATIONAL ZOOLOGICAL PARK OF THE SMITHSONIAN INSTITUTION § 520.13...
36 CFR 504.12 - Items to be checked.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false Items to be checked. 504.12 Section 504.12 Parks, Forests, and Public Property SMITHSONIAN INSTITUTION RULES AND REGULATIONS GOVERNING... assist in walking), or other objects capable of inflicting damage to property or exhibits may be required...
36 CFR 520.13 - Items to be checked.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Items to be checked. 520.13 Section 520.13 Parks, Forests, and Public Property SMITHSONIAN INSTITUTION RULES AND REGULATIONS GOVERNING THE BUILDINGS AND GROUNDS OF THE NATIONAL ZOOLOGICAL PARK OF THE SMITHSONIAN INSTITUTION § 520.13...
36 CFR 520.13 - Items to be checked.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Items to be checked. 520.13 Section 520.13 Parks, Forests, and Public Property SMITHSONIAN INSTITUTION RULES AND REGULATIONS GOVERNING THE BUILDINGS AND GROUNDS OF THE NATIONAL ZOOLOGICAL PARK OF THE SMITHSONIAN INSTITUTION § 520.13...
14 CFR 121.402 - Training program: Special rules.
Code of Federal Regulations, 2013 CFR
2013-01-01
... part or a flight training center certificated under part 142 of this chapter is eligible under this subpart to provide flight training, testing, and checking under contract or other arrangement to those... provide training, testing, and checking required by this part only if the training center— (1) Holds...
14 CFR 121.402 - Training program: Special rules.
Code of Federal Regulations, 2011 CFR
2011-01-01
... part or a flight training center certificated under part 142 of this chapter is eligible under this subpart to provide flight training, testing, and checking under contract or other arrangement to those... provide training, testing, and checking required by this part only if the training center— (1) Holds...
14 CFR 121.402 - Training program: Special rules.
Code of Federal Regulations, 2012 CFR
2012-01-01
... part or a flight training center certificated under part 142 of this chapter is eligible under this subpart to provide flight training, testing, and checking under contract or other arrangement to those... provide training, testing, and checking required by this part only if the training center— (1) Holds...
14 CFR 121.402 - Training program: Special rules.
Code of Federal Regulations, 2010 CFR
2010-01-01
... part or a flight training center certificated under part 142 of this chapter is eligible under this subpart to provide flight training, testing, and checking under contract or other arrangement to those... provide training, testing, and checking required by this part only if the training center— (1) Holds...
14 CFR 121.402 - Training program: Special rules.
Code of Federal Regulations, 2014 CFR
2014-01-01
... part or a flight training center certificated under part 142 of this chapter is eligible under this subpart to provide flight training, testing, and checking under contract or other arrangement to those... provide training, testing, and checking required by this part only if the training center— (1) Holds...
Strategy optimization for mask rule check in wafer fab
NASA Astrophysics Data System (ADS)
Yang, Chuen Huei; Lin, Shaina; Lin, Roger; Wang, Alice; Lee, Rachel; Deng, Erwin
2015-07-01
Photolithography process is getting more and more sophisticated for wafer production following Moore's law. Therefore, for wafer fab, consolidated and close cooperation with mask house is a key to achieve silicon wafer success. However, generally speaking, it is not easy to preserve such partnership because many engineering efforts and frequent communication are indispensable. The inattentive connection is obvious in mask rule check (MRC). Mask houses will do their own MRC at job deck stage, but the checking is only for identification of mask process limitation including writing, etching, inspection, metrology, etc. No further checking in terms of wafer process concerned mask data errors will be implemented after data files of whole mask are composed in mask house. There are still many potential data errors even post-OPC verification has been done for main circuits. What mentioned here are the kinds of errors which will only occur as main circuits combined with frame and dummy patterns to form whole reticle. Therefore, strategy optimization is on-going in UMC to evaluate MRC especially for wafer fab concerned errors. The prerequisite is that no impact on mask delivery cycle time even adding this extra checking. A full-mask checking based on job deck in gds or oasis format is necessary in order to secure acceptable run time. Form of the summarized error report generated by this checking is also crucial because user friendly interface will shorten engineers' judgment time to release mask for writing. This paper will survey the key factors of MRC in wafer fab.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delgoshaei, Parastoo; Austin, Mark A.; Pertzborn, Amanda J.
State-of-the-art building simulation control methods incorporate physical constraints into their mathematical models, but omit implicit constraints associated with policies of operation and dependency relationships among rules representing those constraints. To overcome these shortcomings, there is a recent trend in enabling the control strategies with inference-based rule checking capabilities. One solution is to exploit semantic web technologies in building simulation control. Such approaches provide the tools for semantic modeling of domains, and the ability to deduce new information based on the models through use of Description Logic (DL). In a step toward enabling this capability, this paper presents a cross-disciplinary data-drivenmore » control strategy for building energy management simulation that integrates semantic modeling and formal rule checking mechanisms into a Model Predictive Control (MPC) formulation. The results show that MPC provides superior levels of performance when initial conditions and inputs are derived from inference-based rules.« less
Code of Federal Regulations, 2013 CFR
2013-01-01
... simulator or training device; and (2) A flight check in the aircraft or a check in the simulator or training..., requalification, and differences flight training. 91.1103 Section 91.1103 Aeronautics and Space FEDERAL AVIATION... OPERATING AND FLIGHT RULES Fractional Ownership Operations Program Management § 91.1103 Pilots: Initial...
Code of Federal Regulations, 2014 CFR
2014-01-01
... simulator or training device; and (2) A flight check in the aircraft or a check in the simulator or training..., requalification, and differences flight training. 91.1103 Section 91.1103 Aeronautics and Space FEDERAL AVIATION... OPERATING AND FLIGHT RULES Fractional Ownership Operations Program Management § 91.1103 Pilots: Initial...
Code of Federal Regulations, 2011 CFR
2011-01-01
... simulator or training device; and (2) A flight check in the aircraft or a check in the simulator or training..., requalification, and differences flight training. 91.1103 Section 91.1103 Aeronautics and Space FEDERAL AVIATION... OPERATING AND FLIGHT RULES Fractional Ownership Operations Program Management § 91.1103 Pilots: Initial...
Code of Federal Regulations, 2012 CFR
2012-01-01
... simulator or training device; and (2) A flight check in the aircraft or a check in the simulator or training..., requalification, and differences flight training. 91.1103 Section 91.1103 Aeronautics and Space FEDERAL AVIATION... OPERATING AND FLIGHT RULES Fractional Ownership Operations Program Management § 91.1103 Pilots: Initial...
Code of Federal Regulations, 2010 CFR
2010-01-01
... simulator or training device; and (2) A flight check in the aircraft or a check in the simulator or training..., requalification, and differences flight training. 91.1103 Section 91.1103 Aeronautics and Space FEDERAL AVIATION... OPERATING AND FLIGHT RULES Fractional Ownership Operations Program Management § 91.1103 Pilots: Initial...
36 CFR § 504.12 - Items to be checked.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true Items to be checked. § 504.12 Section § 504.12 Parks, Forests, and Public Property SMITHSONIAN INSTITUTION RULES AND REGULATIONS... needed to assist in walking), or other objects capable of inflicting damage to property or exhibits may...
A rule-based approach to model checking of UML state machines
NASA Astrophysics Data System (ADS)
Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz
2016-12-01
In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.
Automated constraint checking of spacecraft command sequences
NASA Astrophysics Data System (ADS)
Horvath, Joan C.; Alkalaj, Leon J.; Schneider, Karl M.; Spitale, Joseph M.; Le, Dang
1995-01-01
Robotic spacecraft are controlled by onboard sets of commands called "sequences." Determining that sequences will have the desired effect on the spacecraft can be expensive in terms of both labor and computer coding time, with different particular costs for different types of spacecraft. Specification languages and appropriate user interface to the languages can be used to make the most effective use of engineering validation time. This paper describes one specification and verification environment ("SAVE") designed for validating that command sequences have not violated any flight rules. This SAVE system was subsequently adapted for flight use on the TOPEX/Poseidon spacecraft. The relationship of this work to rule-based artificial intelligence and to other specification techniques is discussed, as well as the issues that arise in the transfer of technology from a research prototype to a full flight system.
77 FR 58529 - Submission for OMB Emergency Review
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-21
... Management and Budget (OMB) for review and clearance in accordance with the Paperwork Reduction Act of 1995... issue the final rule implementing the Serve America Act's National Service Criminal History Check rule. In an effort to be compliant while maintaining functions essential to the operations of each CNCS...
76 FR 70813 - Privacy Act of 1974, as Amended
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... ensuring uniform and high ethical standards of conduct for paid tax return preparers. The major components... return preparers, extending the ethical rules found in Treasury Department Circular 230 to all paid tax..., fingerprint, and tax compliance checks, and to ethics and other regulatory rules; may be required to take...
14 CFR 91.1063 - Testing and training: Applicability and terms used.
Code of Federal Regulations, 2013 CFR
2013-01-01
... proficiency check requirements of § 91.1069. (iii) Testing requirements of § 91.1065. (iv) Recurrent flight... 14 Aeronautics and Space 2 2013-01-01 2013-01-01 false Testing and training: Applicability and... TRANSPORTATION (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Fractional...
14 CFR 91.1063 - Testing and training: Applicability and terms used.
Code of Federal Regulations, 2010 CFR
2010-01-01
... proficiency check requirements of § 91.1069. (iii) Testing requirements of § 91.1065. (iv) Recurrent flight... 14 Aeronautics and Space 2 2010-01-01 2010-01-01 false Testing and training: Applicability and... TRANSPORTATION (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Fractional...
14 CFR 91.1063 - Testing and training: Applicability and terms used.
Code of Federal Regulations, 2012 CFR
2012-01-01
... proficiency check requirements of § 91.1069. (iii) Testing requirements of § 91.1065. (iv) Recurrent flight... 14 Aeronautics and Space 2 2012-01-01 2012-01-01 false Testing and training: Applicability and... TRANSPORTATION (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Fractional...
14 CFR 91.1063 - Testing and training: Applicability and terms used.
Code of Federal Regulations, 2011 CFR
2011-01-01
... proficiency check requirements of § 91.1069. (iii) Testing requirements of § 91.1065. (iv) Recurrent flight... 14 Aeronautics and Space 2 2011-01-01 2011-01-01 false Testing and training: Applicability and... TRANSPORTATION (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Fractional...
14 CFR 91.1063 - Testing and training: Applicability and terms used.
Code of Federal Regulations, 2014 CFR
2014-01-01
... proficiency check requirements of § 91.1069. (iii) Testing requirements of § 91.1065. (iv) Recurrent flight... 14 Aeronautics and Space 2 2014-01-01 2014-01-01 false Testing and training: Applicability and... TRANSPORTATION (CONTINUED) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Fractional...
12 CFR 220.117 - Exception to 90-day rule in special cash account.
Code of Federal Regulations, 2010 CFR
2010-01-01
... profit. On Day 8 customer delivered his check for the cost of the purchase to the creditor (member firm... any check or draft drawn on a bank which in the ordinary course of business is payable on presentation...(f), if controlling, would permit the exception to undermine, to some extent, the effectiveness of...
36 CFR § 520.13 - Items to be checked.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true Items to be checked. § 520.13 Section § 520.13 Parks, Forests, and Public Property SMITHSONIAN INSTITUTION RULES AND REGULATIONS GOVERNING THE BUILDINGS AND GROUNDS OF THE NATIONAL ZOOLOGICAL PARK OF THE SMITHSONIAN INSTITUTION § 520.13...
Expert systems built by the Expert: An evaluation of OPS5
NASA Technical Reports Server (NTRS)
Jackson, Robert
1987-01-01
Two expert systems were written in OPS5 by the expert, a Ph.D. astronomer with no prior experience in artificial intelligence or expert systems, without the use of a knowledge engineer. The first system was built from scratch and uses 146 rules to check for duplication of scientific information within a pool of prospective observations. The second system was grafted onto another expert system and uses 149 additional rules to estimate the spacecraft and ground resources consumed by a set of prospective observations. The small vocabulary, the IF this occurs THEN do that logical structure of OPS5, and the ability to follow program execution allowed the expert to design and implement these systems with only the data structures and rules of another OPS5 system as an example. The modularity of the rules in OPS5 allowed the second system to modify the rulebase of the system onto which it was grafted without changing the code or the operation of that system. These experiences show that experts are able to develop their own expert systems due to the ease of programming and code reusability in OPS5.
1985-05-31
These proposed regulations require a State agency to refund to the Federal government the Federal share of Medicaid checks issued by the State or its fiscal agent that remain uncashed 180 days after the date of issuance. In addition, we would require that the Federal share of cancelled (voided) Medicaid checks be refunded quarterly since there has been no expenditure by the State. This proposal is intended to implement in part a 1981 General Accounting Office recommendation that procedures be established for States to credit the Federal government for its portion of uncashed Medicaid checks issues by the State or its fiscal agent.
1986-10-09
These final regulations require that a State agency refund to the Federal Government the Federal share of Medicaid checks issued by the State or its fiscal agent that remain uncashed 180 days after the date of issuance. In addition, we are requiring that the Federal share of cancelled (voided) Medicaid checks be refunded quarterly since there has been no expenditure by the State. These regulations implement, in part, a 1981 General Accounting Office recommendation that procedures be established for States to credit the Federal Government for the Federal portion of uncashed Medicaid checks issued by the State or its fiscal agent.
Patients' feelings about ward nursing regimes and involvement in rule construction.
Alexander, J
2006-10-01
This study compared two acute psychiatric ward nursing regimes, focusing on ward rules as a means of investigating the relationship between the flexibility/inflexibility of the regimes and patient outcomes. Previous studies identified an association between ward rules and patient aggression. A link between absconding and nurses' attitudes towards rule enforcement has also been explored. However, an in-depth exploration of ward rules from the perspective of nurses and patients had not been undertaken previously. The study aimed to discover the content of rules within acute psychiatric wards; to explore patients' responses to the rules; to evaluate the impact of rules and rule enforcement on nurse-patient relationships and on ward events; and to investigate the relationship between ward rules, ward atmosphere and ward design. The relevance of sociological theory emerged from the data analysis. During this process, the results were moved up to another conceptual level to represent the meaning of lived experience at the level of theory. For example, nurses' descriptions of their feelings in relation to rule enforcement were merged as role ambivalence. This concept was supported by examples from the transcripts. Other possible explanations for the data and the connections between them were checked by returning to each text unit in the cluster and ensuring that it fitted with the emergent theory. The design centred on a comparative interview study of 30 patients and 30 nurses within two acute psychiatric wards in different hospitals. Non-participant observations provided a context for the interview data. Measures of the Ward Atmosphere Scale, the Hospital-Hostel Practices Profile, ward incidents and levels of as required (PRN) medication were obtained. The analysis of the quantitative data was assisted by spss, and the qualitative analysis by QSR *NUDIST. Thematic and interpretative phenomenological methods were used in the analysis of the qualitative data. A series of 11 interrelated concepts emerged from an analysis of the data, and a synthesis of the main themes. This paper focuses on the results and recommendations that emerged from the quantitative and qualitative patient data. A further paper will focus on nurses' perceptions of the same topics.
Sigle, Joerg P; Holbro, Andreas; Lehmann, Thomas; Infanti, Laura; Gerull, Sabine; Stern, Martin; Tichelli, Andre; Passweg, Jakob; Buser, Andreas
2015-07-01
The 30-minute rule for RBC concentrates out of controlled temperature storage does not take into account multiple parameters that influence warming of RBC concentrates. This study evaluated two temperature-sensitive indicators (TIs) for monitoring RBC concentrates during transport. TI labels (Check-Spot [Harald H. Temmel KG, Gleisdorf, Austria] and Thermoindikator V4 [BASF, Basel, Switzerland]) were attached to RBC concentrates prior to delivery. Duration of transport, ambient temperatures, and label results (valid vs expired) were recorded. We evaluated the proportion of labels discrepant to the 30-minute rule overall and among deliveries 30 minutes or less and more than 30 minutes and compared the rates of valid and expired readings between both TIs. In total, 201 RBC concentrate deliveries (86.6%) lasted 30 minutes or less, and 31 (13.4%) were more than 30 minutes. Forty-six (19.8%) Check-Spot and 37 (15.9%) Thermoindikator V4 results were discrepant to the 30-minute rule. Sixteen (51.6%) and 27 (87.1%) RBC concentrate deliveries more than 30 minutes displayed valid label readings with Check-Spot and Thermoindikator V4, respectively. Rates of expired labels among deliveries 30 minutes or less and valid labels among deliveries more than 30 minutes differed significantly between TIs (P < .01). TIs identified a considerable number of RBC concentrates whose temperatures may not be adequately reflected by the 30-minute rule. Variability of readings between TIs stresses the necessity of validation prior to implementation. Copyright© by the American Society for Clinical Pathology.
More Than the Rules of Precedence
ERIC Educational Resources Information Center
Liang, Yawei
2005-01-01
In a fundamental computer-programming course, such as CSE101, questions about how to evaluate an arithmetic expression are frequently used to check if our students know the rules of precedence. The author uses two of our final examination questions to show that more knowledge of computer science is needed to answer them correctly. Furthermore,…
The Evolution of the Automated Continuous Evaluation System (ACES) for Personnel Security
2013-11-12
information. It applies business rules to the data, produces a report that flags issues of potential security concern, and electronically transmits...Form 86 (SF- 86) to check these data sources, verify what has been submitted, and collect more information. It applies business rules to the data...subject information. It applies business rules to analyze the data returned, produces a report that flags issues of potential security concern, and
Ensuring production-worthy OPC recipes using large test structure arrays
NASA Astrophysics Data System (ADS)
Cork, Christopher; Zimmermann, Rainer; Mei, Xin; Shahin, Alexander
2007-03-01
The continual shrinking of design rules as the industry follows Moore's Law and the associated need for low k1 processes, have resulted in more layout configurations becoming difficult to print within the required tolerances. OPC recipes have needed to become more complex as tolerances decreased and acceptable corrections harder to find with simple algorithms. With this complexity comes the possibility of coding errors and ensuring the solutions are truly general. OPC Verification tools can check the quality of a correction based on pre-determined specifications for CD variation, line-end pullback and Edge Placement Error and then highlight layout configuration where violations are found. The problem facing a Mask Tape-Out group is that they usually have little control over the Design Styles coming in. Different approaches to eliminating problematic layouts have included highly restrictive Design Rules [1], whereby certain pitches or orientations are disallowed. Now these design rules are either becoming too complex or they overly restrict the designer from benefiting from the reduced pitch of the new node. The tight link between Design and Mask Tape-Out found in Integrated Device Manufacturers [2] (IDMs) i.e. companies that control both design and manufacturing can do much to dictate manufacturing friendly layout styles, and push ownership of problem resolution back to design groups. In fact this has been perceived as such an issue that a new class of products for designers that perform Lithographic Compliance Check on design layout is an emerging technology [3]. In contrast to IDMs, Semiconductor Foundries are presented with a much larger variety of design styles and a set of Fabless customers who generally are less knowledgeable in terms of understanding the impact of their layout on manufacturability and how to correct issues. The robustness requirements of a foundry's OPC correction recipe, therefore needs to be greater than that for an IDM's tape-out group. An OPC correction recipe which gives acceptable verification results, based solely on one customer GDS is clearly not sufficient to guarantee that all future tape-outs from multiple customers will be similarly clean. Ad hoc changes made in reaction to problems seen at verification are risky, while they may solve one particular layout issue on one product there is no guarantee that the problem may simply shift to another configuration on a yet to be manufactured part. The need to re-qualify a recipe over multiple products at each recipe change can easily results in excessive computational requirements. A single layer at an advanced node typically needs overnight runs on a large processor farm. Much of this layout, however, is extremely repetitive, made from a few standard cells placed tens of thousands of times. An alternative and more efficient approach, suggested by this paper as a screening methodology, is to encapsulate the problematic structures into a programmable test structure array. The dimensions of these test structures are parameterized in software such that they can be generated with these dimensions varied over the space of the design rules and conceivable design styles. By verifying the new recipe over these test structures one could more quickly gain confidence that this recipe would be robust over multiple tape-outs. This paper gives some examples of the implementation of this methodology.
2016-01-06
The Department of Health and Human Services (HHS or "the Department'') is issuing this final rule to modify the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy Rule to expressly permit certain HIPAA covered entities to disclose to the National Instant Criminal Background Check System (NICS) the identities of individuals who are subject to a Federal "mental health prohibitor'' that disqualifies them from shipping, transporting, possessing, or receiving a firearm. The NICS is a national system maintained by the Federal Bureau of Investigation (FBI) to conduct background checks on persons who may be disqualified from receiving firearms based on Federally prohibited categories or State law. Among the persons subject to the Federal mental health prohibitor established under the Gun Control Act of 1968 and implementing regulations issued by the Department of Justice (DOJ) are individuals who have been involuntarily committed to a mental institution; found incompetent to stand trial or not guilty by reason of insanity; or otherwise have been determined by a court, board, commission, or other lawful authority to be a danger to themselves or others or to lack the mental capacity to contract or manage their own affairs, as a result of marked subnormal intelligence or mental illness, incompetency, condition, or disease. Under this final rule, only covered entities with lawful authority to make the adjudications or commitment decisions that make individuals subject to the Federal mental health prohibitor, or that serve as repositories of information for NICS reporting purposes, are permitted to disclose the information needed for these purposes. The disclosure is restricted to limited demographic and certain other information needed for NICS purposes. The rule specifically prohibits the disclosure of diagnostic or clinical information, from medical records or other sources, and any mental health information beyond the indication that the individual is subject to the Federal mental health prohibitor.
Mining Hesitation Information by Vague Association Rules
NASA Astrophysics Data System (ADS)
Lu, An; Ng, Wilfred
In many online shopping applications, such as Amazon and eBay, traditional Association Rule (AR) mining has limitations as it only deals with the items that are sold but ignores the items that are almost sold (for example, those items that are put into the basket but not checked out). We say that those almost sold items carry hesitation information, since customers are hesitating to buy them. The hesitation information of items is valuable knowledge for the design of good selling strategies. However, there is no conceptual model that is able to capture different statuses of hesitation information. Herein, we apply and extend vague set theory in the context of AR mining. We define the concepts of attractiveness and hesitation of an item, which represent the overall information of a customer's intent on an item. Based on the two concepts, we propose the notion of Vague Association Rules (VARs). We devise an efficient algorithm to mine the VARs. Our experiments show that our algorithm is efficient and the VARs capture more specific and richer information than do the traditional ARs.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-15
... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-62476; File No. SR-FINRA-2010-012] Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Order Approving a Proposed Rule Change To Amend FINRA Rule 8312 (FINRA BrokerCheck Disclosure) July 8, 2010. I. Introduction On March 30, 2010, the Financial Industry Regulatory...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-70880; File No. SR-FINRA-2013-047] Self... Change To Amend FINRA Rule 8312 (FINRA BrokerCheck Disclosure) To Include Information About Members and... (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ notice is hereby given that on November 1, 2013, the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-14
... within the meaning of Regulation NMS (i.e. ``dark venues'' or ``dark pools''). XCST orders, pursuant to Rule 3315(a)(1)(A)(ix), check the System for available shares and simultaneously route to select dark... Web site ( http://www.sec.gov/rules/sro.shtml ). Copies of the submission, all subsequent amendments...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-13
.... ``dark venues'' or ``dark pools''). QCST orders, pursuant to Rule 4758(a)(1)(A)(xiii), check the System for available shares and simultaneously route to select dark venues and to certain low cost exchanges... Web site ( http://www.sec.gov/rules/sro.shtml ). Copies of the submission, all subsequent amendments...
Model based high NA anamorphic EUV RET
NASA Astrophysics Data System (ADS)
Jiang, Fan; Wiaux, Vincent; Fenger, Germain; Clifford, Chris; Liubich, Vlad; Hendrickx, Eric
2018-03-01
With the announcement of the extension of the Extreme Ultraviolet (EUV) roadmap to a high NA lithography tool that utilizes anamorphic optics design, an investigation of design tradeoffs unique to the imaging of anamorphic lithography tool is shown. An anamorphic optical proximity correction (OPC) solution has been developed that models fully the EUV near field electromagnetic effects and the anamorphic imaging using the Domain Decomposition Method (DDM). Clips of imec representative for the N3 logic node were used to demonstrate the OPC solutions on critical layers that will benefit from the increased contrast at high NA using anamorphic imaging. However, unlike isomorphic case, from wafer perspective, OPC needs to treat x and y differently. In the paper, we show a design trade-off seen unique to Anamorphic EUV, namely that using a mask rule of 48nm (mask scale), approaching current state of the art, limitations are observed in the available correction that can be applied to the mask. The metal pattern has a pitch of 24nm and CD of 12nm. During OPC, the correction of the metal lines oriented vertically are being limited by the mask rule of 12nm 1X. The horizontally oriented lines do not suffer from this mask rule limitation as the correction is allowed to go to 6nm 1X. For this example, the masks rules will need to be more aggressive to allow complete correction, or design rules and wafer processes (wafer rotation) would need to be created that utilize the orientation that can image more aggressive features. When considering VIA or block level correction, aggressive polygon corner to corner designs can be handled with various solutions, including applying a 45 degree chop. Multiple solutions are discussed with the metrics of edge placement error (EPE) and Process Variation Bands (PVBands), together with all the mask constrains. Noted in anamorphic OPC, the 45 degree chop is maintained at the mask level to meet mask manufacturing constraints, but results in skewed angle edge in wafer level correction. In this paper, we used both contact (Via/block) patterns and metal patterns for OPC practice. By comparing the EPE of horizontal and vertical patterns with a fixed mask rule check (MRC), and the PVBand, we focus on the challenges and the solutions of OPC with anamorphic High-NA lens.
NASA Astrophysics Data System (ADS)
Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong
2011-04-01
As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.
Code of Federal Regulations, 2013 CFR
2013-01-01
... history records checks and other elements of background checks for designated categories of individuals..., identification and criminal history records checks and other elements of background checks for designated categories of individuals. Fingerprinting, and the identification and criminal history records checks...
Code of Federal Regulations, 2010 CFR
2010-01-01
... history records checks and other elements of background checks for designated categories of individuals..., identification and criminal history records checks and other elements of background checks for designated categories of individuals. Fingerprinting, and the identification and criminal history records checks...
Specification and Verification of Web Applications in Rewriting Logic
NASA Astrophysics Data System (ADS)
Alpuente, María; Ballis, Demis; Romero, Daniel
This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.
[Safety provisions for recreational flying or sport with a hang-glider].
Gennari, M; Lombardo, C
1987-01-01
The Act. n. 106 of 25th March 1985 had defined the specifications of the particular aircraft designed for hobby or sport flying as is the hang-glider. It has also provided for the issue, within six months, of special regulations aimed at "checking the psycho-physical fitness required in handling" such aircraft in additions to the technical knowledge and the information about traffic, safety, insurance regulations relevant to the matter. However, the patent default of the legislator causes the protection of hobby and sport practice of hang-gliding to be either wholly inadequate or ruled by ambiguous regulations. If, instead, the present law in force is referred to, it is deemed that--while waiting for the regulations provided for by n. 106 Act.--the Aereo Club of Italy may define as "agonistic" the practice of "hobby or sport flight" so that the checking of the "specific" fitness required by such sport comes into operation in compliance with the State Decree of 18th February 1982.
Methodology for balancing design and process tradeoffs for deep-subwavelength technologies
NASA Astrophysics Data System (ADS)
Graur, Ioana; Wagner, Tina; Ryan, Deborah; Chidambarrao, Dureseti; Kumaraswamy, Anand; Bickford, Jeanne; Styduhar, Mark; Wang, Lee
2011-04-01
For process development of deep-subwavelength technologies, it has become accepted practice to use model-based simulation to predict systematic and parametric failures. Increasingly, these techniques are being used by designers to ensure layout manufacturability, as an alternative to, or complement to, restrictive design rules. The benefit of model-based simulation tools in the design environment is that manufacturability problems are addressed in a design-aware way by making appropriate trade-offs, e.g., between overall chip density and manufacturing cost and yield. The paper shows how library elements and the full ASIC design flow benefit from eliminating hot spots and improving design robustness early in the design cycle. It demonstrates a path to yield optimization and first time right designs implemented in leading edge technologies. The approach described herein identifies those areas in the design that could benefit from being fixed early, leading to design updates and avoiding later design churn by careful selection of design sensitivities. This paper shows how to achieve this goal by using simulation tools incorporating various models from sparse to rigorously physical, pattern detection and pattern matching, checking and validating failure thresholds.
Resources for HVACR contractors, technicians, equipment owners and other regulated industry to check rules and requirements for managing refrigerant emissions, information on how to become a certified technician, and compliance assistance documents.
Reducing injury risk from body checking in boys' youth ice hockey.
Brooks, Alison; Loud, Keith J; Brenner, Joel S; Demorest, Rebecca A; Halstead, Mark E; Kelly, Amanda K Weiss; Koutures, Chris G; LaBella, Cynthia R; LaBotz, Michele; Martin, Stephanie S; Moffatt, Kody
2014-06-01
Ice hockey is an increasingly popular sport that allows intentional collision in the form of body checking for males but not for females. There is a two- to threefold increased risk of all injury, severe injury, and concussion related to body checking at all levels of boys' youth ice hockey. The American Academy of Pediatrics reinforces the importance of stringent enforcement of rules to protect player safety as well as educational interventions to decrease unsafe tactics. To promote ice hockey as a lifelong recreational pursuit for boys, the American Academy of Pediatrics recommends the expansion of nonchecking programs and the restriction of body checking to elite levels of boys' youth ice hockey, starting no earlier than 15 years of age.
Internet MEMS design tools based on component technology
NASA Astrophysics Data System (ADS)
Brueck, Rainer; Schumer, Christian
1999-03-01
The micro electromechanical systems (MEMS) industry in Europe is characterized by small and medium sized enterprises specialized on products to solve problems in specific domains like medicine, automotive sensor technology, etc. In this field of business the technology driven design approach known from micro electronics is not appropriate. Instead each design problem aims at its own, specific technology to be used for the solution. The variety of technologies at hand, like Si-surface, Si-bulk, LIGA, laser, precision engineering requires a huge set of different design tools to be available. No single SME can afford to hold licenses for all these tools. This calls for a new and flexible way of designing, implementing and distributing design software. The Internet provides a flexible manner of offering software access along with methodologies of flexible licensing e.g. on a pay-per-use basis. New communication technologies like ADSL, TV cable of satellites as carriers promise to offer a bandwidth sufficient even for interactive tools with graphical interfaces in the near future. INTERLIDO is an experimental tool suite for process specification and layout verification for lithography based MEMS technologies to be accessed via the Internet. The first version provides a Java implementation even including a graphical editor for process specification. Currently, a new version is brought into operation that is based on JavaBeans component technology. JavaBeans offers the possibility to realize independent interactive design assistants, like a design rule checking assistants, a process consistency checking assistants, a technology definition assistants, a graphical editor assistants, etc. that may reside distributed over the Internet, communicating via Internet protocols. Each potential user thus is able to configure his own dedicated version of a design tool set dedicated to the requirements of the current problem to be solved.
IT Security Standards and Legal Metrology - Transfer and Validation
NASA Astrophysics Data System (ADS)
Thiel, F.; Hartmann, V.; Grottker, U.; Richter, D.
2014-08-01
Legal Metrology's requirements can be transferred into the IT security domain applying a generic set of standardized rules provided by the Common Criteria (ISO/IEC 15408). We will outline the transfer and cross validation of such an approach. As an example serves the integration of Legal Metrology's requirements into a recently developed Common Criteria based Protection Profile for a Smart Meter Gateway designed under the leadership of the Germany's Federal Office for Information Security. The requirements on utility meters laid down in the Measuring Instruments Directive (MID) are incorporated. A verification approach to check for meeting Legal Metrology's requirements by their interpretation through Common Criteria's generic requirements is also presented.
Auditing health insurance reimbursement by constructing association rules
NASA Astrophysics Data System (ADS)
Chiang, I.-Jen
2000-04-01
Two months of reimbursement claim data of the admission patients from National Taiwan University Hospital have been used to be the training set (200 MB or so), a quick method has been used to find out the association rules among the illness, the examinations and treatments, the drugs, and the equipment. The filtered rules by setting the minimum support and the minimum confidence are used to screen out a month claimed data from the other hospital. Some unproper orders to the patients are able to checked out. In this paper, we will discuss the algorithm for generalizing association rule and the experiments of using the association rules to screen out the unproper orders in the health reimbursement claims.
ERIC Educational Resources Information Center
Ross, Scott W.; Sabey, Christian V.
2015-01-01
Check-In Check-Out is a Tier 2 intervention designed to reduce problem behavior and increase prosocial behavior. Although the intervention has demonstrated effects in several studies, few research efforts have considered how the intervention can be modified to support students with social skill deficits. Through a multiple baseline design across…
Aggression, Violence and Injury in Minor League Ice Hockey: Avenues for Prevention of Injury.
Cusimano, Michael D; Ilie, Gabriela; Mullen, Sarah J; Pauley, Christopher R; Stulberg, Jennifer R; Topolovec-Vranic, Jane; Zhang, Stanley
2016-01-01
In North America, more than 800,000 youth are registered in organized ice hockey leagues. Despite the many benefits of involvement, young players are at significant risk for injury. Body-checking and aggressive play are associated with high frequency of game-related injury including concussion. We conducted a qualitative study to understand why youth ice hockey players engage in aggressive, injury-prone behaviours on the ice. Semi-structured interviews were conducted with 61 minor ice hockey participants, including male and female players, parents, coaches, trainers, managers and a game official. Players were aged 13-15 playing on competitive body checking teams or on non-body checking teams. Interviews were manually transcribed, coded and analyzed for themes relating to aggressive play in minor ice hockey. Parents, coaches, teammates and the media exert a large influence on player behavior. Aggressive behavior is often reinforced by the player's social environment and justified by players to demonstrate loyalty to teammates and especially injured teammates by seeking revenge particularly in competitive, body-checking leagues. Among female and male players in non-body checking organizations, aggressive play is not reinforced by the social environment. These findings are discussed within the framework of social identity theory and social learning theory, in order to understand players' need to seek revenge and how the social environment reinforces aggressive behaviors. This study provides a better understanding of the players' motivations and environmental influences around aggressive and violent play which may be conducive to injury. The findings can be used to help design interventions aimed at reducing aggression and related injuries sustained during ice hockey and sports with similar cultures and rules.
Aggression, Violence and Injury in Minor League Ice Hockey: Avenues for Prevention of Injury
Cusimano, Michael D.; Ilie, Gabriela; Mullen, Sarah J.; Pauley, Christopher R.; Stulberg, Jennifer R.; Topolovec-Vranic, Jane; Zhang, Stanley
2016-01-01
Background In North America, more than 800,000 youth are registered in organized ice hockey leagues. Despite the many benefits of involvement, young players are at significant risk for injury. Body-checking and aggressive play are associated with high frequency of game-related injury including concussion. We conducted a qualitative study to understand why youth ice hockey players engage in aggressive, injury-prone behaviours on the ice. Methods Semi-structured interviews were conducted with 61 minor ice hockey participants, including male and female players, parents, coaches, trainers, managers and a game official. Players were aged 13–15 playing on competitive body checking teams or on non-body checking teams. Interviews were manually transcribed, coded and analyzed for themes relating to aggressive play in minor ice hockey. Results Parents, coaches, teammates and the media exert a large influence on player behavior. Aggressive behavior is often reinforced by the player’s social environment and justified by players to demonstrate loyalty to teammates and especially injured teammates by seeking revenge particularly in competitive, body-checking leagues. Among female and male players in non-body checking organizations, aggressive play is not reinforced by the social environment. These findings are discussed within the framework of social identity theory and social learning theory, in order to understand players’ need to seek revenge and how the social environment reinforces aggressive behaviors. Conclusion This study provides a better understanding of the players’ motivations and environmental influences around aggressive and violent play which may be conducive to injury. The findings can be used to help design interventions aimed at reducing aggression and related injuries sustained during ice hockey and sports with similar cultures and rules. PMID:27258426
Cusimano, Michael D; Nastis, Sofia; Zuccaro, Laura
2013-01-08
The increasing incidence of injuries related to playing ice hockey is an important public health issue. We conducted a systematic review to evaluate the effectiveness of interventions designed to reduce injuries related to aggressive acts in ice hockey. We identified relevant articles by searching electronic databases from their inception through July 2012, by using Internet search engines, and by manually searching sports medicine journals, the book series Safety in Ice Hockey and reference lists of included articles. We included studies that evaluated interventions to reduce aggression-related injuries and reported ratings of aggressive behaviour or rates of penalties or injuries. We identified 18 eligible studies. Most involved players in minor hockey leagues. Of 13 studies that evaluated changes in mandatory rules intended to lessen aggression (most commonly the restriction of body-checking), 11 observed a reduction in penalty or injury rates associated with rule changes, and 9 of these showed a statistically significant decrease. The mean number of penalties decreased by 1.2-5.9 per game, and injury rates decreased 3- to 12-fold. All 3 studies of educational interventions showed a reduction in penalty rates, but they were not powered or designed to show a change in injury rates. In 2 studies of cognitive behavioural interventions, reductions in aggressive behaviours were observed. Changes to mandatory rules were associated with reductions in penalties for aggressive acts and in injuries related to aggression among ice hockey players. Effects of educational and cognitive behavioural interventions on injury rates are less clear. Well-designed studies of multifaceted strategies that combine such approaches are required.
Design a Fuzzy Rule-based Expert System to Aid Earlier Diagnosis of Gastric Cancer.
Safdari, Reza; Arpanahi, Hadi Kazemi; Langarizadeh, Mostafa; Ghazisaiedi, Marjan; Dargahi, Hossein; Zendehdel, Kazem
2018-01-01
Screening and health check-up programs are most important sanitary priorities, that should be undertaken to control dangerous diseases such as gastric cancer that affected by different factors. More than 50% of gastric cancer diagnoses are made during the advanced stage. Currently, there is no systematic approach for early diagnosis of gastric cancer. to develop a fuzzy expert system that can identify gastric cancer risk levels in individuals. This system was implemented in MATLAB software, Mamdani inference technique applied to simulate reasoning of experts in the field, a total of 67 fuzzy rules extracted as a rule-base based on medical expert's opinion. 50 case scenarios were used to evaluate the system, the information of case reports is given to the system to find risk level of each case report then obtained results were compared with expert's diagnosis. Results revealed that sensitivity was 92.1% and the specificity was 83.1%. The results show that is possible to develop a system that can identify High risk individuals for gastric cancer. The system can lead to earlier diagnosis, this may facilitate early treatment and reduce gastric cancer mortality rate.
Design, Development and Analysis of Centrifugal Blower
NASA Astrophysics Data System (ADS)
Baloni, Beena Devendra; Channiwala, Salim Abbasbhai; Harsha, Sugnanam Naga Ramannath
2018-06-01
Centrifugal blowers are widely used turbomachines equipment in all kinds of modern and domestic life. Manufacturing of blowers seldom follow an optimum design solution for individual blower. Although centrifugal blowers are developed as highly efficient machines, design is still based on various empirical and semi empirical rules proposed by fan designers. There are different methodologies used to design the impeller and other components of blowers. The objective of present study is to study explicit design methodologies and tracing unified design to get better design point performance. This unified design methodology is based more on fundamental concepts and minimum assumptions. Parametric study is also carried out for the effect of design parameters on pressure ratio and their interdependency in the design. The code is developed based on a unified design using C programming. Numerical analysis is carried out to check the flow parameters inside the blower. Two blowers, one based on the present design and other on industrial design, are developed with a standard OEM blower manufacturing unit. A comparison of both designs is done based on experimental performance analysis as per IS standard. The results suggest better efficiency and more flow rate for the same pressure head in case of the present design compared with industrial one.
76 FR 2152 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-12
... Management and Budget for extension and approval. Rule 17a-4 requires exchange members, brokers and dealers.... These include, but are not limited to, bank statements, cancelled checks, bills receivable and payable...
Smart Aerospace eCommerce: Using Intelligent Agents in a NASA Mission Services Ordering Application
NASA Technical Reports Server (NTRS)
Moleski, Walt; Luczak, Ed; Morris, Kim; Clayton, Bill; Scherf, Patricia; Obenschain, Arthur F. (Technical Monitor)
2002-01-01
This paper describes how intelligent agent technology was successfully prototyped and then deployed in a smart eCommerce application for NASA. An intelligent software agent called the Intelligent Service Validation Agent (ISVA) was added to an existing web-based ordering application to validate complex orders for spacecraft mission services. This integration of intelligent agent technology with conventional web technology satisfies an immediate NASA need to reduce manual order processing costs. The ISVA agent checks orders for completeness, consistency, and correctness, and notifies users of detected problems. ISVA uses NASA business rules and a knowledge base of NASA services, and is implemented using the Java Expert System Shell (Jess), a fast rule-based inference engine. The paper discusses the design of the agent and knowledge base, and the prototyping and deployment approach. It also discusses future directions and other applications, and discusses lessons-learned that may help other projects make their aerospace eCommerce applications smarter.
A Data Stream Model For Runoff Simulation In A Changing Environment
NASA Astrophysics Data System (ADS)
Yang, Q.; Shao, J.; Zhang, H.; Wang, G.
2017-12-01
Runoff simulation is of great significance for water engineering design, water disaster control, water resources planning and management in a catchment or region. A large number of methods including concept-based process-driven models and statistic-based data-driven models, have been proposed and widely used in worldwide during past decades. Most existing models assume that the relationship among runoff and its impacting factors is stationary. However, in the changing environment (e.g., climate change, human disturbance), their relationship usually evolves over time. In this study, we propose a data stream model for runoff simulation in a changing environment. Specifically, the proposed model works in three steps: learning a rule set, expansion of a rule, and simulation. The first step is to initialize a rule set. When a new observation arrives, the model will check which rule covers it and then use the rule for simulation. Meanwhile, Page-Hinckley (PH) change detection test is used to monitor the online simulation error of each rule. If a change is detected, the corresponding rule is removed from the rule set. In the second step, for each rule, if it covers more than a given number of instance, the rule is expected to expand. In the third step, a simulation model of each leaf node is learnt with a perceptron without activation function, and is updated with adding a newly incoming observation. Taking Fuxi River catchment as a case study, we applied the model to simulate the monthly runoff in the catchment. Results show that abrupt change is detected in the year of 1997 by using the Page-Hinckley change detection test method, which is consistent with the historic record of flooding. In addition, the model achieves good simulation results with the RMSE of 13.326, and outperforms many established methods. The findings demonstrated that the proposed data stream model provides a promising way to simulate runoff in a changing environment.
Science Opportunity Analyzer (SOA) Version 8
NASA Technical Reports Server (NTRS)
Witoff, Robert J.; Polanskey, Carol A.; Aguinaldo, Anna Marie A.; Liu, Ning; Hofstadter, Mark D.
2013-01-01
SOA allows scientists to plan spacecraft observations. It facilitates the identification of geometrically interesting times in a spacecraft s orbit that a user can use to plan observations or instrument-driven spacecraft maneuvers. These observations can then be visualized multiple ways in both two- and three-dimensional views. When observations have been optimized within a spacecraft's flight rules, the resulting plans can be output for use by other JPL uplink tools. Now in its eighth major version, SOA improves on these capabilities in a modern and integrated fashion. SOA consists of five major functions: Opportunity Search, Visualization, Observation Design, Constraint Checking, and Data Output. Opportunity Search is a GUI-driven interface to existing search engines that can be used to identify times when a spacecraft is in a specific geometrical relationship with other bodies in the solar system. This function can be used for advanced mission planning as well as for making last-minute adjustments to mission sequences in response to trajectory modifications. Visualization is a key aspect of SOA. The user can view observation opportunities in either a 3D representation or as a 2D map projection. Observation Design allows the user to orient the spacecraft and visualize the projection of the instrument field of view for that orientation using the same views as Opportunity Search. Constraint Checking is provided to validate various geometrical and physical aspects of an observation design. The user has the ability to easily create custom rules or to use official project-generated flight rules. This capability may also allow scientists to easily assess the cost to science if flight rule changes occur. Data Output allows the user to compute ancillary data related to an observation or to a given position of the spacecraft along its trajectory. The data can be saved as a tab-delimited text file or viewed as a graph. SOA combines science planning functionality unique to both JPL and the sponsoring spacecraft. SOA is able to ingest JPL SPICE Kernels that are used to drive the tool and its computations. A Percy search engine is then included that identifies interesting time periods for the user to build observations. When observations are then built, flight-like orientation algorithms replicate spacecraft dynamics to closely simulate the flight spacecraft s dynamics. SOA v8 represents large steps forward from SOA v7 in terms of quality, reliability, maintainability, efficiency, and user experience. A tailored agile development environment has been built around SOA that provides automated unit testing, continuous build and integration, a consolidated Web-based code and documentation storage environment, modern Java enhancements, and a focus on usability
40 CFR 60.2780 - What must I include in the deviation report?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Compliance Times for Commercial and Industrial Solid Waste Incineration Units Model Rule-Recordkeeping and... downtime associated with zero, span, and other routine calibration checks). (f) Whether each deviation...
Russian Defense Legislation and Russian Democracy,
1995-08-17
system denoting a President who is virtually unencumbered by the division of and separation of powers and by a system of checks and balances... separation of powers and is himself able to rule by decree. This trend to concentrate power in the President and in unresponsive executive branch...enhanced activity of the President is legally sanctioned along with the concept of rule by decree, a renunciation of separation of powers , exemption from
Verification of the H2O Linelists with Theoretically Developed Tools
NASA Technical Reports Server (NTRS)
Ma, Qiancheng; Tipping, R.; Lavrentieva, N. N.; Dudaryonok, A. S.
2013-01-01
Two basic rules (i.e., the pair identity and the smooth variation rules) resulting from the properties of the energy levels and wave functions of H2O states govern how the spectroscopic parameters vary with the H2O lines within the individually defined groups of lines. With these rules, for those lines involving high j states in the same groups, variations of all their spectroscopic parameters (i.e., the transition frequency, intensity, pressure broadened half-width, pressure-induced shift, and temperature exponent) can be well monitored. Thus, the rules can serve as simple and effective tools to screen the H2O spectroscopic data listed in the HITRAN database and verify the latter's accuracies. By checking violations of the rules occurring among the data within the individual groups, possible errors can be picked up and also possible missing lines in the linelist whose intensities are above the threshold can be identified. We have used these rules to check the accuracies of the spectroscopic parameters and the completeness of the linelists for several important H2O vibrational bands. Based on our results, the accuracy of the line frequencies in HITRAN 2008 is consistent. For the line intensity, we have found that there are a substantial number of lines whose intensity values are questionable. With respect to other parameters, many mistakes have been found. The above claims are consistent with a well known fact that values of these parameters in HITRAN contain larger uncertainties. Furthermore, supplements of the missing line list consisting of line assignments and positions can be developed from the screening results.
Automation for pattern library creation and in-design optimization
NASA Astrophysics Data System (ADS)
Deng, Rock; Zou, Elain; Hong, Sid; Wang, Jinyan; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua; Huang, Jason
2015-03-01
Semiconductor manufacturing technologies are becoming increasingly complex with every passing node. Newer technology nodes are pushing the limits of optical lithography and requiring multiple exposures with exotic material stacks for each critical layer. All of this added complexity usually amounts to further restrictions in what can be designed. Furthermore, the designs must be checked against all these restrictions in verification and sign-off stages. Design rules are intended to capture all the manufacturing limitations such that yield can be maximized for any given design adhering to all the rules. Most manufacturing steps employ some sort of model based simulation which characterizes the behavior of each step. The lithography models play a very big part of the overall yield and design restrictions in patterning. However, lithography models are not practical to run during design creation due to their slow and prohibitive run times. Furthermore, the models are not usually given to foundry customers because of the confidential and sensitive nature of every foundry's processes. The design layout locations where a model flags unacceptable simulated results can be used to define pattern rules which can be shared with customers. With advanced technology nodes we see a large growth of pattern based rules. This is due to the fact that pattern matching is very fast and the rules themselves can be very complex to describe in a standard DRC language. Therefore, the patterns are left as either pattern layout clips or abstracted into pattern-like syntax which a pattern matcher can use directly. The patterns themselves can be multi-layered with "fuzzy" designations such that groups of similar patterns can be found using one description. The pattern matcher is often integrated with a DRC tool such that verification and signoff can be done in one step. The patterns can be layout constructs that are "forbidden", "waived", or simply low-yielding in nature. The patterns can also contain remedies built in so that fixing happens either automatically or in a guided manner. Building a comprehensive library of patterns is a very difficult task especially when a new technology node is being developed or the process keeps changing. The main dilemma is not having enough representative layouts to use for model simulation where pattern locations can be marked and extracted. This paper will present an automatic pattern library creation flow by using a few known yield detractor patterns to systematically expand the pattern library and generate optimized patterns. We will also look at the specific fixing hints in terms of edge movements, additive, or subtractive changes needed during optimization. Optimization will be shown for both the digital physical implementation and custom design methods.
Science opportunity analyzer - a multi-mission tool for planning
NASA Technical Reports Server (NTRS)
Streiffert, B. A.; Polanskey, C. A.; O'Reilly, T.; Colwell, J.
2002-01-01
For many years the diverse scientific community that supports JPL's wide variety ofinterplanetary space missions has needed a tool in order to plan and develop their experiments. The tool needs to be easily adapted to various mission types and portable to the user community. The Science Opportunity Analyzer, SOA, now in its third year of development, is intended to meet this need. SOA is a java-based application that is designed to enable scientists to identify and analyze opportunities for science observations from spacecraft. It differs from other planning tools in that it does not require an in-depth knowledge of the spacecraft command system or operation modes to begin high level planning. Users can, however, develop increasingly detailed levels of design. SOA consists of six major functions: Opportunity Search, Visualization, Observation Design, Constraint Checking, Data Output and Communications. Opportunity Search is a GUI driven interface to existing search engines that can be used to identify times when a spacecraft is in a specific geometrical relationship with other bodies in the solar system. This function can be used for advanced mission planning as well as for making last minute adjustments to mission sequences in response to trajectory modifications. Visualization is a key aspect of SOA. The user can view observation opportunities in either a 3D representation or as a 2D map projection. The user is given extensive flexibility to customize what is displayed in the view. Observation Design allows the user to orient the spacecraft and visualize the projection of the instrument field of view for that orientation using the same views as Opportunity Search. Constraint Checking is provided to validate various geometrical and physical aspects of an observation design. The user has the ability to easily create custom rules or to use official project-generated flight rules. This capability may also allow scientists to easily impact the cost to science if flight rule changes occur. Data Output generates information based on the spacecraft's trajectory, opportunity search results or based on a created observation. The data can be viewed either in tabular format or as a graph. Finally, SOA is unique in that it is designed to be able to communicate with a variety of existing planning and sequencing tools. From the very beginning SOA was designed with the user in mind. Extensive surveys of the potential user community were conducted in order to develop the software requirements. Throughout the development period, close ties have been maintained with the science community to insure that the tool maintains its user focus. Although development is still in its early stages, SOA is already developing a user community on the Cassini project, which is depending on this tool for their science planning. There are other tools at JPL that do various pieces of what SOA can do; however, there is no other tool which combines all these functions and presents them to the user in such a convenient, cohesive, and easy to use fashion.
Abstraction and Assume-Guarantee Reasoning for Automated Software Verification
NASA Technical Reports Server (NTRS)
Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.
2004-01-01
Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.
National Contaminant Occurrence Database (NCOD)
This site describes water sample analytical data that EPA is currently using and has used in the past for analysis, rulemaking, and rule evaluation. The data have been checked for data quality and analyzed for national representativeness.
Enhancing the Dependability of Complex Missions Through Automated Analysis
2013-09-01
triangular or self - referential relationships. The Semantic Web Rule Language (SWRL)—a W3C-approved OWL extension—addresses some of these limitations by...SWRL extends OWL with Horn-like rules that can model complex relational structures and self - referential relationships; Prolog extends OWL+SWRL with the...8]. Additionally, multi-agent model checking has been used to verify OWL-S process models [9]. OWL is a powerful knowledge representation formalism
Typing for Conflict Detection in Access Control Policies
NASA Astrophysics Data System (ADS)
Adi, Kamel; Bouzida, Yacine; Hattak, Ikhlass; Logrippo, Luigi; Mankovskii, Serge
In this paper we present an access control model that considers both abstract and concrete access control policies specifications. Permissions and prohibitions are expressed within this model with contextual conditions. This situation may lead to conflicts. We propose a type system that is applied to the different rules in order to check for inconsistencies. If a resource is well typed, it is guaranteed that access rules to the resource contain no conflicts.
Proof Rules for Automated Compositional Verification through Learning
NASA Technical Reports Server (NTRS)
Barringer, Howard; Giannakopoulou, Dimitra; Pasareanu, Corina S.
2003-01-01
Compositional proof systems not only enable the stepwise development of concurrent processes but also provide a basis to alleviate the state explosion problem associated with model checking. An assume-guarantee style of specification and reasoning has long been advocated to achieve compositionality. However, this style of reasoning is often non-trivial, typically requiring human input to determine appropriate assumptions. In this paper, we present novel assume- guarantee rules in the setting of finite labelled transition systems with blocking communication. We show how these rules can be applied in an iterative and fully automated fashion within a framework based on learning.
Ku, Hao-Hsiang
2015-01-01
Nowadays, people can easily use a smartphone to get wanted information and requested services. Hence, this study designs and proposes a Golf Swing Injury Detection and Evaluation open service platform with Ontology-oritened clustering case-based reasoning mechanism, which is called GoSIDE, based on Arduino and Open Service Gateway initative (OSGi). GoSIDE is a three-tier architecture, which is composed of Mobile Users, Application Servers and a Cloud-based Digital Convergence Server. A mobile user is with a smartphone and Kinect sensors to detect the user's Golf swing actions and to interact with iDTV. An application server is with Intelligent Golf Swing Posture Analysis Model (iGoSPAM) to check a user's Golf swing actions and to alter this user when he is with error actions. Cloud-based Digital Convergence Server is with Ontology-oriented Clustering Case-based Reasoning (CBR) for Quality of Experiences (OCC4QoE), which is designed to provide QoE services by QoE-based Ontology strategies, rules and events for this user. Furthermore, GoSIDE will automatically trigger OCC4QoE and deliver popular rules for a new user. Experiment results illustrate that GoSIDE can provide appropriate detections for Golfers. Finally, GoSIDE can be a reference model for researchers and engineers.
Cusimano, Michael D.; Nastis, Sofia; Zuccaro, Laura
2013-01-01
Background: The increasing incidence of injuries related to playing ice hockey is an important public health issue. We conducted a systematic review to evaluate the effectiveness of interventions designed to reduce injuries related to aggressive acts in ice hockey. Methods: We identified relevant articles by searching electronic databases from their inception through July 2012, by using Internet search engines, and by manually searching sports medicine journals, the book series Safety in Ice Hockey and reference lists of included articles. We included studies that evaluated interventions to reduce aggression-related injuries and reported ratings of aggressive behaviour or rates of penalties or injuries. Results: We identified 18 eligible studies. Most involved players in minor hockey leagues. Of 13 studies that evaluated changes in mandatory rules intended to lessen aggression (most commonly the restriction of body-checking), 11 observed a reduction in penalty or injury rates associated with rule changes, and 9 of these showed a statistically significant decrease. The mean number of penalties decreased by 1.2–5.9 per game, and injury rates decreased 3- to 12-fold. All 3 studies of educational interventions showed a reduction in penalty rates, but they were not powered or designed to show a change in injury rates. In 2 studies of cognitive behavioural interventions, reductions in aggressive behaviours were observed. Interpretation: Changes to mandatory rules were associated with reductions in penalties for aggressive acts and in injuries related to aggression among ice hockey players. Effects of educational and cognitive behavioural interventions on injury rates are less clear. Well-designed studies of multifaceted strategies that combine such approaches are required. PMID:23209118
U.S. Coast Guard, Office of Boating Safety
... COAST GUARD ISSUES FINALE RULE – UPDATE OF OUTBOARD ENGINE WEIGHT TEST REQUIREMENTS FY18 National Nonprofit Organization Funding ... operator, passenger, or concerned individual, can make a difference. Manufacturers Is your boat safe? You can check ...
Simple method to verify OPC data based on exposure condition
NASA Astrophysics Data System (ADS)
Moon, James; Ahn, Young-Bae; Oh, Sey-Young; Nam, Byung-Ho; Yim, Dong Gyu
2006-03-01
In a world where Sub100nm lithography tool is an everyday household item for device makers, shrinkage of the device is at a rate that no one ever have imagined. With the shrinkage of device at such a high rate, demand placed on Optical Proximity Correction (OPC) is like never before. To meet this demand with respect to shrinkage rate of the device, more aggressive OPC tactic is involved. Aggressive OPC tactics is a must for sub 100nm lithography tech but this tactic eventually results in greater room for OPC error and complexity of the OPC data. Until now, Optical Rule Check (ORC) or Design Rule Check (DRC) was used to verify this complex OPC error. But each of these methods has its pros and cons. ORC verification of OPC data is rather accurate "process" wise but inspection of full chip device requires a lot of money (Computer , software,..) and patience (run time). DRC however has no such disadvantage, but accuracy of the verification is a total downfall "process" wise. In this study, we were able to create a new method for OPC data verification that combines the best of both ORC and DRC verification method. We created a method that inspects the biasing of the OPC data with respect to the illumination condition of the process that's involved. This new method for verification was applied to 80nm tech ISOLATION and GATE layer of the 512M DRAM device and showed accuracy equivalent to ORC inspection with run time that of DRC verification.
A novel approach of ensuring layout regularity correct by construction in advanced technologies
NASA Astrophysics Data System (ADS)
Ahmed, Shafquat Jahan; Vaderiya, Yagnesh; Gupta, Radhika; Parthasarathy, Chittoor; Marin, Jean-Claude; Robert, Frederic
2017-03-01
In advanced technology nodes, layout regularity has become a mandatory prerequisite to create robust designs less sensitive to variations in manufacturing process in order to improve yield and minimizing electrical variability. In this paper we describe a method for designing regular full custom layouts based on design and process co-optimization. The method includes various design rule checks that can be used on-the-fly during leaf-cell layout development. We extract a Layout Regularity Index (LRI) from the layouts based on the jogs, alignments and pitches used in the design for any given metal layer. Regularity Index of a layout is the direct indicator of manufacturing yield and is used to compare the relative health of different layout blocks in terms of process friendliness. The method has been deployed for 28nm and 40nm technology nodes for Memory IP and is being extended to other IPs (IO, standard-cell). We have quantified the gain of layout regularity with the deployed method on printability and electrical characteristics by process-variation (PV) band simulation analysis and have achieved up-to 5nm reduction in PV band.
... Savings Account Plans These plans are offered by insurance companies and other private companies approved by Medicare. Medicare Advantage Plans may also offer prescription drug coverage that follows the same rules ... card . Check all other insurance cards that you use. Call the phone number ...
Helping You Choose Quality Ambulatory Care
Helping you choose: Quality ambulatory care When you need ambulatory care, you should find out some information to help you choose the best ... the center follows rules for patient safety and quality. Go to Quality Check ® at www. qualitycheck. org ...
Helping You Choose Quality Hospice Care
Helping you choose: Quality hospice care When you need hospice care, you should find out some information to help you choose the best ... the service follows rules for patient safety and quality. Go to Quality Check ® at www. qualitycheck. org ...
NASA Astrophysics Data System (ADS)
Lin, Huan-Chun; Chen, Su-Chin; Tsai, Chen-Chen
2014-05-01
The contents of engineering design should indeed contain both science and art fields. However, the art aspect is too less discussed to cause an inharmonic impact with natural surroundings, and so are check dams. This study would like to seek more opportunities of check dams' harmony with nearby circumstances. According to literatures review of philosophy and cognition science fields, we suggest a thinking process of three phases to do check dams design work for reference. The first phase, conceptualization, is to list critical problems, such as the characteristics of erosion or deposition, and translate them into some goal situations. The second phase, transformation, is to use cognition methods such as analogy, association and metaphors to shape an image and prototypes. The third phase, formation, is to decide the details of the construction, such as stable safety analysis of shapes or materials. According to the previous descriptions, Taiwan's technological codes or papers about check dam design mostly emphasize the first and third phases, still quite a few lacks of the second phase. We emphases designers shouldn't ignore any phase of the framework especially the second one, or they may miss some chances to find more suitable solutions. Otherwise, this conceptual framework is simple to apply and we suppose it's a useful tool to design a more harmonic check dam with nearby natural landscape. Key Words: check dams, design thinking process, conceptualization, transformation, formation.
Automated Verification of Specifications with Typestates and Access Permissions
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Catano, Nestor
2011-01-01
We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).
C code generation from Petri-net-based logic controller specification
NASA Astrophysics Data System (ADS)
Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei
2017-08-01
The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.
NASA Astrophysics Data System (ADS)
Chen, Jing-Chao; Zhou, Yu; Wang, Xi
2018-02-01
Technical trading rules have been widely used by practitioners in financial markets for a long time. The profitability remains controversial and few consider the stationarity of technical indicators used in trading rules. We convert MA, KDJ and Bollinger bands into stationary processes and investigate the profitability of these trading rules by using 3 high-frequency data(15s,30s and 60s) of CSI300 Stock Index Futures from January 4th 2012 to December 31st 2016. Several performance and risk measures are adopted to assess the practical value of all trading rules directly while ADF-test is used to verify the stationarity and SPA test to check whether trading rules perform well due to intrinsic superiority or pure luck. The results show that there are several significant combinations of parameters for each indicator when transaction costs are not taken into consideration. Once transaction costs are included, trading profits will be eliminated completely. We also propose a method to reduce the risk of technical trading rules.
2012-04-01
offered transparency and force limitations through clear rules of the game, enabling former enemies to keep suspicions in check. It guarantees...will have to change. Of course, it will change should oil prices drop to the point of getting Russia on its knees . Beyond such a scenario, there will...and addressing security challenges in and around Europe. Today’s declaratory policy hardly matches the facts on the ground, and the rules of the
NASA Technical Reports Server (NTRS)
Stahl, H. Philip
2014-01-01
Based on 30 years of optical testing experience, a lot of mistakes, a lot of learning and a lot of experience, I have defined seven guiding principles for optical testing - regardless of how small or how large the optical testing or metrology task: Fully Understand the Task, Develop an Error Budget, Continuous Metrology Coverage, Know where you are, Test like you fly, Independent Cross-Checks, Understand All Anomalies. These rules have been applied with great success to the inprocess optical testing and final specification compliance testing of the JWST mirrors.
A compiler and validator for flight operations on NASA space missions
NASA Astrophysics Data System (ADS)
Fonte, Sergio; Politi, Romolo; Capria, Maria Teresa; Giardino, Marco; De Sanctis, Maria Cristina
2016-07-01
In NASA missions the management and the programming of the flight systems is performed by a specific scripting language, the SASF (Spacecraft Activity Sequence File). In order to perform a check on the syntax and grammar it is necessary a compiler that stress the errors (eventually) found in the sequence file produced for an instrument on board the flight system. In our experience on Dawn mission, we developed VIRV (VIR Validator), a tool that performs checks on the syntax and grammar of SASF, runs a simulations of VIR acquisitions and eventually finds violation of the flight rules of the sequences produced. The project of a SASF compiler (SSC - Spacecraft Sequence Compiler) is ready to have a new implementation: the generalization for different NASA mission. In fact, VIRV is a compiler for a dialect of SASF; it includes VIR commands as part of SASF language. Our goal is to produce a general compiler for the SASF, in which every instrument has a library to be introduced into the compiler. The SSC can analyze a SASF, produce a log of events, perform a simulation of the instrument acquisition and check the flight rules for the instrument selected. The output of the program can be produced in GRASS GIS format and may help the operator to analyze the geometry of the acquisition.
Roorda, Leo D; Green, John R; Houwink, Annemieke; Bagley, Pam J; Smith, Jane; Molenaar, Ivo W; Geurts, Alexander C
2012-06-01
To enable improved interpretation of the total score and faster scoring of the Rivermead Mobility Index (RMI) by studying item ordering or hierarchy and formulating start-and-stop rules in patients after stroke. Cohort study. Rehabilitation center in the Netherlands; stroke rehabilitation units and the community in the United Kingdom. Item hierarchy of the RMI was studied in an initial group of patients (n=620; mean age ± SD, 69.2±12.5y; 297 [48%] men; 304 [49%] left hemisphere lesion, and 269 [43%] right hemisphere lesion), and the adequacy of the item hierarchy-based start-and-stop rules was checked in a second group of patients (n=237; mean age ± SD, 60.0±11.3y; 139 [59%] men; 103 [44%] left hemisphere lesion, and 93 [39%] right hemisphere lesion) undergoing rehabilitation after stroke. Not applicable. Mokken scale analysis was used to investigate the fit of the double monotonicity model, indicating hierarchical item ordering. The percentages of patients with a difference between the RMI total score and the scores based on the start-and-stop rules were calculated to check the adequacy of these rules. The RMI had good fit of the double monotonicity model (coefficient H(T)=.87). The interpretation of the total score improved. Item hierarchy-based start-and-stop rules were formulated. The percentages of patients with a difference between the RMI total score and the score based on the recommended start-and-stop rules were 3% and 5%, respectively. Ten of the original 15 items had to be scored after applying the start-and-stop rules. Item hierarchy was established, enabling improved interpretation and faster scoring of the RMI. Copyright © 2012 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Riley, Gary
1991-01-01
The C Language Integrated Production System (CLIPS) is a forward chaining rule based language developed by NASA. CLIPS was designed specifically to provide high portability, low cost, and easy integration with external systems. The current release of CLIPS, version 4.3, is being used by over 2500 users throughout the public and private community. The primary addition to the next release of CLIPS, version 5.0, will be the CLIPS Object Oriented Language (COOL). The major capabilities of COOL are: class definition with multiple inheritance and no restrictions on the number, types, or cardinality of slots; message passing which allows procedural code bundled with an object to be executed; and query functions which allow groups of instances to be examined and manipulated. In addition to COOL, numerous other enhancements were added to CLIPS including: generic functions (which allow different pieces of procedural code to be executed depending upon the types or classes of the arguments); integer and double precision data type support; multiple conflict resolution strategies; global variables; logical dependencies; type checking on facts; full ANSI compiler support; and incremental reset for rules.
31 CFR 235.1 - Scope of regulations.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., DEPARTMENT OF THE TREASURY FINANCIAL MANAGEMENT SERVICE ISSUANCE OF SETTLEMENT CHECKS FOR FORGED CHECKS DRAWN... checks for checks drawn on designated depositaries of the United States by accountable officers of the...
Code of Federal Regulations, 2014 CFR
2014-07-01
... Small Municipal Waste Combustion Units Constructed on or Before August 30, 1999 Model Rule-Continuous... alternate methods for collecting data when systems malfunction or when repairs, calibration checks, or zero...
Code of Federal Regulations, 2012 CFR
2012-07-01
... Small Municipal Waste Combustion Units Constructed on or Before August 30, 1999 Model Rule-Continuous... alternate methods for collecting data when systems malfunction or when repairs, calibration checks, or zero...
Code of Federal Regulations, 2013 CFR
2013-07-01
... Small Municipal Waste Combustion Units Constructed on or Before August 30, 1999 Model Rule-Continuous... alternate methods for collecting data when systems malfunction or when repairs, calibration checks, or zero...
NASA Technical Reports Server (NTRS)
Harrison, P. Ann
1993-01-01
All the NASA VEGetation Workbench (VEG) goals except the Learning System provide the scientist with several different techniques. When VEG is run, rules assist the scientist in selecting the best of the available techniques to apply to the sample of cover type data being studied. The techniques are stored in the VEG knowledge base. The design and implementation of an interface that allows the scientist to add new techniques to VEG without assistance from the developer were completed. A new interface that enables the scientist to add techniques to VEG without assistance from the developer was designed and implemented. This interface does not require the scientist to have a thorough knowledge of Knowledge Engineering Environment (KEE) by Intellicorp or a detailed knowledge of the structure of VEG. The interface prompts the scientist to enter the required information about the new technique. It prompts the scientist to enter the required Common Lisp functions for executing the technique and the left hand side of the rule that causes the technique to be selected. A template for each function and rule and detailed instructions about the arguments of the functions, the values they should return, and the format of the rule are displayed. Checks are made to ensure that the required data were entered, the functions compiled correctly, and the rule parsed correctly before the new technique is stored. The additional techniques are stored separately from the VEG knowledge base. When the VEG knowledge base is loaded, the additional techniques are not normally loaded. The interface allows the scientist the option of adding all the previously defined new techniques before running VEG. When the techniques are added, the required units to store the additional techniques are created automatically in the correct places in the VEG knowledge base. The methods file containing the functions required by the additional techniques is loaded. New rule units are created to store the new rules. The interface that allow the scientist to select which techniques to use is updated automatically to include the new techniques. Task H was completed. The interface that allows the scientist to add techniques to VEG was implemented and comprehensively tested. The Common Lisp code for the Add Techniques system is listed in Appendix A.
Double patterning from design enablement to verification
NASA Astrophysics Data System (ADS)
Abercrombie, David; Lacour, Pat; El-Sewefy, Omar; Volkov, Alex; Levine, Evgueni; Arb, Kellen; Reid, Chris; Li, Qiao; Ghosh, Pradiptya
2011-11-01
Litho-etch-litho-etch (LELE) is the double patterning (DP) technology of choice for 20 nm contact, via, and lower metal layers. We discuss the unique design and process characteristics of LELE DP, the challenges they present, and various solutions. ∘ We examine DP design methodologies, current DP conflict feedback mechanisms, and how they can help designers identify and resolve conflicts. ∘ In place and route (P&R), the placement engine must now be aware of the assumptions made during IP cell design, and use placement directives provide by the library designer. We examine the new effects DP introduces in detail routing, discuss how multiple choices of LELE and the cut allowances can lead to different solutions, and describe new capabilities required by detail routers and P&R engines. ∘ We discuss why LELE DP cuts and overlaps are critical to optical process correction (OPC), and how a hybrid mechanism of rule and model-based overlap generation can provide a fast and effective solution. ∘ With two litho-etch steps, mask misalignment and image rounding are now verification considerations. We present enhancements to the OPCVerify engine that check for pinching and bridging in the presence of DP overlay errors and acute angles.
50 CFR 17.84 - Special rules-vertebrates.
Code of Federal Regulations, 2011 CFR
2011-10-01
... too steep to support populations of woundfin. (7) The reintroduced populations will be checked... population grows from the point of being established toward the maximum number that its habitat can support... decision to terminate the translocated population. A joint State-Service consultation will determine when...
ERIC Educational Resources Information Center
Dervarics, Charles
1993-01-01
School transportation experts consider the following essential rules in purchasing school buses: know your geography; know federal and state regulations; write specifications carefully; plan for replacement and growth; check out prospective bidders; keep up with recent trends; and remain flexible. New Jersey enacted a law last year requiring seat…
An easy-to-use diagnostic system development shell
NASA Technical Reports Server (NTRS)
Tsai, L. C.; Ross, J. B.; Han, C. Y.; Wee, W. G.
1987-01-01
The Diagnostic System Development Shell (DSDS), an expert system development shell for diagnostic systems, is described. The major objective of building the DSDS is to create a very easy to use and friendly environment for knowledge engineers and end-users. The DSDS is written in OPS5 and CommonLisp. It runs on a VAX/VMS system. A set of domain independent, generalized rules is built in the DSDS, so the users need not be concerned about building the rules. The facts are explicitly represented in a unified format. A powerful check facility which helps the user to check the errors in the created knowledge bases is provided. A judgement facility and other useful facilities are also available. A diagnostic system based on the DSDS system is question driven and can call or be called by other knowledge based systems written in OPS5 and CommonLisp. A prototype diagnostic system for diagnosing a Philips constant potential X-ray system has been built using the DSDS.
Propel: Tools and Methods for Practical Source Code Model Checking
NASA Technical Reports Server (NTRS)
Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem
2003-01-01
The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.
Kiesel, Andrea; Kunde, Wilfried; Pohl, Carsten; Berner, Michael P; Hoffmann, Joachim
2009-01-01
Expertise in a certain stimulus domain enhances perceptual capabilities. In the present article, the authors investigate whether expertise improves perceptual processing to an extent that allows complex visual stimuli to bias behavior unconsciously. Expert chess players judged whether a target chess configuration entailed a checking configuration. These displays were preceded by masked prime configurations that either represented a checking or a nonchecking configuration. Chess experts, but not novice chess players, revealed a subliminal response priming effect, that is, faster responding when prime and target displays were congruent (both checking or both nonchecking) rather than incongruent. Priming generalized to displays that were not used as targets, ruling out simple repetition priming effects. Thus, chess experts were able to judge unconsciously presented chess configurations as checking or nonchecking. A 2nd experiment demonstrated that experts' priming does not occur for simpler but uncommon chess configurations. The authors conclude that long-term practice prompts the acquisition of visual memories of chess configurations with integrated form-location conjunctions. These perceptual chunks enable complex visual processing outside of conscious awareness.
MOM: A meteorological data checking expert system in CLIPS
NASA Technical Reports Server (NTRS)
Odonnell, Richard
1990-01-01
Meteorologists have long faced the problem of verifying the data they use. Experience shows that there is a sizable number of errors in the data reported by meteorological observers. This is unacceptable for computer forecast models, which depend on accurate data for accurate results. Most errors that occur in meteorological data are obvious to the meteorologist, but time constraints prevent hand-checking. For this reason, it is necessary to have a 'front end' to the computer model to ensure the accuracy of input. Various approaches to automatic data quality control have been developed by several groups. MOM is a rule-based system implemented in CLIPS and utilizing 'consistency checks' and 'range checks'. The system is generic in the sense that it knows some meteorological principles, regardless of specific station characteristics. Specific constraints kept as CLIPS facts in a separate file provide for system flexibility. Preliminary results show that the expert system has detected some inconsistencies not noticed by a local expert.
NASA Astrophysics Data System (ADS)
Lee, Kyu J.; Kunii, T. L.; Noma, T.
1993-01-01
In this paper, we propose a syntactic pattern recognition method for non-schematic drawings, based on a new attributed graph grammar with flexible embedding. In our graph grammar, the embedding rule permits the nodes of a guest graph to be arbitrarily connected with the nodes of a host graph. The ambiguity caused by this flexible embedding is controlled with the evaluation of synthesized attributes and the check of context sensitivity. To integrate parsing with the synthesized attribute evaluation and the context sensitivity check, we also develop a bottom up parsing algorithm.
CMM Interim Check Design of Experiments (U)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montano, Joshua Daniel
2015-07-29
Coordinate Measuring Machines (CMM) are widely used in industry, throughout the Nuclear Weapons Complex and at Los Alamos National Laboratory (LANL) to verify part conformance to design definition. Calibration cycles for CMMs at LANL are predominantly one year in length and include a weekly interim check to reduce risk. The CMM interim check makes use of Renishaw’s Machine Checking Gauge which is an off-the-shelf product simulates a large sphere within a CMM’s measurement volume and allows for error estimation. As verification on the interim check process a design of experiments investigation was proposed to test a couple of key factorsmore » (location and inspector). The results from the two-factor factorial experiment proved that location influenced results more than the inspector or interaction.« less
75 FR 31665 - Electronic Fund Transfers
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-04
...-In Requirement 17(b)(1), 17(b)(4)--General Rule and Scope of Opt-In; Notice and Opt-In Requirements... consumers should retain the responsibility to balance their checking accounts. For example, an institution....17(b)(1) contains a general prohibition on charging overdraft fees unless certain requirements are...
26 CFR 148.1-5 - Constructive sale price.
Code of Federal Regulations, 2012 CFR
2012-04-01
... of articles listed in Chapter 32 of the Internal Revenue Code (other than combinations) that embraces... section. For the rule applicable to combinations of two or more articles, see subdivision (iv) of this..., perforating, cutting, and dating machines, and other check protector machine devices; (o) Taxable cash...
26 CFR 148.1-5 - Constructive sale price.
Code of Federal Regulations, 2013 CFR
2013-04-01
... of articles listed in Chapter 32 of the Internal Revenue Code (other than combinations) that embraces... section. For the rule applicable to combinations of two or more articles, see subdivision (iv) of this..., perforating, cutting, and dating machines, and other check protector machine devices; (o) Taxable cash...
26 CFR 148.1-5 - Constructive sale price.
Code of Federal Regulations, 2014 CFR
2014-04-01
... of articles listed in Chapter 32 of the Internal Revenue Code (other than combinations) that embraces... section. For the rule applicable to combinations of two or more articles, see subdivision (iv) of this..., perforating, cutting, and dating machines, and other check protector machine devices; (o) Taxable cash...
Intelligent Data Visualization for Cross-Checking Spacecraft System Diagnosis
NASA Technical Reports Server (NTRS)
Ong, James C.; Remolina, Emilio; Breeden, David; Stroozas, Brett A.; Mohammed, John L.
2012-01-01
Any reasoning system is fallible, so crew members and flight controllers must be able to cross-check automated diagnoses of spacecraft or habitat problems by considering alternate diagnoses and analyzing related evidence. Cross-checking improves diagnostic accuracy because people can apply information processing heuristics, pattern recognition techniques, and reasoning methods that the automated diagnostic system may not possess. Over time, cross-checking also enables crew members to become comfortable with how the diagnostic reasoning system performs, so the system can earn the crew s trust. We developed intelligent data visualization software that helps users cross-check automated diagnoses of system faults more effectively. The user interface displays scrollable arrays of timelines and time-series graphs, which are tightly integrated with an interactive, color-coded system schematic to show important spatial-temporal data patterns. Signal processing and rule-based diagnostic reasoning automatically identify alternate hypotheses and data patterns that support or rebut the original and alternate diagnoses. A color-coded matrix display summarizes the supporting or rebutting evidence for each diagnosis, and a drill-down capability enables crew members to quickly view graphs and timelines of the underlying data. This system demonstrates that modest amounts of diagnostic reasoning, combined with interactive, information-dense data visualizations, can accelerate system diagnosis and cross-checking.
The design and application of a Transportable Inference Engine (TIE1)
NASA Technical Reports Server (NTRS)
Mclean, David R.
1986-01-01
A Transportable Inference Engine (TIE1) system has been developed by the author as part of the Interactive Experimenter Planning System (IEPS) task which is involved with developing expert systems in support of the Spacecraft Control Programs Branch at Goddard Space Flight Center in Greenbelt, Maryland. Unlike traditional inference engines, TIE1 is written in the C programming language. In the TIE1 system, knowledge is represented by a hierarchical network of objects which have rule frames. The TIE1 search algorithm uses a set of strategies, including backward chaining, to obtain the values of goals. The application of TIE1 to a spacecraft scheduling problem is described. This application involves the development of a strategies interpreter which uses TIE1 to do constraint checking.
SCUT: clinical data organization for physicians using pen computers.
Wormuth, D. W.
1992-01-01
The role of computers in assisting physicians with patient care is rapidly advancing. One of the significant obstacles to efficient use of computers in patient care has been the unavailability of reasonably configured portable computers. Lightweight portable computers are becoming more attractive as physician data-management devices, but still pose a significant problem with bedside use. The advent of computers designed to accept input from a pen and having no keyboard present a usable computer platform to enable physicians to perform clinical computing at the bedside. This paper describes a prototype system to maintain an electronic "scut" sheet. SCUT makes use of pen-input and background rule checking to enhance patient care. GO Corporation's PenPoint Operating System is used to implement the SCUT project. PMID:1483012
Miller, P L; Frawley, S J; Sayward, F G; Yasnoff, W A; Duncan, L; Fleming, D W
1997-06-01
IMM/Serve is a computer program which implements the clinical guidelines for childhood immunization. IMM/Serve accepts as input a child's immunization history. It then indicates which vaccinations are due and which vaccinations should be scheduled next. The clinical guidelines for immunization are quite complex and are modified quite frequently. As a result, it is important that IMM/Serve's knowledge be represented in a format that facilitates the maintenance of that knowledge as the field evolves over time. To achieve this goal, IMM/Serve uses four representations for different parts of its knowledge base: (1) Immunization forecasting parameters that specify the minimum ages and wait-intervals for each dose are stored in tabular form. (2) The clinical logic that determines which set of forecasting parameters applies for a particular patient in each vaccine series is represented using if-then rules. (3) The temporal logic that combines dates, ages, and intervals to calculate recommended dates, is expressed procedurally. (4) The screening logic that checks each previous dose for validity is performed using a decision table that combines minimum ages and wait intervals with a small amount of clinical logic. A knowledge maintenance tool, IMM/Def, has been developed to help maintain the rule-based logic. The paper describes the design of IMM/Serve and the rationale and role of the different forms of knowledge used.
78 FR 75528 - Federal Government Participation in the Automated Clearing House
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
... requirements for authorization of ACH entries, adopting the language of Regulation E that an authorization must... revised specific language within the NACHA Operating Rules regarding the application and expiration of a... that prevents automated check processing or creating of an image that may be used to produce a...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-27
... Activities: Requests for Comments; Clearance of Renewed Approval of Information Collection: Training and..., 2011, vol. 76, no. 207, page 66349. The rule allows some experienced pilots who would otherwise qualify...: 2120-0600 . Title: Training and Qualification Requirements for Check Airmen and Flight Instructors...
2011-03-11
ORLANDO, Fla. – NASA Kennedy Space Center Director Bob Cabana checks out the robot designed by the Bionic Tigers team at the regional FIRST robotics competition at the University of Central Florida in Orlando. The team is made up of students from Cocoa High School and Holy Trinity Episcopal Academy along the Space Coast in Florida. NASA's Launch Services Program based at Kennedy is a sponsor of the team. The Bionic Tigers finished seventh in the competition called "For Inspiration and Recognition of Science and Technology," or FIRST, among about 60 high school teams hoping to advance to the national robotics championship. FIRST, founded in 1989, is a non-profit organization that designs accessible, innovative programs to build self-confidence, knowledge and life skills while motivating young people to pursue academic opportunities. The robotics competition challenges teams of high school students and their mentors to solve a common problem in a six-week timeframe using a standard kit of parts and a common set of rules. Photo credit: NASA/Glenn Benson
Sci-Fin: Visual Mining Spatial and Temporal Behavior Features from Social Media
Pu, Jiansu; Teng, Zhiyao; Gong, Rui; Wen, Changjiang; Xu, Yang
2016-01-01
Check-in records are usually available in social services, which offer us the opportunity to capture and analyze users’ spatial and temporal behaviors. Mining such behavior features is essential to social analysis and business intelligence. However, the complexity and incompleteness of check-in records bring challenges to achieve such a task. Different from the previous work on social behavior analysis, in this paper, we present a visual analytics system, Social Check-in Fingerprinting (Sci-Fin), to facilitate the analysis and visualization of social check-in data. We focus on three major components of user check-in data: location, activity, and profile. Visual fingerprints for location, activity, and profile are designed to intuitively represent the high-dimensional attributes. To visually mine and demonstrate the behavior features, we integrate WorldMapper and Voronoi Treemap into our glyph-like designs. Such visual fingerprint designs offer us the opportunity to summarize the interesting features and patterns from different check-in locations, activities and users (groups). We demonstrate the effectiveness and usability of our system by conducting extensive case studies on real check-in data collected from a popular microblogging service. Interesting findings are reported and discussed at last. PMID:27999398
Sci-Fin: Visual Mining Spatial and Temporal Behavior Features from Social Media.
Pu, Jiansu; Teng, Zhiyao; Gong, Rui; Wen, Changjiang; Xu, Yang
2016-12-20
Check-in records are usually available in social services, which offer us the opportunity to capture and analyze users' spatial and temporal behaviors. Mining such behavior features is essential to social analysis and business intelligence. However, the complexity and incompleteness of check-in records bring challenges to achieve such a task. Different from the previous work on social behavior analysis, in this paper, we present a visual analytics system, Social Check-in Fingerprinting (Sci-Fin), to facilitate the analysis and visualization of social check-in data. We focus on three major components of user check-in data: location, activity, and profile. Visual fingerprints for location, activity, and profile are designed to intuitively represent the high-dimensional attributes. To visually mine and demonstrate the behavior features, we integrate WorldMapper and Voronoi Treemap into our glyph-like designs. Such visual fingerprint designs offer us the opportunity to summarize the interesting features and patterns from different check-in locations, activities and users (groups). We demonstrate the effectiveness and usability of our system by conducting extensive case studies on real check-in data collected from a popular microblogging service. Interesting findings are reported and discussed at last.
NASA Technical Reports Server (NTRS)
Berk, A.; Temkin, A.
1985-01-01
A sum rule is derived for the auxiliary eigenvalues of an equation whose eigenspectrum pertains to projection operators which describe electron scattering from multielectron atoms and ions. The sum rule's right-hand side depends on an integral involving the target system eigenfunctions. The sum rule is checked for several approximations of the two-electron target. It is shown that target functions which have a unit eigenvalue in their auxiliary eigenspectrum do not give rise to well-defined projection operators except through a limiting process. For Hylleraas target approximations, the auxiliary equations are shown to contain an infinite spectrum. However, using a Rayleigh-Ritz variational principle, it is shown that a comparatively simple aproximation can exhaust the sum rule to better than five significant figures. The auxiliary Hylleraas equation is greatly simplified by conversion to a square root equation containing the same eigenfunction spectrum and from which the required eigenvalues are trivially recovered by squaring.
General Purpose Data-Driven Online System Health Monitoring with Applications to Space Operations
NASA Technical Reports Server (NTRS)
Iverson, David L.; Spirkovska, Lilly; Schwabacher, Mark
2010-01-01
Modern space transportation and ground support system designs are becoming increasingly sophisticated and complex. Determining the health state of these systems using traditional parameter limit checking, or model-based or rule-based methods is becoming more difficult as the number of sensors and component interactions grows. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults, failures, or precursors of significant failures. The Inductive Monitoring System (IMS) is a general purpose, data-driven system health monitoring software tool that has been successfully applied to several aerospace applications and is under evaluation for anomaly detection in vehicle and ground equipment for next generation launch systems. After an introduction to IMS application development, we discuss these NASA online monitoring applications, including the integration of IMS with complementary model-based and rule-based methods. Although the examples presented in this paper are from space operations applications, IMS is a general-purpose health-monitoring tool that is also applicable to power generation and transmission system monitoring.
Design rules and reality check for carbon-based ultracapacitors
NASA Astrophysics Data System (ADS)
Eisenmann, Erhard T.
1995-04-01
Design criteria for carbon-based Ultracapacitors have been determined for specified energy and power requirements, using the geometry of the components and such material properties as density, porosity and conductivity as parameters, while also considering chemical compatibility. This analysis shows that the weights of active and inactive components of the capacitor structure must be carefully balanced for maximum energy and power density. When applied to nonaqueous electrolytes, the design rules for a 5 Wh/kg device call for porous carbon with a specific capacitance of about 30 F/cu cm. This performance is not achievable with pure, electrostatic double layer capacitance. Double layer capacitance is only 5 to 30% of that observed in aqueous electrolyte. Tests also showed that nonaqueous electrolytes have a diminished capability to access micropores in activated carbon, in one case yielding a capacitance of less than 1 F/cu cm for carbon that had 100 F/cu cm in aqueous electrolyte. With negative results on nonaqueous electrolytes dominating the present study, the obvious conclusion is to concentrate on aqueous systems. Only aqueous double layer capacitors offer adequate electrostatic charging characteristics which is the basis for high power performance. There arc many opportunities for further advancing aqueous double layer capacitors, one being the use of highly activated carbon films, as opposed to powders, fibers and foams. While the manufacture of carbon films is still costly, and while the energy and power density of the resulting devices may not meet the optimistic goals that have been proposed, this technology could produce true double layer capacitors with significantly improved performance and large commercial potential.
A novel HTS magnetic levitation dining table
NASA Astrophysics Data System (ADS)
Lu, Yiyun; Huang, Huiying
2018-05-01
High temperature superconducting (HTS) bulk can levitate above or suspend below a permanent magnet stably. Many magnificent potential applications of HTS bulk are proposed by researchers. Until now, few reports have been found for real applications of HTS bulk. A complete set of small-scale HTS magnetic levitation table is proposed in the paper. The HTS magnetic levitation table includes an annular HTS magnetic levitation system which is composed of an annular HTS bulk array and an annular permanent magnet guideway (PMG). The annular PMG and the annular cryogenics vessel which used to maintain low temperature environment of the HTS bulk array are designed. 62 YBCO bulks are used to locate at the bottom of the annular vessel. A 3D-model finite element numerical method is used to design the HTS bulk magnetic levitation system. Equivalent magnetic levitation and guidance forces calculation rules are proposed aimed at the annular HTS magnetic levitation system stability. Based on the proposed method, levitation and guidance forces curves of the one YBCO bulk magnetic above PMG could be obtained. This method also can use to assist PMG design to check whether the designed PMG could reach the basic demand of the HTS magnetic levitation table.
31 CFR 235.3 - Settlement of claims.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., DEPARTMENT OF THE TREASURY FINANCIAL MANAGEMENT SERVICE ISSUANCE OF SETTLEMENT CHECKS FOR FORGED CHECKS DRAWN... respect to a check drawn on designated depositaries of the United States, in dollars or in foreign...
The study on dynamic cadastral coding rules based on kinship relationship
NASA Astrophysics Data System (ADS)
Xu, Huan; Liu, Nan; Liu, Renyi; Lu, Jingfeng
2007-06-01
Cadastral coding rules are an important supplement to the existing national and local standard specifications for building cadastral database. After analyzing the course of cadastral change, especially the parcel change with the method of object-oriented analysis, a set of dynamic cadastral coding rules based on kinship relationship corresponding to the cadastral change is put forward and a coding format composed of street code, block code, father parcel code, child parcel code and grandchild parcel code is worked out within the county administrative area. The coding rule has been applied to the development of an urban cadastral information system called "ReGIS", which is not only able to figure out the cadastral code automatically according to both the type of parcel change and the coding rules, but also capable of checking out whether the code is spatiotemporally unique before the parcel is stored in the database. The system has been used in several cities of Zhejiang Province and got a favorable response. This verifies the feasibility and effectiveness of the coding rules to some extent.
NASA Astrophysics Data System (ADS)
Aoki, Sinya; Doi, Takumi; Iritani, Takumi
2018-03-01
The sanity check is to rule out certain classes of obviously false results, not to catch every possible error. After reviewing such a sanity check for NN bound states with the Lüscher's finite volume formula [1-3], we give further evidences for the operator dependence of plateaux, a symptom of the fake plateau problem, against the claim [4]. We then present our critical comments on [5] by NPLQCD: (i) Operator dependences of plateaux in NPL2013 [6, 7] exist with the P value of 4-5%. (ii) The volume independence of plateaux in NPL2013 does not prove their correctness. (iii) Effective range expansions (EREs) in NPL2013 violate the physical pole condition. (iv) Their comment is partly based on new data and analysis different from the original ones. (v) Their new ERE does not satisfy the Lüscher's finite volume formula.
Checking the Grammar Checker: Integrating Grammar Instruction with Writing.
ERIC Educational Resources Information Center
McAlexander, Patricia J.
2000-01-01
Notes Rei Noguchi's recommendation of integrating grammar instruction with writing instruction and teaching only the most vital terms and the most frequently made errors. Presents a project that provides a review of the grammar lessons, applies many grammar rules specifically to the students' writing, and teaches students the effective use of the…
Developing Ideas of Refraction, Lenses and Rainbow through the Use of Historical Resources
ERIC Educational Resources Information Center
Mihas, Pavlos
2008-01-01
The paper examines different ways of using historical resources in teaching refraction related subjects. Experimental procedures can be taught by using Ptolemy's and Al Haytham's methods. The student can check the validity of the approximations or rules which were presented by different people. The interpretation of the relations is another…
40 CFR 60.2740 - What records must I keep?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Commercial and Industrial Solid Waste Incineration Units Model Rule-Recordkeeping and Reporting § 60.2740... downtime associated with zero and span and other routine calibration checks). Identify the operating... listed in § 60.2660(a). (m) On a daily basis, keep a log of the quantity of waste burned and the types of...
40 CFR 60.2770 - What information must I include in my annual report?
Code of Federal Regulations, 2012 CFR
2012-07-01
... and Compliance Times for Commercial and Industrial Solid Waste Incineration Units Model Rule... inoperative, except for zero (low-level) and high-level checks. (3) The date, time, and duration that each... of control if any of the following occur. (1) The zero (low-level), mid-level (if applicable), or...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-05
..., including RSVP, LSA, Non-profit Capacity Building, and the Social Innovation Fund (SIF) grant programs... programs including Campuses of Service, Serve America Fellows, Encore Fellows, Silver Scholars, the Social Innovation Fund, and activities funded under programs such as the Volunteer Generation Fund. The final rule...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-03
... holders, and other stakeholders on a draft guidance document entitled ``Weapons Safety Assessment'' (WSA... weapons under the NRC's proposed rule titled ``Enhanced Weapons, Firearms Background Checks, and Security.... You should not include any site-specific security information in your comments. Federal rulemaking Web...
Virtue and the Public Realm: The Role of Education.
ERIC Educational Resources Information Center
Hartoonian, Michael
1990-01-01
Discusses the eighteenth-century view of virtue as an end in itself. Observes that property, prosperity, and individualistic attitudes were essential aspects of economic virtue, whereas majority rule became a central tenet of civic virtue. Explains that public education and governmental checks and balances were seen as means of promoting economic…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-16
... both temporary and permanent. Locations such as post offices can take secure fingerprints for purposes... large number of applicants might require fingerprints and identification checks, and that the Coast... interim rule and work with industry to come up with a better system for fingerprint and identification...
The Hippo Generation and the Vampire State: The Impact of Corruption on Failing Nations
2011-04-13
state institutions and processes, and the growth of crime syndicates linked to ruling elites.൵ ’When corruption is prolific in a given country, the...kind of checks and balances required to eliminate widespread corruption and protect their people. As New York Times columnist Nicholas Christoff
NASA Astrophysics Data System (ADS)
Torghabeh, A. A.; Tousi, A. M.
2007-08-01
This paper presents Fuzzy Logic and Neural Networks approach to Gas Turbine Fuel schedules. Modeling of non-linear system using feed forward artificial Neural Networks using data generated by a simulated gas turbine program is introduced. Two artificial Neural Networks are used , depicting the non-linear relationship between gas generator speed and fuel flow, and turbine inlet temperature and fuel flow respectively . Off-line fast simulations are used for engine controller design for turbojet engine based on repeated simulation. The Mamdani and Sugeno models are used to expression the Fuzzy system . The linguistic Fuzzy rules and membership functions are presents and a Fuzzy controller will be proposed to provide an Open-Loop control for the gas turbine engine during acceleration and deceleration . MATLAB Simulink was used to apply the Fuzzy Logic and Neural Networks analysis. Both systems were able to approximate functions characterizing the acceleration and deceleration schedules . Surge and Flame-out avoidance during acceleration and deceleration phases are then checked . Turbine Inlet Temperature also checked and controls by Neural Networks controller. This Fuzzy Logic and Neural Network Controllers output results are validated and evaluated by GSP software . The validation results are used to evaluate the generalization ability of these artificial Neural Networks and Fuzzy Logic controllers.
Hardware independence checkout software
NASA Technical Reports Server (NTRS)
Cameron, Barry W.; Helbig, H. R.
1990-01-01
ACSI has developed a program utilizing CLIPS to assess compliance with various programming standards. Essentially the program parses C code to extract the names of all function calls. These are asserted as CLIPS facts which also include information about line numbers, source file names, and called functions. Rules have been devised to establish functions called that have not been defined in any of the source parsed. These are compared against lists of standards (represented as facts) using rules that check intersections and/or unions of these. By piping the output into other processes the source is appropriately commented by generating and executing parsed scripts.
NASA Technical Reports Server (NTRS)
Haley, Paul
1991-01-01
The C Language Integrated Production System (CLIPS) cannot effectively perform sound and complete logical inference in most real-world contexts. The problem facing CLIPS is its lack of goal generation. Without automatic goal generation and maintenance, forward chaining can only deduce all instances of a relationship. Backward chaining, which requires goal generation, allows deduction of only that subset of what is logically true which is also relevant to ongoing problem solving. Goal generation can be mimicked in simple cases using forward chaining. However, such mimicry requires manual coding of additional rules which can assert an inadequate goal representation for every condition in every rule that can have corresponding facts derived by backward chaining. In general, for N rules with an average of M conditions per rule the number of goal generation rules required is on the order of N*M. This is clearly intractable from a program maintenance perspective. We describe the support in Eclipse for backward chaining which it automatically asserts as it checks rule conditions. Important characteristics of this extension are that it does not assert goals which cannot match any rule conditions, that 2 equivalent goals are never asserted, and that goals persist as long as, but no longer than, they remain relevant.
Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems
NASA Technical Reports Server (NTRS)
Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris
2010-01-01
Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.
Text-to-phonemic transcription and parsing into mono-syllables of English text
NASA Astrophysics Data System (ADS)
Jusgir Mullick, Yugal; Agrawal, S. S.; Tayal, Smita; Goswami, Manisha
2004-05-01
The present paper describes a program that converts the English text (entered through the normal computer keyboard) into its phonemic representation and then parses it into mono-syllables. For every letter a set of context based rules is defined in lexical order. A default rule is also defined separately for each letter. Beginning from the first letter of the word the rules are checked and the most appropriate rule is applied on the letter to find its actual orthographic representation. If no matching rule is found, then the default rule is applied. Current rule sets the next position to be analyzed. Proceeding in the same manner orthographic representation for each word can be found. For example, ``reading'' is represented as ``rEdiNX'' by applying the following rules: r-->r move 1 position ahead ead-->Ed move 3 position ahead i-->i move 1 position ahead ng-->NX move 2 position ahead, i.e., end of word. The phonemic representations obtained from the above procedure are parsed to get mono-syllabic representation for various combinations such as CVC, CVCC, CV, CVCVC, etc. For example, the above phonemic representation will be parsed as rEdiNX---> /rE/ /diNX/. This study is a part of developing TTS for Indian English.
ERIC Educational Resources Information Center
McDaniel, Sara C.; Bruhn, Allison L.
2016-01-01
Check-in/check-out (CICO) is a Tier 2 behavioral intervention that has demonstrated effectiveness for students with challenging behavior in a variety of educational settings. Existing research has focused primarily on testing the intervention's effectiveness and the role of behavioral function in moderating response to intervention. Only a handful…
Correlations and sum rules in a half-space for a quantum two-dimensional one-component plasma
NASA Astrophysics Data System (ADS)
Jancovici, B.; Šamaj, L.
2007-05-01
This paper is the continuation of a previous one (Šamaj and Jancovici, 2007 J. Stat. Mech. P02002); for a nearly classical quantum fluid in a half-space bounded by a plain plane hard wall (no image forces), we had generalized the Wigner Kirkwood expansion of the equilibrium statistical quantities in powers of Planck's constant \\hbar . As a model system for a more detailed study, we consider the quantum two-dimensional one-component plasma: a system of charged particles of one species, interacting through the logarithmic Coulomb potential in two dimensions, in a uniformly charged background of opposite sign, such that the total charge vanishes. The corresponding classical system is exactly solvable in a variety of geometries, including the present one of a half-plane, when βe2 = 2, where β is the inverse temperature and e is the charge of a particle: all the classical n-body densities are known. In the present paper, we have calculated the expansions of the quantum density profile and truncated two-body density up to order \\hbar ^2 (instead of only to order \\hbar as in the previous paper). These expansions involve the classical n-body densities up to n = 4; thus we obtain exact expressions for these quantum expansions in this special case. For the quantum one-component plasma, two sum rules involving the truncated two-body density (and, for one of them, the density profile) have been derived, a long time ago, by using heuristic macroscopic arguments: one sum rule concerns the asymptotic form along the wall of the truncated two-body density; the other one concerns the dipole moment of the structure factor. In the two-dimensional case at βe2 = 2, we now have explicit expressions up to order \\hbar^2 for these two quantum densities; thus we can microscopically check the sum rules at this order. The checks are positive, reinforcing the idea that the sum rules are correct.
Use of check lists in assessing the statistical content of medical studies.
Gardner, M J; Machin, D; Campbell, M J
1986-01-01
Two check lists are used routinely in the statistical assessment of manuscripts submitted to the "BMJ." One is for papers of a general nature and the other specifically for reports on clinical trials. Each check list includes questions on the design, conduct, analysis, and presentation of studies, and answers to these contribute to the overall statistical evaluation. Only a small proportion of submitted papers are assessed statistically, and these are selected at the refereeing or editorial stage. Examination of the use of the check lists showed that most papers contained statistical failings, many of which could easily be remedied. It is recommended that the check lists should be used by statistical referees, editorial staff, and authors and also during the design stage of studies. PMID:3082452
17 CFR 230.430B - Prospectus in a registration statement after effective date.
Code of Federal Regulations, 2010 CFR
2010-04-01
... during the past three years neither the issuer nor any of its predecessors was: (A) A blank check company... purpose of section 5(b)(1) thereof. This provision shall not limit the information required to be... EXCHANGE COMMISSION GENERAL RULES AND REGULATIONS, SECURITIES ACT OF 1933 Form and Content of Prospectuses...
12 CFR Appendix F to Part 229 - Official Board Interpretations; Preemption Determinations
Code of Federal Regulations, 2011 CFR
2011-01-01
... banking day after deposit. Although the language “deposited in a bank” is unclear, arguably it is broader than the language “made in person to an employee of the depositary bank”, which conditions the next-day... state or the same check processing region. Thus, generally, the Regulation CC rule for availability of...
12 CFR Appendix F to Part 229 - Official Board Interpretations; Preemption Determinations
Code of Federal Regulations, 2012 CFR
2012-01-01
... banking day after deposit. Although the language “deposited in a bank” is unclear, arguably it is broader than the language “made in person to an employee of the depositary bank”, which conditions the next-day... state or the same check processing region. Thus, generally, the Regulation CC rule for availability of...
12 CFR Appendix F to Part 229 - Official Board Interpretations; Preemption Determinations
Code of Federal Regulations, 2010 CFR
2010-01-01
... banking day after deposit. Although the language “deposited in a bank” is unclear, arguably it is broader than the language “made in person to an employee of the depositary bank”, which conditions the next-day... state or the same check processing region. Thus, generally, the Regulation CC rule for availability of...
12 CFR Appendix F to Part 229 - Official Board Interpretations; Preemption Determinations
Code of Federal Regulations, 2013 CFR
2013-01-01
... of the banking day after deposit. Although the language “deposited in a bank” is unclear, arguably it is broader than the language “made in person to an employee of the depositary bank”, which conditions... state or the same check processing region. Thus, generally, the Regulation CC rule for availability of...
12 CFR Appendix F to Part 229 - Official Board Interpretations; Preemption Determinations
Code of Federal Regulations, 2014 CFR
2014-01-01
... of the banking day after deposit. Although the language “deposited in a bank” is unclear, arguably it is broader than the language “made in person to an employee of the depositary bank”, which conditions... state or the same check processing region. Thus, generally, the Regulation CC rule for availability of...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-15
... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-65310; File No. SR-CBOE-2011-082] Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and Immediate Effectiveness of Proposed Rule Change Related to Opening and Complex Order Price Check Parameter Features September 9, 2011. Pursuant to Section 19(b)(1)...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-26
... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-66207; File No. SR-CBOE-2012-004] Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and Immediate Effectiveness of Proposed Rule Change Related to Automatic Execution and Complex Order Price Check Parameter Features January 20, 2012. Pursuant to Sectio...
A Note on Item-Restscore Association in Rasch Models
ERIC Educational Resources Information Center
Kreiner, Svend
2011-01-01
To rule out the need for a two-parameter item response theory (IRT) model during item analysis by Rasch models, it is important to check the Rasch model's assumption that all items have the same item discrimination. Biserial and polyserial correlation coefficients measuring the association between items and restscores are often used in an informal…
Fifty Plus: Encore Careers Provide Success, Fulfillment in Second Half of Life
ERIC Educational Resources Information Center
Freedman, Marc; Goggin, Judy
2008-01-01
Teaching is a popular pursuit among many second-generation career-seekers. The road from the private sector to the classroom often is littered with time-consuming and difficult requirements--for subject matter mastery, credentialing, and licensure. Too often qualified people check out the rules, consider the challenges, and give up. That wasn't…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-02
... study employed the same Activity Based Cost (ABC) accounting method detailed in the Final Rule establishing the process for setting fees (75 FR 24796 (May 6, 2010)). The ABC methodology is consistent with widely accepted accounting principles and complies with the provisions of 31 U.S.C. 9701 and other...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-12
... (NMFS), National Oceanic and Atmospheric Administration (NOAA), Commerce. ACTION: Proposed rule; request... ``NOAA-NMFS-2010-0091'' in the keyword search, then check the box labeled ``Select to find documents... Light, NC) and 28[deg]35.1' N. lat. (due east of the NASA Vehicle Assembly Building, Cape Canaveral, FL...
47 CFR 95.413 - (CB Rule 13) What communications are prohibited?
Code of Federal Regulations, 2010 CFR
2010-10-01
..., traveler assistance, brief tests (radio checks), or voice paging; (5) To advertise or solicit the sale of... entertain; (7) To transmit any sound effect solely to attract attention; (8) To transmit the word “MAYDAY... (155.3 miles) away; (10) To advertise a political candidate or political campaign; (you may use your CB...
Automated Assume-Guarantee Reasoning by Abstraction Refinement
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Giannakopoulous, Dimitra; Glannakopoulou, Dimitra
2008-01-01
Current automated approaches for compositional model checking in the assume-guarantee style are based on learning of assumptions as deterministic automata. We propose an alternative approach based on abstraction refinement. Our new method computes the assumptions for the assume-guarantee rules as conservative and not necessarily deterministic abstractions of some of the components, and refines those abstractions using counter-examples obtained from model checking them together with the other components. Our approach also exploits the alphabets of the interfaces between components and performs iterative refinement of those alphabets as well as of the abstractions. We show experimentally that our preliminary implementation of the proposed alternative achieves similar or better performance than a previous learning-based implementation.
Reopen parameter regions in two-Higgs doublet models
NASA Astrophysics Data System (ADS)
Staub, Florian
2018-01-01
The stability of the electroweak potential is a very important constraint for models of new physics. At the moment, it is standard for Two-Higgs doublet models (THDM), singlet or triplet extensions of the standard model to perform these checks at tree-level. However, these models are often studied in the presence of very large couplings. Therefore, it can be expected that radiative corrections to the potential are important. We study these effects at the example of the THDM type-II and find that loop corrections can revive more than 50% of the phenomenological viable points which are ruled out by the tree-level vacuum stability checks. Similar effects are expected for other extension of the standard model.
ERIC Educational Resources Information Center
Hawken, Leanne S.; Bundock, Kaitlin; Kladis, Kristin; O'Keeffe, Breda; Barret, Courtenay A.
2014-01-01
The purpose of this systematic literature review was to summarize outcomes of the Check-in Check-out (CICO) intervention across elementary and secondary settings. Twenty-eight studies utilizing both single subject and group (experimental and quasi-experimental) designs were included in this review. Median effect sizes across the eight group…
ERIC Educational Resources Information Center
Miller, Leila M.; Dufrene, Brad A.; Sterling, Heather E.; Olmi, D. Joe; Bachmayer, Erica
2015-01-01
This study evaluated the effectiveness of Check-in/Check-out (CICO) for improving behavioral performance for three students referred for Tier 2 behavioral supports. An ABAB withdrawal design was used to evaluate CICO and results indicate that intervention was effective for reducing problem behavior as well as increasing academic engagement for all…
31 CFR 515.405 - Exportation of securities, currency, checks, drafts and promissory notes.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., checks, drafts and promissory notes. 515.405 Section 515.405 Money and Finance: Treasury Regulations..., drafts and promissory notes. Section 515.201 prohibits the exportation of securities, currency, checks, drafts and promissory notes to a designated foreign country. ...
31 CFR 515.405 - Exportation of securities, currency, checks, drafts and promissory notes.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., checks, drafts and promissory notes. 515.405 Section 515.405 Money and Finance: Treasury Regulations..., drafts and promissory notes. Section 515.201 prohibits the exportation of securities, currency, checks, drafts and promissory notes to a designated foreign country. ...
31 CFR 515.405 - Exportation of securities, currency, checks, drafts and promissory notes.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., checks, drafts and promissory notes. 515.405 Section 515.405 Money and Finance: Treasury Regulations..., drafts and promissory notes. Section 515.201 prohibits the exportation of securities, currency, checks, drafts and promissory notes to a designated foreign country. ...
31 CFR 515.405 - Exportation of securities, currency, checks, drafts and promissory notes.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., checks, drafts and promissory notes. 515.405 Section 515.405 Money and Finance: Treasury Regulations..., drafts and promissory notes. Section 515.201 prohibits the exportation of securities, currency, checks, drafts and promissory notes to a designated foreign country. ...
31 CFR 515.405 - Exportation of securities, currency, checks, drafts and promissory notes.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., checks, drafts and promissory notes. 515.405 Section 515.405 Money and Finance: Treasury Regulations..., drafts and promissory notes. Section 515.201 prohibits the exportation of securities, currency, checks, drafts and promissory notes to a designated foreign country. ...
Science Opportunity Analyzer (SOA): Science Planning Made Simple
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; Polanskey, Carol A.
2004-01-01
.For the first time at JPL, the Cassini mission to Saturn is using distributed science operations for developing their experiments. Remote scientists needed the ability to: a) Identify observation opportunities; b) Create accurate, detailed designs for their observations; c) Verify that their designs meet their objectives; d) Check their observations against project flight rules and constraints; e) Communicate their observations to other scientists. Many existing tools provide one or more of these functions, but Science Opportunity Analyzer (SOA) has been built to unify these tasks into a single application. Accurate: Utilizes JPL Navigation and Ancillary Information Facility (NAIF) SPICE* software tool kit - Provides high fidelity modeling. - Facilitates rapid adaptation to other flight projects. Portable: Available in Unix, Windows and Linux. Adaptable: Designed to be a multi-mission tool so it can be readily adapted to other flight projects. Implemented in Java, Java 3D and other innovative technologies. Conclusion: SOA is easy to use. It only requires 6 simple steps. SOA's ability to show the same accurate information in multiple ways (multiple visualization formats, data plots, listings and file output) is essential to meet the needs of a diverse, distributed science operations environment.
ERIC Educational Resources Information Center
Simonsen, Brandi; Myers, Diane; Briere, Donald E., III
2011-01-01
Students who continue to demonstrate at-risk behaviors after a school implements schoolwide primary (Tier 1) interventions require targeted-group secondary (Tier 2) interventions. This study was conducted to compare the effectiveness of a targeted-group behavioral check-in/check-out (CICO) intervention with the school's standard practice (SP) with…
Is Advanced Real-Time Energy Metering Sufficient to Persuade People to Save Energy?
NASA Astrophysics Data System (ADS)
Ting, L.; Leite, H.; Ponce de Leão, T.
2012-10-01
In order to promote a low-carbon economy, EU citizens may soon be able to check their electricity consumption from smart meter. It is hoped that smart meter can, by providing real-time consumption and pricing information to residential users, help reducing demand for electricity. It is argued in this paper that, according the Elaborative Likelihood Model (ELM), these methods are most likely to be effective when consumers perceive the issue of energy conservation relevant to their lives. Nevertheless, some fundamental characteristics of these methods result in limited amount of perceived personal relevance; for instance, energy expenditure expense may be relatively small comparing to other household expenditure like mortgage and consumption information does not enhance interpersonal trust. In this paper, it is suggested that smart meter can apply the "nudge" approaches which respond to ELM as the use of simple rules to make decision, which include the change of feedback delivery and device design.
Optical proximity correction for anamorphic extreme ultraviolet lithography
NASA Astrophysics Data System (ADS)
Clifford, Chris; Lam, Michael; Raghunathan, Ananthan; Jiang, Fan; Fenger, Germain; Adam, Kostas
2017-10-01
The change from isomorphic to anamorphic optics in high numerical aperture extreme ultraviolet scanners necessitates changes to the mask data preparation flow. The required changes for each step in the mask tape out process are discussed, with a focus on optical proximity correction (OPC). When necessary, solutions to new problems are demonstrated and verified by rigorous simulation. Additions to the OPC model include accounting for anamorphic effects in the optics, mask electromagnetics, and mask manufacturing. The correction algorithm is updated to include awareness of anamorphic mask geometry for mask rule checking. OPC verification through process window conditions is enhanced to test different wafer scale mask error ranges in the horizontal and vertical directions. This work will show that existing models and methods can be updated to support anamorphic optics without major changes. Also, the larger mask size in the Y direction can result in better model accuracy, easier OPC convergence, and designs that are more tolerant to mask errors.
75 FR 65975 - Exchange Visitor Program-Secondary School Students
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-27
...The Department is revising existing Secondary School Student regulations regarding the screening, selection, school enrollment, orientation, and quality assurance monitoring of exchange students as well as the screening, selection, orientation, and quality assurance monitoring of host families and field staff. Further, the Department is adopting a new requirement regarding training for all organizational representatives who place and/or monitor students with host families. The proposed requirement to conduct FBI fingerprint-based criminal background checks will not be implemented at this time. Rather, it will continue to be examined and a subsequent Final Rule regarding this provision will be forthcoming. These regulations, as revised, govern the Department designated exchange visitor programs under which foreign secondary school students (ages 15-18\\1/2\\) are afforded the opportunity to study in the United States at accredited public or private secondary schools for an academic semester or year while living with American host families or residing at accredited U.S. boarding schools.
HiVy automated translation of stateflow designs for model checking verification
NASA Technical Reports Server (NTRS)
Pingree, Paula
2003-01-01
tool set enables model checking of finite state machines designs. This is acheived by translating state-chart specifications into the input language of the Spin model checker. An abstract syntax of hierarchical sequential automata (HSA) is provided as an intermediate format tool set.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-15
... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-65311; File No. SR-C2-2011-018] Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Notice of Filing and Immediate Effectiveness of Proposed Rule Change Related to Opening and Complex Order Price Check Parameter Features September 9, 2011. Pursuant to Section 19(b)(1) of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-17
... to other venues on the SOLV System routing table, or (ii) check the NASDAQ book first and then route to destinations on the SOLV System routing table.\\3\\ Under the second option, the applicable routing.... \\3\\ As provided in Rule 4758(a)(1)(A), the term ``System routing table'' refers to the proprietary...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-13
... NMS (i.e. ``dark venues'' or ``dark pools''). BCST orders, pursuant to Rule 4758(a)(1)(A)(ix), check the System for available shares and simultaneously route to select dark venues and to certain low cost... charged for such executions, including its own costs. As a general matter, BX believes that the proposed...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-04
...;Prices of new books are listed in the first FEDERAL REGISTER issue of each #0;week. #0; #0; #0; #0;#0... Transportation (DOT). ACTION: Final rule. SUMMARY: We are adopting a new airworthiness directive (AD) for certain... you to do an operational check of the left and right pitot heat annunciators for proper operation and...
Ertefaie, Ashkan; Shortreed, Susan; Chakraborty, Bibhas
2016-06-15
Q-learning is a regression-based approach that uses longitudinal data to construct dynamic treatment regimes, which are sequences of decision rules that use patient information to inform future treatment decisions. An optimal dynamic treatment regime is composed of a sequence of decision rules that indicate how to optimally individualize treatment using the patients' baseline and time-varying characteristics to optimize the final outcome. Constructing optimal dynamic regimes using Q-learning depends heavily on the assumption that regression models at each decision point are correctly specified; yet model checking in the context of Q-learning has been largely overlooked in the current literature. In this article, we show that residual plots obtained from standard Q-learning models may fail to adequately check the quality of the model fit. We present a modified Q-learning procedure that accommodates residual analyses using standard tools. We present simulation studies showing the advantage of the proposed modification over standard Q-learning. We illustrate this new Q-learning approach using data collected from a sequential multiple assignment randomized trial of patients with schizophrenia. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Kataoka, Haruno; Utsumi, Akira; Hirose, Yuki; Yoshiura, Hiroshi
Disclosure control of natural language information (DCNL), which we are trying to realize, is described. DCNL will be used for securing human communications over the internet, such as through blogs and social network services. Before sentences in the communications are disclosed, they are checked by DCNL and any phrases that could reveal sensitive information are transformed or omitted so that they are no longer revealing. DCNL checks not only phrases that directly represent sensitive information but also those that indirectly suggest it. Combinations of phrases are also checked. DCNL automatically learns the knowledge of sensitive phrases and the suggestive relations between phrases by using co-occurrence analysis and Web retrieval. The users' burden is therefore minimized, i.e., they do not need to define many disclosure control rules. DCNL complements the traditional access control in the fields where reliability needs to be balanced with enjoyment and objects classes for the access control cannot be predefined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, J; Yan, Y; Hager, F
Purpose: Radiation therapy has evolved to become not only more precise and potent, but also more complicated to monitor and deliver. More rigorous and comprehensive quality assurance is needed to safeguard ever advancing radiation therapy. ICRU standards dictate that an ever growing set of treatment parameters are manually checked weekly by medical physicists. This “weekly chart check” procedure is laborious and subject to human errors or other factors. A computer-assisted chart checking process will enable more complete and accurate human review of critical parameters, reduce the risk of medical errors, and improve the efficiency. Methods: We developed a web-based softwaremore » system that enables a thorough weekly quality assurance checks. In the backend, the software retrieves all machine parameters from a Treatment Management System (TMS) and compares them against the corresponding ones from the treatment planning system. They are also checked for validity against preset rules. The results are displayed as a web page in the front-end for physicists to review. Then a summary report is generated and uploaded automatically to the TMS as a record for weekly chart checking. Results: The software system has been deployed on a web server in our department’s intranet, and has been tested thoroughly by our clinical physicists. A plan parameter would be highlighted when it is off the preset limit. The developed system has changed the way of checking charts with significantly improved accuracy, efficiency, and completeness. It has been shown to be robust, fast, and easy to use. Conclusion: A computer-assisted system has been developed for efficient, accurate, and comprehensive weekly chart checking. The system has been extensively validated and is being implemented for routine clinical use.« less
The ALL-OUT Library; A Design for Computer-Powered, Multidimensional Services.
ERIC Educational Resources Information Center
Sleeth, Jim; LaRue, James
1983-01-01
Preliminary description of design of electronic library and home information delivery system highlights potentials of personal computer interface program (applying for service, assuring that users are valid, checking for measures, searching, locating titles) and incorporation of concepts used in other information systems (security checks,…
Toward improved design of check dam systems: A case study in the Loess Plateau, China
NASA Astrophysics Data System (ADS)
Pal, Debasish; Galelli, Stefano; Tang, Honglei; Ran, Qihua
2018-04-01
Check dams are one of the most common strategies for controlling sediment transport in erosion prone areas, along with soil and water conservation measures. However, existing mathematical models that simulate sediment production and delivery are often unable to simulate how the storage capacity of check dams varies with time. To explicitly account for this process-and to support the design of check dam systems-we developed a modelling framework consisting of two components, namely (1) the spatially distributed Soil Erosion and Sediment Delivery Model (WaTEM/SEDEM), and (2) a network-based model of check dam storage dynamics. The two models are run sequentially, with the second model receiving the initial sediment input to check dams from WaTEM/SEDEM. The framework is first applied to Shejiagou catchment, a 4.26 km2 area located in the Loess Plateau, China, where we study the effect of the existing check dam system on sediment dynamics. Results show that the deployment of check dams altered significantly the sediment delivery ratio of the catchment. Furthermore, the network-based model reveals a large variability in the life expectancy of check dams and abrupt changes in their filling rates. The application of the framework to six alternative check dam deployment scenarios is then used to illustrate its usefulness for planning purposes, and to derive some insights on the effect of key decision variables, such as the number, size, and site location of check dams. Simulation results suggest that better performance-in terms of life expectancy and sediment delivery ratio-could have been achieved with an alternative deployment strategy.
Revisiting the Procedures for the Vector Data Quality Assurance in Practice
NASA Astrophysics Data System (ADS)
Erdoğan, M.; Torun, A.; Boyacı, D.
2012-07-01
Immense use of topographical data in spatial data visualization, business GIS (Geographic Information Systems) solutions and applications, mobile and location-based services forced the topo-data providers to create standard, up-to-date and complete data sets in a sustainable frame. Data quality has been studied and researched for more than two decades. There have been un-countable numbers of references on its semantics, its conceptual logical and representations and many applications on spatial databases and GIS. However, there is a gap between research and practice in the sense of spatial data quality which increases the costs and decreases the efficiency of data production. Spatial data quality is well-known by academia and industry but usually in different context. The research on spatial data quality stated several issues having practical use such as descriptive information, metadata, fulfillment of spatial relationships among data, integrity measures, geometric constraints etc. The industry and data producers realize them in three stages; pre-, co- and post data capturing. The pre-data capturing stage covers semantic modelling, data definition, cataloguing, modelling, data dictionary and schema creation processes. The co-data capturing stage covers general rules of spatial relationships, data and model specific rules such as topologic and model building relationships, geometric threshold, data extraction guidelines, object-object, object-belonging class, object-non-belonging class, class-class relationships to be taken into account during data capturing. And post-data capturing stage covers specified QC (quality check) benchmarks and checking compliance to general and specific rules. The vector data quality criteria are different from the views of producers and users. But these criteria are generally driven by the needs, expectations and feedbacks of the users. This paper presents a practical method which closes the gap between theory and practice. Development of spatial data quality concepts into developments and application requires existence of conceptual, logical and most importantly physical existence of data model, rules and knowledge of realization in a form of geo-spatial data. The applicable metrics and thresholds are determined on this concrete base. This study discusses application of geo-spatial data quality issues and QA (quality assurance) and QC procedures in the topographic data production. Firstly we introduce MGCP (Multinational Geospatial Co-production Program) data profile of NATO (North Atlantic Treaty Organization) DFDD (DGIWG Feature Data Dictionary), the requirements of data owner, the view of data producers for both data capturing and QC and finally QA to fulfil user needs. Then, our practical and new approach which divides the quality into three phases is introduced. Finally, implementation of our approach to accomplish metrics, measures and thresholds of quality definitions is discussed. In this paper, especially geometry and semantics quality and quality control procedures that can be performed by the producers are discussed. Some applicable best-practices that we experienced on techniques of quality control, defining regulations that define the objectives and data production procedures are given in the final remarks. These quality control procedures should include the visual checks over the source data, captured vector data and printouts, some automatic checks that can be performed by software and some semi-automatic checks by the interaction with quality control personnel. Finally, these quality control procedures should ensure the geometric, semantic, attribution and metadata quality of vector data.
NASA Astrophysics Data System (ADS)
Gabadadze, Gregory; Tukhashvili, Giorgi
2018-07-01
The Crewther-Broadhurst-Kataev (CBK) relation connects the Bjorken function for deep-inelastic sum rules (or the Gross-Llewellyn Smith function) with the Adler function for electron-positron annihilation in QCD; it has been checked to hold up to four loops in perturbation theory. Here we study non-perturbative terms in the CBK relation using a holographic dual theory that is believed to capture properties of QCD. We show that for the large invariant momenta the perturbative CBK relation is exactly satisfied. For the small momenta non-perturbative corrections enter the relation and we calculate their significant effects. We also give an exact holographic expression for the Bjorken function, as well as for the entire three-point axial-vector-vector correlation function, and check their consistency in the conformal limit.
Osamor, Victor C; Azeta, Ambrose A; Ajulo, Oluseyi O
2014-12-01
Over 1.5-2 million tuberculosis deaths occur annually. Medical professionals are faced with a lot of challenges in delivering good health-care with unassisted automation in hospitals where there are several patients who need the doctor's attention. To automate the pre-laboratory screening process against tuberculosis infection to aid diagnosis and make it fast and accessible to the public via the Internet. The expert system we have built is designed to also take care of people who do not have access to medical experts, but would want to check their medical status. A rule-based approach has been used, and unified modeling language and the client-server architecture technique were applied to model the system and to develop it as a web-based expert system for tuberculosis diagnosis. Algorithmic rules in the Tuberculosis-Diagnosis Expert System necessitate decision coverage where tuberculosis is either suspected or not suspected. The architecture consists of a rule base, knowledge base, and patient database. These units interact with the inference engine, which receives patient' data through the Internet via a user interface. We present the architecture of the Tuberculosis-Diagnosis Expert System and its implementation. We evaluated it for usability to determine the level of effectiveness, efficiency and user satisfaction. The result of the usability evaluation reveals that the system has a usability of 4.08 out of a scale of 5. This is an indication of a more-than-average system performance. Several existing expert systems have been developed for the purpose of supporting different medical diagnoses, but none is designed to translate tuberculosis patients' symptomatic data for online pre-laboratory screening. Our Tuberculosis-Diagnosis Expert System is an effective solution for the implementation of the needed web-based expert system diagnosis. © The Author(s) 2013.
NASA Astrophysics Data System (ADS)
Wang, Shan; Jiang, Zhi-Qiang; Li, Sai-Ping; Zhou, Wei-Xing
2015-12-01
Technical trading rules have a long history of being used by practitioners in financial markets. The profitable ability and efficiency of technical trading rules are yet controversial. In this paper, we test the performance of more than seven thousand traditional technical trading rules on the Shanghai Securities Composite Index (SSCI) from May 21, 1992 through June 30, 2013 and China Securities Index 300 (CSI 300) from April 8, 2005 through June 30, 2013 to check whether an effective trading strategy could be found by using the performance measurements based on the return and Sharpe ratio. To correct for the influence of the data-snooping effect, we adopt the Superior Predictive Ability test to evaluate if there exists a trading rule that can significantly outperform the benchmark. The result shows that for SSCI, technical trading rules offer significant profitability, while for CSI 300, this ability is lost. We further partition the SSCI into two sub-series and find that the efficiency of technical trading in sub-series, which have exactly the same spanning period as that of CSI 300, is severely weakened. By testing the trading rules on both indexes with a five-year moving window, we find that during the financial bubble from 2005 to 2007, the effectiveness of technical trading rules is greatly improved. This is consistent with the predictive ability of technical trading rules which appears when the market is less efficient.
Texas International Airlines LOFT program
NASA Technical Reports Server (NTRS)
Sommerville, J.
1981-01-01
A line-oriented flight training program which allows the crew to work as a team to solve all problems, abnormal or emergency, within the crew concept. A line-oriented check ride takes place every six months for the pilot as a proficiency check. There are advantages and disadvantages to this program. One disadvantage is that since it is designed as a check-ride, the scenarios must be structured so that the average pilot will complete the check-ride without complication. This system is different from a proficiency check which can be stopped at a problem area so training to proficiency can take place before proceeding with the check.
Beyond Member-Checking: A Dialogic Approach to the Research Interview
ERIC Educational Resources Information Center
Harvey, Lou
2015-01-01
This article presents a dialogic qualitative interview design for a narrative study of six international UK university students' motivation for learning English. Based on the work of Mikhail Bakhtin, this design was developed in order to address the limitations of member-checking [Lincoln, Y. S., and E. G. Guba. 1985. "Naturalistic…
ERIC Educational Resources Information Center
Bureau of Naval Personnel, Washington, DC.
The Progress Check Booklet is designed to be used by the student working in the programed course to determine if he has mastered the concepts in the course booklets on: electrical current; voltage; resistance; measuring current and voltage in series circuits; relationships of current, voltage, and resistance; parellel circuits; combination…
Structural design, analysis, and code evaluation of an odd-shaped pressure vessel
NASA Astrophysics Data System (ADS)
Rezvani, M. A.; Ziada, H. H.
1992-12-01
An effort to design, analyze, and evaluate a rectangular pressure vessel is described. Normally pressure vessels are designed in circular or spherical shapes to prevent stress concentrations. In this case, because of operational limitations, the choice of vessels was limited to a rectangular pressure box with a removable cover plate. The American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code is used as a guideline for pressure containments whose width or depth exceeds 15.24 cm (6.0 in.) and where pressures will exceed 103.4 KPa (15.0 lbf/in(sup 2)). This evaluation used Section 8 of this Code, hereafter referred to as the Code. The dimensions and working pressure of the subject vessel fall within the pressure vessel category of the Code. The Code design guidelines and rules do not directly apply to this vessel. Therefore, finite-element methodology was used to analyze the pressure vessel, and the Code then was used in qualifying the vessel to be stamped to the Code. Section 8, Division 1 of the Code was used for evaluation. This action was justified by selecting a material for which fatigue damage would not be a concern. The stress analysis results were then checked against the Code, and the thicknesses adjusted to satisfy Code requirements. Although not directly applicable, the Code design formulas for rectangular vessels were also considered and presented.
What are the most effective risk-reduction strategies in sport concussion?
Benson, Brian W; McIntosh, Andrew S; Maddocks, David; Herring, Stanley A; Raftery, Martin; Dvorák, Jirí
2013-04-01
To critically review the evidence to determine the efficacy and effectiveness of protective equipment, rule changes, neck strength and legislation in reducing sport concussion risk. Electronic databases, grey literature and bibliographies were used to search the evidence using Medical Subject Headings and text words. Inclusion/exclusion criteria were used to select articles for the clinical equipment studies. The quality of evidence was assessed using epidemiological criteria regarding internal/external validity (eg, strength of design, sample size/power, bias and confounding). No new valid, conclusive evidence was provided to suggest the use of headgear in rugby, or mouth guards in American football, significantly reduced players' risk of concussion. No evidence was provided to suggest an association between neck strength increases and concussion risk reduction. There was evidence in ice hockey to suggest fair-play rules and eliminating body checking among 11-years-olds to 12-years-olds were effective injury prevention strategies. Evidence is lacking on the effects of legislation on concussion prevention. Equipment self-selection bias was a common limitation, as was the lack of measurement and control for potential confounding variables. Lastly, helmets need to be able to protect from impacts resulting in a head change in velocities of up to 10 and 7 m/s in professional American and Australian football, respectively, as well as reduce head resultant linear and angular acceleration to below 50 g and 1500 rad/s(2), respectively, to optimise their effectiveness. A multifactorial approach is needed for concussion prevention. Future well-designed and sport-specific prospective analytical studies of sufficient power are warranted.
NASA Technical Reports Server (NTRS)
Call, Jared A.; Kwok, John H.; Fisher, Forest W.
2013-01-01
This innovation is a tool used to verify and validate spacecraft sequences at the predicted events file (PEF) level for the GRAIL (Gravity Recovery and Interior Laboratory, see http://www.nasa. gov/mission_pages/grail/main/index. html) mission as part of the Multi-Mission Planning and Sequencing Team (MPST) operations process to reduce the possibility for errors. This tool is used to catch any sequence related errors or issues immediately after the seqgen modeling to streamline downstream processes. This script verifies and validates the seqgen modeling for the GRAIL MPST process. A PEF is provided as input, and dozens of checks are performed on it to verify and validate the command products including command content, command ordering, flight-rule violations, modeling boundary consistency, resource limits, and ground commanding consistency. By performing as many checks as early in the process as possible, grl_pef_check streamlines the MPST task of generating GRAIL command and modeled products on an aggressive schedule. By enumerating each check being performed, and clearly stating the criteria and assumptions made at each step, grl_pef_check can be used as a manual checklist as well as an automated tool. This helper script was written with a focus on enabling the user with the information they need in order to evaluate a sequence quickly and efficiently, while still keeping them informed and active in the overall sequencing process. grl_pef_check verifies and validates the modeling and sequence content prior to investing any more effort into the build. There are dozens of various items in the modeling run that need to be checked, which is a time-consuming and errorprone task. Currently, no software exists that provides this functionality. Compared to a manual process, this script reduces human error and saves considerable man-hours by automating and streamlining the mission planning and sequencing task for the GRAIL mission.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-02
... one revision to the final rule. The language in Sec. 91.1091(f)(2) incorrectly uses the term ``check... Sec. 91.1091(f)(2). Because the section title applies to flight instructors it is obvious that the use... Civil Aviation (61 Stat. 1180). 0 2. Amend Sec. 91.1091 by revising paragraph (f)(2) to read as follows...
Antipersistent Markov behavior in foreign exchange markets
NASA Astrophysics Data System (ADS)
Baviera, Roberto; Pasquini, Michele; Serva, Maurizio; Vergni, Davide; Vulpiani, Angelo
2002-09-01
A quantitative check of efficiency in US dollar/Deutsche mark exchange rates is developed using high-frequency (tick by tick) data. The antipersistent Markov behavior of log-price fluctuations of given size implies, in principle, the possibility of a statistical forecast. We introduce and measure the available information of the quote sequence, and we show how it can be profitable following a particular trading rule.
Leveraging the cost of HIPPA..
Hoppszallern, Suzanna; Arges, George
2002-01-01
HIPAA's administrative simplification rule was introduced to reduce the costs of handling provider-payer transactions by standardizing them. The transaction rules are predicted to save billions of dollars. To get the biggest payoff, providers will need to overhaul their business processes as part of electronic transaction processing. By eliminating redundant and inefficient administrative processes, staff time can be focused on processes that improve financial performance and customer satisfaction. If your organization needs to have an extension beyond the original compliance date, you must file a plan before Oct. 16 to show how you'll come into compliance by Oct. 16, 2003. Check the roadmap and timetable for HIPAA compliance and see where your organization is and where it needs to be.
Bayesian design of decision rules for failure detection
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.
NASA Technical Reports Server (NTRS)
Penrose, C. J.
1987-01-01
The difficulties of modeling the complex recirculating flow fields produced by multiple jet STOVL aircraft close to the ground have led to extensive use of experimental model tests to predict intake Hot Gas Reingestion (HGR). Model test results reliability is dependent on a satisfactory set of scaling rules which must be validated by fully comparable full scale tests. Scaling rules devised in the U.K. in the mid 60's gave good model/full scale agreement for the BAe P1127 aircraft. Until recently no opportunity has occurred to check the applicability of the rules to the high energy exhaust of current ASTOVL aircraft projects. Such an opportunity has arisen following tests on a Tethered Harrier. Comparison of this full scale data and results from tests on a model configuration approximating to the full scale aircraft geometry has shown discrepancies between HGR levels. These discrepancies although probably due to geometry and other model/scale differences indicate some reexamination of the scaling rules is needed. Therefore the scaling rules are reviewed, further scaling studies planned are described and potential areas for further work are suggested.
Visplause: Visual Data Quality Assessment of Many Time Series Using Plausibility Checks.
Arbesser, Clemens; Spechtenhauser, Florian; Muhlbacher, Thomas; Piringer, Harald
2017-01-01
Trends like decentralized energy production lead to an exploding number of time series from sensors and other sources that need to be assessed regarding their data quality (DQ). While the identification of DQ problems for such routinely collected data is typically based on existing automated plausibility checks, an efficient inspection and validation of check results for hundreds or thousands of time series is challenging. The main contribution of this paper is the validated design of Visplause, a system to support an efficient inspection of DQ problems for many time series. The key idea of Visplause is to utilize meta-information concerning the semantics of both the time series and the plausibility checks for structuring and summarizing results of DQ checks in a flexible way. Linked views enable users to inspect anomalies in detail and to generate hypotheses about possible causes. The design of Visplause was guided by goals derived from a comprehensive task analysis with domain experts in the energy sector. We reflect on the design process by discussing design decisions at four stages and we identify lessons learned. We also report feedback from domain experts after using Visplause for a period of one month. This feedback suggests significant efficiency gains for DQ assessment, increased confidence in the DQ, and the applicability of Visplause to summarize indicators also outside the context of DQ.
The method of a joint intraday security check system based on cloud computing
NASA Astrophysics Data System (ADS)
Dong, Wei; Feng, Changyou; Zhou, Caiqi; Cai, Zhi; Dan, Xu; Dai, Sai; Zhang, Chuancheng
2017-01-01
The intraday security check is the core application in the dispatching control system. The existing security check calculation only uses the dispatch center’s local model and data as the functional margin. This paper introduces the design of all-grid intraday joint security check system based on cloud computing and its implementation. To reduce the effect of subarea bad data on the all-grid security check, a new power flow algorithm basing on comparison and adjustment with inter-provincial tie-line plan is presented. And the numerical example illustrated the effectiveness and feasibility of the proposed method.
Low-Density Parity-Check Code Design Techniques to Simplify Encoding
NASA Astrophysics Data System (ADS)
Perez, J. M.; Andrews, K.
2007-11-01
This work describes a method for encoding low-density parity-check (LDPC) codes based on the accumulate-repeat-4-jagged-accumulate (AR4JA) scheme, using the low-density parity-check matrix H instead of the dense generator matrix G. The use of the H matrix to encode allows a significant reduction in memory consumption and provides the encoder design a great flexibility. Also described are new hardware-efficient codes, based on the same kind of protographs, which require less memory storage and area, allowing at the same time a reduction in the encoding delay.
Design and scheduling for periodic concurrent error detection and recovery in processor arrays
NASA Technical Reports Server (NTRS)
Wang, Yi-Min; Chung, Pi-Yu; Fuchs, W. Kent
1992-01-01
Periodic application of time-redundant error checking provides the trade-off between error detection latency and performance degradation. The goal is to achieve high error coverage while satisfying performance requirements. We derive the optimal scheduling of checking patterns in order to uniformly distribute the available checking capability and maximize the error coverage. Synchronous buffering designs using data forwarding and dynamic reconfiguration are described. Efficient single-cycle diagnosis is implemented by error pattern analysis and direct-mapped recovery cache. A rollback recovery scheme using start-up control for local recovery is also presented.
Data quality assessment for comparative effectiveness research in distributed data networks
Brown, Jeffrey; Kahn, Michael; Toh, Sengwee
2015-01-01
Background Electronic health information routinely collected during healthcare delivery and reimbursement can help address the need for evidence about the real-world effectiveness, safety, and quality of medical care. Often, distributed networks that combine information from multiple sources are needed to generate this real-world evidence. Objective We provide a set of field-tested best practices and a set of recommendations for data quality checking for comparative effectiveness research (CER) in distributed data networks. Methods Explore the requirements for data quality checking and describe data quality approaches undertaken by several existing multi-site networks. Results There are no established standards regarding how to evaluate the quality of electronic health data for CER within distributed networks. Data checks of increasing complexity are often employed, ranging from consistency with syntactic rules to evaluation of semantics and consistency within and across sites. Temporal trends within and across sites are widely used, as are checks of each data refresh or update. Rates of specific events and exposures by age group, sex, and month are also common. Discussion Secondary use of electronic health data for CER holds promise but is complex, especially in distributed data networks that incorporate periodic data refreshes. The viability of a learning health system is dependent on a robust understanding of the quality, validity, and optimal secondary uses of routinely collected electronic health data within distributed health data networks. Robust data quality checking can strengthen confidence in findings based on distributed data network. PMID:23793049
Rosário, Pedro; Núñez, José C.; Vallejo, Guillermo; Cunha, Jennifer; Nunes, Tânia; Suárez, Natalia; Fuentes, Sonia; Moreira, Tânia
2015-01-01
This study analyzed the effects of five types of homework follow-up practices (i.e., checking homework completion; answering questions about homework; checking homework orally; checking homework on the board; and collecting and grading homework) used in class by 26 teachers of English as a Foreign Language (EFL) using a randomized-group design. Once a week, for 6 weeks, the EFL teachers used a particular type of homework follow-up practice they had previously been assigned to. At the end of the 6 weeks students completed an EFL exam as an outcome measure. The results showed that three types of homework follow-up practices (i.e., checking homework orally; checking homework on the board; and collecting and grading homework) had a positive impact on students' performance, thus highlighting the role of EFL teachers in the homework process. The effect of EFL teachers' homework follow-up practices on students' performance was affected by students' prior knowledge, but not by the number of homework follow-up sessions. PMID:26528204
Rosário, Pedro; Núñez, José C; Vallejo, Guillermo; Cunha, Jennifer; Nunes, Tânia; Suárez, Natalia; Fuentes, Sonia; Moreira, Tânia
2015-01-01
This study analyzed the effects of five types of homework follow-up practices (i.e., checking homework completion; answering questions about homework; checking homework orally; checking homework on the board; and collecting and grading homework) used in class by 26 teachers of English as a Foreign Language (EFL) using a randomized-group design. Once a week, for 6 weeks, the EFL teachers used a particular type of homework follow-up practice they had previously been assigned to. At the end of the 6 weeks students completed an EFL exam as an outcome measure. The results showed that three types of homework follow-up practices (i.e., checking homework orally; checking homework on the board; and collecting and grading homework) had a positive impact on students' performance, thus highlighting the role of EFL teachers in the homework process. The effect of EFL teachers' homework follow-up practices on students' performance was affected by students' prior knowledge, but not by the number of homework follow-up sessions.
Effects of event knowledge in processing verbal arguments
Bicknell, Klinton; Elman, Jeffrey L.; Hare, Mary; McRae, Ken; Kutas, Marta
2010-01-01
This research tests whether comprehenders use their knowledge of typical events in real time to process verbal arguments. In self-paced reading and event-related brain potential (ERP) experiments, we used materials in which the likelihood of a specific patient noun (brakes or spelling) depended on the combination of an agent and verb (mechanic checked vs. journalist checked). Reading times were shorter at the word directly following the patient for the congruent than the incongruent items. Differential N400s were found earlier, immediately at the patient. Norming studies ruled out any account of these results based on direct relations between the agent and patient. Thus, comprehenders dynamically combine information about real-world events based on intrasentential agents and verbs, and this combination then rapidly influences online sentence interpretation. PMID:21076629
Protecting quantum memories using coherent parity check codes
NASA Astrophysics Data System (ADS)
Roffe, Joschka; Headley, David; Chancellor, Nicholas; Horsman, Dominic; Kendon, Viv
2018-07-01
Coherent parity check (CPC) codes are a new framework for the construction of quantum error correction codes that encode multiple qubits per logical block. CPC codes have a canonical structure involving successive rounds of bit and phase parity checks, supplemented by cross-checks to fix the code distance. In this paper, we provide a detailed introduction to CPC codes using conventional quantum circuit notation. We demonstrate the implementation of a CPC code on real hardware, by designing a [[4, 2, 2
SU-D-BRD-01: An Automated Physics Weekly Chart Checking System Supporting ARIA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, X; Yang, D
Purpose: A software tool was developed in this study to perform automatic weekly physics chart check on the patient data in ARIA. The tool accesses the electronic patient data directly from ARIA server and checks the accuracy of treatment deliveries, and generates reports which summarize the delivery history and highlight the errors. Methods: The tool has four modules. 1) The database interface is designed to directly access treatment delivery data from the ARIA database before reorganizing the data into the patient chart tree (PCT). 2) PCT is a core data structure designed to store and organize the data in logicalmore » hierarchies, and to be passed among functions. 3) The treatment data check module analyzes the organized data in PCT and stores the checking results into PCT. 4) Report generation module generates reports containing the treatment delivery summary, chart checking results and plots of daily treatment setup parameters (couch table positions, shifts of image guidance). The errors that are found by the tool are highlighted with colors. Results: The weekly check tool has been implemented in MATLAB and clinically tested at two major cancer centers. Javascript, cascading style sheets (CSS) and dynamic HTML were employed to create the user-interactive reports. It takes 0.06 second to search the delivery records of one beam with PCT and compare the delivery records with beam plan. The reports, saved in the HTML files on shared network folder, can be accessed by web browser on computers and mobile devices. Conclusion: The presented weekly check tool is useful to check the electronic patient treatment data in Varian ARIA system. It could be more efficient and reliable than the manually check by physicists. The work was partially supported by a research grant from Varian Medical System.« less
Coping with Novelty and Human Intelligence: The Role of Counterfactual Reasoning
1988-01-01
terminating relevance tests. Relevance is determined by checking whether the conceptual relation in the precue matches that in the item of the problem...found that experts tend to conceptualize dumain-related problems in abstract terms, whereas nonexperts apparently rely more on surface-level features...finally, the effects of the two surface-structural rule manipulations might for all subjects be partly due to perceptual, rather than conceptual
ERIC Educational Resources Information Center
Kleinman, Kimberly E.; Saigh, Philip A.
2011-01-01
The efficacy of the Good Behavior Game was examined in a multiethnic New York City public high school. Classroom rules were posted and students were divided into two teams. A reinforcement preference questionnaire was used to select daily and weekly prizes. The classroom teacher indicated that he was going to place a check on the board after every…
Investigation of model-based physical design restrictions (Invited Paper)
NASA Astrophysics Data System (ADS)
Lucas, Kevin; Baron, Stanislas; Belledent, Jerome; Boone, Robert; Borjon, Amandine; Couderc, Christophe; Patterson, Kyle; Riviere-Cazaux, Lionel; Rody, Yves; Sundermann, Frank; Toublan, Olivier; Trouiller, Yorick; Urbani, Jean-Christophe; Wimmer, Karl
2005-05-01
As lithography and other patterning processes become more complex and more non-linear with each generation, the task of physical design rules necessarily increases in complexity also. The goal of the physical design rules is to define the boundary between the physical layout structures which will yield well from those which will not. This is essentially a rule-based pre-silicon guarantee of layout correctness. However the rapid increase in design rule requirement complexity has created logistical problems for both the design and process functions. Therefore, similar to the semiconductor industry's transition from rule-based to model-based optical proximity correction (OPC) due to increased patterning complexity, opportunities for improving physical design restrictions by implementing model-based physical design methods are evident. In this paper we analyze the possible need and applications for model-based physical design restrictions (MBPDR). We first analyze the traditional design rule evolution, development and usage methodologies for semiconductor manufacturers. Next we discuss examples of specific design rule challenges requiring new solution methods in the patterning regime of low K1 lithography and highly complex RET. We then evaluate possible working strategies for MBPDR in the process development and product design flows, including examples of recent model-based pre-silicon verification techniques. Finally we summarize with a proposed flow and key considerations for MBPDR implementation.
Extension of specification language for soundness and completeness of service workflow
NASA Astrophysics Data System (ADS)
Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn
2018-05-01
A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.
ERIC Educational Resources Information Center
Hass-Cohen, Noah; Clyde Findlay, Joanna; Carr, Richard; Vanderlan, Jessica
2014-01-01
The Check ("Check, Change What You Need To Change and/or Keep What You Want") art therapy protocol is a sequence of directives for treating trauma that is grounded in neurobiological theory and designed to facilitate trauma narrative processing, autobiographical coherency, and the rebalancing of dysregulated responses to psychosocial…
A Low-Complexity Euclidean Orthogonal LDPC Architecture for Low Power Applications.
Revathy, M; Saravanan, R
2015-01-01
Low-density parity-check (LDPC) codes have been implemented in latest digital video broadcasting, broadband wireless access (WiMax), and fourth generation of wireless standards. In this paper, we have proposed a high efficient low-density parity-check code (LDPC) decoder architecture for low power applications. This study also considers the design and analysis of check node and variable node units and Euclidean orthogonal generator in LDPC decoder architecture. The Euclidean orthogonal generator is used to reduce the error rate of the proposed LDPC architecture, which can be incorporated between check and variable node architecture. This proposed decoder design is synthesized on Xilinx 9.2i platform and simulated using Modelsim, which is targeted to 45 nm devices. Synthesis report proves that the proposed architecture greatly reduces the power consumption and hardware utilizations on comparing with different conventional architectures.
Ergün, A Sanlı
2011-10-01
Focused ultrasound therapy relies on acoustic power absorption by tissue. The stronger the absorption the higher the temperature increase is. However, strong acoustic absorption also means faster attenuation and limited penetration depth. Hence, there is a trade-off between heat generation efficacy and penetration depth. In this paper, we formulated the acoustic power absorption as a function of frequency and attenuation coefficient, and defined two figures of merit to measure the power absorption: spatial peak of the acoustic power absorption density, and the acoustic power absorbed within the focal area. Then, we derived "rule of thumb" expressions for the optimum frequencies that maximized these figures of merit given the target depth and homogeneous tissue type. We also formulated a method to calculate the optimum frequency for inhomogeneous tissue given the tissue composition for situations where the tissue structure can be assumed to be made of parallel layers of homogeneous tissue. We checked the validity of the rules using linear acoustic field simulations. For a one-dimensional array of 4cm acoustic aperture, and for a two-dimensional array of 4×4cm(2) acoustic aperture, we found that the power absorbed within the focal area is maximized at 0.86MHz, and 0.79MHz, respectively, when the target depth is 4cm in muscle tissue. The rules on the other hand predicted the optimum frequencies for acoustic power absorption as 0.9MHz and 0.86MHz, respectively for the 1D and 2D array case, which are within 6% and 9% of the field simulation results. Because radiation force generated by an acoustic wave in a lossy propagation medium is approximately proportional to the acoustic power absorption, these rules can be used to maximize acoustic radiation force generated in tissue as well. Copyright © 2011 Elsevier B.V. All rights reserved.
Design technology co-optimization for 14/10nm metal1 double patterning layer
NASA Astrophysics Data System (ADS)
Duan, Yingli; Su, Xiaojing; Chen, Ying; Su, Yajuan; Shao, Feng; Zhang, Recco; Lei, Junjiang; Wei, Yayi
2016-03-01
Design and technology co-optimization (DTCO) can satisfy the needs of the design, generate robust design rule, and avoid unfriendly patterns at the early stage of design to ensure a high level of manufacturability of the product by the technical capability of the present process. The DTCO methodology in this paper includes design rule translation, layout analysis, model validation, hotspots classification and design rule optimization mainly. The correlation of the DTCO and double patterning (DPT) can optimize the related design rule and generate friendlier layout which meets the requirement of the 14/10nm technology node. The experiment demonstrates the methodology of DPT-compliant DTCO which is applied to a metal1 layer from the 14/10nm node. The DTCO workflow proposed in our job is an efficient solution for optimizing the design rules for 14/10 nm tech node Metal1 layer. And the paper also discussed and did the verification about how to tune the design rule of the U-shape and L-shape structures in a DPT-aware metal layer.
Kaiser, W; Faber, T S; Findeis, M
1996-01-01
The authors developed a computer program that detects myocardial infarction (MI) and left ventricular hypertrophy (LVH) in two steps: (1) by extracting parameter values from a 10-second, 12-lead electrocardiogram, and (2) by classifying the extracted parameter values with rule sets. Every disease has its dedicated set of rules. Hence, there are separate rule sets for anterior MI, inferior MI, and LVH. If at least one rule is satisfied, the disease is said to be detected. The computer program automatically develops these rule sets. A database (learning set) of healthy subjects and patients with MI, LVH, and mixed MI+LVH was used. After defining the rule type, initial limits, and expected quality of the rules (positive predictive value, minimum number of patients), the program creates a set of rules by varying the limits. The general rule type is defined as: disease = lim1l < p1 < or = lim1u and lim2l < p2 < or = lim2u and ... limnl < pn < or = limnu. When defining the rule types, only the parameters (p1 ... pn) that are known as clinical electrocardiographic criteria (amplitudes [mV] of Q, R, and T waves and ST-segment; duration [ms] of Q wave; frontal angle [degrees]) were used. This allowed for submitting the learned rule sets to an independent investigator for medical verification. It also allowed the creation of explanatory texts with the rules. These advantages are not offered by the neurons of a neural network. The learned rules were checked against a test set and the following results were obtained: MI: sensitivity 76.2%, positive predictive value 98.6%; LVH: sensitivity 72.3%, positive predictive value 90.9%. The specificity ratings for MI are better than 98%; for LVH, better than 90%.
NASA Technical Reports Server (NTRS)
Lane, J. H.
1976-01-01
Performance tests completed on the Space Shuttle Carrier Aircraft (SCA) transceiver console, verifying its design objectives, were described. These tests included: (1) check of power supply voltages for correct output voltage and energization at the proper point in the turn on sequence, (2) check of cooling system (LRU blower, overload sensors and circuitry, and thermocouple probe), (3) check of control circuits logic, including the provisions for remote control and display, (4) check of the LRU connector for presence of correct voltages and absence of incorrect voltages under both energized and deenergized conditions, and (5) check of the AGC and power output monitor circuits.
Validity criteria for Fermi's golden rule scattering rates applied to metallic nanowires.
Moors, Kristof; Sorée, Bart; Magnus, Wim
2016-09-14
Fermi's golden rule underpins the investigation of mobile carriers propagating through various solids, being a standard tool to calculate their scattering rates. As such, it provides a perturbative estimate under the implicit assumption that the effect of the interaction Hamiltonian which causes the scattering events is sufficiently small. To check the validity of this assumption, we present a general framework to derive simple validity criteria in order to assess whether the scattering rates can be trusted for the system under consideration, given its statistical properties such as average size, electron density, impurity density et cetera. We derive concrete validity criteria for metallic nanowires with conduction electrons populating a single parabolic band subjected to different elastic scattering mechanisms: impurities, grain boundaries and surface roughness.
Ishimaru, Tomohiro; Hattori, Michihiro; Nagata, Masako; Kuwahara, Keisuke; Watanabe, Seiji; Mori, Koji
2018-01-01
The stress check program has been part of annual employees' health screening since 2015. Employees are recommended, but not obliged, to undergo the stress check offered. This study was designed to examine the factors associated with stress check attendance. A total of 31,156 Japanese employees who underwent an annual health examination and a stress check service at an Occupational Health Service Center in 2016 participated in this study. Data from the annual health examination and stress check service included stress check attendance, date of attendance (if implemented), gender, age, workplace industry, number of employees at the workplace, and tobacco and alcohol consumption. Data were analyzed using multiple logistic regression. The mean rate of stress check attendance was 90.8%. A higher rate of stress check attendance was associated with a lower duration from the annual health examination, age ≥30 years, construction and transport industry, and 50-999 employees at the workplace. A lower rate of stress check attendance was associated with medical and welfare industry and ≥1,000 employees at the workplace. These findings provide insights into developing strategies for improving the rate of stress check attendance. In particular, stress check attendance may improve if the stress check service and annual health examination are conducted simultaneously.
10 CFR Appendix C to Part 52 - Design Certification Rule for the AP600 Design
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 2 2012-01-01 2012-01-01 false Design Certification Rule for the AP600 Design C Appendix C to Part 52 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES, CERTIFICATIONS, AND APPROVALS FOR NUCLEAR POWER PLANTS Pt. 52, App. C Appendix C to Part 52—Design Certification Rule for the...
NASA Astrophysics Data System (ADS)
Servilla, M. S.; O'Brien, M.; Costa, D.
2013-12-01
Considerable ecological research performed today occurs through the analysis of data downloaded from various repositories and archives, often resulting in derived or synthetic products generated by automated workflows. These data are only meaningful for research if they are well documented by metadata, lest semantic or data type errors may occur in interpretation or processing. The Long Term Ecological Research (LTER) Network now screens all data packages entering its long-term archive to ensure that each package contains metadata that is complete, of high quality, and accurately describes the structure of its associated data entity and the data are structurally congruent to the metadata. Screening occurs prior to the upload of a data package into the Provenance Aware Synthesis Tracking Architecture (PASTA) data management system through a series of quality checks, thus preventing ambiguously or incorrectly documented data packages from entering the system. The quality checks within PASTA are designed to work specifically with the Ecological Metadata Language (EML), the metadata standard adopted by the LTER Network to describe data generated by their 26 research sites. Each quality check is codified in Java as part of the ecological community-supported Data Manager Library, which is a resource of the EML specification and used as a component of the PASTA software stack. Quality checks test for metadata quality, data integrity, or metadata-data congruence. Quality checks are further classified as either conditional or informational. Conditional checks issue a 'valid', 'warning' or 'error' response. Only an 'error' response blocks the data package from upload into PASTA. Informational checks only provide descriptive content pertaining to a particular facet of the data package. Quality checks are designed by a group of LTER information managers and reviewed by the LTER community before deploying into PASTA. A total of 32 quality checks have been deployed to date. Quality checks can be customized through a configurable template, which includes turning checks 'on' or 'off' and setting the severity of conditional checks. This feature is important to other potential users of the Data Manager Library who wish to configure its quality checks in accordance with the standards of their community. Executing the complete set of quality checks produces a report that describes the result of each check. The report is an XML document that is stored by PASTA for future reference.
A novel methodology for building robust design rules by using design based metrology (DBM)
NASA Astrophysics Data System (ADS)
Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan
2013-03-01
This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.
[Design and implementation of data checking system for Chinese materia medica resources survey].
Wang, Hui; Zhang, Xiao-Bo; Ge, Xiao-Guang; Jin, Yan; Jing, Zhi-Xian; Qi, Yuan-Hua; Wang, Ling; Zhao, Yu-Ping; Wang, Wei; Guo, Lan-Ping; Huang, Lu-Qi
2017-11-01
The Chinese material medica resources (CMMR) national survey information management system has collected a large amount of data. To help dealing with data recheck, reduce the work of inside, improve the recheck of survey data from provincial and county level, National Resource Center for Chinese Materia Medical has designed a data checking system for Chinese material medica resources survey based on J2EE technology, Java language, Oracle data base in accordance with the SOA framework. It includes single data check, check score, content manage, check the survey data census data with manual checking and automatic checking about census implementation plan, key research information, general survey information, cultivation of medicinal materials information, germplasm resources information the medicine information, market research information, traditional knowledge information, specimen information of this 9 aspects 20 class 175 indicators in two aspects of the quantity and quality. The established system assists in the completion of the data consistency and accuracy, pushes the county survey team timely to complete the data entry arrangement work, so as to improve the integrity, consistency and accuracy of the survey data, and ensure effective and available data, which lay a foundation for providing accurate data support for national survey of the Chinese material medica resources (CMMR) results summary, and displaying results and sharing. Copyright© by the Chinese Pharmaceutical Association.
Finite-Time Performance of Local Search Algorithms: Theory and Application
2010-06-10
security devices deployed at airport security checkpoints are used to detect prohibited items (e.g., guns, knives, explosives). Each security device...security devices are deployed, the practical issue of determining how to optimally use them can be difficult. For an airport security system design...checked baggage), explosive detection systems (designed to detect explosives in checked baggage), and detailed hand search by an airport security official
DataPlus - a revolutionary applications generator for DOS hand-held computers
David Dean; Linda Dean
2000-01-01
DataPlus allows the user to easily design data collection templates for DOS-based hand-held computers that mimic clipboard data sheets. The user designs and tests the application on the desktop PC and then transfers it to a DOS field computer. Other features include: error checking, missing data checks, and sensor input from RS-232 devices such as bar code wands,...
Code of Federal Regulations, 2011 CFR
2011-07-01
... is not evidence of a violation of law or has design or other characteristics that particularly suit it for use in illegal activities. This bond must be in the form of a traveler's check, a money order... Service. A bond in the form of a cashier's check will be considered as paid once the check has been...
Code of Federal Regulations, 2010 CFR
2010-07-01
... is not evidence of a violation of law or has design or other characteristics that particularly suit it for use in illegal activities. This bond must be in the form of a traveler's check, a money order... Service. A bond in the form of a cashier's check will be considered as paid once the check has been...
A model-based approach for the scattering-bar printing avoidance
NASA Astrophysics Data System (ADS)
Du, Yaojun; Li, Liang; Zhang, Jingjing; Shao, Feng; Zuniga, Christian; Deng, Yunfei
2018-03-01
As the technology node for the semiconductor manufacturing approaches advanced nodes, the scattering-bars (SBs) are more crucial than ever to ensure a good on-wafer printability of the line space pattern and hole pattern. The main pattern with small pitches requires a very narrow PV (process variation) band. A delicate SB addition scheme is thus needed to maintain a sufficient PW (process window) for the semi-iso- and iso-patterns. In general, the wider, longer, and closer to main feature SBs will be more effective in enhancing the printability; on the other hand, they are also more likely to be printed on the wafer; resulting in undesired defects transferable to subsequent processes. In this work, we have developed a model based approach for the scattering-bar printing avoidance (SPA). A specially designed optical model was tuned based on a broad range of test patterns which contain a variation of CDs and SB placements showing printing and non-printing scattering bars. A printing threshold is then obtained to check the extra-printings of SBs. The accuracy of this threshold is verified by pre-designed test patterns. The printing threshold associated with our novel SPA model allows us to set up a proper SB rule.
Dynamic Response of Functionally Graded Carbon Nanotube Reinforced Sandwich Plate
NASA Astrophysics Data System (ADS)
Mehar, Kulmani; Panda, Subrata Kumar
2018-03-01
In this article, the dynamic response of the carbon nanotube-reinforced functionally graded sandwich composite plate has been studied numerically with the help of finite element method. The face sheets of the sandwich composite plate are made of carbon nanotube- reinforced composite for two different grading patterns whereas the core phase is taken as isotropic material. The final properties of the structure are calculated using the rule of mixture. The geometrical model of the sandwich plate is developed and discretized suitably with the help of available shell element in ANSYS library. Subsequently, the corresponding numerical dynamic responses computed via batch input technique (parametric design language code in ANSYS) of ANSYS including Newmark’s integration scheme. The stability of the sandwich structural numerical model is established through the proper convergence study. Further, the reliability of the sandwich model is checked by comparison study between present and available results from references. As a final point, some numerical problems have been solved to examine the effect of different design constraints (carbon nanotube distribution pattern, core to face thickness ratio, volume fractions of the nanotube, length to thickness ratio, aspect ratio and constraints at edges) on the time-responses of sandwich plate.
The use of self checks and voting in software error detection - An empirical study
NASA Technical Reports Server (NTRS)
Leveson, Nancy G.; Cha, Stephen S.; Knight, John C.; Shimeall, Timothy J.
1990-01-01
The results of an empirical study of software error detection using self checks and N-version voting are presented. Working independently, each of 24 programmers first prepared a set of self checks using just the requirements specification of an aerospace application, and then each added self checks to an existing implementation of that specification. The modified programs were executed to measure the error-detection performance of the checks and to compare this with error detection using simple voting among multiple versions. The analysis of the checks revealed that there are great differences in the ability of individual programmers to design effective checks. It was found that some checks that might have been effective failed to detect an error because they were badly placed, and there were numerous instances of checks signaling nonexistent errors. In general, specification-based checks alone were not as effective as specification-based checks combined with code-based checks. Self checks made it possible to identify faults that had not been detected previously by voting 28 versions of the program over a million randomly generated inputs. This appeared to result from the fact that the self checks could examine the internal state of the executing program, whereas voting examines only final results of computations. If internal states had to be identical in N-version voting systems, then there would be no reason to write multiple versions.
A novel approach for connecting temporal-ontologies with blood flow simulations.
Weichert, F; Mertens, C; Walczak, L; Kern-Isberner, G; Wagner, M
2013-06-01
In this paper an approach for developing a temporal domain ontology for biomedical simulations is introduced. The ideas are presented in the context of simulations of blood flow in aneurysms using the Lattice Boltzmann Method. The advantages in using ontologies are manyfold: On the one hand, ontologies having been proven to be able to provide medical special knowledge e.g., key parameters for simulations. On the other hand, based on a set of rules and the usage of a reasoner, a system for checking the plausibility as well as tracking the outcome of medical simulations can be constructed. Likewise, results of simulations including data derived from them can be stored and communicated in a way that can be understood by computers. Later on, this set of results can be analyzed. At the same time, the ontologies provide a way to exchange knowledge between researchers. Lastly, this approach can be seen as a black-box abstraction of the internals of the simulation for the biomedical researcher as well. This approach is able to provide the complete parameter sets for simulations, part of the corresponding results and part of their analysis as well as e.g., geometry and boundary conditions. These inputs can be transferred to different simulation methods for comparison. Variations on the provided parameters can be automatically used to drive these simulations. Using a rule base, unphysical inputs or outputs of the simulation can be detected and communicated to the physician in a suitable and familiar way. An example for an instantiation of the blood flow simulation ontology and exemplary rules for plausibility checking are given. Copyright © 2013 Elsevier Inc. All rights reserved.
Assume-Guarantee Verification of Source Code with Design-Level Assumptions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.
2004-01-01
Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.
Downing, Christopher; Howard, E Henry; Goodwin, Christina; Geller, E Scott
2016-01-01
Two studies examined factors influencing cashiers' identification (ID)-checking behavior in order to inform the development of interventions to prevent credit-card fraud. In both studies, research assistants made credit purchases in various stores and noted the cashiers' ID-checking behavior. In the first study, the store type, whether the cashier swiped the credit/debit card, the amount of the purchase, and whether the credit/debit card was signed significantly influenced ID-checking behavior. In the second study, an A-B-A design was used to evaluate the impact of a "Check my ID" prompt placed on the credit/debit card. The prompt increased cashiers' ID-checking behavior from 5.9% at Baseline to 10.3% during the Intervention. When the prompt was removed, the cashiers' ID-checking behavior decreased to 7.2%. Implications for further intervention research to prevent credit-card fraud are discussed.
Choices and Challenges: A Guide for the Battalion Commander’s Wife
1991-05-28
make sure you allow yourself plenty of time between them (about 20 minutes) to replenish food and beverage . Whom do you invite? There are no rules. Some...Etiquette" .............. 91 P. Information Letter for a Unit Dining Out V (Sample) ......... ..................... 97 Q. Conducting a Workshop...Fundraising Events: bake sales (donation only), food booth sales, auctions and raifles (days off or passes for soldiers are illegal). Check your local
A Low-Complexity Euclidean Orthogonal LDPC Architecture for Low Power Applications
Revathy, M.; Saravanan, R.
2015-01-01
Low-density parity-check (LDPC) codes have been implemented in latest digital video broadcasting, broadband wireless access (WiMax), and fourth generation of wireless standards. In this paper, we have proposed a high efficient low-density parity-check code (LDPC) decoder architecture for low power applications. This study also considers the design and analysis of check node and variable node units and Euclidean orthogonal generator in LDPC decoder architecture. The Euclidean orthogonal generator is used to reduce the error rate of the proposed LDPC architecture, which can be incorporated between check and variable node architecture. This proposed decoder design is synthesized on Xilinx 9.2i platform and simulated using Modelsim, which is targeted to 45 nm devices. Synthesis report proves that the proposed architecture greatly reduces the power consumption and hardware utilizations on comparing with different conventional architectures. PMID:26065017
Inkjet 3D printed check microvalve
NASA Astrophysics Data System (ADS)
Walczak, Rafał; Adamski, Krzysztof; Lizanets, Danylo
2017-04-01
3D printing enables fast and relatively easy fabrication of various microfluidic structures including microvalves. A check microvalve is the simplest valve enabling control of the fluid flow in microchannels. Proper operation of the check valve is ensured by a movable element that tightens the valve seat during backward flow and enables free flow for forward pressure. Thus, knowledge of the mechanical properties of the movable element is crucial for optimal design and operation of the valve. In this paper, we present for the first time the results of investigations on basic mechanical properties of the building material used in multijet 3D printing. Specified mechanical properties were used in the design and fabrication of two types of check microvalve—with deflecting or hinge-fixed microflap—with 200 µm and 300 µm thickness. Results of numerical simulation and experimental data of the microflap deflection were obtained and compared. The valves were successfully 3D printed and characterised. Opening/closing characteristics of the microvalve for forward and backward pressures were determined. Thus, proper operation of the check microvalve so developed was confirmed.
Optical proximity correction for anamorphic extreme ultraviolet lithography
NASA Astrophysics Data System (ADS)
Clifford, Chris; Lam, Michael; Raghunathan, Ananthan; Jiang, Fan; Fenger, Germain; Adam, Kostas
2017-10-01
The change from isomorphic to anamorphic optics in high numerical aperture (NA) extreme ultraviolet (EUV) scanners necessitates changes to the mask data preparation flow. The required changes for each step in the mask tape out process are discussed, with a focus on optical proximity correction (OPC). When necessary, solutions to new problems are demonstrated, and verified by rigorous simulation. Additions to the OPC model include accounting for anamorphic effects in the optics, mask electromagnetics, and mask manufacturing. The correction algorithm is updated to include awareness of anamorphic mask geometry for mask rule checking (MRC). OPC verification through process window conditions is enhanced to test different wafer scale mask error ranges in the horizontal and vertical directions. This work will show that existing models and methods can be updated to support anamorphic optics without major changes. Also, the larger mask size in the Y direction can result in better model accuracy, easier OPC convergence, and designs which are more tolerant to mask errors.
NASA Technical Reports Server (NTRS)
Borgen, Richard L.
2013-01-01
The configuration of ION (Inter - planetary Overlay Network) network nodes is a manual task that is complex, time-consuming, and error-prone. This program seeks to accelerate this job and produce reliable configurations. The ION Configuration Editor is a model-based smart editor based on Eclipse Modeling Framework technology. An ION network designer uses this Eclipse-based GUI to construct a data model of the complete target network and then generate configurations. The data model is captured in an XML file. Intrinsic editor features aid in achieving model correctness, such as field fill-in, type-checking, lists of valid values, and suitable default values. Additionally, an explicit "validation" feature executes custom rules to catch more subtle model errors. A "survey" feature provides a set of reports providing an overview of the entire network, enabling a quick assessment of the model s completeness and correctness. The "configuration" feature produces the main final result, a complete set of ION configuration files (eight distinct file types) for each ION node in the network.
NASA Astrophysics Data System (ADS)
Jiang, Jingtao; Sui, Rendong; Shi, Yan; Li, Furong; Hu, Caiqi
In this paper 3-D models of combined fixture elements are designed, classified by their functions, and saved in computer as supporting elements library, jointing elements library, basic elements library, localization elements library, clamping elements library, and adjusting elements library etc. Then automatic assembly of 3-D combined checking fixture for auto-body part is presented based on modularization theory. And in virtual auto-body assembly space, Locating constraint mapping technique and assembly rule-based reasoning technique are used to calculate the position of modular elements according to localization points and clamp points of auto-body part. Auto-body part model is transformed from itself coordinate system space to virtual assembly space by homogeneous transformation matrix. Automatic assembly of different functional fixture elements and auto-body part is implemented with API function based on the second development of UG. It is proven in practice that the method in this paper is feasible and high efficiency.
The Changes in Corneal Astigmatism after Botulinum Toxin-A Injection in Patients with Blepharospasm
Moon, Nam Ju; Lee, Hyeon Il
2006-01-01
To determine if the involuntary contractions of eyelids may have any effects on the development of corneal astigmatism, we performed this prospective study which includes 19 patients with either essential blepharospasm or hemifacial spasm. In hemifacial spasm, the degree of corneal astigmatism was evaluated between two eyes. Then the topographic changes were checked using vector analysis technique before and after passively opening the eyelids. They were also measured before and at 1 and 6 months after the injection of Botulinum toxin. Resultantly, 20 eyes had the with-the-rule (group1) and 9 eyes against-the-rule (group2) astigmatism. In hemifacial spasm, significantly more astigmatism was found at spastic eyes. The corneal topographic changes after passively opening the eyelids showed 10 eyes with the astigmatic shift to the with-the-rule, while the remaining 19 to the against-the-rule. At 1 month after injection of Botulinum toxin, group 1 showed reduced average corneal astigmatism, whereas group 2 showed increased astigmatism. The astigmatic change vector showed significantly more against-the-rule. In the contrary, 6 months after treatment, corneal astigmatism again increased in group 1 and decreased in group 2. So they took on the appearance of pretreatment astigmatic status eventually. Conclusively eyelids may play an important role in corneal curvature. PMID:16479079
Ergonomics and regulatory politics: the Washington State case.
Silverstein, Michael
2007-05-01
Every year in the State of Washington more than 50,000 workers experience a work related musculoskeletal disorder (WMSD), making up more than 30% of all worker compensation cases. In 2000, the Washington State Department of Labor and Industries (L&I) adopted a workplace ergonomics rule requiring employers to reduce worker exposure to hazards that cause or contribute to WMSDs. In 2003, the ergonomics rule was repealed by a margin of 53.5-46.5 in a statewide voter initiative. The official rulemaking record of approximately 100,000 pages, along with supplementary published and unpublished material, was reviewed. The relationship between scientific deliberation and the public policy process in adopting and repealing the ergonomics rule was assessed and described. The deliberative features of the regulatory, judicial, legislative, and ballot processes were compared. The ergonomics rule was successful in the regulatory and legal arenas where the process was most transparent and open to public involvement, differing views could be presented fully, and decision makers were expected to explain their decisions in light of the record. The rule fared most poorly in the legislature and at the ballot box when these features were lost and where considered deliberation was replaced by unconstrained political conflict. Additional checks and balances are needed.
Counter-ions at single charged wall: Sum rules.
Samaj, Ladislav
2013-09-01
For inhomogeneous classical Coulomb fluids in thermal equilibrium, like the jellium or the two-component Coulomb gas, there exists a variety of exact sum rules which relate the particle one-body and two-body densities. The necessary condition for these sum rules is that the Coulomb fluid possesses good screening properties, i.e. the particle correlation functions or the averaged charge inhomogeneity, say close to a wall, exhibit a short-range (usually exponential) decay. In this work, we study equilibrium statistical mechanics of an electric double layer with counter-ions only, i.e. a globally neutral system of equally charged point-like particles in the vicinity of a plain hard wall carrying a fixed uniform surface charge density of opposite sign. At large distances from the wall, the one-body and two-body counter-ion densities go to zero slowly according to the inverse-power law. In spite of the absence of screening, all known sum rules are shown to hold for two exactly solvable cases of the present system: in the weak-coupling Poisson-Boltzmann limit (in any spatial dimension larger than one) and at a special free-fermion coupling constant in two dimensions. This fact indicates an extended validity of the sum rules and provides a consistency check for reasonable theoretical approaches.
65nm OPC and design optimization by using simple electrical transistor simulation
NASA Astrophysics Data System (ADS)
Trouiller, Yorick; Devoivre, Thierry; Belledent, Jerome; Foussadier, Franck; Borjon, Amandine; Patterson, Kyle; Lucas, Kevin; Couderc, Christophe; Sundermann, Frank; Urbani, Jean-Christophe; Baron, Stanislas; Rody, Yves; Chapon, Jean-Damien; Arnaud, Franck; Entradas, Jorge
2005-05-01
In the context of 65nm logic technology where gate CD control budget requirements are below 5nm, it is mandatory to properly quantify the impact of the 2D effects on the electrical behavior of the transistor [1,2]. This study uses the following sequence to estimate the impact on transistor performance: 1) A lithographic simulation is performed after OPC (Optical Proximity Correction) of active and poly using a calibrated model at best conditions. Some extrapolation of this model can also be used to assess marginalities due to process window (focus, dose, mask errors, and overlay). In our case study, we mainly checked the poly to active misalignment effects. 2) Electrical behavior of the transistor (Ion, Ioff, Vt) is calculated based on a derivative spice model using the simulated image of the gate as an input. In most of the cases Ion analysis, rather than Vt or leakage, gives sufficient information for patterning optimization. We have demonstrated the benefit of this approach with two different examples: -design rule trade-off : we estimated the impact with and without misalignment of critical rules like poly corner to active distance, active corner to poly distance or minimum space between small transistor and big transistor. -Library standard cell debugging: we applied this methodology to the most critical one hundred transistors of our standard cell libraries and calculate Ion behavior with and without misalignment between active and poly. We compared two scanner illumination modes and two OPC versions based on the behavior of the one hundred transistors. We were able to see the benefits of one illumination, and also the improvement in the OPC maturity.
Design rules for RCA self-aligned silicon-gate CMOS/SOS process
NASA Technical Reports Server (NTRS)
1977-01-01
The CMOS/SOS design rules prepared by the RCA Solid State Technology Center (SSTC) are described. These rules specify the spacing and width requirements for each of the six design levels, the seventh level being used to define openings in the passivation level. An associated report, entitled Silicon-Gate CMOS/SOS Processing, provides further insight into the usage of these rules.
Conceptual design of ACB-CP for ITER cryogenic system
NASA Astrophysics Data System (ADS)
Jiang, Yongcheng; Xiong, Lianyou; Peng, Nan; Tang, Jiancheng; Liu, Liqiang; Zhang, Liang
2012-06-01
ACB-CP (Auxiliary Cold Box for Cryopumps) is used to supply the cryopumps system with necessary cryogen in ITER (International Thermonuclear Experimental Reactor) cryogenic distribution system. The conceptual design of ACB-CP contains thermo-hydraulic analysis, 3D structure design and strength checking. Through the thermohydraulic analysis, the main specifications of process valves, pressure safety valves, pipes, heat exchangers can be decided. During the 3D structure design process, vacuum requirement, adiabatic requirement, assembly constraints and maintenance requirement have been considered to arrange the pipes, valves and other components. The strength checking has been performed to crosscheck if the 3D design meets the strength requirements for the ACB-CP.
Use of standard vocabulary services in validation of water resources data
NASA Astrophysics Data System (ADS)
Yu, Jonathan; Cox, Simon; Ratcliffe, David
2010-05-01
Ontology repositories are increasingly being exposed through vocabulary and concept services. Primarily this is in support of resource discovery. Thesaurus functionality and even more sophisticated reasoning offers the possibility of overcoming the limitations of simple text-matching and tagging which is the basis of most search. However, controlled vocabularies have other important roles in distributed systems: in particular in constraining content validity. A national water information system established by the Australian Bureau of Meterorology ('the Bureau') has deployed a system for ingestion of data from multiple providers. This uses a http interface onto separately maintained vocabulary services as part of the quality assurance chain. With over 200 data providers potentially transferring data to the Bureau, a standard XML-based Water Data Transfer Format (WDTF) was developed for receipt of data into an integrated national water information system. The WDTF schema was built upon standards from the Open Geospatial Consortium (OGC). The structure and syntax specified by a W3C XML Schema is complemented by additional constraints described using Schematron. These implement important content requirements and business rules including: • Restricted cardinality: where optional elements and attributes inherited from the base standards become mandatory in the application, or repeatable elements or attributes are limited to one or omitted. For example, the sampledFeature element from O&M is optional but is mandatory for a samplingPoint element in WDTF. • Vocabulary checking: WDTF data use seventeen vocabularies or code lists derived from Regulations under the Commonwealth Water Act 2007. Examples of codelists are the Australian Water Regulations list, observed property vocabulary, and units of measures. • Contextual constraints: in many places, the permissible value is dependent on the value of another field. For example, within observations the unit of measure must be commensurate with the observed property type Validation of data submitted in WDTF uses a two-pass approach. First, syntax and structural validation is performed by standard XML Schema validation tools. Second, validation of contextual constraints and code list checking is performed using a hybrid method combining context-sensitive rule-based validation (allowing the rules to be expressed within a given context) and semantic vocabulary services. Schematron allows rules to incorporate assertions of XPath expressions to access and constrain element content, therefore enabling contextual constraints. Schematron is also used to perform element cardinality checking. The vocabularies or code lists are formalized in SKOS (Simple Knowledge Organization System), an RDF-based language. SKOS provides mechanisms to define concepts, associate them with (multi-lingual) labels or terms, and record thesaurus-like relationships between them. The vocabularies are managed in a RDF database or semantic triple store. Querying is implemented as a semantic vocabulary service, with an http-based API that allows queries to be issued from rules written in Schematron. WDTF has required development and deployment of some ontologies whose scope is much more general than this application, in particular covering 'observed properties' and 'units of measure', which also have to be related to each other and consistent with the dimensional analysis. Separation of the two validation passes reflects the separate governance and stability of the structural and content rules, and allows an organisation's business rules to be moved out of the XML schema definition and the XML schema to be reused by other businesses with their own specific rules. With the general approach proven, harmonization opportunities with more generic services are being explored, such as the GEMET API for SKOS, developed by the European Environment Agency. Acknowledgements: The authors would like to thank the AUSCOPE team for their development and support provided of the vocabulary services.
Water system microbial check valve development
NASA Technical Reports Server (NTRS)
Colombo, G. V.; Greenley, D. R.; Putnam, D. F.
1978-01-01
A residual iodine microbial check valve (RIMCV) assembly was developed and tested. The assembly is designed to be used in the space shuttle potable water system. The RIMCV is based on an anion exchange resin that is supersaturated with an iodine solution. This system causes a residual to be present in the effluent water which provides continuing bactericidal action. A flight prototype design was finalized and five units were manufactured and delivered.
Design of Installing Check Dam Using RAMMS Model in Seorak National Park of South Korea
NASA Astrophysics Data System (ADS)
Jun, K.; Tak, W.; JUN, B. H.; Lee, H. J.; KIM, S. D.
2016-12-01
Design of Installing Check Dam Using RAMMS Model in Seorak National Park of South Korea Kye-Won Jun*, Won-Jun Tak*, Byong-Hee Jun**, Ho-Jin Lee***, Soung-Doug Kim* *Graduate School of Disaster Prevention, Kangwon National University, 346 Joogang-ro, Samcheok-si, Gangwon-do, Korea **School of Fire and Disaster Protection, Kangwon National University, 346 Joogang-ro, Samcheok-si, Gangwon-do, Korea ***School of Civil Engineering, Chungbuk National University, 1 Chungdae-ro, Seowon-gu, Cheongju, Korea Abstract As more than 64% of the land in South Korea is mountainous area, so many regions in South Korea are exposed to the danger of landslide and debris flow. So it is important to understand the behavior of debris flow in mountainous terrains, the various methods and models are being presented and developed based on the mathematical concept. The purpose of this study is to investigate the regions that experienced the debris flow due to typhoon called Ewiniar and to perform numerical modeling to design and layout of the Check dam for reducing the damage by the debris flow. For the performance of numerical modeling, on-site measurement of the research area was conducted including: topographic investigation, research on bridges in the downstream, and precision LiDAR 3D scanning for composed basic data of numerical modeling. The numerical simulation of this study was performed using RAMMS (Rapid Mass Movements Simulation) model for the analysis of the debris flow. This model applied to the conditions of the Check dam which was installed in the upstream, midstream, and downstream. Considering the reduction effect of debris flow, the expansion of debris flow, and the influence on the bridges in the downstream, proper location of the Check dam was designated. The result of present numerical model showed that when the Check dam was installed in the downstream section, 50 m above the bridge, the reduction effect of the debris flow was higher compared to when the Check dam were installed in other sections. Key words: Debris flow, LiDAR, Check dam, RAMMSAcknowledgementsThis research was supported by a grant [MPSS-NH-2014-74] through the Disaster and Safety Management Institute funded by Ministry of Public Safety and Security of Korean government
A Hybrid Genetic Programming Algorithm for Automated Design of Dispatching Rules.
Nguyen, Su; Mei, Yi; Xue, Bing; Zhang, Mengjie
2018-06-04
Designing effective dispatching rules for production systems is a difficult and timeconsuming task if it is done manually. In the last decade, the growth of computing power, advanced machine learning, and optimisation techniques has made the automated design of dispatching rules possible and automatically discovered rules are competitive or outperform existing rules developed by researchers. Genetic programming is one of the most popular approaches to discovering dispatching rules in the literature, especially for complex production systems. However, the large heuristic search space may restrict genetic programming from finding near optimal dispatching rules. This paper develops a new hybrid genetic programming algorithm for dynamic job shop scheduling based on a new representation, a new local search heuristic, and efficient fitness evaluators. Experiments show that the new method is effective regarding the quality of evolved rules. Moreover, evolved rules are also significantly smaller and contain more relevant attributes.
Proceedings of the First NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
Meyer, Claas; Reutter, Michaela; Matzdorf, Bettina; Sattler, Claudia; Schomers, Sarah
2015-07-01
In recent years, increasing attention has been paid to financial environmental policy instruments that have played important roles in solving agri-environmental problems throughout the world, particularly in the European Union and the United States. The ample and increasing literature on Payments for Ecosystem Services (PES) and agri-environmental measures (AEMs), generally understood as governmental PES, shows that certain single design rules may have an impact on the success of a particular measure. Based on this research, we focused on the interplay of several design rules and conducted a comparative analysis of AEMs' institutional arrangements by examining 49 German cases. We analyzed the effects of the design rules and certain rule combinations on the success of AEMs. Compliance and noncompliance with the hypothesized design rules and the success of the AEMs were surveyed by questioning the responsible agricultural administration and the AEMs' mid-term evaluators. The different rules were evaluated in regard to their necessity and sufficiency for success using Qualitative Comparative Analysis (QCA). Our results show that combinations of certain design rules such as environmental goal targeting and area targeting conditioned the success of the AEMs. Hence, we generalize design principles for AEMs and discuss implications for the general advancement of ecosystem services and the PES approach in agri-environmental policies. Moreover, we highlight the relevance of the results for governmental PES program research and design worldwide. Copyright © 2015 Elsevier Ltd. All rights reserved.
Portable design rules for bulk CMOS
NASA Technical Reports Server (NTRS)
Griswold, T. W.
1982-01-01
It is pointed out that for the past several years, one school of IC designers has used a simplified set of nMOS geometric design rules (GDR) which is 'portable', in that it can be used by many different nMOS manufacturers. The present investigation is concerned with a preliminary set of design rules for bulk CMOS which has been verified for simple test structures. The GDR are defined in terms of Caltech Intermediate Form (CIF), which is a geometry-description language that defines simple geometrical objects in layers. The layers are abstractions of physical mask layers. The design rules do not presume the existence of any particular design methodology. Attention is given to p-well and n-well CMOS processes, bulk CMOS and CMOS-SOS, CMOS geometric rules, and a description of the advantages of CMOS technology.
Simple Check Valves for Microfluidic Devices
NASA Technical Reports Server (NTRS)
Willis, Peter A.; Greer, Harold F.; Smith, J. Anthony
2010-01-01
A simple design concept for check valves has been adopted for microfluidic devices that consist mostly of (1) deformable fluorocarbon polymer membranes sandwiched between (2) borosilicate float glass wafers into which channels, valve seats, and holes have been etched. The first microfluidic devices in which these check valves are intended to be used are micro-capillary electrophoresis (microCE) devices undergoing development for use on Mars in detecting compounds indicative of life. In this application, it will be necessary to store some liquid samples in reservoirs in the devices for subsequent laboratory analysis, and check valves are needed to prevent cross-contamination of the samples. The simple check-valve design concept is also applicable to other microfluidic devices and to fluidic devices in general. These check valves are simplified microscopic versions of conventional rubber- flap check valves that are parts of numerous industrial and consumer products. These check valves are fabricated, not as separate components, but as integral parts of microfluidic devices. A check valve according to this concept consists of suitably shaped portions of a deformable membrane and the two glass wafers between which the membrane is sandwiched (see figure). The valve flap is formed by making an approximately semicircular cut in the membrane. The flap is centered over a hole in the lower glass wafer, through which hole the liquid in question is intended to flow upward into a wider hole, channel, or reservoir in the upper glass wafer. The radius of the cut exceeds the radius of the hole by an amount large enough to prevent settling of the flap into the hole. As in a conventional rubber-flap check valve, back pressure in the liquid pushes the flap against the valve seat (in this case, the valve seat is the adjacent surface of the lower glass wafer), thereby forming a seal that prevents backflow.
NASA Astrophysics Data System (ADS)
Gabor, Allen H.; Brendler, Andrew C.; Brunner, Timothy A.; Chen, Xuemei; Culp, James A.; Levinson, Harry J.
2018-03-01
The relationship between edge placement error, semiconductor design-rule determination and predicted yield in the era of EUV lithography is examined. This paper starts with the basics of edge placement error and then builds up to design-rule calculations. We show that edge placement error (EPE) definitions can be used as the building blocks for design-rule equations but that in the last several years the term "EPE" has been used in the literature to refer to many patterning errors that are not EPE. We then explore the concept of "Good Fields"1 and use it predict the n-sigma value needed for design-rule determination. Specifically, fundamental yield calculations based on the failure opportunities per chip are used to determine at what n-sigma "value" design-rules need to be tested to ensure high yield. The "value" can be a space between two features, an intersect area between two features, a minimum area of a feature, etc. It is shown that across chip variation of design-rule important values needs to be tested at sigma values between seven and eight which is much higher than the four-sigma values traditionally used for design-rule determination. After recommending new statistics be used for design-rule calculations the paper examines the impact of EUV lithography on sources of variation important for design-rule calculations. We show that stochastics can be treated as an effective dose variation that is fully sampled across every chip. Combining the increased within chip variation from EUV with the understanding that across chip variation of design-rule important values needs to not cause a yield loss at significantly higher sigma values than have traditionally been looked at, the conclusion is reached that across-wafer, wafer-to-wafer and lot-to-lot variation will have to overscale for any technology introducing EUV lithography where stochastic noise is a significant fraction of the effective dose variation. We will emphasize stochastic effects on edge placement error distributions and appropriate design-rule setting. While CD distributions with long tails coming from stochastic effects do bring increased risk of failure (especially on chips that may have over a billion failure opportunities per layer) there are other sources of variation that have sharp cutoffs, i.e. have no tails. We will review these sources and show how distributions with different skew and kurtosis values combine.
Chemical Safety Alert: Shaft Blow-Out Hazard of Check and Butterfly Valves
Certain types of check and butterfly valves can undergo shaft-disk separation and fail catastrophically, even when operated within their design limits of pressure and temperature, causing toxic/flammable gas releases, fires, and vapor cloud explosions.
Validity criteria for Fermi’s golden rule scattering rates applied to metallic nanowires
NASA Astrophysics Data System (ADS)
Moors, Kristof; Sorée, Bart; Magnus, Wim
2016-09-01
Fermi’s golden rule underpins the investigation of mobile carriers propagating through various solids, being a standard tool to calculate their scattering rates. As such, it provides a perturbative estimate under the implicit assumption that the effect of the interaction Hamiltonian which causes the scattering events is sufficiently small. To check the validity of this assumption, we present a general framework to derive simple validity criteria in order to assess whether the scattering rates can be trusted for the system under consideration, given its statistical properties such as average size, electron density, impurity density et cetera. We derive concrete validity criteria for metallic nanowires with conduction electrons populating a single parabolic band subjected to different elastic scattering mechanisms: impurities, grain boundaries and surface roughness.
A Rule Based Approach to ISS Interior Volume Control and Layout
NASA Technical Reports Server (NTRS)
Peacock, Brian; Maida, Jim; Fitts, David; Dory, Jonathan
2001-01-01
Traditional human factors design involves the development of human factors requirements based on a desire to accommodate a certain percentage of the intended user population. As the product is developed human factors evaluation involves comparison between the resulting design and the specifications. Sometimes performance metrics are involved that allow leniency in the design requirements given that the human performance result is satisfactory. Clearly such approaches may work but they give rise to uncertainty and negotiation. An alternative approach is to adopt human factors design rules that articulate a range of each design continuum over which there are varying outcome expectations and interactions with other variables, including time. These rules are based on a consensus of human factors specialists, designers, managers and customers. The International Space Station faces exactly this challenge in interior volume control, which is based on anthropometric, performance and subjective preference criteria. This paper describes the traditional approach and then proposes a rule-based alternative. The proposed rules involve spatial, temporal and importance dimensions. If successful this rule-based concept could be applied to many traditional human factors design variables and could lead to a more effective and efficient contribution of human factors input to the design process.
Regionalization of the C-17A Home Station Check to Minimize Costs
2014-06-13
flexibility, and impact to combat operations. The goal of this research is to provide an analysis to determine if there are benefits to C-17 HSC...of the flight and 310 knots or 0.74 Mach for the cruise portion, and used standard Instrument Flight Rules ( IFR ) routes of flight to each base...maintenance flexibility, and possible impact to combat operations. Thus, despite the savings potential, there are a few limitations worth mentioning in
ERIC Educational Resources Information Center
Dymock, Susan; Nicholson, Tom
2017-01-01
The ubiquitous weekly spelling test assumes that words are best learned by memorisation and testing but is this the best way? This study compared two well-known approaches to spelling instruction, the rule based and visual memory approaches. A group of 55 seven-year-olds in two Year 3 classrooms was taught spelling in small groups for three…
Developing an approach for teaching and learning about Lewis structures
NASA Astrophysics Data System (ADS)
Kaufmann, Ilana; Hamza, Karim M.; Rundgren, Carl-Johan; Eriksson, Lars
2017-08-01
This study explores first-year university students' reasoning as they learn to draw Lewis structures. We also present a theoretical account of the formal procedure commonly taught for drawing these structures. Students' discussions during problem-solving activities were video recorded and detailed analyses of the discussions were made through the use of practical epistemology analysis (PEA). Our results show that the formal procedure was central for drawing Lewis structures, but its use varied depending on situational aspects. Commonly, the use of individual steps of the formal procedure was contingent on experiences of chemical structures, and other information such as the characteristics of the problem given. The analysis revealed a number of patterns in how students constructed, checked and modified the structure in relation to the formal procedure and the situational aspects. We suggest that explicitly teaching the formal procedure as a process of constructing, checking and modifying might be helpful for students learning to draw Lewis structures. By doing so, the students may learn to check the accuracy of the generated structure not only in relation to the octet rule and formal charge, but also to other experiences that are not explicitly included in the formal procedure.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-16
...-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer Period for Commission Action on Proceedings to Determine Whether to Approve or Disapprove Proposed Rule Change To Establish... proposed rule change to establish various ``Benchmark Orders'' under NASDAQ Rule 4751(f). The proposed rule...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-29
...-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer Period for Commission Action on Proceedings To Determine Whether To Approve or Disapprove Proposed Rule Change To Amend Rule...,\\2\\ a proposed rule change to amend Exchange Rule 4626--Limitation of Liability (``accommodation...
Prototype data terminal: Multiplexer/demultiplexer
NASA Technical Reports Server (NTRS)
Leck, D. E.; Goodwin, J. E.
1972-01-01
The design and operation of a quad redundant data terminal and a multiplexer/demultiplexer (MDU) design are described. The most unique feature is the design of the quad redundant data terminal. This is one of the few designs where the unit is fail/op, fail/op, fail/safe. Laboratory tests confirm that the unit will operate satisfactorily with the failure of three out of four channels. Although the design utilizes state-of-the-art technology. The waveform error checks, the voting techniques, and the parity bit checks are believed to be used in unique configurations. Correct word selection routines are also novel, if not unique. The MDU design, while not redundant, utilizes, the latest state-of-the-art advantages of light couplers and integrated circuit amplifiers.
Pill testing or drug checking in Australia: Acceptability of service design features.
Barratt, Monica J; Bruno, Raimondo; Ezard, Nadine; Ritter, Alison
2018-02-01
This study aimed to determine design features of a drug-checking service that would be feasible, attractive and likely to be used by Australian festival and nightlife attendees. Web survey of 851 Australians reporting use of psychostimulants and/or hallucinogens and attendance at licensed venues past midnight and/or festivals in the past year (70% male; median age 23 years). A drug-checking service located at festivals or clubs would be used by 94%; a fixed-site service external to such events by 85%. Most (80%) were willing to wait an hour for their result. Almost all (94%) would not use a service if there was a possibility of arrest, and a majority (64%) would not use a service that did not provide individual feedback of results. Drug-checking results were only slightly more attractive if they provided comprehensive quantitative results compared with qualitative results of key ingredients. Most (93%) were willing to pay up to $5, and 68% up to $10, per test. One-third (33%) reported willingness to donate a whole dose for testing: they were more likely to be male, younger, less experienced, use drugs more frequently and attend venues/festivals less frequently. In this sample, festival- or club-based drug-checking services with low wait times and low cost appear broadly attractive under conditions of legal amnesty and individualised feedback. Quantitative analysis of ecstasy pills requiring surrender of a whole pill may appeal to a minority in Australia where pills are more expensive than elsewhere. [Barratt MJ, Bruno R, Ezard N, Ritter A. Pill testing or drug checking in Australia: Acceptability of service design features. Drug Alcohol Rev 2017;00:000-000]. © 2017 Australasian Professional Society on Alcohol and other Drugs.
Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda
2016-01-01
Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules' performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2-4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved.
Check valve installation in pilot operated relief valve prevents reverse pressurization
NASA Technical Reports Server (NTRS)
Oswalt, L.
1966-01-01
Two check valves prevent reverse flow through pilot-operated relief valves of differential area piston design. Title valves control pressure flow to ensure that the piston dome pressure is always at least as great as the main relief valve discharge pressure.
Mobile contract services: what you need to know.
Inman, M
2000-01-01
With sufficient planning and ongoing attention to detail, the performance of a mobile imaging service provider can exceed expectations and requirements. The relationship can prove to be mutually agreeable and profitable for many years. But, when contracting mobile services, you cannot spend too much time on initial research and detail. Several scenarios present outsourcing or mobile services as an acceptable alternative to purchase or lease: outdated equipment, novel or under-utilized technologies, the need for incrementally added or temporary service. To find suitable providers, check with peer sources in your area for recommendations; look specifically for facilities that are comparable in size and volume to your facility. Expect that larger volume facilities will rate more favorable schedules or pricing. Obtain and check references. Require mobile service providers to adhere to the same state and federal laws, rules and regulations that govern your facility; receive the assurance of compliance in writing if it is not specifically addressed in the contract. JCAHO requires that any contract service provider be governed by the same requirements as the accredited facility. Several other rules or licensing requirements may also pertain to mobile services. A prevailing reason for outsourcing imaging services is high equipment costs that cannot be justified with current volume projections. However, equipment quality should not be compromised; it must meet your needs and be in good repair. The mobile service provider you choose should be an extension of your department; quality standards must exist unilaterally. The set rule for assessing mobile service fees is that there is no set rule. There are many ways to negotiate the fee schedule so that it meets the needs of both parties. An effective marketing campaign lets physicians and patients know what you have available. Work with the mobile service provider to plan an initial announcement or open house. The mobile provider should also have patient education materials for referring physicians and your hospital. Having the mobile technologist meet with the radiologists to discuss written protocols will eliminate misunderstandings concerning expectations of both parties; ongoing communication is vital.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-24
... Organizations; C2 Options Exchange, Incorporated; Order Approving a Proposed Rule Change To Adopt a Designated... thereunder,\\2\\ a proposed rule change to adopt a Designated Primary Market-Maker (``DPM'') program. The... the Notice, C2 has proposed to adopt a DPM program. The associated proposed rules are based on the...
Program Model Checking as a New Trend
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Visser, Willem; Clancy, Daniel (Technical Monitor)
2002-01-01
This paper introduces a special section of STTT (International Journal on Software Tools for Technology Transfer) containing a selection of papers that were presented at the 7th International SPIN workshop, Stanford, August 30 - September 1, 2000. The workshop was named SPIN Model Checking and Software Verification, with an emphasis on model checking of programs. The paper outlines the motivation for stressing software verification, rather than only design and model verification, by presenting the work done in the Automated Software Engineering group at NASA Ames Research Center within the last 5 years. This includes work in software model checking, testing like technologies and static analysis.
NASA Astrophysics Data System (ADS)
Pan, Xiaolong; Liu, Bo; Zheng, Jianglong; Tian, Qinghua
2016-08-01
We propose and demonstrate a low complexity Reed-Solomon-based low-density parity-check (RS-LDPC) code with adaptive puncturing decoding algorithm for elastic optical transmission system. Partial received codes and the relevant column in parity-check matrix can be punctured to reduce the calculation complexity by adaptive parity-check matrix during decoding process. The results show that the complexity of the proposed decoding algorithm is reduced by 30% compared with the regular RS-LDPC system. The optimized code rate of the RS-LDPC code can be obtained after five times iteration.
Sediment depositions upstream of open check dams: new elements from small scale models
NASA Astrophysics Data System (ADS)
Piton, Guillaume; Le Guern, Jules; Carbonari, Costanza; Recking, Alain
2015-04-01
Torrent hazard mitigation remains a big issue in mountainous regions. In steep slope streams and especially in their fan part, torrential floods mainly result from abrupt and massive sediment deposits. To curtail such phenomenon, soil conservation measures as well as torrent control works have been undertaken for decades. Since the 1950s, open check dams complete other structural and non-structural measures in watershed scale mitigation plans1. They are often built to trap sediments near the fan apexes. The development of earthmoving machinery after the WWII facilitated the dredging operations of open check dams. Hundreds of these structures have thus been built for 60 years. Their design evolved with the improving comprehension of torrential hydraulics and sediment transport; however this kind of structure has a general tendency to trap most of the sediments supplied by the headwaters. Secondary effects as channel incision downstream of the traps often followed an open check dam creation. This sediment starvation trend tends to propagate to the main valley rivers and to disrupt past geomorphic equilibriums. Taking it into account and to diminish useless dredging operation, a better selectivity of sediment trapping must be sought in open check dams, i.e. optimal open check dams would trap sediments during dangerous floods and flush them during normal small floods. An accurate description of the hydraulic and deposition processes that occur in sediment traps is needed to optimize existing structures and to design best-adjusted new structures. A literature review2 showed that if design criteria exist for the structure itself, little information is available on the dynamic of the sediment depositions upstream of open check dams, i.e. what are the geomorphic patterns that occur during the deposition?, what are the relevant friction laws and sediment transport formula that better describe massive depositions in sediment traps?, what are the range of Froude and Shields numbers that the flows tend to adopt? New small scale model experiments have been undertaken focusing on depositions processes and their related hydraulics. Accurate photogrammetric measurements allowed us to better describe the deposition processes3. Large Scale Particle Image Velocimetry (LS-PIV) was performed to determine surface velocity fields in highly active channels with low grain submersion4. We will present preliminary results of our experiments showing the new elements we observed in massive deposit dynamics. REFERENCES 1.Armanini, A., Dellagiacoma, F. & Ferrari, L. From the check dam to the development of functional check dams. Fluvial Hydraulics of Mountain Regions 37, 331-344 (1991). 2.Piton, G. & Recking, A. Design of sediment traps with open check dams: a review, part I: hydraulic and deposition processes. (Accepted by the) Journal of Hydraulic Engineering 1-23 (2015). 3.Le Guern, J. Ms Thesis: Modélisation physique des plages de depot : analyse de la dynamique de remplissage.(2014) . 4.Carbonari, C. Ms Thesis: Small scale experiments of deposition processes occuring in sediment traps, LS-PIV measurments and geomorphological descriptions. (in preparation).
Sampling theory and automated simulations for vertical sections, applied to human brain.
Cruz-Orive, L M; Gelšvartas, J; Roberts, N
2014-02-01
In recent years, there have been substantial developments in both magnetic resonance imaging techniques and automatic image analysis software. The purpose of this paper is to develop stereological image sampling theory (i.e. unbiased sampling rules) that can be used by image analysts for estimating geometric quantities such as surface area and volume, and to illustrate its implementation. The methods will ideally be applied automatically on segmented, properly sampled 2D images - although convenient manual application is always an option - and they are of wide applicability in many disciplines. In particular, the vertical sections design to estimate surface area is described in detail and applied to estimate the area of the pial surface and of the boundary between cortex and underlying white matter (i.e. subcortical surface area). For completeness, cortical volume and mean cortical thickness are also estimated. The aforementioned surfaces were triangulated in 3D with the aid of FreeSurfer software, which provided accurate surface area measures that served as gold standards. Furthermore, a software was developed to produce digitized trace curves of the triangulated target surfaces automatically from virtual sections. From such traces, a new method (called the 'lambda method') is presented to estimate surface area automatically. In addition, with the new software, intersections could be counted automatically between the relevant surface traces and a cycloid test grid for the classical design. This capability, together with the aforementioned gold standard, enabled us to thoroughly check the performance and the variability of the different estimators by Monte Carlo simulations for studying the human brain. In particular, new methods are offered to split the total error variance into the orientations, sectioning and cycloid components. The latter prediction was hitherto unavailable--one is proposed here and checked by way of simulations on a given set of digitized vertical sections with automatically superimposed cycloid grids of three different sizes. Concrete and detailed recommendations are given to implement the methods. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.
Runway Rubber Removal Specification Development: Field Evaluation Procedures Development.
1984-07-01
removal was sufficient enough to restore full pave- ment skid resistance (based on tests with a DBV). With regard to high-pressure water rubber ...over a test surface, the rubber slider resists motion-. The force, parallel to the test surface, which acts on the slider registers an output on a dial...PROCEDURE 1. Check rubber shoe for wear . Replace when the edge is worn by more than 3/16 in as measured with a rule laid flat across the slider width. 2
Rules and Regulations for Small Passenger Vessels (Under 100 Gross Tons).
1977-07-01
the space having seats and the num- the conditions under which it is Issued ber permitted by the area criteria for the and whether or not the vesel Is...boilers and unfired pressure § 176.25-32 Pressure vesels .--L vessels shall be checked. (a) At each initial and subsequent in- (3) Pressure vessels which...classes of vesels strued as limiting the marine inspector which, In the course of their voyage, do from making such tests or inspections O not proceed
HAL/S - The programming language for Shuttle
NASA Technical Reports Server (NTRS)
Martin, F. H.
1974-01-01
HAL/S is a higher order language and system, now operational, adopted by NASA for programming Space Shuttle on-board software. Program reliability is enhanced through language clarity and readability, modularity through program structure, and protection of code and data. Salient features of HAL/S include output orientation, automatic checking (with strictly enforced compiler rules), the availability of linear algebra, real-time control, a statement-level simulator, and compiler transferability (for applying HAL/S to additional object and host computers). The compiler is described briefly.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-07
... Stock Exchange LLC Amending NYSE Rule 1 To Provide for the Designation of Qualified Employees and NYSE... qualified employees to act in place of any person named in a rule as having authority to act under such rule... 1 to provide that the Exchange may formally designate one or more qualified employees to act in...
ERIC Educational Resources Information Center
Lai, Mark H. C.; Kwok, Oi-man
2015-01-01
Educational researchers commonly use the rule of thumb of "design effect smaller than 2" as the justification of not accounting for the multilevel or clustered structure in their data. The rule, however, has not yet been systematically studied in previous research. In the present study, we generated data from three different models…
Informatics and data quality at collaborative multicenter Breast and Colon Cancer Family Registries.
McGarvey, Peter B; Ladwa, Sweta; Oberti, Mauricio; Dragomir, Anca Dana; Hedlund, Erin K; Tanenbaum, David Michael; Suzek, Baris E; Madhavan, Subha
2012-06-01
Quality control and harmonization of data is a vital and challenging undertaking for any successful data coordination center and a responsibility shared between the multiple sites that produce, integrate, and utilize the data. Here we describe a coordinated effort between scientists and data managers in the Cancer Family Registries to implement a data governance infrastructure consisting of both organizational and technical solutions. The technical solution uses a rule-based validation system that facilitates error detection and correction for data centers submitting data to a central informatics database. Validation rules comprise both standard checks on allowable values and a crosscheck of related database elements for logical and scientific consistency. Evaluation over a 2-year timeframe showed a significant decrease in the number of errors in the database and a concurrent increase in data consistency and accuracy.
Informatics and data quality at collaborative multicenter Breast and Colon Cancer Family Registries
McGarvey, Peter B; Ladwa, Sweta; Oberti, Mauricio; Dragomir, Anca Dana; Hedlund, Erin K; Tanenbaum, David Michael; Suzek, Baris E
2012-01-01
Quality control and harmonization of data is a vital and challenging undertaking for any successful data coordination center and a responsibility shared between the multiple sites that produce, integrate, and utilize the data. Here we describe a coordinated effort between scientists and data managers in the Cancer Family Registries to implement a data governance infrastructure consisting of both organizational and technical solutions. The technical solution uses a rule-based validation system that facilitates error detection and correction for data centers submitting data to a central informatics database. Validation rules comprise both standard checks on allowable values and a crosscheck of related database elements for logical and scientific consistency. Evaluation over a 2-year timeframe showed a significant decrease in the number of errors in the database and a concurrent increase in data consistency and accuracy. PMID:22323393
EAGLE Monitors by Collecting Facts and Generating Obligations
NASA Technical Reports Server (NTRS)
Barrnger, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik
2003-01-01
We present a rule-based framework, called EAGLE, that has been shown to be capable of defining and implementing a range of finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time and metric temporal logics, interval logics, forms of quantified temporal logics, and so on. A monitor for an EAGLE formula checks if a finite trace of states satisfies the given formula. We present, in details, an algorithm for the synthesis of monitors for EAGLE. The algorithm is implemented as a Java application and involves novel techniques for rule definition, manipulation and execution. Monitoring is achieved on a state-by-state basis avoiding any need to store the input trace of states. Our initial experiments have been successful as EAGLE detected a previously unknown bug while testing a planetary rover controller.
Pan, Tingrui; Baldi, Antonio; Ziaie, Babak
2007-06-01
In this paper, we present two remotely adjustable check-valves with an electrochemical release mechanism for implantable biomedical microsystems. These valves allow one to vary the opening pressure set-point and flow resistance over a period of time. The first design consists of a micromachined check-valve array using a SU-8 polymer structural layer deposited on the top of a gold sacrificial layer. The second design is based on a variable length cantilever beam structure with a gold sacrificial layer. The adjustable cantilever-beam structure is fabricated by gold thermo-compression bond of a thin silicon wafer over a glass substrate. In both designs, the evaporated gold can be electrochemically dissolved using a constant DC current via a telemetry link. In the first design the dissolution simply opens up individual outlets, while in the second design, gold anchors are sequentially dissolved hence increasing the effective length of the cantilever beam (reducing the opening pressure). A current density of 35 mA/cm(2) is used to dissolve the gold sacrificial layers. Both gravity and syringe-pump driven flow are used to characterize the valve performance. A multi-stage fluidic performance (e.g. flow resistance and opening pressure) is clearly demonstrated.
14 CFR 121.315 - Cockpit check procedure.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Cockpit check procedure. 121.315 Section 121.315 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED... emergencies. The procedures must be designed so that a flight crewmember will not need to rely upon his memory...
REACH. Teacher's Guide Volume II. Check Points.
ERIC Educational Resources Information Center
Georgia Univ., Athens. Div. of Vocational Education.
Designed for use with individualized instructional units (CE 026 345-347, CE 026 349-351) in the REACH (Refrigeration, Electro-Mechanical, Air-Conditioning, Heating) electromechanical cluster, this second volume of the postsecondary teacher guide contains the check points which the instructor may want to refer to when the unit sheet directs the…
Using computer models to design gully erosion control structures for humid northern Ethiopia
USDA-ARS?s Scientific Manuscript database
Classic gully erosion control measures such as check dams have been unsuccessful in halting gully formation and growth in the humid northern Ethiopian highlands. Gullies are typically formed in vertisols and flow often bypasses the check dams as elevated groundwater tables make gully banks unstable....
AUTOMOTIVE DIESEL MAINTENANCE 2. UNIT XXV, MICHIGAN/CLARK TRANSMISSION--TROUBLESHOOTING.
ERIC Educational Resources Information Center
Minnesota State Dept. of Education, St. Paul. Div. of Vocational and Technical Education.
THIS MODULE OF A 25-MODULE COURSE IS DESIGNED TO DEVELOP AN UNDERSTANDING OF TROUBLESHOOTING PROCEDURES FOR A SPECIFIC TRANSMISSION USED ON DIESEL POWERED EQUIPMENT. TOPICS ARE (1) PRELIMINARY CHECKS, (2) PRESSURE AND OIL FLOW CHECKS, (3) TROUBLESHOOTING TABLES, (4) TROUBLESHOOTING VEHICLES UNDER FIELD CONDITIONS, AND (5) ANALYZING UNACCEPTABLE…
Slit-check dams for the control of debris flow
NASA Astrophysics Data System (ADS)
Armanini, Aronne; Larcher, Michele
2017-04-01
Debris flows are paroxysmal events that mobilize, alongside water, huge quantities of sediment in a very short time, then with both solid and liquid huge discharges, possibly exceeding the capacity of the current torrent restoration works. In this respect, the climate change forcing cannot be ignored. In the majority of urbanized areas, that are generally the most vulnerable, there is often not enough space to create channelling works able to let the volumes pass through without overflowing. The simplest, less expensive and most sustainable solution consists in reducing the peak solid discharge by creating storage areas upstream of the settlements, typically upstream of the alluvial fans, allowing for reduced works of canalization, that are compatible with the constraints imposed by the urbanization. The general idea consists in storing a part of the flowing solids during the peak of the hydrograph and releasing it in a successive phase or during minor floods. For this purpose, and in order to optimize the solid peak discharge reduction, it is necessary that properly designed open-check dams, capable of inducing a significative sedimentation of the solid discharge only when this exceeds a design-threshold value, control the deposition basins. A correct design of the check dam is crucial in order to induce the sedimentation in the right amount and at the right moment: a too early sedimentation might fill the volume before the peak, like in the case of close-check dams, while a too weak sedimentation might not use the whole available volume. In both cases, the channelling works might not be sufficient to let all the flow pass through, compromising the safety of the settlement. To avoid this inconvenience, we propose the use of slit-check dams, whose efficiency has already been proved for bed load. Check dams are often designed only on the base of the designer's experience. Besides, even today it is often believed that the filtering effect of open check dams is exerted through a mechanical sieve, while it was proved that the retention of the solid material is rather due to a hydrodynamic effect induced by the narrowing of the section. Also in the case of debris flow, through proper balances of liquid and solid mass and energy it is possible to obtain a rational criterion for designing the width of the slit in order to obtain a sediment deposition of desired elevation for a given design discharge. In this way the use of the retention basin can be optimized in order to maximize the reduction of the debris flow peak discharge. Flume experiments were carried out in steady conditions at the University of Trento and confirmed with good agreement the prediction of the theory. As in the case of ordinary sediment transport, the clogging induced by the vegetal material represents the major problem for the operational reliability of this systems and needs therefore to be further investigated.
Krolikowski, Maciej P; Black, Amanda M; Palacios-Derflingher, Luz; Blake, Tracy A; Schneider, Kathryn J; Emery, Carolyn A
2017-02-01
Ice hockey is a popular winter sport in Canada. Concussions account for the greatest proportion of all injuries in youth ice hockey. In 2011, a policy change enforcing "zero tolerance for head contact" was implemented in all leagues in Canada. To determine if the risk of game-related concussions and more severe concussions (ie, resulting in >10 days of time loss) and the mechanisms of a concussion differed for Pee Wee class (ages 11-12 years) and Bantam class (ages 13-14 years) players after the 2011 "zero tolerance for head contact" policy change compared with players in similar divisions before the policy change. Cohort study; Level of evidence, 3. The retrospective cohort included Pee Wee (most elite 70%, 2007-2008; n = 891) and Bantam (most elite 30%, 2008-2009; n = 378) players before the rule change and Pee Wee (2011-2012; n = 588) and Bantam (2011-2012; n = 242) players in the same levels of play after the policy change. Suspected concussions were identified by a team designate and referred to a sport medicine physician for diagnosis. Incidence rate ratios (IRRs) were estimated based on multiple Poisson regression analysis, controlling for clustering by team and other important covariates and offset by game-exposure hours. Incidence rates based on the mechanisms of a concussion were estimated based on univariate Poisson regression analysis. The risk of game-related concussions increased after the head contact rule in Pee Wee (IRR, 1.85; 95% CI, 1.20-2.86) and Bantam (IRR, 2.48; 95% CI, 1.17-5.24) players. The risk of more severe concussions increased after the head contact rule in Pee Wee (IRR, 4.12; 95% CI, 2.00-8.50) and Bantam (IRR, 7.91; 95% CI, 3.13-19.94) players. The rates of concussions due to body checking and direct head contact increased after the rule change. The "zero tolerance for head contact" policy change did not reduce the risk of game-related concussions in Pee Wee or Bantam class ice hockey players. Increased concussion awareness and education after the policy change may have contributed to the increased risk of concussions found after the policy change.
PrimerStation: a highly specific multiplex genomic PCR primer design server for the human genome
Yamada, Tomoyuki; Soma, Haruhiko; Morishita, Shinichi
2006-01-01
PrimerStation () is a web service that calculates primer sets guaranteeing high specificity against the entire human genome. To achieve high accuracy, we used the hybridization ratio of primers in liquid solution. Calculating the status of sequence hybridization in terms of the stringent hybridization ratio is computationally costly, and no web service checks the entire human genome and returns a highly specific primer set calculated using a precise physicochemical model. To shorten the response time, we precomputed candidates for specific primers using a massively parallel computer with 100 CPUs (SunFire 15 K) about 3 months in advance. This enables PrimerStation to search and output qualified primers interactively. PrimerStation can select highly specific primers suitable for multiplex PCR by seeking a wider temperature range that minimizes the possibility of cross-reaction. It also allows users to add heuristic rules to the primer design, e.g. the exclusion of single nucleotide polymorphisms (SNPs) in primers, the avoidance of poly(A) and CA-repeats in the PCR products, and the elimination of defective primers using the secondary structure prediction. We performed several tests to verify the PCR amplification of randomly selected primers for ChrX, and we confirmed that the primers amplify specific PCR products perfectly. PMID:16845094
John, Shalini; Thangapandian, Sundarapandian; Lee, Keun Woo
2012-01-01
Human pancreatic cholesterol esterase (hCEase) is one of the lipases found to involve in the digestion of large and broad spectrum of substrates including triglycerides, phospholipids, cholesteryl esters, etc. The presence of bile salts is found to be very important for the activation of hCEase. Molecular dynamic simulations were performed for the apoform and bile salt complexed form of hCEase using the co-ordinates of two bile salts from bovine CEase. The stability of the systems throughout the simulation time was checked and two representative structures from the highly populated regions were selected using cluster analysis. These two representative structures were used in pharmacophore model generation. The generated pharmacophore models were validated and used in database screening. The screened hits were refined for their drug-like properties based on Lipinski's rule of five and ADMET properties. The drug-like compounds were further refined by molecular docking simulation using GOLD program based on the GOLD fitness score, mode of binding, and molecular interactions with the active site amino acids. Finally, three hits of novel scaffolds were selected as potential leads to be used in novel and potent hCEase inhibitor design. The stability of binding modes and molecular interactions of these final hits were re-assured by molecular dynamics simulations.
First experiences with the LHC BLM sanity checks
NASA Astrophysics Data System (ADS)
Emery, J.; Dehning, B.; Effinger, E.; Nordt, A.; Sapinski, M. G.; Zamantzas, C.
2010-12-01
The reliability concerns have driven the design of the Large Hardron Collider (LHC) Beam Loss Monitoring (BLM) system from the early stage of the studies up to the present commissioning and the latest development of diagnostic tools. To protect the system against non-conformities, new ways of automatic checking have been developed and implemented. These checks are regularly and systematically executed by the LHC operation team to ensure that the system status is after each test "as good as new". The sanity checks are part of this strategy. They are testing the electrical part of the detectors (ionisation chamber or secondary emission detector), their cable connections to the front-end electronics, further connections to the back-end electronics and their ability to request a beam abort. During the installation and in the early commissioning phase, these checks have shown their ability to find also non-conformities caused by unexpected failure event scenarios. In every day operation, a non-conformity discovered by this check inhibits any further injections into the LHC until the check confirms the absence of non-conformities.
Microspheres as resistive elements in a check valve for low pressure and low flow rate conditions.
Ou, Kevin; Jackson, John; Burt, Helen; Chiao, Mu
2012-11-07
In this paper we describe a microsphere-based check valve integrated with a micropump. The check valve uses Ø20 μm polystyrene microspheres to rectify flow in low pressure and low flow rate applications (Re < 1). The microspheres form a porous medium in the check valve increasing fluidic resistance based on the direction of flow. Three check valve designs were fabricated and characterized to study the microspheres' effectiveness as resistive elements. A maximum diodicity (ratio of flow in the forward and reverse direction) of 18 was achieved. The pumping system can deliver a minimum flow volume of 0.25 μL and a maximum flow volume of 1.26 μL under an applied pressure of 0.2 kPa and 1 kPa, respectively. A proof-of-concept study was conducted using a pharmaceutical agent, docetaxel (DTX), as a sample drug showing the microsphere check valve's ability to limit diffusion from the micropump. The proposed check valve and pumping concept shows strong potential for implantable drug delivery applications with low flow rate requirements.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-04
... Capital Commitment Schedule (``CCS'') interest; (3) NYSE Rule 70.25 to permit d-Quotes to be designated... that MPL Orders may interact with CCS interest; (3) NYSE Rule 70.25 to permit d- Quotes to be... the CCS pursuant to Rule 1000 would not be permitted to be designated as MPL Orders. The CCS is a...
50 CFR 424.16 - Proposed rules.
Code of Federal Regulations, 2011 CFR
2011-10-01
... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.16 Proposed rules. (a) General. Based... any proposed rule to list, delist, or reclassify a species, or to designate or revise critical habitat...
Space shuttle prototype check valve development
NASA Technical Reports Server (NTRS)
Tellier, G. F.
1976-01-01
Contaminant-resistant seal designs and a dynamically stable prototype check valve for the orbital maneuvering and reaction control helium pressurization systems of the space shuttle were developed. Polymer and carbide seal models were designed and tested. Perfluoroelastomers compatible with N2O4 and N2H4 types were evaluated and compared with Teflon in flat and captive seal models. Low load sealing and contamination resistance tests demonstrated cutter seal superiority over polymer seals. Ceramic and carbide materials were evaluated for N2O4 service using exposure to RFNA as a worst case screen; chemically vapor deposited tungsten carbide was shown to be impervious to the acid after 6 months immersion. A unique carbide shell poppet/cutter seat check valve was designed and tested to demonstrate low cracking pressure ( 2.0 psid), dynamic stability under all test bench flow conditions, contamination resistance (0.001 inch CRES wires cut with 1.5 pound seat load) and long life of 100,000 cycles (leakage 1.0 scc/hr helium from 0.1 to 400 psig).
UTP and Temporal Logic Model Checking
NASA Astrophysics Data System (ADS)
Anderson, Hugh; Ciobanu, Gabriel; Freitas, Leo
In this paper we give an additional perspective to the formal verification of programs through temporal logic model checking, which uses Hoare and He Unifying Theories of Programming (UTP). Our perspective emphasizes the use of UTP designs, an alphabetised relational calculus expressed as a pre/post condition pair of relations, to verify state or temporal assertions about programs. The temporal model checking relation is derived from a satisfaction relation between the model and its properties. The contribution of this paper is that it shows a UTP perspective to temporal logic model checking. The approach includes the notion of efficiency found in traditional model checkers, which reduced a state explosion problem through the use of efficient data structures
Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda
2016-01-01
Background Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules’ performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Methods Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. Results A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2–4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Conclusion Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved. PMID:26730980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pin, F.G.
Sensor-based operation of autonomous robots in unstructured and/or outdoor environments has revealed to be an extremely challenging problem, mainly because of the difficulties encountered when attempting to represent the many uncertainties which are always present in the real world. These uncertainties are primarily due to sensor imprecisions and unpredictability of the environment, i.e., lack of full knowledge of the environment characteristics and dynamics. An approach. which we have named the {open_quotes}Fuzzy Behaviorist Approach{close_quotes} (FBA) is proposed in an attempt to remedy some of these difficulties. This approach is based on the representation of the system`s uncertainties using Fuzzy Set Theory-basedmore » approximations and on the representation of the reasoning and control schemes as sets of elemental behaviors. Using the FBA, a formalism for rule base development and an automated generator of fuzzy rules have been developed. This automated system can automatically construct the set of membership functions corresponding to fuzzy behaviors. Once these have been expressed in qualitative terms by the user. The system also checks for completeness of the rule base and for non-redundancy of the rules (which has traditionally been a major hurdle in rule base development). Two major conceptual features, the suppression and inhibition mechanisms which allow to express a dominance between behaviors are discussed in detail. Some experimental results obtained with the automated fuzzy, rule generator applied to the domain of sensor-based navigation in aprion unknown environments. using one of our autonomous test-bed robots as well as a real car in outdoor environments, are then reviewed and discussed to illustrate the feasibility of large-scale automatic fuzzy rule generation using the {open_quotes}Fuzzy Behaviorist{close_quotes} concepts.« less
Sweeney, N; Owen, H; Fronsko, R; Hurlow, E
2012-11-01
Anaesthetists may subject patients to unnecessary risk by not checking anaesthetic equipment thoroughly before use. Numerous adverse events have been associated with failure to check equipment. The Australian and New Zealand College of Anaesthetists and anaesthetic delivery system manufactures have made recommendations on how anaesthetic equipment should be maintained and checked before use and for the training required for staff who use such equipment. These recommendations are made to minimise the risk to patients undergoing anaesthesia. This prospective audit investigated the adherence of anaesthetic practitioners to a selection of those recommendations. Covert observations of anaesthetic practitioners were made while they were checking their designated anaesthetic machine, either at the beginning of a day's list or between cases. Structured interviews with staff who check the anaesthetic machine were carried out to determine the training they had received. The results indicated poor compliance with recommendations: significantly, the backup oxygen cylinders' pressure/contents were not checked in 45% of observations; the emergency ventilation device was not checked in 67% of observations; the breathing circuit was not tested between patients in 79% of observations; no documentation of the checks performed was done in any cases; and no assessment or accreditation of the staff who performed these checks was performed. It was concluded that the poor compliance was a system failing and that patient safety might be increased with training and accrediting staff responsible for checking equipment, documenting the checks performed, and the formulation and use of a checklist.
Methods for Geometric Data Validation of 3d City Models
NASA Astrophysics Data System (ADS)
Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.
2015-12-01
Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges are detected, among additional criteria. Self-intersection might lead to different results, e.g. intersection points, lines or areas. Depending on the geometric constellation, they might represent gaps between bounding polygons of the solids, overlaps, or violations of the 2-manifoldness. Not least due to the floating point problem in digital numbers, tolerances must be considered in some algorithms, e.g. planarity and solid self-intersection. Effects of different tolerance values and their handling is discussed; recommendations for suitable values are given. The goal of the paper is to give a clear understanding of geometric validation in the context of 3D city models. This should also enable the data holder to get a better comprehension of the validation results and their consequences on the deployment fields of the validated data set.
Lam, Simon C; Lui, Andrew K F; Lee, Linda Y K; Lee, Joseph K L; Wong, K F; Lee, Cathy N Y
2016-05-01
The use of N95 respirators prevents spread of respiratory infectious agents, but leakage hampers its protection. Manufacturers recommend a user seal check to identify on-site gross leakage. However, no empirical evidence is provided. Therefore, this study aims to examine validity of a user seal check on gross leakage detection in commonly used types of N95 respirators. A convenience sample of 638 nursing students was recruited. On the wearing of 3 different designs of N95 respirators, namely 3M-1860s, 3M-1862, and Kimberly-Clark 46827, the standardized user seal check procedure was carried out to identify gross leakage. Repeated testing of leakage was followed by the use of a quantitative fit testing (QNFT) device in performing normal breathing and deep breathing exercises. Sensitivity, specificity, predictive values, and likelihood ratios were calculated accordingly. As indicated by QNFT, prevalence of actual gross leakage was 31.0%-39.2% with the 3M respirators and 65.4%-65.8% with the Kimberly-Clark respirator. Sensitivity and specificity of the user seal check for identifying actual gross leakage were approximately 27.7% and 75.5% for 3M-1860s, 22.1% and 80.5% for 3M-1862, and 26.9% and 80.2% for Kimberly-Clark 46827, respectively. Likelihood ratios were close to 1 (range, 0.89-1.51) for all types of respirators. The results did not support user seal checks in detecting any actual gross leakage in the donning of N95 respirators. However, such a check might alert health care workers that donning a tight-fitting respirator should be performed carefully. Copyright © 2016 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Anderer, Peter; Gruber, Georg; Parapatics, Silvia; Woertz, Michael; Miazhynskaia, Tatiana; Klosch, Gerhard; Saletu, Bernd; Zeitlhofer, Josef; Barbanoj, Manuel J; Danker-Hopfe, Heidi; Himanen, Sari-Leena; Kemp, Bob; Penzel, Thomas; Grozinger, Michael; Kunz, Dieter; Rappelsberger, Peter; Schlogl, Alois; Dorffner, Georg
2005-01-01
To date, the only standard for the classification of sleep-EEG recordings that has found worldwide acceptance are the rules published in 1968 by Rechtschaffen and Kales. Even though several attempts have been made to automate the classification process, so far no method has been published that has proven its validity in a study including a sufficiently large number of controls and patients of all adult age ranges. The present paper describes the development and optimization of an automatic classification system that is based on one central EEG channel, two EOG channels and one chin EMG channel. It adheres to the decision rules for visual scoring as closely as possible and includes a structured quality control procedure by a human expert. The final system (Somnolyzer 24 x 7) consists of a raw data quality check, a feature extraction algorithm (density and intensity of sleep/wake-related patterns such as sleep spindles, delta waves, SEMs and REMs), a feature matrix plausibility check, a classifier designed as an expert system, a rule-based smoothing procedure for the start and the end of stages REM, and finally a statistical comparison to age- and sex-matched normal healthy controls (Siesta Spot Report). The expert system considers different prior probabilities of stage changes depending on the preceding sleep stage, the occurrence of a movement arousal and the position of the epoch within the NREM/REM sleep cycles. Moreover, results obtained with and without using the chin EMG signal are combined. The Siesta polysomnographic database (590 recordings in both normal healthy subjects aged 20-95 years and patients suffering from organic or nonorganic sleep disorders) was split into two halves, which were randomly assigned to a training and a validation set, respectively. The final validation revealed an overall epoch-by-epoch agreement of 80% (Cohen's kappa: 0.72) between the Somnolyzer 24 x 7 and the human expert scoring, as compared with an inter-rater reliability of 77% (Cohen's kappa: 0.68) between two human experts scoring the same dataset. Two Somnolyzer 24 x 7 analyses (including a structured quality control by two human experts) revealed an inter-rater reliability close to 1 (Cohen's kappa: 0.991), which confirmed that the variability induced by the quality control procedure, whereby approximately 1% of the epochs (in 9.5% of the recordings) are changed, can definitely be neglected. Thus, the validation study proved the high reliability and validity of the Somnolyzer 24 x 7 and demonstrated its applicability in clinical routine and sleep studies.
NASA Technical Reports Server (NTRS)
Donner, Kimberly A.; Holden, Kritina L.; Manahan, Meera K.
1991-01-01
Investigated are five designs of software-based ON/OFF indicators in a hypothetical Space Station Power System monitoring task. The hardware equivalent of the indicators used in the present study is the traditional indicator light that illuminates an ON label or an OFF label. Coding methods used to represent the active state were reverse video, color, frame, check, or reverse video with check. Display background color was also varied. Subjects made judgments concerning the state of indicators that resulted in very low error rates and high percentages of agreement across indicator designs. Response time measures for each of the five indicator designs did not differ significantly, although subjects reported that color was the best communicator. The impact of these results on indicator design is discussed.
Zhang, Jie; Wang, Yuping; Feng, Junhong
2013-01-01
In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption.
Wang, Yuping; Feng, Junhong
2013-01-01
In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption. PMID:23766683
30 CFR 77.804 - High-voltage trailing cables; minimum design requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... equipped with metallic shields around each power conductor with one or more ground conductors having a total cross-sectional area of not less than one-half the power conductor, and with an insulated conductor for the ground continuity check circuit. External ground check conductors may be used if they are...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-26
... Maintenance Manual (AMM) includes chapters 05-10 ``Time Limits'', 05-15 ``Critical Design Configuration... 05, ``Time Limits/Maintenance Checks,'' of BAe 146 Series/AVRO 146-RJ Series Aircraft Maintenance... Chapter 05, ``Time Limits/ Maintenance Checks,'' of the BAE SYSTEMS (Operations) Limited BAe 146 Series...
12 CFR Appendix E to Part 229 - Commentary
Code of Federal Regulations, 2013 CFR
2013-01-01
... by another entity. The Board believes that the statutory proximity test was designed to apply to.... The EFA Act defines a certified check as one to which a bank has certified that the drawer's signature... by regulations.” The Board has defined check processing region as the territory served by one of the...
12 CFR Appendix E to Part 229 - Commentary
Code of Federal Regulations, 2011 CFR
2011-01-01
... by another entity. The Board believes that the statutory proximity test was designed to apply to.... The EFA Act defines a certified check as one to which a bank has certified that the drawer's signature... by regulations.” The Board has defined check processing region as the territory served by one of the...
12 CFR Appendix E to Part 229 - Commentary
Code of Federal Regulations, 2014 CFR
2014-01-01
... by another entity. The Board believes that the statutory proximity test was designed to apply to.... The EFA Act defines a certified check as one to which a bank has certified that the drawer's signature... by regulations.” The Board has defined check processing region as the territory served by one of the...
12 CFR Appendix E to Part 229 - Commentary
Code of Federal Regulations, 2012 CFR
2012-01-01
... by another entity. The Board believes that the statutory proximity test was designed to apply to.... The EFA Act defines a certified check as one to which a bank has certified that the drawer's signature... by regulations.” The Board has defined check processing region as the territory served by one of the...
"Check Your Smile", Prototype of a Collaborative LSP Website for Technical Vocabulary
ERIC Educational Resources Information Center
Yassine-Diab, Nadia; Alazard-Guiu, Charlotte; Loiseau, Mathieu; Sorin, Laurent; Orliac, Charlotte
2016-01-01
In a design-based research approach (Barab & Squire, 2004), we are currently developing the first prototype of a collaborative Language for Specific Purposes (LSP) website. It focuses on technical vocabulary to help students master any field of LSP better. "Check Your Smile" is a platform aggregating various types of gameplays for…
ERIC Educational Resources Information Center
Davies, Daniel K.; Stock, Steven E.; Wehmeyer, Michael L.
2003-01-01
This report describes results of an initial investigation of the utility of a specially designed money management software program for improving management of personal checking accounts for individuals with mental retardation. Use with 19 adults with mental retardation indicated the software resulted in significant reduction in check writing and…
49 CFR 1562.23 - Aircraft operator and passenger requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... designated by an aircraft operator under paragraph (a) of this section: (1) Must undergo a fingerprint-based... compliance with the fingerprint-based criminal history records check requirements of §§ 1542.209, 1544.229... a fingerprint-based criminal history records check that does not disclose that he or she has a...
49 CFR 1562.23 - Aircraft operator and passenger requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... designated by an aircraft operator under paragraph (a) of this section: (1) Must undergo a fingerprint-based... compliance with the fingerprint-based criminal history records check requirements of §§ 1542.209, 1544.229... a fingerprint-based criminal history records check that does not disclose that he or she has a...
Ice hockey shoulder pad design and the effect on head response during shoulder-to-head impacts.
Richards, Darrin; Ivarsson, B Johan; Scher, Irving; Hoover, Ryan; Rodowicz, Kathleen; Cripton, Peter
2016-11-01
Ice hockey body checks involving direct shoulder-to-head contact frequently result in head injury. In the current study, we examined the effect of shoulder pad style on the likelihood of head injury from a shoulder-to-head check. Shoulder-to-head body checks were simulated by swinging a modified Hybrid-III anthropomorphic test device (ATD) with and without shoulder pads into a stationary Hybrid-III ATD at 21 km/h. Tests were conducted with three different styles of shoulder pads (traditional, integrated and tethered) and without shoulder pads for the purpose of control. Head response kinematics for the stationary ATD were measured. Compared to the case of no shoulder pads, the three different pad styles significantly (p < 0.05) reduced peak resultant linear head accelerations of the stationary ATD by 35-56%. The integrated shoulder pads reduced linear head accelerations by an additional 18-21% beyond the other two styles of shoulder pads. The data presented here suggest that shoulder pads can be designed to help protect the head of the struck player in a shoulder-to-head check.
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
NASA Astrophysics Data System (ADS)
Sha, Wei E. I.; Zhu, Hugh L.; Chen, Luzhou; Chew, Weng Cho; Choy, Wallace C. H.
2015-02-01
It is well known that transport paths of photocarriers (electrons and holes) before collected by electrodes strongly affect bulk recombination and thus electrical properties of solar cells, including open-circuit voltage and fill factor. For boosting device performance, a general design rule, tailored to arbitrary electron to hole mobility ratio, is proposed to decide the transport paths of photocarriers. Due to a unique ability to localize and concentrate light, plasmonics is explored to manipulate photocarrier transport through spatially redistributing light absorption at the active layer of devices. Without changing the active materials, we conceive a plasmonic-electrical concept, which tunes electrical properties of solar cells via the plasmon-modified optical field distribution, to realize the design rule. Incorporating spectrally and spatially configurable metallic nanostructures, thin-film solar cells are theoretically modelled and experimentally fabricated to validate the design rule and verify the plasmonic-tunable electrical properties. The general design rule, together with the plasmonic-electrical effect, contributes to the evolution of emerging photovoltaics.
A business rules design framework for a pharmaceutical validation and alert system.
Boussadi, A; Bousquet, C; Sabatier, B; Caruba, T; Durieux, P; Degoulet, P
2011-01-01
Several alert systems have been developed to improve the patient safety aspects of clinical information systems (CIS). Most studies have focused on the evaluation of these systems, with little information provided about the methodology leading to system implementation. We propose here an 'agile' business rule design framework (BRDF) supporting both the design of alerts for the validation of drug prescriptions and the incorporation of the end user into the design process. We analyzed the unified process (UP) design life cycle and defined the activities, subactivities, actors and UML artifacts that could be used to enhance the agility of the proposed framework. We then applied the proposed framework to two different sets of data in the context of the Georges Pompidou University Hospital (HEGP) CIS. We introduced two new subactivities into UP: business rule specification and business rule instantiation activity. The pharmacist made an effective contribution to five of the eight BRDF design activities. Validation of the two new subactivities was effected in the context of drug dosage adaption to the patients' clinical and biological contexts. Pilot experiment shows that business rules modeled with BRDF and implemented as an alert system triggered an alert for 5824 of the 71,413 prescriptions considered (8.16%). A business rule design framework approach meets one of the strategic objectives for decision support design by taking into account three important criteria posing a particular challenge to system designers: 1) business processes, 2) knowledge modeling of the context of application, and 3) the agility of the various design steps.
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-25
... DEPARTMENT OF AGRICULTURE Agricultural Marketing Service 7 CFR Part 922 [Docket No. AMS-FV-12-0028... Regulations AGENCY: Agricultural Marketing Service, USDA. ACTION: Affirmation of interim rule as final rule... the marketing order for apricots grown in designated Counties in Washington. The interim rule...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rawls, G.; Newhouse, N.; Rana, M.
2010-04-13
The Boiler and Pressure Vessel Project Team on Hydrogen Tanks was formed in 2004 to develop Code rules to address the various needs that had been identified for the design and construction of up to 15000 psi hydrogen storage vessel. One of these needs was the development of Code rules for high pressure composite vessels with non-load sharing liners for stationary applications. In 2009, ASME approved new Appendix 8, for Section X Code which contains the rules for these vessels. These vessels are designated as Class III vessels with design pressure ranging from 20.7 MPa (3,000 ps)i to 103.4 MPamore » (15,000 psi) and maximum allowable outside liner diameter of 2.54 m (100 inches). The maximum design life of these vessels is limited to 20 years. Design, fabrication, and examination requirements have been specified, included Acoustic Emission testing at time of manufacture. The Code rules include the design qualification testing of prototype vessels. Qualification includes proof, expansion, burst, cyclic fatigue, creep, flaw, permeability, torque, penetration, and environmental testing.« less
Sleboda, Patrycja; Sokolowska, Joanna
2017-01-01
The first goal of this study was to validate the Rational-Experiential Inventory (REI) and the Cognitive Reflection Test (CRT) through checking their relation to the transitivity axiom. The second goal was to test the relation between decision strategies and cognitive style as well as the relation between decision strategies and the transitivity of preferences. The following characteristics of strategies were investigated: requirements for trade-offs, maximization vs. satisficing and option-wise vs. attribute-wise information processing. Respondents were given choices between two multi-attribute options. The options were designed so that the choice indicated which strategy was applied. Both the REI-R and the CRT were found to be good predictors of the transitivity of preferences. Respondents who applied compensatory strategies and the maximization criterion scored highly on the REI-R and in the CRT, whereas those who applied the satisficing rule scored highly on the REI-R but not in the CRT. Attribute-wise information processing was related to low scores in both measurements. Option-wise information processing led to a high transitivity of preferences. PMID:29093695
Sleboda, Patrycja; Sokolowska, Joanna
2017-01-01
The first goal of this study was to validate the Rational-Experiential Inventory ( REI ) and the Cognitive Reflection Test ( CRT ) through checking their relation to the transitivity axiom. The second goal was to test the relation between decision strategies and cognitive style as well as the relation between decision strategies and the transitivity of preferences. The following characteristics of strategies were investigated: requirements for trade-offs, maximization vs. satisficing and option-wise vs. attribute-wise information processing. Respondents were given choices between two multi-attribute options. The options were designed so that the choice indicated which strategy was applied. Both the REI-R and the CRT were found to be good predictors of the transitivity of preferences. Respondents who applied compensatory strategies and the maximization criterion scored highly on the REI-R and in the CRT , whereas those who applied the satisficing rule scored highly on the REI-R but not in the CRT . Attribute-wise information processing was related to low scores in both measurements. Option-wise information processing led to a high transitivity of preferences.
Genetic reinforcement learning through symbiotic evolution for fuzzy controller design.
Juang, C F; Lin, J Y; Lin, C T
2000-01-01
An efficient genetic reinforcement learning algorithm for designing fuzzy controllers is proposed in this paper. The genetic algorithm (GA) adopted in this paper is based upon symbiotic evolution which, when applied to fuzzy controller design, complements the local mapping property of a fuzzy rule. Using this Symbiotic-Evolution-based Fuzzy Controller (SEFC) design method, the number of control trials, as well as consumed CPU time, are considerably reduced when compared to traditional GA-based fuzzy controller design methods and other types of genetic reinforcement learning schemes. Moreover, unlike traditional fuzzy controllers, which partition the input space into a grid, SEFC partitions the input space in a flexible way, thus creating fewer fuzzy rules. In SEFC, different types of fuzzy rules whose consequent parts are singletons, fuzzy sets, or linear equations (TSK-type fuzzy rules) are allowed. Further, the free parameters (e.g., centers and widths of membership functions) and fuzzy rules are all tuned automatically. For the TSK-type fuzzy rule especially, which put the proposed learning algorithm in use, only the significant input variables are selected to participate in the consequent of a rule. The proposed SEFC design method has been applied to different simulated control problems, including the cart-pole balancing system, a magnetic levitation system, and a water bath temperature control system. The proposed SEFC has been verified to be efficient and superior from these control problems, and from comparisons with some traditional GA-based fuzzy systems.
Harvesting river water through small dams promote positive environmental impact.
Agoramoorthy, Govindasamy; Chaudhary, Sunita; Chinnasamy, Pennan; Hsu, Minna J
2016-11-01
While deliberations relating to negative consequences of large dams on the environment continue to dominate world attention, positive benefits provided by small dams, also known as check dams, go unobserved. Besides, little is known about the potential of check dams in mitigating global warming impacts due to less data availability. Small dams are usually commissioned to private contractors who do not have clear mandate from their employers to post their work online for public scrutiny. As a result, statistics on the design, cost, and materials used to build check dams are not available in public domain. However, this review paper presents data for the first time on the often ignored potential of check dams mitigating climate-induced hydrological threats. We hope that the scientific analysis presented in this paper will promote further research on check dams worldwide to better comprehend their eco-friendly significance serving society.
Design, decoding and optimized implementation of SECDED codes over GF(q)
Ward, H Lee; Ganti, Anand; Resnick, David R
2014-06-17
A plurality of columns for a check matrix that implements a distance d linear error correcting code are populated by providing a set of vectors from which to populate the columns, and applying to the set of vectors a filter operation that reduces the set by eliminating therefrom all vectors that would, if used to populate the columns, prevent the check matrix from satisfying a column-wise linear independence requirement associated with check matrices of distance d linear codes. One of the vectors from the reduced set may then be selected to populate one of the columns. The filtering and selecting repeats iteratively until either all of the columns are populated or the number of currently unpopulated columns exceeds the number of vectors in the reduced set. Columns for the check matrix may be processed to reduce the amount of logic needed to implement the check matrix in circuit logic.
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
German experiences in local fatigue monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abib, E.; Bergholz, S.; Rudolph, J.
The ageing management of nuclear power plants (NPP) has gained an increasing importance in the last years. The reasons are mainly due to the international context of extending period of plants operation. Moreover, new scientific discoveries, such as the corrosive influence of the medium on the fatigue process (environmentally assisted fatigue - EAF) play an important role and influence the code development (ASME, EAF code cases). The fatigue damage process takes a central position in ageing mechanisms of components. It must be ensured through appropriate evidence that facilities are being operated under allowable boundary conditions. In the design phase ofmore » NPP, fatigue analyses are still based on theoretical considerations and empirical values, which are summarized in the design transient catalogue, necessary for licensing. These analyses aim at proving the admissibility of the loads in terms of stress and fatigue usage. These analyses will also provide the fatigue-relevant positions in the NPP and give a basis for future design improvements and optimization of operating modes. The design transients are in practice conservatively correlated with the real transients occurring during operation. Uncertainties reveal very conservative assumptions regarding forecast temperatures, temperature gradients and frequencies of events. During operation of the plant, it has to be recurrently proved, that the plant is being operated under designed boundary conditions. Moreover, operating signals are constantly acquired to enable a fatigue evaluation. For example, in Germany fatigue evaluation is based on decades of experience and regulatory requirements. The rule KTA 3201.4 [1] establishes the rules for qualified fatigue monitoring. The rule DIN 25475-3 [2] on fatigue monitoring systems is available in draft version. Experience shows that some significant differences occur between the design transients and the real occurred transients during plant operation. The reasons for it are the various manual control options and also different operating modes. It is clear that showing the covering of real loads by design loads, requires a relatively complex and well-qualified detection process. The difficulty of this task is increased due to the lack of data or incomplete information and the exclusive reliance on existing operation plant data. The strategy of employing local fatigue monitoring is a straightforward solution enabling the direct measurement of loads on the fatigue-sensitive zones. Nowadays a direct derivation of the complete stress tensor at the fatigue-relevant locations is enabled thanks to the recorded local loads and combination with finite element (FE) analyses. So, additionally to the recorded temperature curves, a representation of the time evolution of the six stress components for each monitored component is possible. This allows the application of the simplified elasto-plastic fatigue check according to design codes. The fatigue level can be realistically analyzed with a suitable cycle-counting method. Furthermore, the knowledge of the time evolution of the stresses and strains enables to take into account an environmental factor to include the corrosive fluid influence in the calculations. Without local recording, it is impossible to calculate realistic fatigue usage. AREVA offers the AREVA fatigue concept (AFC) and the new fatigue monitoring system integrated (FAMOSi), necessary tools to monitor local fatigue and to provide realistic assessment. (authors)« less
Ontology Based Quality Evaluation for Spatial Data
NASA Astrophysics Data System (ADS)
Yılmaz, C.; Cömert, Ç.
2015-08-01
Many institutions will be providing data to the National Spatial Data Infrastructure (NSDI). Current technical background of the NSDI is based on syntactic web services. It is expected that this will be replaced by semantic web services. The quality of the data provided is important in terms of the decision-making process and the accuracy of transactions. Therefore, the data quality needs to be tested. This topic has been neglected in Turkey. Data quality control for NSDI may be done by private or public "data accreditation" institutions. A methodology is required for data quality evaluation. There are studies for data quality including ISO standards, academic studies and software to evaluate spatial data quality. ISO 19157 standard defines the data quality elements. Proprietary software such as, 1Spatial's 1Validate and ESRI's Data Reviewer offers quality evaluation based on their own classification of rules. Commonly, rule based approaches are used for geospatial data quality check. In this study, we look for the technical components to devise and implement a rule based approach with ontologies using free and open source software in semantic web context. Semantic web uses ontologies to deliver well-defined web resources and make them accessible to end-users and processes. We have created an ontology conforming to the geospatial data and defined some sample rules to show how to test data with respect to data quality elements including; attribute, topo-semantic and geometrical consistency using free and open source software. To test data against rules, sample GeoSPARQL queries are created, associated with specifications.
Eagle, Dawn M.; Noschang, Cristie; d’Angelo, Laure-Sophie Camilla; Noble, Christie A.; Day, Jacob O.; Dongelmans, Marie Louise; Theobald, David E.; Mar, Adam C.; Urcelay, Gonzalo P.; Morein-Zamir, Sharon; Robbins, Trevor W.
2014-01-01
Excessive checking is a common, debilitating symptom of obsessive-compulsive disorder (OCD). In an established rodent model of OCD checking behaviour, quinpirole (dopamine D2/3-receptor agonist) increased checking in open-field tests, indicating dopaminergic modulation of checking-like behaviours. We designed a novel operant paradigm for rats (observing response task (ORT)) to further examine cognitive processes underpinning checking behaviour and clarify how and why checking develops. We investigated i) how quinpirole increases checking, ii) dependence of these effects on D2/3 receptor function (following treatment with D2/3 receptor antagonist sulpiride) and iii) effects of reward uncertainty. In the ORT, rats pressed an ‘observing’ lever for information about the location of an ‘active’ lever that provided food reinforcement. High- and low-checkers (defined from baseline observing) received quinpirole (0.5 mg/kg, 10 treatments) or vehicle. Parametric task manipulations assessed observing/checking under increasing task demands relating to reinforcement uncertainty (variable response requirement and active-lever location switching). Treatment with sulpiride further probed the pharmacological basis of long-term behavioural changes. Quinpirole selectively increased checking, both functional observing lever presses (OLPs) and non-functional extra OLPs (EOLPs). The increase in OLPs and EOLPs was long-lasting, without further quinpirole administration. Quinpirole did not affect the immediate ability to use information from checking. Vehicle and quinpirole-treated rats (VEH and QNP respectively) were selectively sensitive to different forms of uncertainty. Sulpiride reduced non-functional EOLPs in QNP rats but had no effect on functional OLPs. These data have implications for treatment of compulsive checking in OCD, particularly for serotonin-reuptake-inhibitor treatment-refractory cases, where supplementation with dopamine receptor antagonists may be beneficial. PMID:24406720
A Framework of Simple Event Detection in Surveillance Video
NASA Astrophysics Data System (ADS)
Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao
Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.
A Policy Language for Modelling Recommendations
NASA Astrophysics Data System (ADS)
Abou El Kalam, Anas; Balbiani, Philippe
While current and emergent applications become more and more complex, most of existing security policies and models only consider a yes/no response to the access requests. Consequently, modelling, formalizing and implementing permissions, obligations and prohibitions do not cover the richness of all the possible scenarios. In fact, several applications have access rules with the recommendation access modality. In this paper we focus on the problem of formalizing security policies with recommendation needs. The aim is to provide a generic domain-independent formal system for modelling not only permissions, prohibitions and obligations, but also recommendations. In this respect, we present our logic-based language, the semantics, the truth conditions, our axiomatic as well as inference rules. We also give a representative use case with our specification of recommendation requirements. Finally, we explain how our logical framework could be used to query the security policy and to check its consistency.
The Good, the Bad, and the Ugly: A Theoretical Framework for the Assessment of Continuous Colormaps.
Bujack, Roxana; Turton, Terece L; Samsel, Francesca; Ware, Colin; Rogers, David H; Ahrens, James
2018-01-01
A myriad of design rules for what constitutes a "good" colormap can be found in the literature. Some common rules include order, uniformity, and high discriminative power. However, the meaning of many of these terms is often ambiguous or open to interpretation. At times, different authors may use the same term to describe different concepts or the same rule is described by varying nomenclature. These ambiguities stand in the way of collaborative work, the design of experiments to assess the characteristics of colormaps, and automated colormap generation. In this paper, we review current and historical guidelines for colormap design. We propose a specified taxonomy and provide unambiguous mathematical definitions for the most common design rules.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Designated contract market and swap execution facility position limits and accountability rules. (a) Spot... rules and procedures for monitoring and enforcing spot-month position limits set at levels no greater... monitoring and enforcing spot-month position limits set at levels no greater than 25 percent of estimated...
Code of Federal Regulations, 2013 CFR
2013-04-01
... Designated contract market and swap execution facility position limits and accountability rules. (a) Spot... rules and procedures for monitoring and enforcing spot-month position limits set at levels no greater... monitoring and enforcing spot-month position limits set at levels no greater than 25 percent of estimated...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-19
... institutions to bring those individuals and families who have rarely, if ever, held a checking account, a savings account or other type of transaction or check cashing account at an insured depository institution... size and worth of the ``unbanked'' market in the United States.'' The Household Survey is designed to...
10 CFR 36.81 - Records and retention periods.
Code of Federal Regulations, 2012 CFR
2012-01-01
... required by § 36.55 until the Commission terminates the license. (f) Records of radiation surveys required by § 36.57 for 3 years from the date of the survey. (g) Records of radiation survey meter...) Records on the design checks required by § 36.39 and the construction control checks as required by § 36...
10 CFR 36.81 - Records and retention periods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... required by § 36.55 until the Commission terminates the license. (f) Records of radiation surveys required by § 36.57 for 3 years from the date of the survey. (g) Records of radiation survey meter...) Records on the design checks required by § 36.39 and the construction control checks as required by § 36...
10 CFR 36.81 - Records and retention periods.
Code of Federal Regulations, 2010 CFR
2010-01-01
... required by § 36.55 until the Commission terminates the license. (f) Records of radiation surveys required by § 36.57 for 3 years from the date of the survey. (g) Records of radiation survey meter...) Records on the design checks required by § 36.39 and the construction control checks as required by § 36...
10 CFR 36.81 - Records and retention periods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... required by § 36.55 until the Commission terminates the license. (f) Records of radiation surveys required by § 36.57 for 3 years from the date of the survey. (g) Records of radiation survey meter...) Records on the design checks required by § 36.39 and the construction control checks as required by § 36...
Code of Federal Regulations, 2013 CFR
2013-01-01
...-powered; (5) For a pilot authorized by the Administrator to operate an experimental turbojet-powered aircraft that possesses, by original design or through modification, more than a single seat, the required proficiency check for all of the experimental turbojet-powered aircraft for which the pilot holds an...
Code of Federal Regulations, 2014 CFR
2014-01-01
...-powered; (5) For a pilot authorized by the Administrator to operate an experimental turbojet-powered aircraft that possesses, by original design or through modification, more than a single seat, the required proficiency check for all of the experimental turbojet-powered aircraft for which the pilot holds an...
Plan-Do-Check-Act and the Management of Institutional Research. AIR 1992 Annual Forum Paper.
ERIC Educational Resources Information Center
McLaughlin, Gerald W.; Snyder, Julie K.
This paper describes the application of a Total Quality Management strategy called Plan-Do-Check-Act (PDCA) to the projects and activities of an institutional research office at the Virginia Polytechnic Institute and State University. PDCA is a cycle designed to facilitate incremental continual improvement through change. The specific steps are…
Minimum Check List for Mechanical and Electrical Plans & Specifications.
ERIC Educational Resources Information Center
North Carolina State Dept. of Public Instruction, Raleigh. Div. of School Facility Services.
This is the fifth revision of the Minimum Check List since its origin in 1960 by North Carolina's School Planning. The checklist was developed to serve as a means of communication between school agencies and design professionals and has been widely used in the development and review of mechanical and electrical plans and specifications by…
ERIC Educational Resources Information Center
Maynard, Brandy R.; Kjellstrand, Elizabeth K.; Thompson, Aaron M.
2014-01-01
Objectives: This study examined the effects of Check & Connect (C&C) on the attendance, behavior, and academic outcomes of at-risk youth in a field-based effectiveness trial. Method: A multisite randomized block design was used, wherein 260 primarily Hispanic (89%) and economically disadvantaged (74%) students were randomized to treatment…
28 CFR 8.14 - Disposition of property before forfeiture.
Code of Federal Regulations, 2013 CFR
2013-07-01
... violation of law, is not contraband, and has no design or other characteristics that particularly suit it for use in illegal activities. This payment must be in the form of a money order, an official bank check, or a cashier's check made payable to the United States Marshals Service. A bond in the form of a...
28 CFR 8.14 - Disposition of property before forfeiture.
Code of Federal Regulations, 2014 CFR
2014-07-01
... violation of law, is not contraband, and has no design or other characteristics that particularly suit it for use in illegal activities. This payment must be in the form of a money order, an official bank check, or a cashier's check made payable to the United States Marshals Service. A bond in the form of a...
Delnevo, Cristine D; Gundersen, Daniel A; Manderski, Michelle T B; Giovenco, Daniel P; Giovino, Gary A
2017-08-15
Accurate surveillance is critical for monitoring the epidemiology of emerging tobacco products in the United States, and survey science suggests that survey response format can impact prevalence estimates. We utilized data from the 2014 New Jersey Youth Tobacco Survey (n = 3,909) to compare estimates of the prevalence of 4 behaviors (ever hookah use, current hookah use, ever e-cigarette use, and current e-cigarette use) among New Jersey high school students, as assessed using "check-all-that-apply" questions, with estimates measured by means of "forced-choice" questions. Measurement discrepancies were apparent for all 4 outcomes, with the forced-choice questions yielding prevalence estimates approximately twice those of the check-all-that-apply questions, and agreement was fair to moderate. The sensitivity of the check-all-that-apply questions, treating the forced-choice format as the "gold standard," ranged from 38.1% (current hookah use) to 58.3% (ever e-cigarette use), indicating substantial false-negative rates. These findings highlight the impact of question response format on prevalence estimates of emerging tobacco products among youth and suggest that estimates generated by means of check-all-that-apply questions may be biased downward. Alternative survey designs should be considered to avoid check-all-that-apply response formats, and researchers should use caution when interpreting tobacco use data obtained from check-all-that-apply formats. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
10 CFR 63.142 - Quality assurance criteria.
Code of Federal Regulations, 2014 CFR
2014-01-01
...) Design control. (1) DOE shall establish measures to assure that applicable regulatory requirements and... control design interfaces and for coordination among participating design organizations. These measures... control measures must provide for verifying or checking the adequacy of design, such as by the performance...
10 CFR 63.142 - Quality assurance criteria.
Code of Federal Regulations, 2013 CFR
2013-01-01
...) Design control. (1) DOE shall establish measures to assure that applicable regulatory requirements and... control design interfaces and for coordination among participating design organizations. These measures... control measures must provide for verifying or checking the adequacy of design, such as by the performance...
10 CFR 63.142 - Quality assurance criteria.
Code of Federal Regulations, 2012 CFR
2012-01-01
...) Design control. (1) DOE shall establish measures to assure that applicable regulatory requirements and... control design interfaces and for coordination among participating design organizations. These measures... control measures must provide for verifying or checking the adequacy of design, such as by the performance...
10 CFR 63.142 - Quality assurance criteria.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Design control. (1) DOE shall establish measures to assure that applicable regulatory requirements and... control design interfaces and for coordination among participating design organizations. These measures... control measures must provide for verifying or checking the adequacy of design, such as by the performance...
NASA Astrophysics Data System (ADS)
Maschio, Lorenzo; Kirtman, Bernard; Rérat, Michel; Orlando, Roberto; Dovesi, Roberto
2013-10-01
In this work, we validate a new, fully analytical method for calculating Raman intensities of periodic systems, developed and presented in Paper I [L. Maschio, B. Kirtman, M. Rérat, R. Orlando, and R. Dovesi, J. Chem. Phys. 139, 164101 (2013)]. Our validation of this method and its implementation in the CRYSTAL code is done through several internal checks as well as comparison with experiment. The internal checks include consistency of results when increasing the number of periodic directions (from 0D to 1D, 2D, 3D), comparison with numerical differentiation, and a test of the sum rule for derivatives of the polarizability tensor. The choice of basis set as well as the Hamiltonian is also studied. Simulated Raman spectra of α-quartz and of the UiO-66 Metal-Organic Framework are compared with the experimental data.
Automated software system for checking the structure and format of ACM SIG documents
NASA Astrophysics Data System (ADS)
Mirza, Arsalan Rahman; Sah, Melike
2017-04-01
Microsoft (MS) Office Word is one of the most commonly used software tools for creating documents. MS Word 2007 and above uses XML to represent the structure of MS Word documents. Metadata about the documents are automatically created using Office Open XML (OOXML) syntax. We develop a new framework, which is called ADFCS (Automated Document Format Checking System) that takes the advantage of the OOXML metadata, in order to extract semantic information from MS Office Word documents. In particular, we develop a new ontology for Association for Computing Machinery (ACM) Special Interested Group (SIG) documents for representing the structure and format of these documents by using OWL (Web Ontology Language). Then, the metadata is extracted automatically in RDF (Resource Description Framework) according to this ontology using the developed software. Finally, we generate extensive rules in order to infer whether the documents are formatted according to ACM SIG standards. This paper, introduces ACM SIG ontology, metadata extraction process, inference engine, ADFCS online user interface, system evaluation and user study evaluations.
Rapid detection of irradiated frozen hamburgers
NASA Astrophysics Data System (ADS)
Delincée, Henry
2002-03-01
DNA comet assay can be employed as a rapid and inexpensive screening test to check whether frozen ground beef patties (hamburgers) have been irradiated as a means to increase their safety by eliminating pathogenic bacteria, e.g. E. coli O157:H7. Such a detection procedure will provide an additional check on compliance with existing regulations, e.g. enforcement of labelling and rules in international trade. Frozen ready prepared hamburgers from the market place were `electron irradiated' with doses of 0, 1.3, 2.7, 4.5 and 7.2kGy covering the range of potential commercial irradiation. DNA fragmentation in the hamburgers was made visible within a few hours using the comet assay, and non-irradiated hamburgers could be easily discerned from the irradiated ones. Even after 9 months of frozen storage, irradiated hamburgers could be identified. Since DNA fragmentation may also occur with other food processes (e.g. temperature abuse), positive screening tests shall be confirmed using a validated method to specifically prove an irradiation treatment, e.g. EN 1784 or EN 1785.
EUV mask manufacturing readiness in the merchant mask industry
NASA Astrophysics Data System (ADS)
Green, Michael; Choi, Yohan; Ham, Young; Kamberian, Henry; Progler, Chris; Tseng, Shih-En; Chiou, Tsann-Bim; Miyazaki, Junji; Lammers, Ad; Chen, Alek
2017-10-01
As nodes progress into the 7nm and below regime, extreme ultraviolet lithography (EUVL) becomes critical for all industry participants interested in remaining at the leading edge. One key cost driver for EUV in the supply chain is the reflective EUV mask. As of today, the relatively few end users of EUV consist primarily of integrated device manufactures (IDMs) and foundries that have internal (captive) mask manufacturing capability. At the same time, strong and early participation in EUV by the merchant mask industry should bring value to these chip makers, aiding the wide-scale adoption of EUV in the future. For this, merchants need access to high quality, representative test vehicles to develop and validate their own processes. This business circumstance provides the motivation for merchants to form Joint Development Partnerships (JDPs) with IDMs, foundries, Original Equipment Manufacturers (OEMs) and other members of the EUV supplier ecosystem that leverage complementary strengths. In this paper, we will show how, through a collaborative supplier JDP model between a merchant and OEM, a novel, test chip driven strategy is applied to guide and validate mask level process development. We demonstrate how an EUV test vehicle (TV) is generated for mask process characterization in advance of receiving chip maker-specific designs. We utilize the TV to carry out mask process "stress testing" to define process boundary conditions which can be used to create Mask Rule Check (MRC) rules as well as serve as baseline conditions for future process improvement. We utilize Advanced Mask Characterization (AMC) techniques to understand process capability on designs of varying complexity that include EUV OPC models with and without sub-resolution assist features (SRAFs). Through these collaborations, we demonstrate ways to develop EUV processes and reduce implementation risks for eventual mass production. By reducing these risks, we hope to expand access to EUV mask capability for the broadest community possible as the technology is implemented first within and then beyond the initial early adopters.
Code development for ships -- A demonstration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ayyub, B.; Mansour, A.E.; White, G.
1996-12-31
A demonstration summary of a reliability-based structural design code for ships is presented for two ship types, a cruiser and a tanker. For both ship types, code requirements cover four failure modes: hull girder bulking, unstiffened plate yielding and buckling, stiffened plate buckling, and fatigue of critical detail. Both serviceability and ultimate limit states are considered. Because of limitation on the length, only hull girder modes are presented in this paper. Code requirements for other modes will be presented in future publication. A specific provision of the code will be a safety check expression. The design variables are to bemore » taken at their nominal values, typically values in the safe side of the respective distributions. Other safety check expressions for hull girder failure that include load combination factors, as well as consequence of failure factors, are considered. This paper provides a summary of safety check expressions for the hull girder modes.« less
Approach to design neural cryptography: a generalized architecture and a heuristic rule.
Mu, Nankun; Liao, Xiaofeng; Huang, Tingwen
2013-06-01
Neural cryptography, a type of public key exchange protocol, is widely considered as an effective method for sharing a common secret key between two neural networks on public channels. How to design neural cryptography remains a great challenge. In this paper, in order to provide an approach to solve this challenge, a generalized network architecture and a significant heuristic rule are designed. The proposed generic framework is named as tree state classification machine (TSCM), which extends and unifies the existing structures, i.e., tree parity machine (TPM) and tree committee machine (TCM). Furthermore, we carefully study and find that the heuristic rule can improve the security of TSCM-based neural cryptography. Therefore, TSCM and the heuristic rule can guide us to designing a great deal of effective neural cryptography candidates, in which it is possible to achieve the more secure instances. Significantly, in the light of TSCM and the heuristic rule, we further expound that our designed neural cryptography outperforms TPM (the most secure model at present) on security. Finally, a series of numerical simulation experiments are provided to verify validity and applicability of our results.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-24
... Equities Rule 104 To Adopt Pricing Obligations for Designated Market Makers September 20, 2010. Pursuant to... pricing obligations for Designated Market Makers (``DMMs''). The text of the proposed rule change is... adopt pricing obligations for DMMs. Under the proposal, the Exchange will require DMMs to continuously...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-07
... Organizations; International Securities Exchange, LLC; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change To List and Trade Option Contracts Overlying 10 Shares of a Security June... Act of 1934 (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to list and trade...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-25
...-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer Period for Commission Action on Proceedings To Determine Whether To Approve or Disapprove Proposed Rule Change To Link Market... Rule 19b-4 thereunder,\\2\\ a proposed rule change to discount certain market data fees and increase...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-15
...; Proposed Amendments to Rule G-8, on Books and Records, Rule G- 9, on Record Retention, and Rule G-18, on... of proposed MSRB Rule G-43, on broker's brokers; amendments to MSRB Rule G-8, on books and records...
Low-cost and high-speed optical mark reader based on an intelligent line camera
NASA Astrophysics Data System (ADS)
Hussmann, Stephan; Chan, Leona; Fung, Celine; Albrecht, Martin
2003-08-01
Optical Mark Recognition (OMR) is thoroughly reliable and highly efficient provided that high standards are maintained at both the planning and implementation stages. It is necessary to ensure that OMR forms are designed with due attention to data integrity checks, the best use is made of features built into the OMR, used data integrity is checked before the data is processed and data is validated before it is processed. This paper describes the design and implementation of an OMR prototype system for marking multiple-choice tests automatically. Parameter testing is carried out before the platform and the multiple-choice answer sheet has been designed. Position recognition and position verification methods have been developed and implemented in an intelligent line scan camera. The position recognition process is implemented into a Field Programmable Gate Array (FPGA), whereas the verification process is implemented into a micro-controller. The verified results are then sent to the Graphical User Interface (GUI) for answers checking and statistical analysis. At the end of the paper the proposed OMR system will be compared with commercially available system on the market.
Nguyen, Su; Zhang, Mengjie; Tan, Kay Chen
2017-09-01
Automated design of dispatching rules for production systems has been an interesting research topic over the last several years. Machine learning, especially genetic programming (GP), has been a powerful approach to dealing with this design problem. However, intensive computational requirements, accuracy and interpretability are still its limitations. This paper aims at developing a new surrogate assisted GP to help improving the quality of the evolved rules without significant computational costs. The experiments have verified the effectiveness and efficiency of the proposed algorithms as compared to those in the literature. Furthermore, new simplification and visualisation approaches have also been developed to improve the interpretability of the evolved rules. These approaches have shown great potentials and proved to be a critical part of the automated design system.
75 FR 47063 - Mutual Fund Distribution Fees; Confirmations
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-04
... competition for distribution services. The proposed rule and rule amendments are designed to protect... designed to enhance investor understanding of those charges, limit the cumulative sales charges each...(b) was designed to protect funds from being charged excessive sales and promotional expenses.\\26...
Saving Material with Systematic Process Designs
NASA Astrophysics Data System (ADS)
Kerausch, M.
2011-08-01
Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.
Labeit, Alexander; Peinemann, Frank; Baker, Richard
2013-01-01
Objectives To analyse and compare the determinants of screening uptake for different National Health Service (NHS) health check-ups in the UK. Design Individual-level analysis of repeated cross-sectional surveys with balanced panel data. Setting The UK. Participants Individuals taking part in the British Household Panel Survey (BHPS), 1992–2008. Outcome measure Uptake of NHS health check-ups for cervical cancer screening, breast cancer screening, blood pressure checks, cholesterol tests, dental screening and eyesight tests. Methods Dynamic panel data models (random effects panel probit with initial conditions). Results Having had a health check-up 1 year before, and previously in accordance with the recommended schedule, was associated with higher uptake of health check-ups. Individuals who visited a general practitioner (GP) had a significantly higher uptake in 5 of the 6 health check-ups. Uptake was highest in the recommended age group for breast and cervical cancer screening. For all health check-ups, age had a non-linear relationship. Lower self-rated health status was associated with increased uptake of blood pressure checks and cholesterol tests; smoking was associated with decreased uptake of 4 health check-ups. The effects of socioeconomic variables differed for the different health check-ups. Ethnicity did not have a significant influence on any health check-up. Permanent household income had an influence only on eyesight tests and dental screening. Conclusions Common determinants for having health check-ups are age, screening history and a GP visit. Policy interventions to increase uptake should consider the central role of the GP in promoting screening examinations and in preserving a high level of uptake. Possible economic barriers to access for prevention exist for dental screening and eyesight tests, and could be a target for policy intervention. Trial registration This observational study was not registered. PMID:24366576
Complexity, Training Paradigm Design, and the Contribution of Memory Subsystems to Grammar Learning
Ettlinger, Marc; Wong, Patrick C. M.
2016-01-01
Although there is variability in nonnative grammar learning outcomes, the contributions of training paradigm design and memory subsystems are not well understood. To examine this, we presented learners with an artificial grammar that formed words via simple and complex morphophonological rules. Across three experiments, we manipulated training paradigm design and measured subjects' declarative, procedural, and working memory subsystems. Experiment 1 demonstrated that passive, exposure-based training boosted learning of both simple and complex grammatical rules, relative to no training. Additionally, procedural memory correlated with simple rule learning, whereas declarative memory correlated with complex rule learning. Experiment 2 showed that presenting corrective feedback during the test phase did not improve learning. Experiment 3 revealed that structuring the order of training so that subjects are first exposed to the simple rule and then the complex improved learning. The cumulative findings shed light on the contributions of grammatical complexity, training paradigm design, and domain-general memory subsystems in determining grammar learning success. PMID:27391085
Software tool for physics chart checks.
Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa
2014-01-01
Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.
Symbolic LTL Compilation for Model Checking: Extended Abstract
NASA Technical Reports Server (NTRS)
Rozier, Kristin Y.; Vardi, Moshe Y.
2007-01-01
In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.
Acquisition, representation and rule generation for procedural knowledge
NASA Technical Reports Server (NTRS)
Ortiz, Chris; Saito, Tim; Mithal, Sachin; Loftin, R. Bowen
1991-01-01
Current research into the design and continuing development of a system for the acquisition of procedural knowledge, its representation in useful forms, and proposed methods for automated C Language Integrated Production System (CLIPS) rule generation is discussed. The Task Analysis and Rule Generation Tool (TARGET) is intended to permit experts, individually or collectively, to visually describe and refine procedural tasks. The system is designed to represent the acquired knowledge in the form of graphical objects with the capacity for generating production rules in CLIPS. The generated rules can then be integrated into applications such as NASA's Intelligent Computer Aided Training (ICAT) architecture. Also described are proposed methods for use in translating the graphical and intermediate knowledge representations into CLIPS rules.
Process Materialization Using Templates and Rules to Design Flexible Process Models
NASA Astrophysics Data System (ADS)
Kumar, Akhil; Yao, Wen
The main idea in this paper is to show how flexible processes can be designed by combining generic process templates and business rules. We instantiate a process by applying rules to specific case data, and running a materialization algorithm. The customized process instance is then executed in an existing workflow engine. We present an architecture and also give an algorithm for process materialization. The rules are written in a logic-based language like Prolog. Our focus is on capturing deeper process knowledge and achieving a holistic approach to robust process design that encompasses control flow, resources and data, as well as makes it easier to accommodate changes to business policy.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-16
... be made in a nondiscriminatory fashion.\\14\\ \\14\\ See NYSE Arca Equities Rule 7.45(d)(3). NYSE Arca... Securities will be required to establish and enforce policies and procedures that are reasonably designed to... other things, that the rules of a national securities exchange be designed to prevent fraudulent and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-26
...-Regulatory Organizations; NYSE Arca, Inc.; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change Proposing a Pilot Program To Create a Lead Market Maker Issuer Incentive Program for...'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to create and implement, on a pilot basis, a...
Learning CAD at University through Summaries of the Rules of Design Intent
ERIC Educational Resources Information Center
Barbero, Basilio Ramos; Pedrosa, Carlos Melgosa; Samperio, Raúl Zamora
2017-01-01
The ease with which 3D CAD models may be modified and reused are two key aspects that improve the design-intent variable and that can significantly shorten the development timelines of a product. A set of rules are gathered from various authors that take different 3D modelling strategies into account. These rules are then applied to CAD…
Basis of the tubesheet heat exchanger design rules used in the French pressure vessel code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osweiller, F.
1992-02-01
For about 40 years most tubessheet exchangers have been designed according to the standards of TEMA. Partly due to their simplicity, these rules do not assure a safe heat-exchanger design in all cases. This is the main reason why new tubesheet design rules were developed in 1981 in France for the French pressure vessel code CODAP. For fixed tubesheet heat exchangers, the new rules account for the elastic rotational restraint of the shell and channel at the outer edge of the tubesheet, as proposed in 1959 by Galletly. For floating-head and U-tube heat exchangers, the approach developed by Gardner inmore » 1969 was selected with some modifications. In both cases, the tubesheet is replaced by an equivalent solid plate with adequate effective elastic constants, and the tube bundle is simulated by an elastic foundation. The elastic restraint at the edge of the tubesheet due the shell and channel is accounted for in different ways in the two types of heat exchangers. The purpose of the paper is to present the main basis of these rules and to compare them to TEMA rules.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-03
... Proposed Rule Change Moving the Rule Text That Provides for Pegging on the Exchange From Supplementary Material .26 of NYSE Rule 70 to NYSE Rule 13 and Amending Such Text to (i) Permit Designated Market Maker... of the Terms of Substance of the Proposed Rule Change The Exchange proposes to move the rule text...
2003-09-10
KENNEDY SPACE CENTER, FLA. - Employees check out the new chamber facilities of the Space Life Sciences Lab (SLSL), formerly known as the Space Experiment Research and Processing Laboratory (SERPL). From left are Ray Wheeler, with NASA; Debbie Wells and Larry Burns, with Dynamac; A.O. Rule, president of Environmental Growth Chambers, Inc. (ECG); Neil Yorio, with Dynamac; and John Wiezchowski, with ECG. The SLSL is a state-of-the-art facility being built for ISS biotechnology research. Developed as a partnership between NASA-KSC and the State of Florida, NASA’s life sciences contractor will be the primary tenant of the facility, leasing space to conduct flight experiment processing and NASA-sponsored research. About 20 percent of the facility will be available for use by Florida’s university researchers through the Florida Space Research Institute.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zaks, D; Fletcher, R; Salamon, S
Purpose: To develop an online framework that tracks a patient’s plan from initial simulation to treatment and that helps automate elements of the physics plan checks usually performed in the record and verify (RV) system and treatment planning system. Methods: We have developed PlanTracker, an online plan tracking system that automatically imports new patients tasks and follows it through treatment planning, physics checks, therapy check, and chart rounds. A survey was designed to collect information about the amount of time spent by medical physicists in non-physics related tasks. We then assessed these non-physics tasks for automation. Using these surveys, wemore » directed our PlanTracker software development towards the automation of intra-plan physics review. We then conducted a systematic evaluation of PlanTracker’s accuracy by generating test plans in the RV system software designed to mimic real plans, in order to test its efficacy in catching errors both real and theoretical. Results: PlanTracker has proven to be an effective improvement to the clinical workflow in a radiotherapy clinic. We present data indicating that roughly 1/3 of the physics plan check can be automated, and the workflow optimized, and show the functionality of PlanTracker. When the full system is in clinical use we will present data on improvement of time use in comparison to survey data prior to PlanTracker implementation. Conclusion: We have developed a framework for plan tracking and automatic checks in radiation therapy. We anticipate using PlanTracker as a basis for further development in clinical/research software. We hope that by eliminating the most simple and time consuming checks, medical physicists may be able to spend their time on plan quality and other physics tasks rather than in arithmetic and logic checks. We see this development as part of a broader initiative to advance the clinical/research informatics infrastructure surrounding the radiotherapy clinic. This research project has been financially supported by Varian Medical Systems, Palo Alto, CA, through a Varian MRA.« less
META II Complex Systems Design and Analysis (CODA)
2011-08-01
37 3.8.7 Variables, Parameters and Constraints ............................................................. 37 3.8.8 Objective...18 Figure 7: Inputs, States, Outputs and Parameters of System Requirements Specifications ......... 19...Design Rule Based on Device Parameter ....................................................... 57 Figure 35: AEE Device Design Rules (excerpt
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-14
... Proposed Rule Change To Modify the Requirements To Qualify for Credits as a Designated Liquidity Provider... requirements to qualify for credits as a designated liquidity provider under Rule 7018(i) and to make a minor... Designated Liquidity Providers: Charge to Designated Liquidity Provider $0.003 per share executed entering...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-30
...., wish to apply these airworthiness design standards to other airplane models, OHA, Inc. must submit a... affects only certain airworthiness design standards on Cessna model C172I, C172K, C172L, C172M airplanes... Design Standards for Acceptance Under the Primary Category Rule; Orlando Helicopter Airways (OHA), Inc...
Compositional schedulability analysis of real-time actor-based systems.
Jaghoori, Mohammad Mahdi; de Boer, Frank; Longuet, Delphine; Chothia, Tom; Sirjani, Marjan
2017-01-01
We present an extension of the actor model with real-time, including deadlines associated with messages, and explicit application-level scheduling policies, e.g.,"earliest deadline first" which can be associated with individual actors. Schedulability analysis in this setting amounts to checking whether, given a scheduling policy for each actor, every task is processed within its designated deadline. To check schedulability, we introduce a compositional automata-theoretic approach, based on maximal use of model checking combined with testing. Behavioral interfaces define what an actor expects from the environment, and the deadlines for messages given these assumptions. We use model checking to verify that actors match their behavioral interfaces. We extend timed automata refinement with the notion of deadlines and use it to define compatibility of actor environments with the behavioral interfaces. Model checking of compatibility is computationally hard, so we propose a special testing process. We show that the analyses are decidable and automate the process using the Uppaal model checker.
Bendability optimization of flexible optical nanoelectronics via neutral axis engineering
2012-01-01
The enhancement of bendability of flexible nanoelectronics is critically important to realize future portable and wearable nanoelectronics for personal and military purposes. Because there is an enormous variety of materials and structures that are used for flexible nanoelectronic devices, a governing design rule for optimizing the bendability of these nanodevices is required. In this article, we suggest a design rule to optimize the bendability of flexible nanoelectronics through neutral axis (NA) engineering. In flexible optical nanoelectronics, transparent electrodes such as indium tin oxide (ITO) are usually the most fragile under an external load because of their brittleness. Therefore, we representatively focus on the bendability of ITO which has been widely used as transparent electrodes, and the NA is controlled by employing a buffer layer on the ITO layer. First, we independently investigate the effect of the thickness and elastic modulus of a buffer layer on the bendability of an ITO film. Then, we develop a design rule for the bendability optimization of flexible optical nanoelectronics. Because NA is determined by considering both the thickness and elastic modulus of a buffer layer, the design rule is conceived to be applicable regardless of the material and thickness that are used for the buffer layer. Finally, our design rule is applied to optimize the bendability of an organic solar cell, which allows the bending radius to reach about 1 mm. Our design rule is thus expected to provide a great strategy to enhance the bending performance of a variety of flexible nanoelectronics. PMID:22587757
Bendability optimization of flexible optical nanoelectronics via neutral axis engineering.
Lee, Sangmin; Kwon, Jang-Yeon; Yoon, Daesung; Cho, Handong; You, Jinho; Kang, Yong Tae; Choi, Dukhyun; Hwang, Woonbong
2012-05-15
The enhancement of bendability of flexible nanoelectronics is critically important to realize future portable and wearable nanoelectronics for personal and military purposes. Because there is an enormous variety of materials and structures that are used for flexible nanoelectronic devices, a governing design rule for optimizing the bendability of these nanodevices is required. In this article, we suggest a design rule to optimize the bendability of flexible nanoelectronics through neutral axis (NA) engineering. In flexible optical nanoelectronics, transparent electrodes such as indium tin oxide (ITO) are usually the most fragile under an external load because of their brittleness. Therefore, we representatively focus on the bendability of ITO which has been widely used as transparent electrodes, and the NA is controlled by employing a buffer layer on the ITO layer. First, we independently investigate the effect of the thickness and elastic modulus of a buffer layer on the bendability of an ITO film. Then, we develop a design rule for the bendability optimization of flexible optical nanoelectronics. Because NA is determined by considering both the thickness and elastic modulus of a buffer layer, the design rule is conceived to be applicable regardless of the material and thickness that are used for the buffer layer. Finally, our design rule is applied to optimize the bendability of an organic solar cell, which allows the bending radius to reach about 1 mm. Our design rule is thus expected to provide a great strategy to enhance the bending performance of a variety of flexible nanoelectronics.
McGinty, Emma E; Wolfson, Julia A; Sell, Tara Kirk; Webster, Daniel W
2016-02-01
Gun violence is a critical public health problem in the United States, but it is rarely at the top of the public policy agenda. The 2012 mass shooting in Newtown, Connecticut, opened a rare window of opportunity to strengthen firearm policies in the United States. In this study, we examine the American public's exposure to competing arguments for and against federal- and state-level universal background check laws, which would require a background check prior to every firearm sale, in a large sample of national and regional news stories (n = 486) published in the year following the Newtown shooting. Competing messages about background check laws could influence the outcome of policy debates by shifting support and political engagement among key constituencies such as gun owners and conservatives. We found that news media messages in support of universal background checks were fact-based and used rational arguments, and opposing messages often used rights-based frames designed to activate the core values of politically engaged gun owners. Reframing supportive messages about background check policies to align with gun owners' and conservatives' core values could be a promising strategy to increase these groups' willingness to vocalize their support for expanding background checks for firearm sales. Copyright © 2016 by Duke University Press.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-22
...-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Designation of a Longer Period for Commission Action on a Proposed Rule Change Relating to Wash Sale Transactions and FINRA Rule...-4 thereunder,\\2\\ a proposed rule change to amend FINRA Rule 5210. The proposed rule change was...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2010 CFR
2010-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2011 CFR
2011-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2013 CFR
2013-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
Good pharmacovigilance practices: technology enabled.
Nelson, Robert C; Palsulich, Bruce; Gogolak, Victor
2002-01-01
The assessment of spontaneous reports is most effective it is conducted within a defined and rigorous process. The framework for good pharmacovigilance process (GPVP) is proposed as a subset of good postmarketing surveillance process (GPMSP), a functional structure for both a public health and corporate risk management strategy. GPVP has good practices that implement each step within a defined process. These practices are designed to efficiently and effectively detect and alert the drug safety professional to new and potentially important information on drug-associated adverse reactions. These practices are enabled by applied technology designed specifically for the review and assessment of spontaneous reports. Specific practices include rules-based triage, active query prompts for severe organ insults, contextual single case evaluation, statistical proportionality and correlational checks, case-series analyses, and templates for signal work-up and interpretation. These practices and the overall GPVP are supported by state-of-the-art web-based systems with powerful analytical engines, workflow and audit trials to allow validated systems support for valid drug safety signalling efforts. It is also important to understand that a process has a defined set of steps and any one cannot stand independently. Specifically, advanced use of technical alerting methods in isolation can mislead and allow one to misunderstand priorities and relative value. In the end, pharmacovigilance is a clinical art and a component process to the science of pharmacoepidemiology and risk management.
NASA Technical Reports Server (NTRS)
1974-01-01
Monograph reviews and assesses current design practices, and from them establishes firm guidance for achieving greater consistency in design, increased reliability in end product, and greater efficiency in design effort. Five devices are treated separately. Guides to aid in configuration selection are outlined.
Preventing youth access to alcohol: outcomes from a multi-community time-series trial*.
Wagenaar, Alexander C; Toomey, Traci L; Erickson, Darin J
2005-03-01
AIMS/INTERVENTION: The Complying with the Minimum Drinking Age project (CMDA) is a community trial designed to test effects of two interventions designed to reduce alcohol sales to minors: (1) training for management of retail alcohol establishments and (2) enforcement checks of alcohol establishments. CMDA is a multi-community time-series quasi-experimental trial with a nested cohort design. CMDA was implemented in 20 cities in four geographic areas in the US Midwest. The core outcome, propensity for alcohol sales to minors, was directly tested with research staff who attempted to purchase alcohol without showing age identification using a standardized protocol in 602 on-premise and 340 off-premise alcohol establishments. Data were collected every other week in all communities over 4 years. Mixed-model regression and Box-Jenkins time-series analyses were used to assess short- and long-term establishment-specific and general community-level effects of the two interventions. Effects of the training intervention were mixed. Specific deterrent effects were observed for enforcement checks, with an immediate 17% reduction in likelihood of sales to minors. These effects decayed entirely within 3 months in off-premise establishments and to an 8.2% reduction in on-premise establishments. Enforcement checks prevent alcohol sales to minors. At the intensity levels tested, enforcement primarily affected specific establishments checked, with limited diffusion to the whole community. Finally, most of the enforcement effect decayed within 3 months, suggesting that a regular schedule of enforcement is necessary to maintain deterrence.
Design and Performance Checks of the NPL Axial Heat Flow Apparatus
NASA Astrophysics Data System (ADS)
Wu, J.; Clark, J.; Stacey, C.; Salmon, D.
2015-03-01
This paper describes the design and performance checks of the NPL axial heat flow apparatus developed at the National Physical Laboratory for measurement of thermal conductivity. This apparatus is based on an absolute steady-state technique and is suitable for measuring specimens with thermal conductivities in the range from to and at temperatures between and . A uniform heat flow is induced in a cylindrical bar-shaped specimen that is firmly clamped between a guarded heater unit at the top and a water-cooled base. Heat is supplied at a known rate at the top end of the specimen by the heater unit and constrained to flow axially through the specimen by a surrounding edge-guard system, which is closely matched to the temperature gradient within the test specimen. The performance of this apparatus has been checked against existing NPL thermal-conductivity reference materials NPL 2S89 (based on Stainless Steel 310) and BSC Pure Iron (pure iron supplied by the British Steel Corporation with 99.96 % purity). The measured data produced by the newly designed NPL axial heat flow apparatus agree with the reference data for NPL 2S89 within 2 % and with that of BSC Pure Iron to within 3 % at temperatures from to . This apparatus is being used to provide accurate measurements to industrial and academic organizations and has also been used to develop a new range of NPL reference materials for checking other experimental techniques and procedures for thermal-conductivity measurements.
Sediment trapping efficiency of adjustable check dam in laboratory and field experiment
NASA Astrophysics Data System (ADS)
Wang, Chiang; Chen, Su-Chin; Lu, Sheng-Jui
2014-05-01
Check dam has been constructed at mountain area to block debris flow, but has been filled after several events and lose its function of trapping. For the reason, the main facilities of our research is the adjustable steel slit check dam, which with the advantages of fast building, easy to remove or adjust it function. When we can remove transverse beams to drain sediments off and keep the channel continuity. We constructed adjustable steel slit check dam on the Landow torrent, Huisun Experiment Forest station as the prototype to compare with model in laboratory. In laboratory experiments, the Froude number similarity was used to design the dam model. The main comparisons focused on types of sediment trapping and removing, sediment discharge, and trapping rate of slit check dam. In different types of removing transverse beam showed different kind of sediment removal and differences on rate of sediment removing, removing rate, and particle size distribution. The sediment discharge in check dam with beams is about 40%~80% of check dam without beams. Furthermore, the spacing of beams is considerable factor to the sediment discharge. In field experiment, this research uses time-lapse photography to record the adjustable steel slit check dam on the Landow torrent. The typhoon Soulik made rainfall amounts of 600 mm in eight hours and induced debris flow in Landow torrent. Image data of time-lapse photography demonstrated that after several sediment transport event the adjustable steel slit check dam was buried by debris flow. The result of lab and field experiments: (1)Adjustable check dam could trap boulders and stop woody debris flow and flush out fine sediment to supply the need of downstream river. (2)The efficiency of sediment trapping in adjustable check dam with transverse beams was significantly improved. (3)The check dam without transverse beams can remove the sediment and keep the ecosystem continuity.
Eagle, Dawn M; Noschang, Cristie; d'Angelo, Laure-Sophie Camilla; Noble, Christie A; Day, Jacob O; Dongelmans, Marie Louise; Theobald, David E; Mar, Adam C; Urcelay, Gonzalo P; Morein-Zamir, Sharon; Robbins, Trevor W
2014-05-01
Excessive checking is a common, debilitating symptom of obsessive-compulsive disorder (OCD). In an established rodent model of OCD checking behaviour, quinpirole (dopamine D2/3-receptor agonist) increased checking in open-field tests, indicating dopaminergic modulation of checking-like behaviours. We designed a novel operant paradigm for rats (observing response task (ORT)) to further examine cognitive processes underpinning checking behaviour and clarify how and why checking develops. We investigated i) how quinpirole increases checking, ii) dependence of these effects on D2/3 receptor function (following treatment with D2/3 receptor antagonist sulpiride) and iii) effects of reward uncertainty. In the ORT, rats pressed an 'observing' lever for information about the location of an 'active' lever that provided food reinforcement. High- and low-checkers (defined from baseline observing) received quinpirole (0.5mg/kg, 10 treatments) or vehicle. Parametric task manipulations assessed observing/checking under increasing task demands relating to reinforcement uncertainty (variable response requirement and active-lever location switching). Treatment with sulpiride further probed the pharmacological basis of long-term behavioural changes. Quinpirole selectively increased checking, both functional observing lever presses (OLPs) and non-functional extra OLPs (EOLPs). The increase in OLPs and EOLPs was long-lasting, without further quinpirole administration. Quinpirole did not affect the immediate ability to use information from checking. Vehicle and quinpirole-treated rats (VEH and QNP respectively) were selectively sensitive to different forms of uncertainty. Sulpiride reduced non-functional EOLPs in QNP rats but had no effect on functional OLPs. These data have implications for treatment of compulsive checking in OCD, particularly for serotonin-reuptake-inhibitor treatment-refractory cases, where supplementation with dopamine receptor antagonists may be beneficial. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Hyper-heuristic Evolution of Dispatching Rules: A Comparison of Rule Representations.
Branke, Jürgen; Hildebrandt, Torsten; Scholz-Reiter, Bernd
2015-01-01
Dispatching rules are frequently used for real-time, online scheduling in complex manufacturing systems. Design of such rules is usually done by experts in a time consuming trial-and-error process. Recently, evolutionary algorithms have been proposed to automate the design process. There are several possibilities to represent rules for this hyper-heuristic search. Because the representation determines the search neighborhood and the complexity of the rules that can be evolved, a suitable choice of representation is key for a successful evolutionary algorithm. In this paper we empirically compare three different representations, both numeric and symbolic, for automated rule design: A linear combination of attributes, a representation based on artificial neural networks, and a tree representation. Using appropriate evolutionary algorithms (CMA-ES for the neural network and linear representations, genetic programming for the tree representation), we empirically investigate the suitability of each representation in a dynamic stochastic job shop scenario. We also examine the robustness of the evolved dispatching rules against variations in the underlying job shop scenario, and visualize what the rules do, in order to get an intuitive understanding of their inner workings. Results indicate that the tree representation using an improved version of genetic programming gives the best results if many candidate rules can be evaluated, closely followed by the neural network representation that already leads to good results for small to moderate computational budgets. The linear representation is found to be competitive only for extremely small computational budgets.
78 FR 67467 - Registration of Municipal Advisors
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-12
... the Exchange Act. These rules and forms are designed to give effect to provisions of Title IX of the... ``investment strategies'' in the final rule is designed to address the main concerns raised by these commenters... state, and provide tax advantages designed to encourage saving for future college costs.\\54\\ 529 Savings...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... Organizations; National Stock Exchange, Inc.; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change To Adopt a New Order Type Called the ``Auto-Ex Only'' Order March 19, 2013. On January... (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to adopt a new order type called the...
STS-93 MS Tognini checks the BRIC experiment petri dishes on the middeck
2013-11-18
STS093-350-008 (22-27 July 1999) --- Astronaut Michel Tognini, mission specialist representing Frances Centre National dEtudes Spatiales (CNES), checks the Biological Research in Canisters (BRIC) payload petri dishes on the mid deck of the Space Shuttle Columbia. BRIC was designed to investigate the effects of space flight on small arthropod animals and plant specimens.
2011-03-28
Space suit designer Oleg Gerasimenko shares some tips on the Sokol suit with NASA astronaut Rex Walheim during a fit check at the Zvezda facility on Monday, March 28, 2011, in Moscow. The crew of the final shuttle mission traveled to Moscow for a suit fit check of their Russian Soyuz suits that will be required in the event of an emergency. ( NASA Photo / Houston Chronicle, Smiley N. Pool )
Determination of MLC model parameters for Monaco using commercial diode arrays.
Kinsella, Paul; Shields, Laura; McCavana, Patrick; McClean, Brendan; Langan, Brian
2016-07-08
Multileaf collimators (MLCs) need to be characterized accurately in treatment planning systems to facilitate accurate intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT). The aim of this study was to examine the use of MapCHECK 2 and ArcCHECK diode arrays for optimizing MLC parameters in Monaco X-ray voxel Monte Carlo (XVMC) dose calculation algorithm. A series of radiation test beams designed to evaluate MLC model parameters were delivered to MapCHECK 2, ArcCHECK, and EBT3 Gafchromic film for comparison. Initial comparison of the calculated and ArcCHECK-measured dose distributions revealed it was unclear how to change the MLC parameters to gain agreement. This ambiguity arose due to an insufficient sampling of the test field dose distributions and unexpected discrepancies in the open parts of some test fields. Consequently, the XVMC MLC parameters were optimized based on MapCHECK 2 measurements. Gafchromic EBT3 film was used to verify the accuracy of MapCHECK 2 measured dose distributions. It was found that adjustment of the MLC parameters from their default values resulted in improved global gamma analysis pass rates for MapCHECK 2 measurements versus calculated dose. The lowest pass rate of any MLC-modulated test beam improved from 68.5% to 93.5% with 3% and 2 mm gamma criteria. Given the close agreement of the optimized model to both MapCHECK 2 and film, the optimized model was used as a benchmark to highlight the relatively large discrepancies in some of the test field dose distributions found with ArcCHECK. Comparison between the optimized model-calculated dose and ArcCHECK-measured dose resulted in global gamma pass rates which ranged from 70.0%-97.9% for gamma criteria of 3% and 2 mm. The simple square fields yielded high pass rates. The lower gamma pass rates were attributed to the ArcCHECK overestimating the dose in-field for the rectangular test fields whose long axis was parallel to the long axis of the ArcCHECK. Considering ArcCHECK measurement issues and the lower gamma pass rates for the MLC-modulated test beams, it was concluded that MapCHECK 2 was a more suitable detector than ArcCHECK for the optimization process. © 2016 The Authors
Self-monitoring as a viable fading option in check-in/check-out.
Miller, Leila M; Dufrene, Brad A; Joe Olmi, D; Tingstrom, Daniel; Filce, Hollie
2015-04-01
This study systematically replaced the teacher completed Daily Behavior Report Card (DBRC) and feedback component of check-in/check-out (CICO) with self-monitoring for four elementary students referred for Tier 2 behavioral supports within School-Wide Positive Behavior Interventions and Supports (SWPBIS). An ABAB withdrawal design was used to test the effectiveness of CICO. Then, following the second B phase, teacher completion of the DBRC and corresponding feedback to students was replaced with self-monitoring. For all four participants, CICO was associated with increases in academic engagement and reductions in disruptive behavior. Moreover, students' behavioral gains were maintained when teacher completion of the DBRC was replaced with self-monitoring. Results are discussed in terms of CICO research and practice. Copyright © 2014 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
Huang, Li; Yuan, Jiamin; Yang, Zhimin; Xu, Fuping; Huang, Chunhua
2015-01-01
Background. In this study, we use association rules to explore the latent rules and patterns of prescribing and adjusting the ingredients of herbal decoctions based on empirical herbal formula of Chinese Medicine (CM). Materials and Methods. The consideration and development of CM prescriptions based on the knowledge of CM doctors are analyzed. The study contained three stages. The first stage is to identify the chief symptoms to a specific empirical herbal formula, which can serve as the key indication for herb addition and cancellation. The second stage is to conduct a case study on the empirical CM herbal formula for insomnia. Doctors will add extra ingredients or cancel some of them by CM syndrome diagnosis. The last stage of the study is to divide the observed cases into the effective group and ineffective group based on the assessed clinical effect by doctors. The patterns during the diagnosis and treatment are selected by the applied algorithm and the relations between clinical symptoms or indications and herb choosing principles will be selected by the association rules algorithm. Results. Totally 40 patients were observed in this study: 28 patients were considered effective after treatment and the remaining 12 were ineffective. 206 patterns related to clinical indications of Chinese Medicine were checked and screened with each observed case. In the analysis of the effective group, we used the algorithm of association rules to select combinations between 28 herbal adjustment strategies of the empirical herbal formula and the 190 patterns of individual clinical manifestations. During this stage, 11 common patterns were eliminated and 5 major symptoms for insomnia remained. 12 association rules were identified which included 5 herbal adjustment strategies. Conclusion. The association rules method is an effective algorithm to explore the latent relations between clinical indications and herbal adjustment strategies for the study on empirical herbal formulas. PMID:26495415
A Semiautomated Journal Check-In and Binding System; or Variations on a Common Theme
Livingston, Frances G.
1967-01-01
The journal check-in project described here, though based on a computerized system, uses only unit-record equipment and is designed for the medium-sized library. The frequency codes used are based on the date printed on the journal rather than on the expected date of receipt, which allows for more stability in the coding scheme. The journal's volume number and issue number, which in other systems are usually predetermined by a computer, are inserted at the time of check-in. Routine claiming of overdue issues and a systematic binding schedule have also been developed as by-products. PMID:6041836
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-24
... Rule 104 To Adopt Pricing Obligations for Designated Market Makers September 20, 2010. Pursuant to... of the Proposed Rule Change The Exchange proposes to amend Rule 104 to adopt pricing obligations for.... Purpose The Exchange proposes to amend Rule 104 to adopt pricing obligations for DMMs. Under the proposal...
Foundations of the Bandera Abstraction Tools
NASA Technical Reports Server (NTRS)
Hatcliff, John; Dwyer, Matthew B.; Pasareanu, Corina S.; Robby
2003-01-01
Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking. The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera's data abstraction mechanism which is used to reduce the cardinality (and the program's state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.
DOT National Transportation Integrated Search
2008-11-01
The Texas Department of Transportation (TxDOT) uses the modified triaxial design procedure to check : pavement designs from the flexible pavement system program. Since its original development more than : 50 years ago, little modification has been ma...
Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding
NASA Astrophysics Data System (ADS)
Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.
2016-03-01
In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.
Assessment of Petrological Microscopes.
ERIC Educational Resources Information Center
Mathison, Charter Innes
1990-01-01
Presented is a set of procedures designed to check the design, ergonomics, illumination, function, optics, accessory equipment, and image quality of a microscope being considered for purchase. Functions for use in a petrology or mineralogy laboratory are stressed. (CW)
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-27
... Proposed Rule Change to Offer Risk Management Tools Designed to Allow Member Organizations to Monitor and... of the Proposed Rule Change The Exchange proposes to offer risk management tools designed to allow... risk management tools designed to allow member organizations to monitor and address exposure to risk...
77 FR 21161 - National Forest System Land Management Planning
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-09
... ecosystem services and multiple uses. The planning rule is designed to ensure that plans provide for the... adaptive and science-based, engages the public, and is designed to be efficient, effective, and within the..., the new rule is designed to make planning more efficient and effective. Purpose and Need for the New...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-21
... Habitat for Ivesia webberi (Webber's ivesia) AGENCY: Fish and Wildlife Service, Interior. ACTION: Proposed... dates published in the August 2, 2013, proposed rule to designate critical habitat for Ivesia webberi... rule to designate critical habitat for Ivesia webberi, we included the wrong date for the public...
Optimal Test Design with Rule-Based Item Generation
ERIC Educational Resources Information Center
Geerlings, Hanneke; van der Linden, Wim J.; Glas, Cees A. W.
2013-01-01
Optimal test-design methods are applied to rule-based item generation. Three different cases of automated test design are presented: (a) test assembly from a pool of pregenerated, calibrated items; (b) test generation on the fly from a pool of calibrated item families; and (c) test generation on the fly directly from calibrated features defining…
Design space exploration for early identification of yield limiting patterns
NASA Astrophysics Data System (ADS)
Li, Helen; Zou, Elain; Lee, Robben; Hong, Sid; Liu, Square; Wang, JinYan; Du, Chunshan; Zhang, Recco; Madkour, Kareem; Ali, Hussein; Hsu, Danny; Kabeel, Aliaa; ElManhawy, Wael; Kwan, Joe
2016-03-01
In order to resolve the causality dilemma of which comes first, accurate design rules or real designs, this paper presents a flow for exploration of the layout design space to early identify problematic patterns that will negatively affect the yield. A new random layout generating method called Layout Schema Generator (LSG) is reported in this paper, this method generates realistic design-like layouts without any design rule violation. Lithography simulation is then used on the generated layout to discover the potentially problematic patterns (hotspots). These hotspot patterns are further explored by randomly inducing feature and context variations to these identified hotspots through a flow called Hotspot variation Flow (HSV). Simulation is then performed on these expanded set of layout clips to further identify more problematic patterns. These patterns are then classified into design forbidden patterns that should be included in the design rule checker and legal patterns that need better handling in the RET recipes and processes.
Failure detection system design methodology. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chow, E. Y.
1980-01-01
The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.
Fault-tolerant computer study. [logic designs for building block circuits
NASA Technical Reports Server (NTRS)
Rennels, D. A.; Avizienis, A. A.; Ercegovac, M. D.
1981-01-01
A set of building block circuits is described which can be used with commercially available microprocessors and memories to implement fault tolerant distributed computer systems. Each building block circuit is intended for VLSI implementation as a single chip. Several building blocks and associated processor and memory chips form a self checking computer module with self contained input output and interfaces to redundant communications buses. Fault tolerance is achieved by connecting self checking computer modules into a redundant network in which backup buses and computer modules are provided to circumvent failures. The requirements and design methodology which led to the definition of the building block circuits are discussed.
Design and evaluation of a simple signaling device for live traps
Benevides, F.L.; Hansen, H.; Hess, S.C.
2008-01-01
Frequent checks of live traps require enormous amounts of labor and add human scents associated with repeated monitoring, which may reduce capture efficiency. To reduce efforts and increase efficiency, we developed a trap-signaling device with long-distance reception, durability in adverse weather, and ease of transport, deployment, and use. Modifications from previous designs include a normally open magnetic switch and a mounting configuration to maximize reception. The system weighed <225 g, was effective ???17.1 km, and failed in <1% of trap-nights. Employing this system, researchers and wildlife managers may reduce the amount of effort checking traps while improving the welfare of trapped animals.
Wils, Julien; Fonfrède, Michèle; Augereau, Christine; Watine, Joseph
2014-01-01
Several tools are available to help evaluate the quality of clinical practice guidelines (CPG). The AGREE instrument (Appraisal of guidelines for research & evaluation) is the most consensual tool but it has been designed to assess CPG methodology only. The European federation of laboratory medicine (EFLM) recently designed a check-list dedicated to laboratory medicine which is supposed to be comprehensive and which therefore makes it possible to evaluate more thoroughly the quality of CPG in laboratory medicine. In the present work we test the comprehensiveness of this check-list on a sample of CPG written in French and published in Annales de biologie clinique (ABC). Thus we show that some work remains to be achieved before a truly comprehensive check-list is designed. We also show that there is some room for improvement for the CPG published in ABC, for example regarding the fact that some of these CPG do not provide any information about allowed durations of transport and of storage of biological samples before analysis, or about standards of minimal analytical performance, or about the sensitivities or the specificities of the recommended tests.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-22
... visitors with access to restricted areas or critical assets, including, (i) Measures designed to verify and validate identity; (ii) Measures designed to check criminal history; (iii) Measures designed to verify and validate legal authorization to work; and (iv) Measures designed to identify people with terrorist ties...