Complexity, Training Paradigm Design, and the Contribution of Memory Subsystems to Grammar Learning
Ettlinger, Marc; Wong, Patrick C. M.
2016-01-01
Although there is variability in nonnative grammar learning outcomes, the contributions of training paradigm design and memory subsystems are not well understood. To examine this, we presented learners with an artificial grammar that formed words via simple and complex morphophonological rules. Across three experiments, we manipulated training paradigm design and measured subjects' declarative, procedural, and working memory subsystems. Experiment 1 demonstrated that passive, exposure-based training boosted learning of both simple and complex grammatical rules, relative to no training. Additionally, procedural memory correlated with simple rule learning, whereas declarative memory correlated with complex rule learning. Experiment 2 showed that presenting corrective feedback during the test phase did not improve learning. Experiment 3 revealed that structuring the order of training so that subjects are first exposed to the simple rule and then the complex improved learning. The cumulative findings shed light on the contributions of grammatical complexity, training paradigm design, and domain-general memory subsystems in determining grammar learning success. PMID:27391085
Portable design rules for bulk CMOS
NASA Technical Reports Server (NTRS)
Griswold, T. W.
1982-01-01
It is pointed out that for the past several years, one school of IC designers has used a simplified set of nMOS geometric design rules (GDR) which is 'portable', in that it can be used by many different nMOS manufacturers. The present investigation is concerned with a preliminary set of design rules for bulk CMOS which has been verified for simple test structures. The GDR are defined in terms of Caltech Intermediate Form (CIF), which is a geometry-description language that defines simple geometrical objects in layers. The layers are abstractions of physical mask layers. The design rules do not presume the existence of any particular design methodology. Attention is given to p-well and n-well CMOS processes, bulk CMOS and CMOS-SOS, CMOS geometric rules, and a description of the advantages of CMOS technology.
A novel methodology for building robust design rules by using design based metrology (DBM)
NASA Astrophysics Data System (ADS)
Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan
2013-03-01
This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.
Bayesian design of decision rules for failure detection
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.
Using pattern enumeration to accelerate process development and ramp yield
NASA Astrophysics Data System (ADS)
Zhuang, Linda; Pang, Jenny; Xu, Jessy; Tsai, Mengfeng; Wang, Amy; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua
2016-03-01
During a new technology node process setup phase, foundries do not initially have enough product chip designs to conduct exhaustive process development. Different operational teams use manually designed simple test keys to set up their process flows and recipes. When the very first version of the design rule manual (DRM) is ready, foundries enter the process development phase where new experiment design data is manually created based on these design rules. However, these IP/test keys contain very uniform or simple design structures. This kind of design normally does not contain critical design structures or process unfriendly design patterns that pass design rule checks but are found to be less manufacturable. It is desired to have a method to generate exhaustive test patterns allowed by design rules at development stage to verify the gap of design rule and process. This paper presents a novel method of how to generate test key patterns which contain known problematic patterns as well as any constructs which designers could possibly draw based on current design rules. The enumerated test key patterns will contain the most critical design structures which are allowed by any particular design rule. A layout profiling method is used to do design chip analysis in order to find potential weak points on new incoming products so fab can take preemptive action to avoid yield loss. It can be achieved by comparing different products and leveraging the knowledge learned from previous manufactured chips to find possible yield detractors.
An automated approach to the design of decision tree classifiers
NASA Technical Reports Server (NTRS)
Argentiero, P.; Chin, R.; Beaudet, P.
1982-01-01
An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.
Nested subcritical flows within supercritical systems
NASA Technical Reports Server (NTRS)
Hendricks, R. C.; Braun, M. J.; Wheeler, R. L., III; Mullen, R. L.
1985-01-01
In supercritical systems the design inlet and outlet pressures are maintained above the thermaodynamic critical pressure P sub C. Designers rely on this simple rule of thumb to circumvent problems associated with a subcritical pressure regime nested within the supercritical pressure system along with the uncertainties in heat transfer, fluid mechanics, and thermophysical property variations. The simple rule of thumb is adequate in many low-power designs but is inadequate for high-performance turbomachines and linear systems, where nested two-phase regions can exist. Examples for a free-jet expansion with backpressure greater than P sub C and a rotor (bearing) with ambient pressure greater than P sub C illustrate the existence of subcritical pressure regimes nested within supercritical systems.
Using a Simple Contest to Illustrate Mechanism Design
ERIC Educational Resources Information Center
Blackwell, Calvin
2011-01-01
This article describes a simple classroom activity that illustrates how economic theory can be used for mechanism design. The rules for a set of contests are presented; the results typically obtained from these contests illustrate how the prize structure can be manipulated in order to produce a particular outcome. Specifically, this activity is…
Evaluation of IOTA Simple Ultrasound Rules to Distinguish Benign and Malignant Ovarian Tumours.
Garg, Sugandha; Kaur, Amarjit; Mohi, Jaswinder Kaur; Sibia, Preet Kanwal; Kaur, Navkiran
2017-08-01
IOTA stands for International Ovarian Tumour Analysis group. Ovarian cancer is one of the common cancers in women and is diagnosed at later stage in majority. The limiting factor for early diagnosis is lack of standardized terms and procedures in gynaecological sonography. Introduction of IOTA rules has provided some consistency in defining morphological features of ovarian masses through a standardized examination technique. To evaluate the efficacy of IOTA simple ultrasound rules in distinguishing benign and malignant ovarian tumours and establishing their use as a tool in early diagnosis of ovarian malignancy. A hospital based case control prospective study was conducted. Patients with suspected ovarian pathology were evaluated using IOTA ultrasound rules and designated as benign or malignant. Findings were correlated with histopathological findings. Collected data was statistically analysed using chi-square test and kappa statistical method. Out of initial 55 patients, 50 patients were included in the final analysis who underwent surgery. IOTA simple rules were applicable in 45 out of these 50 patients (90%). The sensitivity for the detection of malignancy in cases where IOTA simple rules were applicable was 91.66% and the specificity was 84.84%. Accuracy was 86.66%. Classifying inconclusive cases as malignant, the sensitivity and specificity was 93% and 80% respectively. High level of agreement was found between USG and histopathological diagnosis with Kappa value as 0.323. IOTA simple ultrasound rules were highly sensitive and specific in predicting ovarian malignancy preoperatively yet being reproducible, easy to train and use.
A Simple Model of Circuit Design.
1980-05-01
mathematicians who discover mathematical ideas (i.cnat>, programmers who write code <Manna> <Barstow>, physicists who solve mechanics problems <de Kiecr-l...rules and shows how - they result in the design of circuits. ’l’he design rules must not only capture the purely mathematical constralints given by VICs...K VI.. *? and KCI, but also how those constraints can implement mechanism. Mathematical constraints tell us an amplifier’s input and output voltages
Design and Analysis of Complex D-Regions in Reinforced Concrete Structures
ERIC Educational Resources Information Center
Yindeesuk, Sukit
2009-01-01
STM design provisions, such as those in Appendix A of ACI318-08, consist of rules for evaluating the capacity of the load-resisting truss that is idealized to carry the forces through the D-Region. These code rules were primarily derived from test data on simple D-Regions such as deep beams and corbels. However, these STM provisions are taken as…
Evaluation of IOTA Simple Ultrasound Rules to Distinguish Benign and Malignant Ovarian Tumours
Kaur, Amarjit; Mohi, Jaswinder Kaur; Sibia, Preet Kanwal; Kaur, Navkiran
2017-01-01
Introduction IOTA stands for International Ovarian Tumour Analysis group. Ovarian cancer is one of the common cancers in women and is diagnosed at later stage in majority. The limiting factor for early diagnosis is lack of standardized terms and procedures in gynaecological sonography. Introduction of IOTA rules has provided some consistency in defining morphological features of ovarian masses through a standardized examination technique. Aim To evaluate the efficacy of IOTA simple ultrasound rules in distinguishing benign and malignant ovarian tumours and establishing their use as a tool in early diagnosis of ovarian malignancy. Materials and Methods A hospital based case control prospective study was conducted. Patients with suspected ovarian pathology were evaluated using IOTA ultrasound rules and designated as benign or malignant. Findings were correlated with histopathological findings. Collected data was statistically analysed using chi-square test and kappa statistical method. Results Out of initial 55 patients, 50 patients were included in the final analysis who underwent surgery. IOTA simple rules were applicable in 45 out of these 50 patients (90%). The sensitivity for the detection of malignancy in cases where IOTA simple rules were applicable was 91.66% and the specificity was 84.84%. Accuracy was 86.66%. Classifying inconclusive cases as malignant, the sensitivity and specificity was 93% and 80% respectively. High level of agreement was found between USG and histopathological diagnosis with Kappa value as 0.323. Conclusion IOTA simple ultrasound rules were highly sensitive and specific in predicting ovarian malignancy preoperatively yet being reproducible, easy to train and use. PMID:28969237
Development of Design Rules for Reliable Antisense RNA Behavior in E. coli.
Hoynes-O'Connor, Allison; Moon, Tae Seok
2016-12-16
A key driver of synthetic biology is the development of designable genetic parts with predictable behaviors that can be quickly implemented in complex genetic systems. However, the intrinsic complexity of gene regulation can make the rational design of genetic parts challenging. This challenge is apparent in the design of antisense RNA (asRNA) regulators. Though asRNAs are well-known regulators, the literature governing their design is conflicting and leaves the synthetic biology community without clear asRNA design rules. The goal of this study is to perform a comprehensive experimental characterization and statistical analysis of 121 unique asRNA regulators in order to resolve the conflicts that currently exist in the literature. asRNAs usually consist of two regions, the Hfq binding site and the target binding region (TBR). First, the behaviors of several high-performing Hfq binding sites were compared, in terms of their ability to improve repression efficiencies and their orthogonality. Next, a large-scale analysis of TBR design parameters identified asRNA length, the thermodynamics of asRNA-mRNA complex formation, and the percent of target mismatch as key parameters for TBR design. These parameters were used to develop simple asRNA design rules. Finally, these design rules were applied to construct both a simple and a complex genetic circuit containing different asRNAs, and predictable behavior was observed in both circuits. The results presented in this study will drive synthetic biology forward by providing useful design guidelines for the construction of asRNA regulators with predictable behaviors.
A CLIPS-based tool for aircraft pilot-vehicle interface design
NASA Technical Reports Server (NTRS)
Fowler, Thomas D.; Rogers, Steven P.
1991-01-01
The Pilot-Vehicle Interface of modern aircraft is the cognitive, sensory, and psychomotor link between the pilot, the avionics modules, and all other systems on board the aircraft. To assist pilot-vehicle interface designers, a C Language Integrated Production System (CLIPS) based tool was developed that allows design information to be stored in a table that can be modified by rules representing design knowledge. Developed for the Apple Macintosh, the tool allows users without any CLIPS programming experience to form simple rules using a point and click interface.
Rule Systems for Runtime Verification: A Short Tutorial
NASA Astrophysics Data System (ADS)
Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex
In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.
The Box-and-Dot Method: A Simple Strategy for Counting Significant Figures
NASA Astrophysics Data System (ADS)
Stephenson, W. Kirk
2009-08-01
A visual method for counting significant digits is presented. This easy-to-learn (and easy-to-teach) method, designated the box-and-dot method, uses the device of "boxing" significant figures based on two simple rules, then counting the number of digits in the boxes.
Self-assembly of Archimedean tilings with enthalpically and entropically patchy polygons.
Millan, Jaime A; Ortiz, Daniel; van Anders, Greg; Glotzer, Sharon C
2014-03-25
Considerable progress in the synthesis of anisotropic patchy nanoplates (nanoplatelets) promises a rich variety of highly ordered two-dimensional superlattices. Recent experiments of superlattices assembled from nanoplates confirm the accessibility of exotic phases and motivate the need for a better understanding of the underlying self-assembly mechanisms. Here, we present experimentally accessible, rational design rules for the self-assembly of the Archimedean tilings from polygonal nanoplates. The Archimedean tilings represent a model set of target patterns that (i) contain both simple and complex patterns, (ii) are comprised of simple regular shapes, and (iii) contain patterns with potentially interesting materials properties. Via Monte Carlo simulations, we propose a set of design rules with general applicability to one- and two-component systems of polygons. These design rules, specified by increasing levels of patchiness, correspond to a reduced set of anisotropy dimensions for robust self-assembly of the Archimedean tilings. We show for which tilings entropic patches alone are sufficient for assembly and when short-range enthalpic interactions are required. For the latter, we show how patchy these interactions should be for optimal yield. This study provides a minimal set of guidelines for the design of anisostropic patchy particles that can self-assemble all 11 Archimedean tilings.
Ameye, Lieveke; Fischerova, Daniela; Epstein, Elisabeth; Melis, Gian Benedetto; Guerriero, Stefano; Van Holsbeke, Caroline; Savelli, Luca; Fruscio, Robert; Lissoni, Andrea Alberto; Testa, Antonia Carla; Veldman, Joan; Vergote, Ignace; Van Huffel, Sabine; Bourne, Tom; Valentin, Lil
2010-01-01
Objectives To prospectively assess the diagnostic performance of simple ultrasound rules to predict benignity/malignancy in an adnexal mass and to test the performance of the risk of malignancy index, two logistic regression models, and subjective assessment of ultrasonic findings by an experienced ultrasound examiner in adnexal masses for which the simple rules yield an inconclusive result. Design Prospective temporal and external validation of simple ultrasound rules to distinguish benign from malignant adnexal masses. The rules comprised five ultrasonic features (including shape, size, solidity, and results of colour Doppler examination) to predict a malignant tumour (M features) and five to predict a benign tumour (B features). If one or more M features were present in the absence of a B feature, the mass was classified as malignant. If one or more B features were present in the absence of an M feature, it was classified as benign. If both M features and B features were present, or if none of the features was present, the simple rules were inconclusive. Setting 19 ultrasound centres in eight countries. Participants 1938 women with an adnexal mass examined with ultrasound by the principal investigator at each centre with a standardised research protocol. Reference standard Histological classification of the excised adnexal mass as benign or malignant. Main outcome measures Diagnostic sensitivity and specificity. Results Of the 1938 patients with an adnexal mass, 1396 (72%) had benign tumours, 373 (19.2%) had primary invasive tumours, 111 (5.7%) had borderline malignant tumours, and 58 (3%) had metastatic tumours in the ovary. The simple rules yielded a conclusive result in 1501 (77%) masses, for which they resulted in a sensitivity of 92% (95% confidence interval 89% to 94%) and a specificity of 96% (94% to 97%). The corresponding sensitivity and specificity of subjective assessment were 91% (88% to 94%) and 96% (94% to 97%). In the 357 masses for which the simple rules yielded an inconclusive result and with available results of CA-125 measurements, the sensitivities were 89% (83% to 93%) for subjective assessment, 50% (42% to 58%) for the risk of malignancy index, 89% (83% to 93%) for logistic regression model 1, and 82% (75% to 87%) for logistic regression model 2; the corresponding specificities were 78% (72% to 83%), 84% (78% to 88%), 44% (38% to 51%), and 48% (42% to 55%). Use of the simple rules as a triage test and subjective assessment for those masses for which the simple rules yielded an inconclusive result gave a sensitivity of 91% (88% to 93%) and a specificity of 93% (91% to 94%), compared with a sensitivity of 90% (88% to 93%) and a specificity of 93% (91% to 94%) when subjective assessment was used in all masses. Conclusions The use of the simple rules has the potential to improve the management of women with adnexal masses. In adnexal masses for which the rules yielded an inconclusive result, subjective assessment of ultrasonic findings by an experienced ultrasound examiner was the most accurate diagnostic test; the risk of malignancy index and the two regression models were not useful. PMID:21156740
The Box-and-Dot Method: A Simple Strategy for Counting Significant Figures
ERIC Educational Resources Information Center
Stephenson, W. Kirk
2009-01-01
A visual method for counting significant digits is presented. This easy-to-learn (and easy-to-teach) method, designated the box-and-dot method, uses the device of "boxing" significant figures based on two simple rules, then counting the number of digits in the boxes. (Contains 4 notes.)
Seismic Safety Of Simple Masonry Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guadagnuolo, Mariateresa; Faella, Giuseppe
2008-07-08
Several masonry buildings comply with the rules for simple buildings provided by seismic codes. For these buildings explicit safety verifications are not compulsory if specific code rules are fulfilled. In fact it is assumed that their fulfilment ensures a suitable seismic behaviour of buildings and thus adequate safety under earthquakes. Italian and European seismic codes differ in the requirements for simple masonry buildings, mostly concerning the building typology, the building geometry and the acceleration at site. Obviously, a wide percentage of buildings assumed simple by codes should satisfy the numerical safety verification, so that no confusion and uncertainty have tomore » be given rise to designers who must use the codes. This paper aims at evaluating the seismic response of some simple unreinforced masonry buildings that comply with the provisions of the new Italian seismic code. Two-story buildings, having different geometry, are analysed and results from nonlinear static analyses performed by varying the acceleration at site are presented and discussed. Indications on the congruence between code rules and results of numerical analyses performed according to the code itself are supplied and, in this context, the obtained result can provide a contribution for improving the seismic code requirements.« less
Two-phase flows within systems with ambient pressure
NASA Technical Reports Server (NTRS)
Hendricks, R. C.; Braun, M. J.; Wheeler, R. L., III; Mullen, R. L.
1985-01-01
In systems where the design inlet and outlet pressures are maintained above the thermodynamic critical pressure, it is often assumed that two phase flows within the system cannot occur. Designers rely on this simple rule of thumb to circumvent problems associated with a highly compressible two phase flow occurring within the supercritical pressure system along with the uncertainties in rotordynamics, load capacity, heat transfer, fluid mechanics, and thermophysical property variations. The simple rule of thumb is adequate in many low power designs but is inadequate for high performance turbomachines and linear systems, where two phase regions can exist even though outlet pressure is greater than critical pressure. Rotordynamic-fluid-mechanic restoring forces depend on momentum differences, and those for a two phase zone can differ significantly from those for a single-phase zone. Using the Reynolds equation the angular velocity, eccentricity, geometry, and ambient conditions are varied to determine the point of two phase flow incipience.
Meys, Evelyne; Rutten, Iris; Kruitwagen, Roy; Slangen, Brigitte; Lambrechts, Sandrina; Mertens, Helen; Nolting, Ernst; Boskamp, Dieuwke; Van Gorp, Toon
2017-12-01
To analyze how well untrained examiners - without experience in the use of International Ovarian Tumor Analysis (IOTA) terminology or simple ultrasound-based rules (simple rules) - are able to apply IOTA terminology and simple rules and to assess the level of agreement between non-experts and an expert. This prospective multicenter cohort study enrolled women with ovarian masses. Ultrasound was performed by non-expert examiners and an expert. Ultrasound features were recorded using IOTA nomenclature, and used for classifying the mass by simple rules. Interobserver agreement was evaluated with Fleiss' kappa and percentage agreement between observers. 50 consecutive women were included. We observed 46 discrepancies in the description of ovarian masses when non-experts utilized IOTA terminology. Tumor type was misclassified often (n = 22), resulting in poor interobserver agreement between the non-experts and the expert (kappa = 0.39, 95 %-CI 0.244 - 0.529, percentage of agreement = 52.0 %). Misinterpretation of simple rules by non-experts was observed 57 times, resulting in an erroneous diagnosis in 15 patients (30 %). The agreement for classifying the mass as benign, malignant or inconclusive by simple rules was only moderate between the non-experts and the expert (kappa = 0.50, 95 %-CI 0.300 - 0.704, percentage of agreement = 70.0 %). The level of agreement for all 10 simple rules features varied greatly (kappa index range: -0.08 - 0.74, percentage of agreement 66 - 94 %). Although simple rules are useful to distinguish benign from malignant adnexal masses, they are not that simple for untrained examiners. Training with both IOTA terminology and simple rules is necessary before simple rules can be introduced into guidelines and daily clinical practice. © Georg Thieme Verlag KG Stuttgart · New York.
Zhang, Jie; Wang, Yuping; Feng, Junhong
2013-01-01
In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption.
Wang, Yuping; Feng, Junhong
2013-01-01
In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption. PMID:23766683
ITER structural design criteria and their extension to advanced reactor blankets*1
NASA Astrophysics Data System (ADS)
Majumdar, S.; Kalinin, G.
2000-12-01
Applications of the recent ITER structural design criteria (ISDC) are illustrated by two components. First, the low-temperature-design rules are applied to copper alloys that are particularly prone to irradiation embrittlement at relatively low fluences at certain temperatures. Allowable stresses are derived and the impact of the embrittlement on allowable surface heat flux of a simple first-wall/limiter design is demonstrated. Next, the high-temperature-design rules of ISDC are applied to evaporation of lithium and vapor extraction (EVOLVE), a blanket design concept currently being investigated under the US Advanced Power Extraction (APEX) program. A single tungsten first-wall tube is considered for thermal and stress analyses by finite-element method.
The Rules of the Game in an Introductory Literature Class
ERIC Educational Resources Information Center
Jones, Ed
2008-01-01
While focusing on Andrew Marvell's "To His Coy Mistress," the author came up with the Interpretation Game, a game that had a simple set of rules designed to promote engaged academic discussion and, at the same time, to overcome problems that students have in class discussion about literature. In this article, the author narrates a few instances of…
Anton TenWolde; Mark T. Bomberg
2009-01-01
Overall, despite the lack of exact input data, the use of design tools, including models, is much superior to the simple following of rules of thumbs, and a moisture analysis should be standard procedure for any building envelope design. Exceptions can only be made for buildings in the same climate, similar occupancy, and similar envelope construction. This chapter...
IOTA simple rules in differentiating between benign and malignant ovarian tumors.
Tantipalakorn, Charuwan; Wanapirak, Chanane; Khunamornpong, Surapan; Sukpan, Kornkanok; Tongsong, Theera
2014-01-01
To evaluate the diagnostic performance of IOTA simple rules in differentiating between benign and malignant ovarian tumors. A study of diagnostic performance was conducted on women scheduled for elective surgery due to ovarian masses between March 2007 and March 2012. All patients underwent ultrasound examination for IOTA simple rules within 24 hours of surgery. All examinations were performed by the authors, who had no any clinical information of the patients, to differentiate between benign and malignant adnexal masses using IOTA simple rules. Gold standard diagnosis was based on pathological or operative findings. A total of 398 adnexal masses, in 376 women, were available for analysis. Of them, the IOTA simple rules could be applied in 319 (80.1%) including 212 (66.5%) benign tumors and 107 (33.6%) malignant tumors. The simple rules yielded inconclusive results in 79 (19.9%) masses. In the 319 masses for which the IOTA simple rules could be applied, sensitivity was 82.9% and specificity 95.3%. The IOTA simple rules have high diagnostic performance in differentiating between benign and malignant adnexal masses. Nevertheless, inconclusive results are relatively common.
A simple threshold rule is sufficient to explain sophisticated collective decision-making.
Robinson, Elva J H; Franks, Nigel R; Ellis, Samuel; Okuda, Saki; Marshall, James A R
2011-01-01
Decision-making animals can use slow-but-accurate strategies, such as making multiple comparisons, or opt for simpler, faster strategies to find a 'good enough' option. Social animals make collective decisions about many group behaviours including foraging and migration. The key to the collective choice lies with individual behaviour. We present a case study of a collective decision-making process (house-hunting ants, Temnothorax albipennis), in which a previously proposed decision strategy involved both quality-dependent hesitancy and direct comparisons of nests by scouts. An alternative possible decision strategy is that scouting ants use a very simple quality-dependent threshold rule to decide whether to recruit nest-mates to a new site or search for alternatives. We use analytical and simulation modelling to demonstrate that this simple rule is sufficient to explain empirical patterns from three studies of collective decision-making in ants, and can account parsimoniously for apparent comparison by individuals and apparent hesitancy (recruitment latency) effects, when available nests differ strongly in quality. This highlights the need to carefully design experiments to detect individual comparison. We present empirical data strongly suggesting that best-of-n comparison is not used by individual ants, although individual sequential comparisons are not ruled out. However, by using a simple threshold rule, decision-making groups are able to effectively compare options, without relying on any form of direct comparison of alternatives by individuals. This parsimonious mechanism could promote collective rationality in group decision-making.
Knowledge base rule partitioning design for CLIPS
NASA Technical Reports Server (NTRS)
Mainardi, Joseph D.; Szatkowski, G. P.
1990-01-01
This describes a knowledge base (KB) partitioning approach to solve the problem of real-time performance using the CLIPS AI shell when containing large numbers of rules and facts. This work is funded under the joint USAF/NASA Advanced Launch System (ALS) Program as applied research in expert systems to perform vehicle checkout for real-time controller and diagnostic monitoring tasks. The Expert System advanced development project (ADP-2302) main objective is to provide robust systems responding to new data frames of 0.1 to 1.0 second intervals. The intelligent system control must be performed within the specified real-time window, in order to meet the demands of the given application. Partitioning the KB reduces the complexity of the inferencing Rete net at any given time. This reduced complexity improves performance but without undo impacts during load and unload cycles. The second objective is to produce highly reliable intelligent systems. This requires simple and automated approaches to the KB verification & validation task. Partitioning the KB reduces rule interaction complexity overall. Reduced interaction simplifies the V&V testing necessary by focusing attention only on individual areas of interest. Many systems require a robustness that involves a large number of rules, most of which are mutually exclusive under different phases or conditions. The ideal solution is to control the knowledge base by loading rules that directly apply for that condition, while stripping out all rules and facts that are not used during that cycle. The practical approach is to cluster rules and facts into associated 'blocks'. A simple approach has been designed to control the addition and deletion of 'blocks' of rules and facts, while allowing real-time operations to run freely. Timing tests for real-time performance for specific machines under R/T operating systems have not been completed but are planned as part of the analysis process to validate the design.
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher; Valco, Mark J.
2002-01-01
The Oil-Free Turbomachinery team at the NASA Glenn Research Center has unlocked one of the mysteries surrounding foil air bearing performance. Foil air bearings are self-acting hydrodynamic bearings that use ambient air, or any fluid, as their lubricant. In operation, the motion of the shaft's surface drags fluid into the bearing by viscous action, creating a pressurized lubricant film. This lubricating film separates the stationary foil bearing surface from the moving shaft and supports load. Foil bearings have been around for decades and are widely employed in the air cycle machines used for cabin pressurization and cooling aboard commercial jetliners. The Oil-Free Turbomachinery team is fostering the maturation of this technology for integration into advanced Oil-Free aircraft engines. Elimination of the engine oil system can significantly reduce weight and cost and could enable revolutionary new engine designs. Foil bearings, however, have complex elastic support structures (spring packs) that make the prediction of bearing performance, such as load capacity, difficult if not impossible. Researchers at Glenn recently found a link between foil bearing design and load capacity performance. The results have led to a simple rule-of-thumb that relates a bearing's size, speed, and design to its load capacity. Early simple designs (Generation I) had simple elastic (spring) support elements, and performance was limited. More advanced bearings (Generation III) with elastic supports, in which the stiffness is varied locally to optimize gas film pressures, exhibit load capacities that are more than double those of the best previous designs. This is shown graphically in the figure. These more advanced bearings have enabled industry to introduce commercial Oil-Free gas-turbine-based electrical generators and are allowing the aeropropulsion industry to incorporate the technology into aircraft engines. The rule-of-thumb enables engine and bearing designers to easily size and select bearing technology for a new application and determine the level of complexity required in the bearings. This new understanding enables industry to assess the feasibility of new engine designs and provides critical guidance toward the future development of Oil-Free turbomachinery propulsion systems.
Eisenhardt, K M; Sull, D N
2001-01-01
The success of Yahoo!, eBay, Enron, and other companies that have become adept at morphing to meet the demands of changing markets can't be explained using traditional thinking about competitive strategy. These companies have succeeded by pursuing constantly evolving strategies in market spaces that were considered unattractive according to traditional measures. In this article--the third in an HBR series by Kathleen Eisenhardt and Donald Sull on strategy in the new economy--the authors ask, what are the sources of competitive advantage in high-velocity markets? The secret, they say, is strategy as simple rules. The companies know that the greatest opportunities for competitive advantage lie in market confusion, but they recognize the need for a few crucial strategic processes and a few simple rules. In traditional strategy, advantage comes from exploiting resources or stable market positions. In strategy as simple rules, advantage comes from successfully seizing fleeting opportunities. Key strategic processes, such as product innovation, partnering, or spinout creation, place the company where the flow of opportunities is greatest. Simple rules then provide the guidelines within which managers can pursue such opportunities. Simple rules, which grow out of experience, fall into five broad categories: how- to rules, boundary conditions, priority rules, timing rules, and exit rules. Companies with simple-rules strategies must follow the rules religiously and avoid the temptation to change them too frequently. A consistent strategy helps managers sort through opportunities and gain short-term advantage by exploiting the attractive ones. In stable markets, managers rely on complicated strategies built on detailed predictions of the future. But when business is complicated, strategy should be simple.
A Simple Method to Determine the "R" or "S" Configuration of Molecules with an Axis of Chirality
ERIC Educational Resources Information Center
Wang, Cunde; Wu, Weiming
2011-01-01
A simple method for the "R" or "S" designation of molecules with an axis of chirality is described. The method involves projection of the substituents along the chiral axis, utilizes the Cahn-Ingold-Prelog sequence rules in assigning priority to the substituents, is easy to use, and has broad applicability. (Contains 5 figures.)
Improved specificity of TALE-based genome editing using an expanded RVD repertoire.
Miller, Jeffrey C; Zhang, Lei; Xia, Danny F; Campo, John J; Ankoudinova, Irina V; Guschin, Dmitry Y; Babiarz, Joshua E; Meng, Xiangdong; Hinkley, Sarah J; Lam, Stephen C; Paschon, David E; Vincent, Anna I; Dulay, Gladys P; Barlow, Kyle A; Shivak, David A; Leung, Elo; Kim, Jinwon D; Amora, Rainier; Urnov, Fyodor D; Gregory, Philip D; Rebar, Edward J
2015-05-01
Transcription activator-like effector (TALE) proteins have gained broad appeal as a platform for targeted DNA recognition, largely owing to their simple rules for design. These rules relate the base specified by a single TALE repeat to the identity of two key residues (the repeat variable diresidue, or RVD) and enable design for new sequence targets via modular shuffling of these units. A key limitation of these rules is that their simplicity precludes options for improving designs that are insufficiently active or specific. Here we address this limitation by developing an expanded set of RVDs and applying them to improve the performance of previously described TALEs. As an extreme example, total conversion of a TALE nuclease to new RVDs substantially reduced off-target cleavage in cellular studies. By providing new RVDs and design strategies, these studies establish options for developing improved TALEs for broader application across medicine and biotechnology.
Tinnangwattana, Dangcheewan; Vichak-Ururote, Linlada; Tontivuthikul, Paponrad; Charoenratana, Cholaros; Lerthiranwong, Thitikarn; Tongsong, Theera
2015-01-01
To evaluate the diagnostic performance of IOTA simple rules in predicting malignant adnexal tumors by non-expert examiners. Five obstetric/gynecologic residents, who had never performed gynecologic ultrasound examination by themselves before, were trained for IOTA simple rules by an experienced examiner. One trained resident performed ultrasound examinations including IOTA simple rules on 100 women, who were scheduled for surgery due to ovarian masses, within 24 hours of surgery. The gold standard diagnosis was based on pathological or operative findings. The five-trained residents performed IOTA simple rules on 30 patients for evaluation of inter-observer variability. A total of 100 patients underwent ultrasound examination for the IOTA simple rules. Of them, IOTA simple rules could be applied in 94 (94%) masses including 71 (71.0%) benign masses and 29 (29.0%) malignant masses. The diagnostic performance of IOTA simple rules showed sensitivity of 89.3% (95%CI, 77.8%; 100.7%), specificity 83.3% (95%CI, 74.3%; 92.3%). Inter-observer variability was analyzed using Cohen's kappa coefficient. Kappa indices of the four pairs of raters are 0.713-0.884 (0.722, 0.827, 0.713, and 0.884). IOTA simple rules have high diagnostic performance in discriminating adnexal masses even when are applied by non-expert sonographers, though a training course may be required. Nevertheless, they should be further tested by a greater number of general practitioners before widely use.
An Integrated Children Disease Prediction Tool within a Special Social Network.
Apostolova Trpkovska, Marika; Yildirim Yayilgan, Sule; Besimi, Adrian
2016-01-01
This paper proposes a social network with an integrated children disease prediction system developed by the use of the specially designed Children General Disease Ontology (CGDO). This ontology consists of children diseases and their relationship with symptoms and Semantic Web Rule Language (SWRL rules) that are specially designed for predicting diseases. The prediction process starts by filling data about the appeared signs and symptoms by the user which are after that mapped with the CGDO ontology. Once the data are mapped, the prediction results are presented. The phase of prediction executes the rules which extract the predicted disease details based on the SWRL rule specified. The motivation behind the development of this system is to spread knowledge about the children diseases and their symptoms in a very simple way using the specialized social networking website www.emama.mk.
Versloot, Judith; Grudniewicz, Agnes; Chatterjee, Ananda; Hayden, Leigh; Kastner, Monika; Bhattacharyya, Onil
2015-06-01
We present simple formatting rules derived from an extensive literature review that can improve the format of clinical practice guidelines (CPGs), and potentially increase the likelihood of being used. We recently conducted a review of the literature from medicine, psychology, design, and human factors engineering on characteristics of guidelines that are associated with their use in practice, covering both the creation and communication of content. The formatting rules described in this article are derived from that review. The formatting rules are grouped into three categories that can be easily applied to CPGs: first, Vivid: make it stand out; second, Intuitive: match it to the audience's expectations, and third, Visual: use alternatives to text. We highlight rules supported by our broad literature review and provide specific 'how to' recommendations for individuals and groups developing evidence-based materials for clinicians. The way text documents are formatted influences their accessibility and usability. Optimizing the formatting of CPGs is a relatively inexpensive intervention and can be used to facilitate the dissemination of evidence in healthcare. Applying simple formatting principles to make documents more vivid, intuitive, and visual is a practical approach that has the potential to influence the usability of guidelines and to influence the extent to which guidelines are read, remembered, and used in practice.
Tracking Hazard Analysis Data in a Jungle of Changing Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Robin S.; Young, Jonathan
2006-05-16
Tracking hazard analysis data during the 'life cycle' of a project can be an extremely complicated task. However, a few simple rules, used consistently, can give you the edge that will save countless headaches and provide the information that will help integrate the hazard analysis and design activities even if performed in parallel.
Guidelines for glycol dehydrator design; Part 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manning, W.P.; Wood, H.S.
1993-01-01
Better designs and instrumentation improve glycol dehydrator performance. This paper reports on these guidelines which emphasize efficient water removal from natural gas. Water, a common contaminant in natural gas, causes operational problems when it forms hydrates and deposits on solid surfaces. Result: plugged valves, meters, instruments and even pipelines. Simple rules resolve these problems and reduce downtime and maintenance costs.
ERIC Educational Resources Information Center
Rhoads, Christopher
2014-01-01
Recent publications have drawn attention to the idea of utilizing prior information about the correlation structure to improve statistical power in cluster randomized experiments. Because power in cluster randomized designs is a function of many different parameters, it has been difficult for applied researchers to discern a simple rule explaining…
Oosterman, Joukje M; Heringa, Sophie M; Kessels, Roy P C; Biessels, Geert Jan; Koek, Huiberdina L; Maes, Joseph H R; van den Berg, Esther
2017-04-01
Rule induction tests such as the Wisconsin Card Sorting Test require executive control processes, but also the learning and memorization of simple stimulus-response rules. In this study, we examined the contribution of diminished learning and memorization of simple rules to complex rule induction test performance in patients with amnestic mild cognitive impairment (aMCI) or Alzheimer's dementia (AD). Twenty-six aMCI patients, 39 AD patients, and 32 control participants were included. A task was used in which the memory load and the complexity of the rules were independently manipulated. This task consisted of three conditions: a simple two-rule learning condition (Condition 1), a simple four-rule learning condition (inducing an increase in memory load, Condition 2), and a complex biconditional four-rule learning condition-inducing an increase in complexity and, hence, executive control load (Condition 3). Performance of AD patients declined disproportionately when the number of simple rules that had to be memorized increased (from Condition 1 to 2). An additional increment in complexity (from Condition 2 to 3) did not, however, disproportionately affect performance of the patients. Performance of the aMCI patients did not differ from that of the control participants. In the patient group, correlation analysis showed that memory performance correlated with Condition 1 performance, whereas executive task performance correlated with Condition 2 performance. These results indicate that the reduced learning and memorization of underlying task rules explains a significant part of the diminished complex rule induction performance commonly reported in AD, although results from the correlation analysis suggest involvement of executive control functions as well. Taken together, these findings suggest that care is needed when interpreting rule induction task performance in terms of executive function deficits in these patients.
Design and Training of Limited-Interconnect Architectures
1991-07-16
and signal processing. Neuromorphic (brain like) models, allow an alternative for achieving real-time operation tor such tasks, while having a...compact and robust architecture. Neuromorphic models consist of interconnections of simple computational nodes. In this approach, each node computes a...operational performance. I1. Research Objectives The research objectives were: 1. Development of on- chip local training rules specifically designed for
Sonographic Diagnosis of Tubal Cancer with IOTA Simple Rules Plus Pattern Recognition
Tongsong, Theera; Wanapirak, Chanane; Tantipalakorn, Charuwan; Tinnangwattana, Dangcheewan
2017-01-01
Objective: To evaluate diagnostic performance of IOTA simple rules plus pattern recognition in predicting tubal cancer. Methods: Secondary analysis was performed on prospective database of our IOTA project. The patients recruited in the project were those who were scheduled for pelvic surgery due to adnexal masses. The patients underwent ultrasound examinations within 24 hours before surgery. On ultrasound examination, the masses were evaluated using the well-established IOTA simple rules plus pattern recognition (sausage-shaped appearance, incomplete septum, visible ipsilateral ovaries) to predict tubal cancer. The gold standard diagnosis was based on histological findings or operative findings. Results: A total of 482 patients, including 15 cases of tubal cancer, were evaluated by ultrasound preoperatively. The IOTA simple rules plus pattern recognition gave a sensitivity of 86.7% (13 in 15) and specificity of 97.4%. Sausage-shaped appearance was identified in nearly all cases (14 in 15). Incomplete septa and normal ovaries could be identified in 33.3% and 40%, respectively. Conclusion: IOTA simple rules plus pattern recognition is relatively effective in predicting tubal cancer. Thus, we propose the simple scheme in diagnosis of tubal cancer as follows. First of all, the adnexal masses are evaluated with IOTA simple rules. If the B-rules could be applied, tubal cancer is reliably excluded. If the M-rules could be applied or the result is inconclusive, careful delineation of the mass with pattern recognition should be performed. PMID:29172273
Sonographic Diagnosis of Tubal Cancer with IOTA Simple Rules Plus Pattern Recognition
Tongsong, Theera; Wanapirak, Chanane; Tantipalakorn, Charuwan; Tinnangwattana, Dangcheewan
2017-11-26
Objective: To evaluate diagnostic performance of IOTA simple rules plus pattern recognition in predicting tubal cancer. Methods: Secondary analysis was performed on prospective database of our IOTA project. The patients recruited in the project were those who were scheduled for pelvic surgery due to adnexal masses. The patients underwent ultrasound examinations within 24 hours before surgery. On ultrasound examination, the masses were evaluated using the well-established IOTA simple rules plus pattern recognition (sausage-shaped appearance, incomplete septum, visible ipsilateral ovaries) to predict tubal cancer. The gold standard diagnosis was based on histological findings or operative findings. Results: A total of 482 patients, including 15 cases of tubal cancer, were evaluated by ultrasound preoperatively. The IOTA simple rules plus pattern recognition gave a sensitivity of 86.7% (13 in 15) and specificity of 97.4%. Sausage-shaped appearance was identified in nearly all cases (14 in 15). Incomplete septa and normal ovaries could be identified in 33.3% and 40%, respectively. Conclusion: IOTA simple rules plus pattern recognition is relatively effective in predicting tubal cancer. Thus, we propose the simple scheme in diagnosis of tubal cancer as follows. First of all, the adnexal masses are evaluated with IOTA simple rules. If the B-rules could be applied, tubal cancer is reliably excluded. If the M-rules could be applied or the result is inconclusive, careful delineation of the mass with pattern recognition should be performed. Creative Commons Attribution License
A detailed comparison of optimality and simplicity in perceptual decision-making
Shen, Shan; Ma, Wei Ji
2017-01-01
Two prominent ideas in the study of decision-making have been that organisms behave near-optimally, and that they use simple heuristic rules. These principles might be operating in different types of tasks, but this possibility cannot be fully investigated without a direct, rigorous comparison within a single task. Such a comparison was lacking in most previous studies, because a) the optimal decision rule was simple; b) no simple suboptimal rules were considered; c) it was unclear what was optimal, or d) a simple rule could closely approximate the optimal rule. Here, we used a perceptual decision-making task in which the optimal decision rule is well-defined and complex, and makes qualitatively distinct predictions from many simple suboptimal rules. We find that all simple rules tested fail to describe human behavior, that the optimal rule accounts well for the data, and that several complex suboptimal rules are indistinguishable from the optimal one. Moreover, we found evidence that the optimal model is close to the true model: first, the better the trial-to-trial predictions of a suboptimal model agree with those of the optimal model, the better that suboptimal model fits; second, our estimate of the Kullback-Leibler divergence between the optimal model and the true model is not significantly different from zero. When observers receive no feedback, the optimal model still describes behavior best, suggesting that sensory uncertainty is implicitly represented and taken into account. Beyond the task and models studied here, our results have implications for best practices of model comparison. PMID:27177259
Applications of rule-induction in the derivation of quantitative structure-activity relationships.
A-Razzak, M; Glen, R C
1992-08-01
Recently, methods have been developed in the field of Artificial Intelligence (AI), specifically in the expert systems area using rule-induction, designed to extract rules from data. We have applied these methods to the analysis of molecular series with the objective of generating rules which are predictive and reliable. The input to rule-induction consists of a number of examples with known outcomes (a training set) and the output is a tree-structured series of rules. Unlike most other analysis methods, the results of the analysis are in the form of simple statements which can be easily interpreted. These are readily applied to new data giving both a classification and a probability of correctness. Rule-induction has been applied to in-house generated and published QSAR datasets and the methodology, application and results of these analyses are discussed. The results imply that in some cases it would be advantageous to use rule-induction as a complementary technique in addition to conventional statistical and pattern-recognition methods.
Applications of rule-induction in the derivation of quantitative structure-activity relationships
NASA Astrophysics Data System (ADS)
A-Razzak, Mohammed; Glen, Robert C.
1992-08-01
Recently, methods have been developed in the field of Artificial Intelligence (AI), specifically in the expert systems area using rule-induction, designed to extract rules from data. We have applied these methods to the analysis of molecular series with the objective of generating rules which are predictive and reliable. The input to rule-induction consists of a number of examples with known outcomes (a training set) and the output is a tree-structured series of rules. Unlike most other analysis methods, the results of the analysis are in the form of simple statements which can be easily interpreted. These are readily applied to new data giving both a classification and a probability of correctness. Rule-induction has been applied to in-house generated and published QSAR datasets and the methodology, application and results of these analyses are discussed. The results imply that in some cases it would be advantageous to use rule-induction as a complementary technique in addition to conventional statistical and pattern-recognition methods.
Coplanar Waveguide Radial Line Double Stub and Application to Filter Circuits
NASA Technical Reports Server (NTRS)
Simons, R. N.; Taub, S. R.
1993-01-01
Coplanar waveguide (CPW) and grounded coplanar waveguide (GCPW) radial line double stub resonators are experimentally characterized with respect to stub radius and sector angle. A simple closed-form design equation, which predicts the resonance radius of the stub, is presented. Use of a double stub resonator as a lowpass filter or as a harmonic suppression filter is demonstrated, and design rules are given.
ER2OWL: Generating OWL Ontology from ER Diagram
NASA Astrophysics Data System (ADS)
Fahad, Muhammad
Ontology is the fundamental part of Semantic Web. The goal of W3C is to bring the web into (its full potential) a semantic web with reusing previous systems and artifacts. Most legacy systems have been documented in structural analysis and structured design (SASD), especially in simple or Extended ER Diagram (ERD). Such systems need up-gradation to become the part of semantic web. In this paper, we present ERD to OWL-DL ontology transformation rules at concrete level. These rules facilitate an easy and understandable transformation from ERD to OWL. The set of rules for transformation is tested on a structured analysis and design example. The framework provides OWL ontology for semantic web fundamental. This framework helps software engineers in upgrading the structured analysis and design artifact ERD, to components of semantic web. Moreover our transformation tool, ER2OWL, reduces the cost and time for building OWL ontologies with the reuse of existing entity relationship models.
Weiss, Volker C
2010-07-22
One of Guggenheim's many corresponding-states rules for simple fluids implies that the molar enthalpy of vaporization (determined at the temperature at which the pressure reaches 1/50th of its critical value, which approximately coincides with the normal boiling point) divided by the critical temperature has a value of roughly 5.2R, where R is the universal gas constant. For more complex fluids, such as strongly polar and ionic fluids, one must expect deviations from Guggenheim's rule. Such a deviation has far-reaching consequences for other empirical rules related to the vaporization of fluids, namely Guldberg's rule and Trouton's rule. We evaluate these characteristic quantities for simple fluids, polar fluids, hydrogen-bonding fluids, simple inorganic molten salts, and room temperature ionic liquids (RTILs). For the ionic fluids, the critical parameters are not accessible to direct experimental observation; therefore, suitable extrapolation schemes have to be applied. For the RTILs [1-n-alkyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imides, where the alkyl chain is ethyl, butyl, hexyl, or octyl], the critical temperature is estimated by extrapolating the surface tension to zero using Guggenheim's and Eotvos' rules; the critical density is obtained using the linear-diameter rule. It is shown that the RTILs adhere to Guggenheim's master curve for the reduced surface tension of simple and moderately polar fluids, but that they deviate significantly from his rule for the reduced enthalpy of vaporization of simple fluids. Consequences for evaluating the Trouton constant of RTILs, the value of which has been discussed controversially in the literature, are indicated.
Redundancy checking algorithms based on parallel novel extension rule
NASA Astrophysics Data System (ADS)
Liu, Lei; Yang, Yang; Li, Guangli; Wang, Qi; Lü, Shuai
2017-05-01
Redundancy checking (RC) is a key knowledge reduction technology. Extension rule (ER) is a new reasoning method, first presented in 2003 and well received by experts at home and abroad. Novel extension rule (NER) is an improved ER-based reasoning method, presented in 2009. In this paper, we first analyse the characteristics of the extension rule, and then present a simple algorithm for redundancy checking based on extension rule (RCER). In addition, we introduce MIMF, a type of heuristic strategy. Using the aforementioned rule and strategy, we design and implement RCHER algorithm, which relies on MIMF. Next we design and implement an RCNER (redundancy checking based on NER) algorithm based on NER. Parallel computing greatly accelerates the NER algorithm, which has weak dependence among tasks when executed. Considering this, we present PNER (parallel NER) and apply it to redundancy checking and necessity checking. Furthermore, we design and implement the RCPNER (redundancy checking based on PNER) and NCPPNER (necessary clause partition based on PNER) algorithms as well. The experimental results show that MIMF significantly influences the acceleration of algorithm RCER in formulae on a large scale and high redundancy. Comparing PNER with NER and RCPNER with RCNER, the average speedup can reach up to the number of task decompositions when executed. Comparing NCPNER with the RCNER-based algorithm on separating redundant formulae, speedup increases steadily as the scale of the formulae is incrementing. Finally, we describe the challenges that the extension rule will be faced with and suggest possible solutions.
Automation tools for demonstration of goal directed and self-repairing flight control systems
NASA Technical Reports Server (NTRS)
Agarwal, A. K.
1988-01-01
The coupling of expert systems and control design and analysis techniques are documented to provide a realizable self repairing flight control system. Key features of such a flight control system are identified and a limited set of rules for a simple aircraft model are presented.
Design of fuzzy systems using neurofuzzy networks.
Figueiredo, M; Gomide, F
1999-01-01
This paper introduces a systematic approach for fuzzy system design based on a class of neural fuzzy networks built upon a general neuron model. The network structure is such that it encodes the knowledge learned in the form of if-then fuzzy rules and processes data following fuzzy reasoning principles. The technique provides a mechanism to obtain rules covering the whole input/output space as well as the membership functions (including their shapes) for each input variable. Such characteristics are of utmost importance in fuzzy systems design and application. In addition, after learning, it is very simple to extract fuzzy rules in the linguistic form. The network has universal approximation capability, a property very useful in, e.g., modeling and control applications. Here we focus on function approximation problems as a vehicle to illustrate its usefulness and to evaluate its performance. Comparisons with alternative approaches are also included. Both, nonnoisy and noisy data have been studied and considered in the computational experiments. The neural fuzzy network developed here and, consequently, the underlying approach, has shown to provide good results from the accuracy, complexity, and system design points of view.
A simple randomisation procedure for validating discriminant analysis: a methodological note.
Wastell, D G
1987-04-01
Because the goal of discriminant analysis (DA) is to optimise classification, it designedly exaggerates between-group differences. This bias complicates validation of DA. Jack-knifing has been used for validation but is inappropriate when stepwise selection (SWDA) is employed. A simple randomisation test is presented which is shown to give correct decisions for SWDA. The general superiority of randomisation tests over orthodox significance tests is discussed. Current work on non-parametric methods of estimating the error rates of prediction rules is briefly reviewed.
A 32-bit NMOS microprocessor with a large register file
NASA Astrophysics Data System (ADS)
Sherburne, R. W., Jr.; Katevenis, M. G. H.; Patterson, D. A.; Sequin, C. H.
1984-10-01
Two scaled versions of a 32-bit NMOS reduced instruction set computer CPU, called RISC II, have been implemented on two different processing lines using the simple Mead and Conway layout rules with lambda values of 2 and 1.5 microns (corresponding to drawn gate lengths of 4 and 3 microns), respectively. The design utilizes a small set of simple instructions in conjunction with a large register file in order to provide high performance. This approach has resulted in two surprisingly powerful single-chip processors.
A Simple Computer-Aided Three-Dimensional Molecular Modeling for the Octant Rule
ERIC Educational Resources Information Center
Kang, Yinan; Kang, Fu-An
2011-01-01
The Moffitt-Woodward-Moscowitz-Klyne-Djerassi octant rule is one of the most successful empirical rules in organic chemistry. However, the lack of a simple effective modeling method for the octant rule in the past 50 years has posed constant difficulties for researchers, teachers, and students, particularly the young generations, to learn and…
Rule-violations sensitise towards negative and authority-related stimuli.
Wirth, Robert; Foerster, Anna; Rendel, Hannah; Kunde, Wilfried; Pfister, Roland
2018-05-01
Rule violations have usually been studied from a third-person perspective, identifying situational factors that render violations more or less likely. A first-person perspective of the agent that actively violates the rules, on the other hand, is only just beginning to emerge. Here we show that committing a rule violation sensitises towards subsequent negative stimuli as well as subsequent authority-related stimuli. In a Prime-Probe design, we used an instructed rule-violation task as the Prime and a word categorisation task as the Probe. Also, we employed a control condition that used a rule inversion task as the Prime (instead of rule violations). Probe targets were categorised faster after a violation relative to after a rule-based response if they related to either, negative valence or authority. Inversions, however, primed only negative stimuli and did not accelerate the categorisation of authority-related stimuli. A heightened sensitivity towards authority-related targets thus seems to be specific to rule violations. A control experiment showed that these effects cannot be explained in terms of semantic priming. Therefore, we propose that rule violations necessarily activate authority-related representations that make rule violations qualitatively different from simple rule inversions.
SIRE: A Simple Interactive Rule Editor for NICBES
NASA Technical Reports Server (NTRS)
Bykat, Alex
1988-01-01
To support evolution of domain expertise, and its representation in an expert system knowledge base, a user-friendly rule base editor is mandatory. The Nickel Cadmium Battery Expert System (NICBES), a prototype of an expert system for the Hubble Space Telescope power storage management system, does not provide such an editor. In the following, a description of a Simple Interactive Rule Base Editor (SIRE) for NICBES is described. The SIRE provides a consistent internal representation of the NICBES knowledge base. It supports knowledge presentation and provides a user-friendly and code language independent medium for rule addition and modification. The SIRE is integrated with NICBES via an interface module. This module provides translation of the internal representation to Prolog-type rules (Horn clauses), latter rule assertion, and a simple mechanism for rule selection for its Prolog inference engine.
Tongsong, Theera; Tinnangwattana, Dangcheewan; Vichak-Ururote, Linlada; Tontivuthikul, Paponrad; Charoenratana, Cholaros; Lerthiranwong, Thitikarn
2016-01-01
To compare diagnostic performance in differentiating benign from malignant ovarian masses between IOTA (the International Ovarian Tumor Analysis) simple rules and subjective sonographic assessment. Women scheduled for elective surgery because of ovarian masses were recruited into the study and underwent ultrasound examination within 24 hours of surgery to apply the IOTA simple rules by general gynecologists and to record video clips for subjective assessment by an experienced sonographer. The diagnostic performance of the IOTA rules and subjective assessment for differentiation between benign and malignant masses was compared. The gold standard diagnosis was pathological or operative findings. A total of 150 ovarian masses were covered, comprising 105 (70%) benign and 45 (30%) malignant. Of them, the IOTA simple rules could be applied in 119 (79.3%) and were inconclusive in 31 (20.7%) whereas subjective assessment could be applied in all cases (100%). The sensitivity and the specificity of the IOTA simple rules and subjective assessment were not significantly different, 82.9% vs 86.7% and 94.0% vs 94.3% respectively. The agreement of the two methods in prediction was high with a Kappa index of 0.835. Both techniques had a high diagnostic performance in differentiation between benign and malignant ovarian masses but the IOTA rules had a relatively high rate of inconclusive results. The IOTA rules can be used as an effective screening technique by general gynecologists but when the results are inconclusive they should consult experienced sonographers.
Sainz de Murieta, Iñaki; Rodríguez-Patón, Alfonso
2012-08-01
Despite the many designs of devices operating with the DNA strand displacement, surprisingly none is explicitly devoted to the implementation of logical deductions. The present article introduces a new model of biosensor device that uses nucleic acid strands to encode simple rules such as "IF DNA_strand(1) is present THEN disease(A)" or "IF DNA_strand(1) AND DNA_strand(2) are present THEN disease(B)". Taking advantage of the strand displacement operation, our model makes these simple rules interact with input signals (either DNA or any type of RNA) to generate an output signal (in the form of nucleotide strands). This output signal represents a diagnosis, which either can be measured using FRET techniques, cascaded as the input of another logical deduction with different rules, or even be a drug that is administered in response to a set of symptoms. The encoding introduces an implicit error cancellation mechanism, which increases the system scalability enabling longer inference cascades with a bounded and controllable signal-noise relation. It also allows the same rule to be used in forward inference or backward inference, providing the option of validly outputting negated propositions (e.g. "diagnosis A excluded"). The models presented in this paper can be used to implement smart logical DNA devices that perform genetic diagnosis in vitro. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Wittenberg, Philipp; Gan, Fah Fatt; Knoth, Sven
2018-04-17
The variable life-adjusted display (VLAD) is the first risk-adjusted graphical procedure proposed in the literature for monitoring the performance of a surgeon. It displays the cumulative sum of expected minus observed deaths. It has since become highly popular because the statistic plotted is easy to understand. But it is also easy to misinterpret a surgeon's performance by utilizing the VLAD, potentially leading to grave consequences. The problem of misinterpretation is essentially caused by the variance of the VLAD's statistic that increases with sample size. In order for the VLAD to be truly useful, a simple signaling rule is desperately needed. Various forms of signaling rules have been developed, but they are usually quite complicated. Without signaling rules, making inferences using the VLAD alone is difficult if not misleading. In this paper, we establish an equivalence between a VLAD with V-mask and a risk-adjusted cumulative sum (RA-CUSUM) chart based on the difference between the estimated probability of death and surgical outcome. Average run length analysis based on simulation shows that this particular RA-CUSUM chart has similar performance as compared to the established RA-CUSUM chart based on the log-likelihood ratio statistic obtained by testing the odds ratio of death. We provide a simple design procedure for determining the V-mask parameters based on a resampling approach. Resampling from a real data set ensures that these parameters can be estimated appropriately. Finally, we illustrate the monitoring of a real surgeon's performance using VLAD with V-mask. Copyright © 2018 John Wiley & Sons, Ltd.
A semi-automatic computer-aided method for surgical template design
NASA Astrophysics Data System (ADS)
Chen, Xiaojun; Xu, Lu; Yang, Yue; Egger, Jan
2016-02-01
This paper presents a generalized integrated framework of semi-automatic surgical template design. Several algorithms were implemented including the mesh segmentation, offset surface generation, collision detection, ruled surface generation, etc., and a special software named TemDesigner was developed. With a simple user interface, a customized template can be semi- automatically designed according to the preoperative plan. Firstly, mesh segmentation with signed scalar of vertex is utilized to partition the inner surface from the input surface mesh based on the indicated point loop. Then, the offset surface of the inner surface is obtained through contouring the distance field of the inner surface, and segmented to generate the outer surface. Ruled surface is employed to connect inner and outer surfaces. Finally, drilling tubes are generated according to the preoperative plan through collision detection and merging. It has been applied to the template design for various kinds of surgeries, including oral implantology, cervical pedicle screw insertion, iliosacral screw insertion and osteotomy, demonstrating the efficiency, functionality and generality of our method.
A semi-automatic computer-aided method for surgical template design
Chen, Xiaojun; Xu, Lu; Yang, Yue; Egger, Jan
2016-01-01
This paper presents a generalized integrated framework of semi-automatic surgical template design. Several algorithms were implemented including the mesh segmentation, offset surface generation, collision detection, ruled surface generation, etc., and a special software named TemDesigner was developed. With a simple user interface, a customized template can be semi- automatically designed according to the preoperative plan. Firstly, mesh segmentation with signed scalar of vertex is utilized to partition the inner surface from the input surface mesh based on the indicated point loop. Then, the offset surface of the inner surface is obtained through contouring the distance field of the inner surface, and segmented to generate the outer surface. Ruled surface is employed to connect inner and outer surfaces. Finally, drilling tubes are generated according to the preoperative plan through collision detection and merging. It has been applied to the template design for various kinds of surgeries, including oral implantology, cervical pedicle screw insertion, iliosacral screw insertion and osteotomy, demonstrating the efficiency, functionality and generality of our method. PMID:26843434
A semi-automatic computer-aided method for surgical template design.
Chen, Xiaojun; Xu, Lu; Yang, Yue; Egger, Jan
2016-02-04
This paper presents a generalized integrated framework of semi-automatic surgical template design. Several algorithms were implemented including the mesh segmentation, offset surface generation, collision detection, ruled surface generation, etc., and a special software named TemDesigner was developed. With a simple user interface, a customized template can be semi- automatically designed according to the preoperative plan. Firstly, mesh segmentation with signed scalar of vertex is utilized to partition the inner surface from the input surface mesh based on the indicated point loop. Then, the offset surface of the inner surface is obtained through contouring the distance field of the inner surface, and segmented to generate the outer surface. Ruled surface is employed to connect inner and outer surfaces. Finally, drilling tubes are generated according to the preoperative plan through collision detection and merging. It has been applied to the template design for various kinds of surgeries, including oral implantology, cervical pedicle screw insertion, iliosacral screw insertion and osteotomy, demonstrating the efficiency, functionality and generality of our method.
New Solar Cell Is More Efficient, Less Costly | News | NREL
rules for solar cells. Credit: Dennis Schroeder American innovators still have some cards to play when significant cost advantage when it comes to high-volume manufacturing. "It's a potentially disruptive . solar manufacturing when the approach hits the assembly line next year. The innovative design, simple
How to Design and Present Texts to Cultivate Balanced Regional Images in Geography Education
ERIC Educational Resources Information Center
Lee, Dong-Min; Ryu, Jaemyong
2013-01-01
This article examines possibilities associated with the cultivation of balanced regional images via the use of simple methods. Two experiments based on the primacy effect and the painting picture rule, or visual depiction of regions, were conducted. The results show significant differences in the formation of regional images. More specifically,…
Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0
NASA Technical Reports Server (NTRS)
Schmidt, Conrad K.
2013-01-01
Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.
Adaptive Critic-based Neurofuzzy Controller for the Steam Generator Water Level
NASA Astrophysics Data System (ADS)
Fakhrazari, Amin; Boroushaki, Mehrdad
2008-06-01
In this paper, an adaptive critic-based neurofuzzy controller is presented for water level regulation of nuclear steam generators. The problem has been of great concern for many years as the steam generator is a highly nonlinear system showing inverse response dynamics especially at low operating power levels. Fuzzy critic-based learning is a reinforcement learning method based on dynamic programming. The only information available for the critic agent is the system feedback which is interpreted as the last action the controller has performed in the previous state. The signal produced by the critic agent is used alongside the backpropagation of error algorithm to tune online conclusion parts of the fuzzy inference rules. The critic agent here has a proportional-derivative structure and the fuzzy rule base has nine rules. The proposed controller shows satisfactory transient responses, disturbance rejection and robustness to model uncertainty. Its simple design procedure and structure, nominates it as one of the suitable controller designs for the steam generator water level control in nuclear power plant industry.
Modular design of synthetic gene circuits with biological parts and pools.
Marchisio, Mario Andrea
2015-01-01
Synthetic gene circuits can be designed in an electronic fashion by displaying their basic components-Standard Biological Parts and Pools of molecules-on the computer screen and connecting them with hypothetical wires. This procedure, achieved by our add-on for the software ProMoT, was successfully applied to bacterial circuits. Recently, we have extended this design-methodology to eukaryotic cells. Here, highly complex components such as promoters and Pools of mRNA contain hundreds of species and reactions whose calculation demands a rule-based modeling approach. We showed how to build such complex modules via the joint employment of the software BioNetGen (rule-based modeling) and ProMoT (modularization). In this chapter, we illustrate how to utilize our computational tool for synthetic biology with the in silico implementation of a simple eukaryotic gene circuit that performs the logic AND operation.
Wang, Pengfei; Wu, Siyu; Tian, Cheng; Yu, Guimei; Jiang, Wen; Wang, Guansong; Mao, Chengde
2016-10-11
Current tile-based DNA self-assembly produces simple repetitive or highly symmetric structures. In the case of 2D lattices, the unit cell often contains only one basic tile because the tiles often are symmetric (in terms of either the backbone or the sequence). In this work, we have applied retrosynthetic analysis to determine the minimal asymmetric units for complex DNA nanostructures. Such analysis guides us to break the intrinsic structural symmetries of the tiles to achieve high structural complexities. This strategy has led to the construction of several DNA nanostructures that are not accessible from conventional symmetric tile designs. Along with previous studies, herein we have established a set of four fundamental rules regarding tile-based assembly. Such rules could serve as guidelines for the design of DNA nanostructures.
ERIC Educational Resources Information Center
Nahavandi, Naemeh; Mukundan, Jayakaran
2013-01-01
The present study investigated the impact of textual input enhancement and explicit rule presentation on 93 Iranian EFL learners' intake of simple past tense. Three intact general English classes in Tabriz Azad University were randomly assigned to: 1) a control group; 2) a TIE group; and 3) a TIE plus explicit rule presentation group. All…
Minimizing Significant Figure Fuzziness.
ERIC Educational Resources Information Center
Fields, Lawrence D.; Hawkes, Stephen J.
1986-01-01
Addresses the principles and problems associated with the use of significant figures. Explains uncertainty, the meaning of significant figures, the Simple Rule, the Three Rule, and the 1-5 Rule. Also provides examples of the Rules. (ML)
Simple modification of Oja rule limits L1-norm of weight vector and leads to sparse connectivity.
Aparin, Vladimir
2012-03-01
This letter describes a simple modification of the Oja learning rule, which asymptotically constrains the L1-norm of an input weight vector instead of the L2-norm as in the original rule. This constraining is local as opposed to commonly used instant normalizations, which require the knowledge of all input weights of a neuron to update each one of them individually. The proposed rule converges to a weight vector that is sparser (has more zero weights) than the vector learned by the original Oja rule with or without the zero bound, which could explain the developmental synaptic pruning.
Ten simple rules for Lightning and PechaKucha presentations.
NASA Astrophysics Data System (ADS)
Lortie, C. J.
2016-12-01
An interesting opportunity has emerged that bridges the gap between lengthy, detailed presentations of scientific findings and `sound bites' appropriate for media reporting - very short presentations often presented in sets. Lightning or Ignite (20 slides @15 seconds each) and PechaKucha (20 slides @20 seconds each) presentations are common formats for short, rapid communications at scientific conferences and public events. The simple rules for making good presentations also apply, but these presentation formats provide both unique communication opportunities and novel challenges. In the spirit of light, quick, and exact (but without the fox), here are ten simple rules for presentation formats that do not wait for the speaker.
Scheeline, Alexander
2017-10-01
Designing a spectrometer requires knowledge of the problem to be solved, the molecules whose properties will contribute to a solution of that problem and skill in many subfields of science and engineering. A seemingly simple problem, design of an ultraviolet, visible, and near-infrared spectrometer, is used to show the reasoning behind the trade-offs in instrument design. Rather than reporting a fully optimized instrument, the Yin and Yang of design choices, leading to decisions about financial cost, materials choice, resolution, throughput, aperture, and layout are described. To limit scope, aspects such as grating blaze, electronics design, and light sources are not presented. The review illustrates the mixture of mathematical rigor, rule of thumb, esthetics, and availability of components that contribute to the art of spectrometer design.
A Left-Hand Rule for Faraday's Law
ERIC Educational Resources Information Center
Salu, Yehuda
2014-01-01
A left-hand rule for Faraday's law is presented here. This rule provides a simple and quick way of finding directional relationships between variables of Faraday's law without using Lenz's rule.
Simple and accurate sum rules for highly relativistic systems
NASA Astrophysics Data System (ADS)
Cohen, Scott M.
2005-03-01
In this paper, I consider the Bethe and Thomas-Reiche-Kuhn sum rules, which together form the foundation of Bethe's theory of energy loss from fast charged particles to matter. For nonrelativistic target systems, the use of closure leads directly to simple expressions for these quantities. In the case of relativistic systems, on the other hand, the calculation of sum rules is fraught with difficulties. Various perturbative approaches have been used over the years to obtain relativistic corrections, but these methods fail badly when the system in question is very strongly bound. Here, I present an approach that leads to relatively simple expressions yielding accurate sums, even for highly relativistic many-electron systems. I also offer an explanation for the difference between relativistic and nonrelativistic sum rules in terms of the Zitterbewegung of the electrons.
Strategies for Pre-Emptive Mid-Air Collision Avoidance in Budgerigars
Schiffner, Ingo; Srinivasan, Mandyam V.
2016-01-01
We have investigated how birds avoid mid-air collisions during head-on encounters. Trajectories of birds flying towards each other in a tunnel were recorded using high speed video cameras. Analysis and modelling of the data suggest two simple strategies for collision avoidance: (a) each bird veers to its right and (b) each bird changes its altitude relative to the other bird according to a preset preference. Both strategies suggest simple rules by which collisions can be avoided in head-on encounters by two agents, be they animals or machines. The findings are potentially applicable to the design of guidance algorithms for automated collision avoidance on aircraft. PMID:27680488
Occupational exposure decisions: can limited data interpretation training help improve accuracy?
Logan, Perry; Ramachandran, Gurumurthy; Mulhausen, John; Hewett, Paul
2009-06-01
Accurate exposure assessments are critical for ensuring that potentially hazardous exposures are properly identified and controlled. The availability and accuracy of exposure assessments can determine whether resources are appropriately allocated to engineering and administrative controls, medical surveillance, personal protective equipment and other programs designed to protect workers. A desktop study was performed using videos, task information and sampling data to evaluate the accuracy and potential bias of participants' exposure judgments. Desktop exposure judgments were obtained from occupational hygienists for material handling jobs with small air sampling data sets (0-8 samples) and without the aid of computers. In addition, data interpretation tests (DITs) were administered to participants where they were asked to estimate the 95th percentile of an underlying log-normal exposure distribution from small data sets. Participants were presented with an exposure data interpretation or rule of thumb training which included a simple set of rules for estimating 95th percentiles for small data sets from a log-normal population. DIT was given to each participant before and after the rule of thumb training. Results of each DIT and qualitative and quantitative exposure judgments were compared with a reference judgment obtained through a Bayesian probabilistic analysis of the sampling data to investigate overall judgment accuracy and bias. There were a total of 4386 participant-task-chemical judgments for all data collections: 552 qualitative judgments made without sampling data and 3834 quantitative judgments with sampling data. The DITs and quantitative judgments were significantly better than random chance and much improved by the rule of thumb training. In addition, the rule of thumb training reduced the amount of bias in the DITs and quantitative judgments. The mean DIT % correct scores increased from 47 to 64% after the rule of thumb training (P < 0.001). The accuracy for quantitative desktop judgments increased from 43 to 63% correct after the rule of thumb training (P < 0.001). The rule of thumb training did not significantly impact accuracy for qualitative desktop judgments. The finding that even some simple statistical rules of thumb improve judgment accuracy significantly suggests that hygienists need to routinely use statistical tools while making exposure judgments using monitoring data.
Optimal two-phase sampling design for comparing accuracies of two binary classification rules.
Xu, Huiping; Hui, Siu L; Grannis, Shaun
2014-02-10
In this paper, we consider the design for comparing the performance of two binary classification rules, for example, two record linkage algorithms or two screening tests. Statistical methods are well developed for comparing these accuracy measures when the gold standard is available for every unit in the sample, or in a two-phase study when the gold standard is ascertained only in the second phase in a subsample using a fixed sampling scheme. However, these methods do not attempt to optimize the sampling scheme to minimize the variance of the estimators of interest. In comparing the performance of two classification rules, the parameters of primary interest are the difference in sensitivities, specificities, and positive predictive values. We derived the analytic variance formulas for these parameter estimates and used them to obtain the optimal sampling design. The efficiency of the optimal sampling design is evaluated through an empirical investigation that compares the optimal sampling with simple random sampling and with proportional allocation. Results of the empirical study show that the optimal sampling design is similar for estimating the difference in sensitivities and in specificities, and both achieve a substantial amount of variance reduction with an over-sample of subjects with discordant results and under-sample of subjects with concordant results. A heuristic rule is recommended when there is no prior knowledge of individual sensitivities and specificities, or the prevalence of the true positive findings in the study population. The optimal sampling is applied to a real-world example in record linkage to evaluate the difference in classification accuracy of two matching algorithms. Copyright © 2013 John Wiley & Sons, Ltd.
Tanaka, Takuma; Aoyagi, Toshio; Kaneko, Takeshi
2012-10-01
We propose a new principle for replicating receptive field properties of neurons in the primary visual cortex. We derive a learning rule for a feedforward network, which maintains a low firing rate for the output neurons (resulting in temporal sparseness) and allows only a small subset of the neurons in the network to fire at any given time (resulting in population sparseness). Our learning rule also sets the firing rates of the output neurons at each time step to near-maximum or near-minimum levels, resulting in neuronal reliability. The learning rule is simple enough to be written in spatially and temporally local forms. After the learning stage is performed using input image patches of natural scenes, output neurons in the model network are found to exhibit simple-cell-like receptive field properties. When the output of these simple-cell-like neurons are input to another model layer using the same learning rule, the second-layer output neurons after learning become less sensitive to the phase of gratings than the simple-cell-like input neurons. In particular, some of the second-layer output neurons become completely phase invariant, owing to the convergence of the connections from first-layer neurons with similar orientation selectivity to second-layer neurons in the model network. We examine the parameter dependencies of the receptive field properties of the model neurons after learning and discuss their biological implications. We also show that the localized learning rule is consistent with experimental results concerning neuronal plasticity and can replicate the receptive fields of simple and complex cells.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erlenwein, P.; Frisch, W.; Kafka, P.
Nuclear reactors of 200- to 400-MW(thermal) power for district heating are the subject of increasing interest, and several specific designs are under discussion today. In the Federal Republic of Germany (FRG), the Kraftwerk Union AG has presented a 200-MW(thermal) heating reactor concept. The main safety issues of this design are assessed. In this design, the primary system is fully integrated into the reactor pressure vessel (RPV), which is tightly enclosed by the containment. The low process parameters like pressure, temperature, and power density and the high ratio of coolant volume to thermal power allow the design of simple safety features.more » This is supported by the preference of passive over active components. A special feature is a newly designed hydraulic control and rod drive mechanism, which is also integrated into the RPV. Within the safety assessment an overview of the relevant FRG safety rules and guidelines, developed mainly for large, electricity-generating power plants, is given. Included is a discussion of the extent to which these licensing rules can be applied to the concept of heating reactors.« less
Charge-density-shear-moduli relationships in aluminum-lithium alloys.
Eberhart, M
2001-11-12
Using the first principles full-potential linear-augmented-Slater-type orbital technique, the energies and charge densities of aluminum and aluminum-lithium supercells have been computed. The experimentally observed increase in aluminum's shear moduli upon alloying with lithium is argued to be the result of predictable changes to aluminum's total charge density, suggesting that simple rules may allow the alloy designer to predict the effects of dilute substitutional elements on alloy elastic response.
A Simple Demonstration of a General Rule for the Variation of Magnetic Field with Distance
ERIC Educational Resources Information Center
Kodama, K.
2009-01-01
We describe a simple experiment demonstrating the variation in magnitude of a magnetic field with distance. The method described requires only an ordinary magnetic compass and a permanent magnet. The proposed graphical analysis illustrates a unique method for deducing a general rule of magnetostatics. (Contains 1 table and 6 figures.)
NASA Astrophysics Data System (ADS)
Maneri, E.; Gawronski, W.
1999-10-01
The linear quadratic Gaussian (LQG) design algorithms described in [2] and [5] have been used in the controller design of JPL's beam-waveguide [5] and 70-m [6] antennas. This algorithm significantly improves tracking precision in a windy environment. This article describes the graphical user interface (GUI) software for the design LQG controllers. It consists of two parts: the basic LQG design and the fine-tuning of the basic design using a constrained optimization algorithm. The presented GUI was developed to simplify the design process, to make the design process user-friendly, and to enable design of an LQG controller for one with a limited control engineering background. The user is asked to manipulate the GUI sliders and radio buttons to watch the antenna performance. Simple rules are given at the GUI display.
NASA Technical Reports Server (NTRS)
1972-01-01
A long life assurance program for the development of design, process, test, and application guidelines for achieving reliable spacecraft hardware was conducted. The study approach consisted of a review of technical data performed concurrently with a survey of the aerospace industry. The data reviewed included design and operating characteristics, failure histories and solutions, and similar documents. The topics covered by the guidelines are reported. It is concluded that long life hardware is achieved through meticulous attention to many details and no simple set of rules can suffice.
Retrieving and Indexing Spatial Data in the Cloud Computing Environment
NASA Astrophysics Data System (ADS)
Wang, Yonggang; Wang, Sheng; Zhou, Daliang
In order to solve the drawbacks of spatial data storage in common Cloud Computing platform, we design and present a framework for retrieving, indexing, accessing and managing spatial data in the Cloud environment. An interoperable spatial data object model is provided based on the Simple Feature Coding Rules from the OGC such as Well Known Binary (WKB) and Well Known Text (WKT). And the classic spatial indexing algorithms like Quad-Tree and R-Tree are re-designed in the Cloud Computing environment. In the last we develop a prototype software based on Google App Engine to implement the proposed model.
A random walk rule for phase I clinical trials.
Durham, S D; Flournoy, N; Rosenberger, W F
1997-06-01
We describe a family of random walk rules for the sequential allocation of dose levels to patients in a dose-response study, or phase I clinical trial. Patients are sequentially assigned the next higher, same, or next lower dose level according to some probability distribution, which may be determined by ethical considerations as well as the patient's response. It is shown that one can choose these probabilities in order to center dose level assignments unimodally around any target quantile of interest. Estimation of the quantile is discussed; the maximum likelihood estimator and its variance are derived under a two-parameter logistic distribution, and the maximum likelihood estimator is compared with other nonparametric estimators. Random walk rules have clear advantages: they are simple to implement, and finite and asymptotic distribution theory is completely worked out. For a specific random walk rule, we compute finite and asymptotic properties and give examples of its use in planning studies. Having the finite distribution theory available and tractable obviates the need for elaborate simulation studies to analyze the properties of the design. The small sample properties of our rule, as determined by exact theory, compare favorably to those of the continual reassessment method, determined by simulation.
Replication initiatives will not salvage the trustworthiness of psychology.
Coyne, James C
2016-05-31
Replication initiatives in psychology continue to gather considerable attention from far outside the field, as well as controversy from within. Some accomplishments of these initiatives are noted, but this article focuses on why they do not provide a general solution for what ails psychology. There are inherent limitations to mass replications ever being conducted in many areas of psychology, both in terms of their practicality and their prospects for improving the science. Unnecessary compromises were built into the ground rules for design and publication of the Open Science Collaboration: Psychology that undermine its effectiveness. Some ground rules could actually be flipped into guidance for how not to conduct replications. Greater adherence to best publication practices, transparency in the design and publishing of research, strengthening of independent post-publication peer review and firmer enforcement of rules about data sharing and declarations of conflict of interest would make many replications unnecessary. Yet, it has been difficult to move beyond simple endorsement of these measures to consistent implementation. Given the strong institutional support for questionable publication practices, progress will depend on effective individual and collective use of social media to expose lapses and demand reform. Some recent incidents highlight the necessity of this.
The design and implementation of EPL: An event pattern language for active databases
NASA Technical Reports Server (NTRS)
Giuffrida, G.; Zaniolo, C.
1994-01-01
The growing demand for intelligent information systems requires closer coupling of rule-based reasoning engines, such as CLIPS, with advanced data base management systems (DBMS). For instance, several commercial DBMS now support the notion of triggers that monitor events and transactions occurring in the database and fire induced actions, which perform a variety of critical functions, including safeguarding the integrity of data, monitoring access, and recording volatile information needed by administrators, analysts, and expert systems to perform assorted tasks; examples of these tasks include security enforcement, market studies, knowledge discovery, and link analysis. At UCLA, we designed and implemented the event pattern language (EPL) which is capable of detecting and acting upon complex patterns of events which are temporally related to each other. For instance, a plant manager should be notified when a certain pattern of overheating repeats itself over time in a chemical process; likewise, proper notification is required when a suspicious sequence of bank transactions is executed within a certain time limit. The EPL prototype is built in CLIPS to operate on top of Sybase, a commercial relational DBMS, where actions can be triggered by events such as simple database updates, insertions, and deletions. The rule-based syntax of EPL allows the sequences of goals in rules to be interpreted as sequences of temporal events; each goal can correspond to either (1) a simple event, or (2) a (possibly negated) event/condition predicate, or (3) a complex event defined as the disjunction and repetition of other events. Various extensions have been added to CLIPS in order to tailor the interface with Sybase and its open client/server architecture.
Aktipis, C. Athena
2011-01-01
The evolution of cooperation through partner choice mechanisms is often thought to involve relatively complex cognitive abilities. Using agent-based simulations I model a simple partner choice rule, the ‘Walk Away’ rule, where individuals stay in groups that provide higher returns (by virtue of having more cooperators), and ‘Walk Away’ from groups providing low returns. Implementing this conditional movement rule in a public goods game leads to a number of interesting findings: 1) cooperators have a selective advantage when thresholds are high, corresponding to low tolerance for defectors, 2) high thresholds lead to high initial rates of movement and low final rates of movement (after selection), and 3) as cooperation is selected, the population undergoes a spatial transition from high migration (and a many small and ephemeral groups) to low migration (and large and stable groups). These results suggest that the very simple ‘Walk Away’ rule of leaving uncooperative groups can favor the evolution of cooperation, and that cooperation can evolve in populations in which individuals are able to move in response to local social conditions. A diverse array of organisms are able to leave degraded physical or social environments. The ubiquitous nature of conditional movement suggests that ‘Walk Away’ dynamics may play an important role in the evolution of social behavior in both cognitively complex and cognitively simple organisms. PMID:21666771
Rule-governed behavior: teaching a preliminary repertoire of rule-following to children with autism.
Tarbox, Jonathan; Zuckerman, Carrie K; Bishop, Michele R; Olive, Melissa L; O'Hora, Denis P
2011-01-01
Rule-governed behavior is generally considered an integral component of complex verbal repertoires but has rarely been the subject of empirical research. In particular, little or no previous research has attempted to establish rule-governed behavior in individuals who do not already display the repertoire. This study consists of two experiments that evaluated multiple exemplar training procedures for teaching a simple component skill, which may be necessary for developing a repertoire of rule-governed behavior. In both experiments, children with autism were taught to respond to simple rules that specified antecedents and the behaviors that should occur in their presence. In the first study, participants were taught to respond to rules containing "if/then" statements, where the antecedent was specified before the behavior. The second experiment was a replication and extension of the first. It involved a variation on the manner in which rules were presented. Both experiments eventually demonstrated generalization to novel rules for all participants; however variations to the standard procedure were required for several participants. Results suggest that rule-following can be analyzed and taught as generalized operant behavior and implications for future research are discussed.
Ten simple rules for making research software more robust
2017-01-01
Software produced for research, published and otherwise, suffers from a number of common problems that make it difficult or impossible to run outside the original institution or even off the primary developer’s computer. We present ten simple rules to make such software robust enough to be run by anyone, anywhere, and thereby delight your users and collaborators. PMID:28407023
Learning Problem-Solving Rules as Search Through a Hypothesis Space.
Lee, Hee Seung; Betts, Shawn; Anderson, John R
2016-07-01
Learning to solve a class of problems can be characterized as a search through a space of hypotheses about the rules for solving these problems. A series of four experiments studied how different learning conditions affected the search among hypotheses about the solution rule for a simple computational problem. Experiment 1 showed that a problem property such as computational difficulty of the rules biased the search process and so affected learning. Experiment 2 examined the impact of examples as instructional tools and found that their effectiveness was determined by whether they uniquely pointed to the correct rule. Experiment 3 compared verbal directions with examples and found that both could guide search. The final experiment tried to improve learning by using more explicit verbal directions or by adding scaffolding to the example. While both manipulations improved learning, learning still took the form of a search through a hypothesis space of possible rules. We describe a model that embodies two assumptions: (1) the instruction can bias the rules participants hypothesize rather than directly be encoded into a rule; (2) participants do not have memory for past wrong hypotheses and are likely to retry them. These assumptions are realized in a Markov model that fits all the data by estimating two sets of probabilities. First, the learning condition induced one set of Start probabilities of trying various rules. Second, should this first hypothesis prove wrong, the learning condition induced a second set of Choice probabilities of considering various rules. These findings broaden our understanding of effective instruction and provide implications for instructional design. Copyright © 2015 Cognitive Science Society, Inc.
Robson, Barry; Mushlin, Richard
2004-01-01
The physician and researcher must ultimately be able to combine qualitative and quantitative features from a variety of combinations of observations on data of many component items (i.e., many dimensions), and hence reach simple conclusions about interpretation, rational courses of action, and design. In the first paper of this series, it was noted that such needs are challenging the classical means of using statistics. Hence, the paper proposed the use of a Generalized Theory of Expected Information or "Zeta Theory". The conjoint event [a,b,c,..] is seen as a rule of association for a,b,c,.. associated with a rule strength I(a;b;c;...) = xi(s,o[a,b,c,..]) - xi (s,e[a,b,c,...]), where xi is the incomplete Zeta Function. Here, o[a,b,c,...] is the observed, and e[a,b,c,..] the expected, frequency of occurrence of conjoint event [a,b,c,...]. The present paper explores how output from this approach might be assembled in a form better suited for decision support. Related to this is the difficulty that the treatment of covariance and multivariance was previously rendered as a "fuzzy association" so that the output would fall into a similar form as the true associations, but this was a somewhat ad hoc approach in which only the final I( ) had any meaning. Users at clinical research sites had subsequently requested an alternative approach in which "effective frequencies" o[ ] and e[ ] calculated from the above variances and used to evaluate I( ) give some intuitive feeling analogous to the association treatment, and this is explored here. Though the present paper is theoretical, real examples are used to illustrate application. One clinical-genomic example illustrates experimental design by identifying data which is, or is not, statistically germane to the study. We also report on some impressions based on applying these techniques in studies of real, extensive patient record data which are now emerging, as well as on molecular design data originally studied in part to test the ability to deduce the effects of simple natural patient sequence variations ("SNPs") on patient protein activity. On the basis of these study experiences, methods of rationalizing and condensing the rules implied by associations and variances between data, as well as discussion of the difficulty of what is meant by "condensed", are presented in the Appendix.
ERIC Educational Resources Information Center
Minkiewicz, Piotr; Darewicz, Malgorzata; Iwaniak, Anna
2018-01-01
A simple equation to calculate the oxidation states (oxidation numbers) of individual atoms in molecules and ions may be introduced instead of rules associated with words alone. The equation includes two of three categories of bonds, classified as proposed by Goodstein: number of bonds with more electronegative atoms and number of bonds with less…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-18
... solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR..., all tied hedge transactions (regardless of whether the option order is a simple or complex order) are... simple order the execution of the option leg of a tied hedge transaction does not qualify it for any NBBO...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-27
... solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR.... Purpose The purpose of the proposed rule change is to increase certain Simple Order Fees for Removing... market. Section I Amendments The Exchange proposes to amend the Simple Order fees in Section I, Part A of...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-16
... comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR 240.19b-4... applicable to simple orders in the options class under Exchange Rule 6.42--Minimum Increments of Bids and..., with the increment of trading being the standard trading increment applicable to simple orders in the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-01
... to solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2... recently filed a rule change to amend its transaction fees and rebates for simple,\\6\\ non-complex orders.... \\6\\ C2 defines simple orders to exclude ETFs and indexes. \\7\\ See Securities Exchange Act Release No...
NASA Astrophysics Data System (ADS)
Gehrmann, Andreas; Nagai, Yoshimitsu; Yoshida, Osamu; Ishizu, Syohei
Since management decision-making becomes complex and preferences of the decision-maker frequently becomes inconsistent, multi-attribute decision-making problems were studied. To represent inconsistent preference relation, the concept of evaluation structure was introduced. We can generate simple rules to represent inconsistent preference relation by the evaluation structures. Further rough set theory for the preference relation was studied and the concept of approximation was introduced. One of our main aims of this paper is to introduce a concept of rough evaluation structure for representing inconsistent preference relation. We apply rough set theory to the evaluation structure, and develop a method for generating simple rules for inconsistent preference relations. In this paper, we introduce concepts of totally ordered information system, similarity class of preference relation, upper and lower approximation of preference relations. We also show the properties of rough evaluation structure and provide a simple example. As an application of rough evaluation structure, we analyze questionnaire survey of customer preferences about audio players.
Profitability of simple technical trading rules of Chinese stock exchange indexes
NASA Astrophysics Data System (ADS)
Zhu, Hong; Jiang, Zhi-Qiang; Li, Sai-Ping; Zhou, Wei-Xing
2015-12-01
Although technical trading rules have been widely used by practitioners in financial markets, their profitability still remains controversial. We here investigate the profitability of moving average (MA) and trading range break (TRB) rules by using the Shanghai Stock Exchange Composite Index (SHCI) from May 21, 1992 through December 31, 2013 and Shenzhen Stock Exchange Component Index (SZCI) from April 3, 1991 through December 31, 2013. The t-test is adopted to check whether the mean returns which are conditioned on the trading signals are significantly different from unconditioned returns and whether the mean returns conditioned on the buy signals are significantly different from the mean returns conditioned on the sell signals. We find that TRB rules outperform MA rules and short-term variable moving average (VMA) rules outperform long-term VMA rules. By applying White's Reality Check test and accounting for the data snooping effects, we find that the best trading rule outperforms the buy-and-hold strategy when transaction costs are not taken into consideration. Once transaction costs are included, trading profits will be eliminated completely. Our analysis suggests that simple trading rules like MA and TRB cannot beat the standard buy-and-hold strategy for the Chinese stock exchange indexes.
Simple heuristics and rules of thumb: where psychologists and behavioural biologists might meet.
Hutchinson, John M C; Gigerenzer, Gerd
2005-05-31
The Centre for Adaptive Behaviour and Cognition (ABC) has hypothesised that much human decision-making can be described by simple algorithmic process models (heuristics). This paper explains this approach and relates it to research in biology on rules of thumb, which we also review. As an example of a simple heuristic, consider the lexicographic strategy of Take The Best for choosing between two alternatives: cues are searched in turn until one discriminates, then search stops and all other cues are ignored. Heuristics consist of building blocks, and building blocks exploit evolved or learned abilities such as recognition memory; it is the complexity of these abilities that allows the heuristics to be simple. Simple heuristics have an advantage in making decisions fast and with little information, and in avoiding overfitting. Furthermore, humans are observed to use simple heuristics. Simulations show that the statistical structures of different environments affect which heuristics perform better, a relationship referred to as ecological rationality. We contrast ecological rationality with the stronger claim of adaptation. Rules of thumb from biology provide clearer examples of adaptation because animals can be studied in the environments in which they evolved. The range of examples is also much more diverse. To investigate them, biologists have sometimes used similar simulation techniques to ABC, but many examples depend on empirically driven approaches. ABC's theoretical framework can be useful in connecting some of these examples, particularly the scattered literature on how information from different cues is integrated. Optimality modelling is usually used to explain less detailed aspects of behaviour but might more often be redirected to investigate rules of thumb.
Jonnalagadda, Siddhartha Reddy; Li, Dingcheng; Sohn, Sunghwan; Wu, Stephen Tze-Inn; Wagholikar, Kavishwar; Torii, Manabu; Liu, Hongfang
2012-01-01
This paper describes the coreference resolution system submitted by Mayo Clinic for the 2011 i2b2/VA/Cincinnati shared task Track 1C. The goal of the task was to construct a system that links the markables corresponding to the same entity. The task organizers provided progress notes and discharge summaries that were annotated with the markables of treatment, problem, test, person, and pronoun. We used a multi-pass sieve algorithm that applies deterministic rules in the order of preciseness and simultaneously gathers information about the entities in the documents. Our system, MedCoref, also uses a state-of-the-art machine learning framework as an alternative to the final, rule-based pronoun resolution sieve. The best system that uses a multi-pass sieve has an overall score of 0.836 (average of B(3), MUC, Blanc, and CEAF F score) for the training set and 0.843 for the test set. A supervised machine learning system that typically uses a single function to find coreferents cannot accommodate irregularities encountered in data especially given the insufficient number of examples. On the other hand, a completely deterministic system could lead to a decrease in recall (sensitivity) when the rules are not exhaustive. The sieve-based framework allows one to combine reliable machine learning components with rules designed by experts. Using relatively simple rules, part-of-speech information, and semantic type properties, an effective coreference resolution system could be designed. The source code of the system described is available at https://sourceforge.net/projects/ohnlp/files/MedCoref.
Transport Properties of Complex Oxides: New Ideas and Insights from Theory and Simulation
NASA Astrophysics Data System (ADS)
Benedek, Nicole
Complex oxides are one of the largest and most technologically important materials families. The ABO3 perovskite oxides in particular display an unparalleled variety of physical properties. The microscopic origin of these properties (how they arise from the structure of the material) is often complicated, but in many systems previous research has identified simple guidelines or `rules of thumb' that link structure and chemistry to the physics of interest. For example, the tolerance factor is a simple empirical measure that relates the composition of a perovskite to its tendency to adopt a distorted structure. First-principles calculations have shown that the tendency towards ferroelectricity increases systematically as the tolerance factor of the perovskite decreases. Can we uncover a similar set of simple guidelines to yield new insights into the ionic and thermal transport properties of perovskites? I will discuss recent research from my group on the link between crystal structure and chemistry, soft phonons and ionic transport in a family of layered perovskite oxides, the Ln2NiO4+δ Ruddlesden-Popper phases. In particular, we show how the lattice dynamical properties of these materials (their tendency to undergo certain structural distortions) can be correlated with oxide ion transport properties. Ultimately, we seek new ways to understand the microscopic origins of complex transport processes and to develop first-principles-based design rules for new materials based on our understanding.
NASA Technical Reports Server (NTRS)
Kettig, R. L.
1975-01-01
A method of classification of digitized multispectral images is developed and experimentally evaluated on actual earth resources data collected by aircraft and satellite. The method is designed to exploit the characteristic dependence between adjacent states of nature that is neglected by the more conventional simple-symmetric decision rule. Thus contextual information is incorporated into the classification scheme. The principle reason for doing this is to improve the accuracy of the classification. For general types of dependence this would generally require more computation per resolution element than the simple-symmetric classifier. But when the dependence occurs in the form of redundance, the elements can be classified collectively, in groups, therby reducing the number of classifications required.
Ontogeny of collective behavior reveals a simple attraction rule.
Hinz, Robert C; de Polavieja, Gonzalo G
2017-02-28
The striking patterns of collective animal behavior, including ant trails, bird flocks, and fish schools, can result from local interactions among animals without centralized control. Several of these rules of interaction have been proposed, but it has proven difficult to discriminate which ones are implemented in nature. As a method to better discriminate among interaction rules, we propose to follow the slow birth of a rule of interaction during animal development. Specifically, we followed the development of zebrafish, Danio rerio , and found that larvae turn toward each other from 7 days postfertilization and increase the intensity of interactions until 3 weeks. This developmental dataset allows testing the parameter-free predictions of a simple rule in which animals attract each other part of the time, with attraction defined as turning toward another animal chosen at random. This rule makes each individual likely move to a high density of conspecifics, and moving groups naturally emerge. Development of attraction strength corresponds to an increase in the time spent in attraction behavior. Adults were found to follow the same attraction rule, suggesting a potential significance for adults of other species.
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Application of 10-percent shareholder test to interest paid to a simple trust or grantor trust. Whether interest paid to a simple trust or grantor trust and distributed to or included in the gross income of a... 26 Internal Revenue 9 2010-04-01 2010-04-01 false Rules relating to repeal of tax on interest of...
Calendar methods of fertility regulation: a rule of thumb.
Colombo, B; Scarpa, B
1996-01-01
"[Many] illiterate women, particularly in the third world, find [it] difficult to apply usual calendar methods for the regulation of fertility. Some of them are even unable to make simple subtractions. In this paper we are therefore trying to evaluate the applicability and the efficiency of an extremely simple rule which entails only [the ability to count] a number of days, and always the same way." (SUMMARY IN ITA) excerpt
Silvestre, Liliane; Martins, Wellington P; Candido-Dos-Reis, Francisco J
2015-07-29
This study describes the accuracy of three-dimensional power Doppler (3D-PD) angiography as secondary method for differential diagnosis of ovarian tumors. Seventy-five women scheduled for surgical removal of adnexal masses were assessed by transvaginal ultrasound. Ovarian tumors were classified by IOTA simple rules and two three-dimensional blocks were recorded. In a second step analyses, a 4 cm(3) spherical sample was obtained from the highest vascularized solid area of each stored block. Vascularization index (VI), flow index (FI) and vascularization-flow index (VFI) were calculated. The repeatability was assessed by concordance correlation coefficient (CCC) and limits of agreement (LoA), and diagnostic accuracy by area under ROC curve. IOTA simple rules classified 26 cases as benign, nine as inconclusive and 40 as malignant. There were eight false positive and no false negative. Among the masses classified as inconclusive or malignant by IOTA simple rules, the CCCs were 0.91 for VI, 0.70 for FI, and 0.86 for VFI. The areas under ROC curve were 0.82 for VI, 0.67 for FI and 0.81 for VFI. 3D-PD angiography presented considerable intraobserver variability and low accuracy for identifying false positive results of IOTA simple rules.
ERIC Educational Resources Information Center
Kupperman, Joel J.
1978-01-01
Explores the use of the concept of inhibition in moral philosophy. Argues that there are strong practical reasons for basing moral teaching on simple moral rules and for inculcating inhibitions about breaking these rules. (Author)
New QCD sum rules based on canonical commutation relations
NASA Astrophysics Data System (ADS)
Hayata, Tomoya
2012-04-01
New derivation of QCD sum rules by canonical commutators is developed. It is the simple and straightforward generalization of Thomas-Reiche-Kuhn sum rule on the basis of Kugo-Ojima operator formalism of a non-abelian gauge theory and a suitable subtraction of UV divergences. By applying the method to the vector and axial vector current in QCD, the exact Weinberg’s sum rules are examined. Vector current sum rules and new fractional power sum rules are also discussed.
Analyzing compound and project progress through multi-objective-based compound quality assessment.
Nissink, J Willem M; Degorce, Sébastien
2013-05-01
Compound-quality scoring methods designed to evaluate multiple drug properties concurrently are useful to analyze and prioritize output from drug-design efforts. However, formalized multiparameter optimization approaches are not widely used in drug design. We rank molecules synthesized in drug-discovery projects using simple and aggregated desirability functions reflecting medicinal chemistry 'rules'. Our quality score deals transparently with missing data, a key requirement in drug-hunting projects where data availability is often limited. We further estimate confidence in the interpretation of such a compound-quality measure. Scores and associated confidences provide systematic insight in the quality of emerging chemical equity. Tracking quality of synthetic output over time yields valuable insight into the progress of drug-design teams, with potential applications in risk and resource management of a drug portfolio.
A study of some nine-element decision rules. [for multispectral recognition of remote sensing
NASA Technical Reports Server (NTRS)
Richardson, W.
1974-01-01
A nine-element rule is one that makes a classification decision for each pixel based on data from that pixel and its eight immediate neighbors. Three such rules, all fast and simple to use, are defined and tested. All performed substantially better on field interiors than the best one-point rule. Qualitative results indicate that fine detail and contradictory testimony tend to be overlooked by the rules.
Fuzzy logic-based flight control system design
NASA Astrophysics Data System (ADS)
Nho, Kyungmoon
The application of fuzzy logic to aircraft motion control is studied in this dissertation. The self-tuning fuzzy techniques are developed by changing input scaling factors to obtain a robust fuzzy controller over a wide range of operating conditions and nonlinearities for a nonlinear aircraft model. It is demonstrated that the properly adjusted input scaling factors can meet the required performance and robustness in a fuzzy controller. For a simple demonstration of the easy design and control capability of a fuzzy controller, a proportional-derivative (PD) fuzzy control system is compared to the conventional controller for a simple dynamical system. This thesis also describes the design principles and stability analysis of fuzzy control systems by considering the key features of a fuzzy control system including the fuzzification, rule-base and defuzzification. The wing-rock motion of slender delta wings, a linear aircraft model and the six degree of freedom nonlinear aircraft dynamics are considered to illustrate several self-tuning methods employing change in input scaling factors. Finally, this dissertation is concluded with numerical simulation of glide-slope capture in windshear demonstrating the robustness of the fuzzy logic based flight control system.
Models of Quantitative Estimations: Rule-Based and Exemplar-Based Processes Compared
ERIC Educational Resources Information Center
von Helversen, Bettina; Rieskamp, Jorg
2009-01-01
The cognitive processes underlying quantitative estimations vary. Past research has identified task-contingent changes between rule-based and exemplar-based processes (P. Juslin, L. Karlsson, & H. Olsson, 2008). B. von Helversen and J. Rieskamp (2008), however, proposed a simple rule-based model--the mapping model--that outperformed the…
A hybrid nonlinear programming method for design optimization
NASA Technical Reports Server (NTRS)
Rajan, S. D.
1986-01-01
Solutions to engineering design problems formulated as nonlinear programming (NLP) problems usually require the use of more than one optimization technique. Moreover, the interaction between the user (analysis/synthesis) program and the NLP system can lead to interface, scaling, or convergence problems. An NLP solution system is presented that seeks to solve these problems by providing a programming system to ease the user-system interface. A simple set of rules is used to select an optimization technique or to switch from one technique to another in an attempt to detect, diagnose, and solve some potential problems. Numerical examples involving finite element based optimal design of space trusses and rotor bearing systems are used to illustrate the applicability of the proposed methodology.
FASTBUS Slaves: a designers view
DOE Office of Scientific and Technical Information (OSTI.GOV)
Downing, R.W.
1981-10-01
Although FASTBUS has features built into it which allow complex interconnections and multiple Masters, the rules for implementing Slaves are very simple. The first time designer of Slave Modules should not be intimidated by the 200 pages of the FASTBUS document. About 90% of the specification is associated with system implications that do not impact Slave design. This paper will review the basic logic and timing requirements for FASTBUS Slave design. Also, some examples of implementation will be shown. The discussion which follows assumes that mastership of the bus has been gained. Bus arbitration, system interconnection, message routing, etc. aremore » separate topics and will not be discussed here. These topics affect only the design of devices which operate at the system level since FASTBUS Slave modules have been specified to be completely transparent to these system considerations.« less
Less can be more: How to make operations more flexible and robust with fewer resources
NASA Astrophysics Data System (ADS)
Haksöz, ćaǧrı; Katsikopoulos, Konstantinos; Gigerenzer, Gerd
2018-06-01
We review empirical evidence from practice and general theoretical conditions, under which simple rules of thumb can help to make operations flexible and robust. An operation is flexible when it responds adaptively to adverse events such as natural disasters; an operation is robust when it is less affected by adverse events in the first place. We illustrate the relationship between flexibility and robustness in the context of supply chain risk. In addition to increasing flexibility and robustness, simple rules simultaneously reduce the need for resources such as time, money, information, and computation. We illustrate the simple-rules approach with an easy-to-use graphical aid for diagnosing and managing supply chain risk. More generally, we recommend a four-step process for determining the amount of resources that decision makers should invest in so as to increase flexibility and robustness.
Adding Only One Priority Rule Allows Extending CIP Rules to Supramolecular Systems.
Alkorta, Ibon; Elguero, José; Cintas, Pedro
2015-05-01
There are frequent situations both in supramolecular chemistry and in crystallography that result in stereogenic centers, whose absolute configuration needs to be specified. With this aim we propose the inclusion of one simple additional rule to the Cahn-Ingold-Prelog (CIP) system of priority rules stating that noncovalent interactions have a fictitious number between 0 and 1. © 2015 Wiley Periodicals, Inc.
Li, Dingcheng; Sohn, Sunghwan; Wu, Stephen Tze-Inn; Wagholikar, Kavishwar; Torii, Manabu; Liu, Hongfang
2012-01-01
Objective This paper describes the coreference resolution system submitted by Mayo Clinic for the 2011 i2b2/VA/Cincinnati shared task Track 1C. The goal of the task was to construct a system that links the markables corresponding to the same entity. Materials and methods The task organizers provided progress notes and discharge summaries that were annotated with the markables of treatment, problem, test, person, and pronoun. We used a multi-pass sieve algorithm that applies deterministic rules in the order of preciseness and simultaneously gathers information about the entities in the documents. Our system, MedCoref, also uses a state-of-the-art machine learning framework as an alternative to the final, rule-based pronoun resolution sieve. Results The best system that uses a multi-pass sieve has an overall score of 0.836 (average of B3, MUC, Blanc, and CEAF F score) for the training set and 0.843 for the test set. Discussion A supervised machine learning system that typically uses a single function to find coreferents cannot accommodate irregularities encountered in data especially given the insufficient number of examples. On the other hand, a completely deterministic system could lead to a decrease in recall (sensitivity) when the rules are not exhaustive. The sieve-based framework allows one to combine reliable machine learning components with rules designed by experts. Conclusion Using relatively simple rules, part-of-speech information, and semantic type properties, an effective coreference resolution system could be designed. The source code of the system described is available at https://sourceforge.net/projects/ohnlp/files/MedCoref. PMID:22707745
Congestion relaxation due to density-dependent junction rules in TASEP network
NASA Astrophysics Data System (ADS)
Tannai, Takahiro; Nishinari, Katsuhiro
2017-09-01
We now consider a small network module of Totally Asymmetric Simple Exclusion Process with branching and aggregation points, and rules of junctions dependent on the densities of segments of the network module. We also focus on the interaction among junctions which are branching and aggregation. The interaction among junctions with density-dependent rules possesses more complexity than those with density-independent rules studied in the previous papers. In conclusion, we confirm the result that density-dependent rules enable vehicles to move more effectively than the density-independent rules.
A rule of unity for human intestinal absorption 3: Application to pharmaceuticals.
Patel, Raj B; Yalkowsky, Samuel H
2018-02-01
The rule of unity is based on a simple absorption parameter, Π, that can accurately predict whether or not an orally administered drug will be well absorbed or poorly absorbed. The intrinsic aqueous solubility and octanol-water partition coefficient, along with the drug dose are used to calculate Π. We show that a single delineator value for Π exist that can distinguish whether a drug is likely to be well absorbed (FA ≥ 0.5) or poorly absorbed (FA < 0.5) at any specified dose. The model is shown to give 82.5% correct predictions for over 938 pharmaceuticals. The maximum well-absorbed dose (i.e. the maximum dose that will be more than 50% absorbed) calculated using this model can be utilized as a guideline for drug design and synthesis. Copyright © 2017 John Wiley & Sons, Ltd.
Nonconservative dynamics in long atomic wires
NASA Astrophysics Data System (ADS)
Cunningham, Brian; Todorov, Tchavdar N.; Dundas, Daniel
2014-09-01
The effect of nonconservative current-induced forces on the ions in a defect-free metallic nanowire is investigated using both steady-state calculations and dynamical simulations. Nonconservative forces were found to have a major influence on the ion dynamics in these systems, but their role in increasing the kinetic energy of the ions decreases with increasing system length. The results illustrate the importance of nonconservative effects in short nanowires and the scaling of these effects with system size. The dependence on bias and ion mass can be understood with the help of a simple pen and paper model. This material highlights the benefit of simple preliminary steady-state calculations in anticipating aspects of brute-force dynamical simulations, and provides rule of thumb criteria for the design of stable quantum wires.
A Neuromorphic Architecture for Object Recognition and Motion Anticipation Using Burst-STDP
Balduzzi, David; Tononi, Giulio
2012-01-01
In this work we investigate the possibilities offered by a minimal framework of artificial spiking neurons to be deployed in silico. Here we introduce a hierarchical network architecture of spiking neurons which learns to recognize moving objects in a visual environment and determine the correct motor output for each object. These tasks are learned through both supervised and unsupervised spike timing dependent plasticity (STDP). STDP is responsible for the strengthening (or weakening) of synapses in relation to pre- and post-synaptic spike times and has been described as a Hebbian paradigm taking place both in vitro and in vivo. We utilize a variation of STDP learning, called burst-STDP, which is based on the notion that, since spikes are expensive in terms of energy consumption, then strong bursting activity carries more information than single (sparse) spikes. Furthermore, this learning algorithm takes advantage of homeostatic renormalization, which has been hypothesized to promote memory consolidation during NREM sleep. Using this learning rule, we design a spiking neural network architecture capable of object recognition, motion detection, attention towards important objects, and motor control outputs. We demonstrate the abilities of our design in a simple environment with distractor objects, multiple objects moving concurrently, and in the presence of noise. Most importantly, we show how this neural network is capable of performing these tasks using a simple leaky-integrate-and-fire (LIF) neuron model with binary synapses, making it fully compatible with state-of-the-art digital neuromorphic hardware designs. As such, the building blocks and learning rules presented in this paper appear promising for scalable fully neuromorphic systems to be implemented in hardware chips. PMID:22615855
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-04
... solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR... Executive Officer in Rule 2(c) represents a simple oversight in the 2006 amendments and seeks to correct it... investors and the public interest by allowing CHX to amend its rules to permit any Officer of the Exchange...
Karystianis, George; Thayer, Kristina; Wolfe, Mary; Tsafnat, Guy
2017-06-01
Most data extraction efforts in epidemiology are focused on obtaining targeted information from clinical trials. In contrast, limited research has been conducted on the identification of information from observational studies, a major source for human evidence in many fields, including environmental health. The recognition of key epidemiological information (e.g., exposures) through text mining techniques can assist in the automation of systematic reviews and other evidence summaries. We designed and applied a knowledge-driven, rule-based approach to identify targeted information (study design, participant population, exposure, outcome, confounding factors, and the country where the study was conducted) from abstracts of epidemiological studies included in several systematic reviews of environmental health exposures. The rules were based on common syntactical patterns observed in text and are thus not specific to any systematic review. To validate the general applicability of our approach, we compared the data extracted using our approach versus hand curation for 35 epidemiological study abstracts manually selected for inclusion in two systematic reviews. The returned F-score, precision, and recall ranged from 70% to 98%, 81% to 100%, and 54% to 97%, respectively. The highest precision was observed for exposure, outcome and population (100%) while recall was best for exposure and study design with 97% and 89%, respectively. The lowest recall was observed for the population (54%), which also had the lowest F-score (70%). The generated performance of our text-mining approach demonstrated encouraging results for the identification of targeted information from observational epidemiological study abstracts related to environmental exposures. We have demonstrated that rules based on generic syntactic patterns in one corpus can be applied to other observational study design by simple interchanging the dictionaries aiming to identify certain characteristics (i.e., outcomes, exposures). At the document level, the recognised information can assist in the selection and categorization of studies included in a systematic review. Copyright © 2017 Elsevier Inc. All rights reserved.
Forty years of Clar's aromatic π-sextet rule
Solà, Miquel
2013-01-01
In 1972 Erich Clar formulated his aromatic π-sextet rule that allows discussing qualitatively the aromatic character of benzenoid species. Now, 40 years later, Clar's aromatic π-sextet rule is still a source of inspiration for many chemists. This simple rule has been validated both experimentally and theoretically. In this review, we select some particular examples to highlight the achievement of Clar's aromatic π-sextet rule in many situations and we discuss two recent successful cases of its application. PMID:24790950
Design for an 8 Meter Monolithic UV/OIR Space Telescope
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Postman, Marc; Hornsby, Linda; Hopkins, Randall; Mosier, Gary E.; Pasquale, Bert A.; Arnold, William R.
2009-01-01
ATLAST-8 is an 8-meter monolithic UV/optical/NIR space observatory to be placed in orbit at Sun-Earth L2 by NASA's planned Ares V cargo launch vehicle. The ATLAST-8 will yield fundamental astronomical breakthroughs. The mission concept utilizes two enabling technologies: planned Ares-V launch vehicle (scheduled for 2019) and autonomous rendezvous and docking (AR&D). The unprecedented Ares-V payload and mass capacity enables the use of a massive, monolithic, thin-meniscus primary mirror - similar to a VLT or Subaru. Furthermore, it enables simple robust design rules to mitigate cost, schedule and performance risk. AR&D enables on-orbit servicing, extending mission life and enhancing science return.
The design of hypersonic waveriders for aero-assisted interplanetary trajectories
NASA Technical Reports Server (NTRS)
Lewis, Mark J.; Mcronald, Angus D.
1991-01-01
The aerodynamic performance of a vehicle designed to execute an aerogravity assisted maneuver, which combines a gravitational turn with a low-drag atmosphere pass, is examined. The advantage of the aerogravity assisted maneuver, as opposed to a more traditional gravity-assist trajectory, is that, through the use of a controlled atmospheric flight, nearly any deflection angle around a gravitating body can be realized. This holds the promise of providing extremely large values of Delta V. The success of such a maneuver depends on being able to design a vehicle which can execute sustained atmospheric flight at Mach numbers in the range of 50 - 100 with minimal drag losses. Some simple modeling is used to demonstrate design rules for the design of such vehicles, and to estimate the deterioration of their performance during the flight. Two sample aerogravity-assisted maneuvers are detailed, including a close solar approach requiring modest Delta V, and a sprint mission to Pluto.
Learning Problem-Solving Rules as Search through a Hypothesis Space
ERIC Educational Resources Information Center
Lee, Hee Seung; Betts, Shawn; Anderson, John R.
2016-01-01
Learning to solve a class of problems can be characterized as a search through a space of hypotheses about the rules for solving these problems. A series of four experiments studied how different learning conditions affected the search among hypotheses about the solution rule for a simple computational problem. Experiment 1 showed that a problem…
Cerebellar Deep Nuclei Involvement in Cognitive Adaptation and Automaticity
ERIC Educational Resources Information Center
Callu, Delphine; Lopez, Joelle; El Massioui, Nicole
2013-01-01
To determine the role of the interpositus nuclei of cerebellum in rule-based learning and optimization processes, we studied (1) successive transfers of an initially acquired response rule in a cross maze and (2) behavioral strategies in learning a simple response rule in a T maze in interpositus lesioned rats (neurotoxic or electrolytic lesions).…
Analysis of intrapulse chirp in CO2 oscillators
NASA Technical Reports Server (NTRS)
Moody, Stephen E.; Berger, Russell G.; Thayer, William J., III
1987-01-01
Pulsed single-frequency CO2 laser oscillators are often used as transmitters for coherent lidar applications. These oscillators suffer from intrapulse chirp, or dynamic frequency shifting. If excessive, such chirp can limit the signal-to-noise ratio of the lidar (by generating excess bandwidth), or limit the velocity resolution if the lidar is of the Doppler type. This paper describes a detailed numerical model that considers all known sources of intrapulse chirp. Some typical predictions of the model are shown, and simple design rules to minimize chirp are proposed.
2006-05-01
harassment, it was not until the Allied ground forces over- ran the launch areas that the threat truly came to an end.21 By becoming mobile, the Germans had...SS-6 “ Sapwood .”28 Korolev’s RD-105/RD-106 propulsion concept for this missile involved a total of five engines—a simple design based on German...to-air missiles mobile because we had a big area to de- fend. Our stationary surface-to-air missile sites were primarily around Moscow and others
Interpretable Decision Sets: A Joint Framework for Description and Prediction
Lakkaraju, Himabindu; Bach, Stephen H.; Jure, Leskovec
2016-01-01
One of the most important obstacles to deploying predictive models is the fact that humans do not understand and trust them. Knowing which variables are important in a model’s prediction and how they are combined can be very powerful in helping people understand and trust automatic decision making systems. Here we propose interpretable decision sets, a framework for building predictive models that are highly accurate, yet also highly interpretable. Decision sets are sets of independent if-then rules. Because each rule can be applied independently, decision sets are simple, concise, and easily interpretable. We formalize decision set learning through an objective function that simultaneously optimizes accuracy and interpretability of the rules. In particular, our approach learns short, accurate, and non-overlapping rules that cover the whole feature space and pay attention to small but important classes. Moreover, we prove that our objective is a non-monotone submodular function, which we efficiently optimize to find a near-optimal set of rules. Experiments show that interpretable decision sets are as accurate at classification as state-of-the-art machine learning techniques. They are also three times smaller on average than rule-based models learned by other methods. Finally, results of a user study show that people are able to answer multiple-choice questions about the decision boundaries of interpretable decision sets and write descriptions of classes based on them faster and more accurately than with other rule-based models that were designed for interpretability. Overall, our framework provides a new approach to interpretable machine learning that balances accuracy, interpretability, and computational efficiency. PMID:27853627
Hidden patterns of reciprocity.
Syi
2014-03-21
Reciprocity can help the evolution of cooperation. To model both types of reciprocity, we need the concept of strategy. In the case of direct reciprocity there are four second-order action rules (Simple Tit-for-tat, Contrite Tit-for-tat, Pavlov, and Grim Trigger), which are able to promote cooperation. In the case of indirect reciprocity the key component of cooperation is the assessment rule. There are, again, four elementary second-order assessment rules (Image Scoring, Simple Standing, Stern Judging, and Shunning). The eight concepts can be formalized in an ontologically thin way we need only an action predicate and a value function, two agent concepts, and the constant of goodness. The formalism helps us to discover that the action and assessment rules can be paired, and that they show the same patterns. The logic of these patterns can be interpreted with the concept of punishment that has an inherent paradoxical nature. Copyright © 2013 Elsevier Ltd. All rights reserved.
Foraging Ecology Predicts Learning Performance in Insectivorous Bats
Clarin, Theresa M. A.; Ruczyński, Ireneusz; Page, Rachel A.
2013-01-01
Bats are unusual among mammals in showing great ecological diversity even among closely related species and are thus well suited for studies of adaptation to the ecological background. Here we investigate whether behavioral flexibility and simple- and complex-rule learning performance can be predicted by foraging ecology. We predict faster learning and higher flexibility in animals hunting in more complex, variable environments than in animals hunting in more simple, stable environments. To test this hypothesis, we studied three closely related insectivorous European bat species of the genus Myotis that belong to three different functional groups based on foraging habitats: M. capaccinii, an open water forager, M. myotis, a passive listening gleaner, and M. emarginatus, a clutter specialist. We predicted that M. capaccinii would show the least flexibility and slowest learning reflecting its relatively unstructured foraging habitat and the stereotypy of its natural foraging behavior, while the other two species would show greater flexibility and more rapid learning reflecting the complexity of their natural foraging tasks. We used a purposefully unnatural and thus species-fair crawling maze to test simple- and complex-rule learning, flexibility and re-learning performance. We found that M. capaccinii learned a simple rule as fast as the other species, but was slower in complex rule learning and was less flexible in response to changes in reward location. We found no differences in re-learning ability among species. Our results corroborate the hypothesis that animals’ cognitive skills reflect the demands of their ecological niche. PMID:23755146
NASA Astrophysics Data System (ADS)
Magee, Daniel J.; Niemeyer, Kyle E.
2018-03-01
The expedient design of precision components in aerospace and other high-tech industries requires simulations of physical phenomena often described by partial differential equations (PDEs) without exact solutions. Modern design problems require simulations with a level of resolution difficult to achieve in reasonable amounts of time-even in effectively parallelized solvers. Though the scale of the problem relative to available computing power is the greatest impediment to accelerating these applications, significant performance gains can be achieved through careful attention to the details of memory communication and access. The swept time-space decomposition rule reduces communication between sub-domains by exhausting the domain of influence before communicating boundary values. Here we present a GPU implementation of the swept rule, which modifies the algorithm for improved performance on this processing architecture by prioritizing use of private (shared) memory, avoiding interblock communication, and overwriting unnecessary values. It shows significant improvement in the execution time of finite-difference solvers for one-dimensional unsteady PDEs, producing speedups of 2 - 9 × for a range of problem sizes, respectively, compared with simple GPU versions and 7 - 300 × compared with parallel CPU versions. However, for a more sophisticated one-dimensional system of equations discretized with a second-order finite-volume scheme, the swept rule performs 1.2 - 1.9 × worse than a standard implementation for all problem sizes.
Integrated layout based Monte-Carlo simulation for design arc optimization
NASA Astrophysics Data System (ADS)
Shao, Dongbing; Clevenger, Larry; Zhuang, Lei; Liebmann, Lars; Wong, Robert; Culp, James
2016-03-01
Design rules are created considering a wafer fail mechanism with the relevant design levels under various design cases, and the values are set to cover the worst scenario. Because of the simplification and generalization, design rule hinders, rather than helps, dense device scaling. As an example, SRAM designs always need extensive ground rule waivers. Furthermore, dense design also often involves "design arc", a collection of design rules, the sum of which equals critical pitch defined by technology. In design arc, a single rule change can lead to chain reaction of other rule violations. In this talk we present a methodology using Layout Based Monte-Carlo Simulation (LBMCS) with integrated multiple ground rule checks. We apply this methodology on SRAM word line contact, and the result is a layout that has balanced wafer fail risks based on Process Assumptions (PAs). This work was performed at the IBM Microelectronics Div, Semiconductor Research and Development Center, Hopewell Junction, NY 12533
ERIC Educational Resources Information Center
Endress, Ansgar D.; Hauser, Marc D.
2011-01-01
Rules, and exceptions to such rules, are ubiquitous in many domains, including language. Here we used simple artificial grammars to investigate the influence of 2 factors on the acquisition of rules and their exceptions, namely type frequency (the relative numbers of different exceptions to different regular items) and token frequency (the number…
An evaluation of rise time characterization and prediction methods
NASA Technical Reports Server (NTRS)
Robinson, Leick D.
1994-01-01
One common method of extrapolating sonic boom waveforms from aircraft to ground is to calculate the nonlinear distortion, and then add a rise time to each shock by a simple empirical rule. One common rule is the '3 over P' rule which calculates the rise time in milliseconds as three divided by the shock amplitude in psf. This rule was compared with the results of ZEPHYRUS, a comprehensive algorithm which calculates sonic boom propagation and extrapolation with the combined effects of nonlinearity, attenuation, dispersion, geometric spreading, and refraction in a stratified atmosphere. It is shown there that the simple empirical rule considerably overestimates the rise time estimate. In addition, the empirical rule does not account for variations in the rise time due to humidity variation or propagation history. It is also demonstrated that the rise time is only an approximate indicator of perceived loudness. Three waveforms with identical characteristics (shock placement, amplitude, and rise time), but with different shock shapes, are shown to give different calculated loudness. This paper is based in part on work performed at the Applied Research Laboratories, the University of Texas at Austin, and supported by NASA Langley.
Science Opportunity Analyzer (SOA): Science Planning Made Simple
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; Polanskey, Carol A.
2004-01-01
.For the first time at JPL, the Cassini mission to Saturn is using distributed science operations for developing their experiments. Remote scientists needed the ability to: a) Identify observation opportunities; b) Create accurate, detailed designs for their observations; c) Verify that their designs meet their objectives; d) Check their observations against project flight rules and constraints; e) Communicate their observations to other scientists. Many existing tools provide one or more of these functions, but Science Opportunity Analyzer (SOA) has been built to unify these tasks into a single application. Accurate: Utilizes JPL Navigation and Ancillary Information Facility (NAIF) SPICE* software tool kit - Provides high fidelity modeling. - Facilitates rapid adaptation to other flight projects. Portable: Available in Unix, Windows and Linux. Adaptable: Designed to be a multi-mission tool so it can be readily adapted to other flight projects. Implemented in Java, Java 3D and other innovative technologies. Conclusion: SOA is easy to use. It only requires 6 simple steps. SOA's ability to show the same accurate information in multiple ways (multiple visualization formats, data plots, listings and file output) is essential to meet the needs of a diverse, distributed science operations environment.
Hughes, I
1998-09-24
The direct analysis of selected components from combinatorial libraries by sensitive methods such as mass spectrometry is potentially more efficient than deconvolution and tagging strategies since additional steps of resynthesis or introduction of molecular tags are avoided. A substituent selection procedure is described that eliminates the mass degeneracy commonly observed in libraries prepared by "split-and-mix" methods, without recourse to high-resolution mass measurements. A set of simple rules guides the choice of substituents such that all components of the library have unique nominal masses. Additional rules extend the scope by ensuring that characteristic isotopic mass patterns distinguish isobaric components. The method is applicable to libraries having from two to four varying substituent groups and can encode from a few hundred to several thousand components. No restrictions are imposed on the manner in which the "self-coded" library is synthesized or screened.
Rands, Sean A.
2011-01-01
Functional explanations of behaviour often propose optimal strategies for organisms to follow. These ‘best’ strategies could be difficult to perform given biological constraints such as neural architecture and physiological constraints. Instead, simple heuristics or ‘rules-of-thumb’ that approximate these optimal strategies may instead be performed. From a modelling perspective, rules-of-thumb are also useful tools for considering how group behaviour is shaped by the behaviours of individuals. Using simple rules-of-thumb reduces the complexity of these models, but care needs to be taken to use rules that are biologically relevant. Here, we investigate the similarity between the outputs of a two-player dynamic foraging game (which generated optimal but complex solutions) and a computational simulation of the behaviours of the two members of a foraging pair, who instead followed a rule-of-thumb approximation of the game's output. The original game generated complex results, and we demonstrate here that the simulations following the much-simplified rules-of-thumb also generate complex results, suggesting that the rule-of-thumb was sufficient to make some of the model outcomes unpredictable. There was some agreement between both modelling techniques, but some differences arose – particularly when pair members were not identical in how they gained and lost energy. We argue that exploring how rules-of-thumb perform in comparison to their optimal counterparts is an important exercise for biologically validating the output of agent-based models of group behaviour. PMID:21765938
Rands, Sean A
2011-01-01
Functional explanations of behaviour often propose optimal strategies for organisms to follow. These 'best' strategies could be difficult to perform given biological constraints such as neural architecture and physiological constraints. Instead, simple heuristics or 'rules-of-thumb' that approximate these optimal strategies may instead be performed. From a modelling perspective, rules-of-thumb are also useful tools for considering how group behaviour is shaped by the behaviours of individuals. Using simple rules-of-thumb reduces the complexity of these models, but care needs to be taken to use rules that are biologically relevant. Here, we investigate the similarity between the outputs of a two-player dynamic foraging game (which generated optimal but complex solutions) and a computational simulation of the behaviours of the two members of a foraging pair, who instead followed a rule-of-thumb approximation of the game's output. The original game generated complex results, and we demonstrate here that the simulations following the much-simplified rules-of-thumb also generate complex results, suggesting that the rule-of-thumb was sufficient to make some of the model outcomes unpredictable. There was some agreement between both modelling techniques, but some differences arose - particularly when pair members were not identical in how they gained and lost energy. We argue that exploring how rules-of-thumb perform in comparison to their optimal counterparts is an important exercise for biologically validating the output of agent-based models of group behaviour.
NASA Technical Reports Server (NTRS)
Johnson, David W.
1992-01-01
Virtual realities are a type of human-computer interface (HCI) and as such may be understood from a historical perspective. In the earliest era, the computer was a very simple, straightforward machine. Interaction was human manipulation of an inanimate object, little more than the provision of an explicit instruction set to be carried out without deviation. In short, control resided with the user. In the second era of HCI, some level of intelligence and control was imparted to the system to enable a dialogue with the user. Simple context sensitive help systems are early examples, while more sophisticated expert system designs typify this era. Control was shared more equally. In this, the third era of the HCI, the constructed system emulates a particular environment, constructed with rules and knowledge about 'reality'. Control is, in part, outside the realm of the human-computer dialogue. Virtual reality systems are discussed.
29 CFR 1206.8 - Amendment or rescission of rules in this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... interested person may petition the Board, in writing, for the issuance, amendment, or repeal of a rule or... denied in whole or in part, prompt notice shall be given of the denial, accompanied by a simple statement...
Multisensor fusion with non-optimal decision rules: the challenges of open world sensing
NASA Astrophysics Data System (ADS)
Minor, Christian; Johnson, Kevin
2014-05-01
In this work, simple, generic models of chemical sensing are used to simulate sensor array data and to illustrate the impact on overall system performance that specific design choices impart. The ability of multisensor systems to perform multianalyte detection (i.e., distinguish multiple targets) is explored by examining the distinction between fundamental design-related limitations stemming from mismatching of mixture composition to fused sensor measurement spaces, and limitations that arise from measurement uncertainty. Insight on the limits and potential of sensor fusion to robustly address detection tasks in realistic field conditions can be gained through an examination of a) the underlying geometry of both the composition space of sources one hopes to elucidate and the measurement space a fused sensor system is capable of generating, and b) the informational impact of uncertainty on both of these spaces. For instance, what is the potential impact on sensor fusion in an open world scenario where unknown interferants may contaminate target signals? Under complex and dynamic backgrounds, decision rules may implicitly become non-optimal and adding sensors may increase the amount of conflicting information observed. This suggests that the manner in which a decision rule handles sensor conflict can be critical in leveraging sensor fusion for effective open world sensing, and becomes exponentially more important as more sensors are added. Results and design considerations for handling conflicting evidence in Bayes and Dempster-Shafer fusion frameworks are presented. Bayesian decision theory is used to provide an upper limit on detector performance of simulated sensor systems.
26 CFR 1.401(k)-4 - SIMPLE 401(k) plan requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 26 Internal Revenue 5 2012-04-01 2011-04-01 true SIMPLE 401(k) plan requirements. 1.401(k)-4 Section 1.401(k)-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED...(k)-4 SIMPLE 401(k) plan requirements. (a) General rule. A cash or deferred arrangement satisfies the...
26 CFR 1.401(k)-4 - SIMPLE 401(k) plan requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 26 Internal Revenue 5 2014-04-01 2014-04-01 false SIMPLE 401(k) plan requirements. 1.401(k)-4 Section 1.401(k)-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED...(k)-4 SIMPLE 401(k) plan requirements. (a) General rule. A cash or deferred arrangement satisfies the...
26 CFR 1.401(k)-4 - SIMPLE 401(k) plan requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 26 Internal Revenue 5 2011-04-01 2011-04-01 false SIMPLE 401(k) plan requirements. 1.401(k)-4 Section 1.401(k)-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED...(k)-4 SIMPLE 401(k) plan requirements. (a) General rule. A cash or deferred arrangement satisfies the...
26 CFR 1.401(k)-4 - SIMPLE 401(k) plan requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 26 Internal Revenue 5 2013-04-01 2013-04-01 false SIMPLE 401(k) plan requirements. 1.401(k)-4 Section 1.401(k)-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED...(k)-4 SIMPLE 401(k) plan requirements. (a) General rule. A cash or deferred arrangement satisfies the...
Library-based illumination synthesis for critical CMOS patterning.
Yu, Jue-Chin; Yu, Peichen; Chao, Hsueh-Yung
2013-07-01
In optical microlithography, the illumination source for critical complementary metal-oxide-semiconductor layers needs to be determined in the early stage of a technology node with very limited design information, leading to simple binary shapes. Recently, the availability of freeform sources permits us to increase pattern fidelity and relax mask complexities with minimal insertion risks to the current manufacturing flow. However, source optimization across many patterns is often treated as a design-of-experiments problem, which may not fully exploit the benefits of a freeform source. In this paper, a rigorous source-optimization algorithm is presented via linear superposition of optimal sources for pre-selected patterns. We show that analytical solutions are made possible by using Hopkins formulation and quadratic programming. The algorithm allows synthesized illumination to be linked with assorted pattern libraries, which has a direct impact on design rule studies for early planning and design automation for full wafer optimization.
Molnets: An Artificial Chemistry Based on Neural Networks
NASA Technical Reports Server (NTRS)
Colombano, Silvano; Luk, Johnny; Segovia-Juarez, Jose L.; Lohn, Jason; Clancy, Daniel (Technical Monitor)
2002-01-01
The fundamental problem in the evolution of matter is to understand how structure-function relationships are formed and increase in complexity from the molecular level all the way to a genetic system. We have created a system where structure-function relationships arise naturally and without the need of ad hoc function assignments to given structures. The idea was inspired by neural networks, where the structure of the net embodies specific computational properties. In this system networks interact with other networks to create connections between the inputs of one net and the outputs of another. The newly created net then recomputes its own synaptic weights, based on anti-hebbian rules. As a result some connections may be cut, and multiple nets can emerge as products of a 'reaction'. The idea is to study emergent reaction behaviors, based on simple rules that constitute a pseudophysics of the system. These simple rules are parameterized to produce behaviors that emulate chemical reactions. We find that these simple rules show a gradual increase in the size and complexity of molecules. We have been building a virtual artificial chemistry laboratory for discovering interesting reactions and for testing further ideas on the evolution of primitive molecules. Some of these ideas include the potential effect of membranes and selective diffusion according to molecular size.
ERIC Educational Resources Information Center
Mitchell, Paul; Kemp, Nenagh; Bryant, Peter
2011-01-01
The purpose of this research was to examine whether adults rely on morphemic spelling rules or word-specific knowledge when spelling simple words. We examined adults' knowledge of two of the simplest and most reliable rules in English spelling concerning the morphological word ending -s. This spelling is required for regular plural nouns (e.g.,…
Controlling the scattering properties of thin, particle-doped coatings
NASA Astrophysics Data System (ADS)
Rogers, William; Corbett, Madeleine; Manoharan, Vinothan
2013-03-01
Coatings and thin films of small particles suspended in a matrix possess optical properties that are important in several industries from cosmetics and paints to polymer composites. Many of the most interesting applications require coatings that produce several bulk effects simultaneously, but it is often difficult to rationally formulate materials with these desired optical properties. Here, we focus on the specific challenge of designing a thin colloidal film that maximizes both diffuse and total hemispherical transmission. We demonstrate that these bulk optical properties follow a simple scaling with two microscopic length scales: the scattering and transport mean free paths. Using these length scales and Mie scattering calculations, we generate basic design rules that relate scattering at the single particle level to the film's bulk optical properties. These ideas will be useful in the rational design of future optically active coatings.
Mars Aeronomy Explorer (MAX): Study Employing Distributed Micro-Spacecraft
NASA Technical Reports Server (NTRS)
Shotwell, Robert F.; Gray, Andrew A.; Illsley, Peter M.; Johnson, M.; Sherwood, Robert L.; Vozoff, M.; Ziemer, John K.
2005-01-01
An overview of a Mars Aeronomy Explorer (MAX) mission design study performed at NASA's Jet Propulsion Laboratory is presented herein. The mission design consists of ten micro-spacecraft orbiters launched on a Delta IV to Mars polar orbit to determine the spatial, diurnal and seasonal variation of the constituents of the Martian upper atmosphere and ionosphere over the course of one Martian year. The spacecraft are designed to allow penetration of the upper atmosphere to at least 90 km. This property coupled with orbit precession will yield knowledge of the nature of the solar wind interaction with Mars, the influence of the Mars crustal magnetic field on ionospheric processes, and the measurement of present thermal and nonthermal escape rates of atmospheric constituents. The mission design incorporates alternative design paradigms that are more appropriate for-and in some cases motivate-distributed micro-spacecraft. These design paradigms are not defined by a simple set of rules, but rather a way of thinking about the function of instruments, mission reliability/risk, and cost in a systemic framework.
New tools for evaluating LQAS survey designs
2014-01-01
Lot Quality Assurance Sampling (LQAS) surveys have become increasingly popular in global health care applications. Incorporating Bayesian ideas into LQAS survey design, such as using reasonable prior beliefs about the distribution of an indicator, can improve the selection of design parameters and decision rules. In this paper, a joint frequentist and Bayesian framework is proposed for evaluating LQAS classification accuracy and informing survey design parameters. Simple software tools are provided for calculating the positive and negative predictive value of a design with respect to an underlying coverage distribution and the selected design parameters. These tools are illustrated using a data example from two consecutive LQAS surveys measuring Oral Rehydration Solution (ORS) preparation. Using the survey tools, the dependence of classification accuracy on benchmark selection and the width of the ‘grey region’ are clarified in the context of ORS preparation across seven supervision areas. Following the completion of an LQAS survey, estimation of the distribution of coverage across areas facilitates quantifying classification accuracy and can help guide intervention decisions. PMID:24528928
New tools for evaluating LQAS survey designs.
Hund, Lauren
2014-02-15
Lot Quality Assurance Sampling (LQAS) surveys have become increasingly popular in global health care applications. Incorporating Bayesian ideas into LQAS survey design, such as using reasonable prior beliefs about the distribution of an indicator, can improve the selection of design parameters and decision rules. In this paper, a joint frequentist and Bayesian framework is proposed for evaluating LQAS classification accuracy and informing survey design parameters. Simple software tools are provided for calculating the positive and negative predictive value of a design with respect to an underlying coverage distribution and the selected design parameters. These tools are illustrated using a data example from two consecutive LQAS surveys measuring Oral Rehydration Solution (ORS) preparation. Using the survey tools, the dependence of classification accuracy on benchmark selection and the width of the 'grey region' are clarified in the context of ORS preparation across seven supervision areas. Following the completion of an LQAS survey, estimation of the distribution of coverage across areas facilitates quantifying classification accuracy and can help guide intervention decisions.
Design of freeze-drying processes for pharmaceuticals: practical advice.
Tang, Xiaolin; Pikal, Michael J
2004-02-01
Design of freeze-drying processes is often approached with a "trial and error" experimental plan or, worse yet, the protocol used in the first laboratory run is adopted without further attempts at optimization. Consequently, commercial freeze-drying processes are often neither robust nor efficient. It is our thesis that design of an "optimized" freeze-drying process is not particularly difficult for most products, as long as some simple rules based on well-accepted scientific principles are followed. It is the purpose of this review to discuss the scientific foundations of the freeze-drying process design and then to consolidate these principles into a set of guidelines for rational process design and optimization. General advice is given concerning common stability issues with proteins, but unusual and difficult stability issues are beyond the scope of this review. Control of ice nucleation and crystallization during the freezing step is discussed, and the impact of freezing on the rest of the process and final product quality is reviewed. Representative freezing protocols are presented. The significance of the collapse temperature and the thermal transition, denoted Tg', are discussed, and procedures for the selection of the "target product temperature" for primary drying are presented. Furthermore, guidelines are given for selection of the optimal shelf temperature and chamber pressure settings required to achieve the target product temperature without thermal and/or mass transfer overload of the freeze dryer. Finally, guidelines and "rules" for optimization of secondary drying and representative secondary drying protocols are presented.
Application of a swarm-based approach for phase unwrapping
NASA Astrophysics Data System (ADS)
da S. Maciel, Lucas; Albertazzi G., Armando, Jr.
2014-07-01
An algorithm for phase unwrapping based on swarm intelligence is proposed. The novel approach is based on the emergent behavior of swarms. This behavior is the result of the interactions between independent agents following a simple set of rules and is regarded as fast, flexible and robust. The rules here were designed with two purposes. Firstly, the collective behavior must result in a reliable map of the unwrapped phase. The unwrapping reliability was evaluated by each agent during run-time, based on the quality of the neighboring pixels. In addition, the rule set must result in a behavior that focuses on wrapped regions. Stigmergy and communication rules were implemented in order to enable each agent to seek less worked areas of the image. The agents were modeled as Finite-State Machines. Based on the availability of unwrappable pixels, each agent assumed a different state in order to better adapt itself to the surroundings. The implemented rule set was able to fulfill the requirements on reliability and focused unwrapping. The unwrapped phase map was comparable to those from established methods as the agents were able to reliably evaluate each pixel quality. Also, the unwrapping behavior, being observed in real time, was able to focus on workable areas as the agents communicated in order to find less traveled regions. The results were very positive for such a new approach to the phase unwrapping problem. Finally, the authors see great potential for future developments concerning the flexibility, robustness and processing times of the swarm-based algorithm.
Modeling Collective Animal Behavior with a Cognitive Perspective: A Methodological Framework
Weitz, Sebastian; Blanco, Stéphane; Fournier, Richard; Gautrais, Jacques; Jost, Christian; Theraulaz, Guy
2012-01-01
The last decades have seen an increasing interest in modeling collective animal behavior. Some studies try to reproduce as accurately as possible the collective dynamics and patterns observed in several animal groups with biologically plausible, individual behavioral rules. The objective is then essentially to demonstrate that the observed collective features may be the result of self-organizing processes involving quite simple individual behaviors. Other studies concentrate on the objective of establishing or enriching links between collective behavior researches and cognitive or physiological ones, which then requires that each individual rule be carefully validated. Here we discuss the methodological consequences of this additional requirement. Using the example of corpse clustering in ants, we first illustrate that it may be impossible to discriminate among alternative individual rules by considering only observational data collected at the group level. Six individual behavioral models are described: They are clearly distinct in terms of individual behaviors, they all reproduce satisfactorily the collective dynamics and distribution patterns observed in experiments, and we show theoretically that it is strictly impossible to discriminate two of these models even in the limit of an infinite amount of data whatever the accuracy level. A set of methodological steps are then listed and discussed as practical ways to partially overcome this problem. They involve complementary experimental protocols specifically designed to address the behavioral rules successively, conserving group-level data for the overall model validation. In this context, we highlight the importance of maintaining a sharp distinction between model enunciation, with explicit references to validated biological concepts, and formal translation of these concepts in terms of quantitative state variables and fittable functional dependences. Illustrative examples are provided of the benefits expected during the often long and difficult process of refining a behavioral model, designing adapted experimental protocols and inversing model parameters. PMID:22761685
Modeling collective animal behavior with a cognitive perspective: a methodological framework.
Weitz, Sebastian; Blanco, Stéphane; Fournier, Richard; Gautrais, Jacques; Jost, Christian; Theraulaz, Guy
2012-01-01
The last decades have seen an increasing interest in modeling collective animal behavior. Some studies try to reproduce as accurately as possible the collective dynamics and patterns observed in several animal groups with biologically plausible, individual behavioral rules. The objective is then essentially to demonstrate that the observed collective features may be the result of self-organizing processes involving quite simple individual behaviors. Other studies concentrate on the objective of establishing or enriching links between collective behavior researches and cognitive or physiological ones, which then requires that each individual rule be carefully validated. Here we discuss the methodological consequences of this additional requirement. Using the example of corpse clustering in ants, we first illustrate that it may be impossible to discriminate among alternative individual rules by considering only observational data collected at the group level. Six individual behavioral models are described: They are clearly distinct in terms of individual behaviors, they all reproduce satisfactorily the collective dynamics and distribution patterns observed in experiments, and we show theoretically that it is strictly impossible to discriminate two of these models even in the limit of an infinite amount of data whatever the accuracy level. A set of methodological steps are then listed and discussed as practical ways to partially overcome this problem. They involve complementary experimental protocols specifically designed to address the behavioral rules successively, conserving group-level data for the overall model validation. In this context, we highlight the importance of maintaining a sharp distinction between model enunciation, with explicit references to validated biological concepts, and formal translation of these concepts in terms of quantitative state variables and fittable functional dependences. Illustrative examples are provided of the benefits expected during the often long and difficult process of refining a behavioral model, designing adapted experimental protocols and inversing model parameters.
A rule-based software test data generator
NASA Technical Reports Server (NTRS)
Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II
1991-01-01
Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.
Yap, Christina; Billingham, Lucinda J; Cheung, Ying Kuen; Craddock, Charlie; O'Quigley, John
2017-12-15
The ever-increasing pace of development of novel therapies mandates efficient methodologies for assessment of their tolerability and activity. Evidence increasingly support the merits of model-based dose-finding designs in identifying the recommended phase II dose compared with conventional rule-based designs such as the 3 + 3 but despite this, their use remains limited. Here, we propose a useful tool, dose transition pathways (DTP), which helps overcome several commonly faced practical and methodologic challenges in the implementation of model-based designs. DTP projects in advance the doses recommended by a model-based design for subsequent patients (stay, escalate, de-escalate, or stop early), using all the accumulated information. After specifying a model with favorable statistical properties, we utilize the DTP to fine-tune the model to tailor it to the trial's specific requirements that reflect important clinical judgments. In particular, it can help to determine how stringent the stopping rules should be if the investigated therapy is too toxic. Its use to design and implement a modified continual reassessment method is illustrated in an acute myeloid leukemia trial. DTP removes the fears of model-based designs as unknown, complex systems and can serve as a handbook, guiding decision-making for each dose update. In the illustrated trial, the seamless, clear transition for each dose recommendation aided the investigators' understanding of the design and facilitated decision-making to enable finer calibration of a tailored model. We advocate the use of the DTP as an integral procedure in the co-development and successful implementation of practical model-based designs by statisticians and investigators. Clin Cancer Res; 23(24); 7440-7. ©2017 AACR . ©2017 American Association for Cancer Research.
Computing exponentially faster: implementing a non-deterministic universal Turing machine using DNA
Currin, Andrew; Korovin, Konstantin; Ababi, Maria; Roper, Katherine; Kell, Douglas B.; Day, Philip J.
2017-01-01
The theory of computer science is based around universal Turing machines (UTMs): abstract machines able to execute all possible algorithms. Modern digital computers are physical embodiments of classical UTMs. For the most important class of problem in computer science, non-deterministic polynomial complete problems, non-deterministic UTMs (NUTMs) are theoretically exponentially faster than both classical UTMs and quantum mechanical UTMs (QUTMs). However, no attempt has previously been made to build an NUTM, and their construction has been regarded as impossible. Here, we demonstrate the first physical design of an NUTM. This design is based on Thue string rewriting systems, and thereby avoids the limitations of most previous DNA computing schemes: all the computation is local (simple edits to strings) so there is no need for communication, and there is no need to order operations. The design exploits DNA's ability to replicate to execute an exponential number of computational paths in P time. Each Thue rewriting step is embodied in a DNA edit implemented using a novel combination of polymerase chain reactions and site-directed mutagenesis. We demonstrate that the design works using both computational modelling and in vitro molecular biology experimentation: the design is thermodynamically favourable, microprogramming can be used to encode arbitrary Thue rules, all classes of Thue rule can be implemented, and non-deterministic rule implementation. In an NUTM, the resource limitation is space, which contrasts with classical UTMs and QUTMs where it is time. This fundamental difference enables an NUTM to trade space for time, which is significant for both theoretical computer science and physics. It is also of practical importance, for to quote Richard Feynman ‘there's plenty of room at the bottom’. This means that a desktop DNA NUTM could potentially utilize more processors than all the electronic computers in the world combined, and thereby outperform the world's current fastest supercomputer, while consuming a tiny fraction of its energy. PMID:28250099
A SIMPLE CELLULAR AUTOMATON MODEL FOR HIGH-LEVEL VEGETATION DYNAMICS
We have produced a simple two-dimensional (ground-plan) cellular automata model of vegetation dynamics specifically to investigate high-level community processes. The model is probabilistic, with individual plant behavior determined by physiologically-based rules derived from a w...
Beyond Molecular Codes: Simple Rules to Wire Complex Brains
Hassan, Bassem A.; Hiesinger, P. Robin
2015-01-01
Summary Molecular codes, like postal zip codes, are generally considered a robust way to ensure the specificity of neuronal target selection. However, a code capable of unambiguously generating complex neural circuits is difficult to conceive. Here, we re-examine the notion of molecular codes in the light of developmental algorithms. We explore how molecules and mechanisms that have been considered part of a code may alternatively implement simple pattern formation rules sufficient to ensure wiring specificity in neural circuits. This analysis delineates a pattern-based framework for circuit construction that may contribute to our understanding of brain wiring. PMID:26451480
Design Rules and Scaling for Solar Sails
NASA Technical Reports Server (NTRS)
Zeiders, Glenn W.
2005-01-01
Useful design rules and simple scaling models have been developed for solar sails. Chief among the conclusions are: 1. Sail distortions contribute to the thrust and moments primarily though the mean squared value of their derivatives (slopes), and the sail behaves like a flat sheet if the value is small. The RMS slope is therefore an important figure of merit, and sail distortion effects on the spacecraft can generally be disregarded if the RMS slope is less than about 10% or so. 2. The characteristic slope of the sail distortion varies inversely with the tension in the sail, and it is the tension that produces the principle loading on the support booms. The tension is not arbitrary, but rather is the value needed to maintain the allowable RMS slope. That corresponds to a halyard force about equal to three times the normal force on the supported sail area. 3. Both the AEC/SRS and L Garde concepts appear to be structurally capable of supporting sail sizes up to a kilometer or more with 1AU solar flux, but select transverse dimensions must be changed to do so. Operational issues such as fabrication, handling, storage and deployment will be the limiting factors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Breuker, M.S.; Braun, J.E.
This paper presents a detailed evaluation of the performance of a statistical, rule-based fault detection and diagnostic (FDD) technique presented by Rossi and Braun (1997). Steady-state and transient tests were performed on a simple rooftop air conditioner over a range of conditions and fault levels. The steady-state data without faults were used to train models that predict outputs for normal operation. The transient data with faults were used to evaluate FDD performance. The effect of a number of design variables on FDD sensitivity for different faults was evaluated and two prototype systems were specified for more complete evaluation. Good performancemore » was achieved in detecting and diagnosing five faults using only six temperatures (2 input and 4 output) and linear models. The performance improved by about a factor of two when ten measurements (three input and seven output) and higher order models were used. This approach for evaluating and optimizing the performance of the statistical, rule-based FDD technique could be used as a design and evaluation tool when applying this FDD method to other packaged air-conditioning systems. Furthermore, the approach could also be modified to evaluate the performance of other FDD methods.« less
Making the Cut: Lattice Kirigami Rules
NASA Astrophysics Data System (ADS)
Castle, Toen; Cho, Yigil; Gong, Xingting; Jung, Euiyeon; Sussman, Daniel M.; Yang, Shu; Kamien, Randall D.
2014-12-01
In this Letter we explore and develop a simple set of rules that apply to cutting, pasting, and folding honeycomb lattices. We consider origami-like structures that are extrinsically flat away from zero-dimensional sources of Gaussian curvature and one-dimensional sources of mean curvature, and our cutting and pasting rules maintain the intrinsic bond lengths on both the lattice and its dual lattice. We find that a small set of rules is allowed providing a framework for exploring and building kirigami—folding, cutting, and pasting the edges of paper.
Bureaucracy, Safety and Software: a Potentially Lethal Cocktail
NASA Astrophysics Data System (ADS)
Hatton, Les
This position paper identifies a potential problem with the evolution of software controlled safety critical systems. It observes that the rapid growth of bureaucracy in society quickly spills over into rules for behaviour. Whether the need for the rules comes first or there is simple anticipation of the need for a rule by a bureaucrat is unclear in many cases. Many such rules lead to draconian restrictions and often make the existing situation worse due to the presence of unintended consequences as will be shown with a number of examples.
Simple methods of exploiting the underlying structure of rule-based systems
NASA Technical Reports Server (NTRS)
Hendler, James
1986-01-01
Much recent work in the field of expert systems research has aimed at exploiting the underlying structures of the rule base for reasons of analysis. Such techniques as Petri-nets and GAGs have been proposed as representational structures that will allow complete analysis. Much has been made of proving isomorphisms between the rule bases and the mechanisms, and in examining the theoretical power of this analysis. In this paper we describe some early work in a new system which has much simpler (and thus, one hopes, more easily achieved) aims and less formality. The technique being examined is a very simple one: OPS5 programs are analyzed in a purely syntactic way and a FSA description is generated. In this paper we describe the technique and some user interface tools which exploit this structure.
Hierarchy of Certain Types of DNA Splicing Systems
NASA Astrophysics Data System (ADS)
Yusof, Yuhani; Sarmin, Nor Haniza; Goode, T. Elizabeth; Mahmud, Mazri; Heng, Fong Wan
A Head splicing system (H-system)consists of a finite set of strings (words) written over a finite alphabet, along with a finite set of rules that acts on the strings by iterated cutting and pasting to create a splicing language. Any interpretation that is aligned with Tom Head's original idea is one in which the strings represent double-stranded deoxyribonucleic acid (dsDNA) and the rules represent the cutting and pasting action of restriction enzymes and ligase, respectively. A new way of writing the rule sets is adopted so as to make the biological interpretation transparent. This approach is used in a formal language- theoretic analysis of the hierarchy of certain classes of splicing systems, namely simple, semi-simple and semi-null splicing systems. The relations between such systems and their associated languages are given as theorems, corollaries and counterexamples.
Evaluating changes to reservoir rule curves using historical water-level data
Mower, Ethan; Miranda, Leandro E.
2013-01-01
Flood control reservoirs are typically managed through rule curves (i.e. target water levels) which control the storage and release timing of flood waters. Changes to rule curves are often contemplated and requested by various user groups and management agencies with no information available about the actual flood risk of such requests. Methods of estimating flood risk in reservoirs are not easily available to those unfamiliar with hydrological models that track water movement through a river basin. We developed a quantile regression model that uses readily available daily water-level data to estimate risk of spilling. Our model provided a relatively simple process for estimating the maximum applicable water level under a specific flood risk for any day of the year. This water level represents an upper-limit umbrella under which water levels can be operated in a variety of ways. Our model allows the visualization of water-level management under a user-specified flood risk and provides a framework for incorporating the effect of a changing environment on water-level management in reservoirs, but is not designed to replace existing hydrological models. The model can improve communication and collaboration among agencies responsible for managing natural resources dependent on reservoir water levels.
26 CFR 1.652(a)-1 - Simple trusts; inclusion of amounts in income of beneficiaries.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 26 Internal Revenue 8 2012-04-01 2012-04-01 false Simple trusts; inclusion of amounts in income of beneficiaries. 1.652(a)-1 Section 1.652(a)-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE... Only § 1.652(a)-1 Simple trusts; inclusion of amounts in income of beneficiaries. Subject to the rules...
26 CFR 1.652(a)-1 - Simple trusts; inclusion of amounts in income of beneficiaries.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 26 Internal Revenue 8 2014-04-01 2014-04-01 false Simple trusts; inclusion of amounts in income of beneficiaries. 1.652(a)-1 Section 1.652(a)-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE... Only § 1.652(a)-1 Simple trusts; inclusion of amounts in income of beneficiaries. Subject to the rules...
26 CFR 1.652(a)-1 - Simple trusts; inclusion of amounts in income of beneficiaries.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 26 Internal Revenue 8 2013-04-01 2013-04-01 false Simple trusts; inclusion of amounts in income of beneficiaries. 1.652(a)-1 Section 1.652(a)-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE... Only § 1.652(a)-1 Simple trusts; inclusion of amounts in income of beneficiaries. Subject to the rules...
26 CFR 1.652(a)-1 - Simple trusts; inclusion of amounts in income of beneficiaries.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 26 Internal Revenue 8 2011-04-01 2011-04-01 false Simple trusts; inclusion of amounts in income of beneficiaries. 1.652(a)-1 Section 1.652(a)-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE... Only § 1.652(a)-1 Simple trusts; inclusion of amounts in income of beneficiaries. Subject to the rules...
The Aromaticity of Pericyclic Reaction Transition States
ERIC Educational Resources Information Center
Rzepa, Henry S.
2007-01-01
An approach is presented that starts from two fundamental concepts in organic chemistry, chirality and aromaticity, and combines them into a simple rule for stating selection rules for pericyclic reactions in terms of achiral Huckel-aromatic and chiral Mobius-aromatic transition states. This is illustrated using an example that leads to apparent…
Learning Non-Adjacent Regularities at Age 0 ; 7
ERIC Educational Resources Information Center
Gervain, Judit; Werker, Janet F.
2013-01-01
One important mechanism suggested to underlie the acquisition of grammar is rule learning. Indeed, infants aged 0 ; 7 are able to learn rules based on simple identity relations (adjacent repetitions, ABB: "wo fe fe" and non-adjacent repetitions, ABA: "wo fe wo", respectively; Marcus et al., 1999). One unexplored issue is…
Modeling, Modal Properties, and Mesh Stiffness Variation Instabilities of Planetary Gears
NASA Technical Reports Server (NTRS)
Parker, Robert G.; Lin, Jian; Krantz, Timothy L. (Technical Monitor)
2001-01-01
Planetary gear noise and vibration are primary concerns in their applications in helicopters, automobiles, aircraft engines, heavy machinery and marine vehicles. Dynamic analysis is essential to the noise and vibration reduction. This work analytically investigates some critical issues and advances the understanding of planetary gear dynamics. A lumped-parameter model is built for the dynamic analysis of general planetary gears. The unique properties of the natural frequency spectra and vibration modes are rigorously characterized. These special structures apply for general planetary gears with cyclic symmetry and, in practically important case, systems with diametrically opposed planets. The special vibration properties are useful for subsequent research. Taking advantage of the derived modal properties, the natural frequency and vibration mode sensitivities to design parameters are investigated. The key parameters include mesh stiffnesses, support/bearing stiffnesses, component masses, moments of inertia, and operating speed. The eigen-sensitivities are expressed in simple, closed-form formulae associated with modal strain and kinetic energies. As disorders (e.g., mesh stiffness variation. manufacturing and assembling errors) disturb the cyclic symmetry of planetary gears, their effects on the free vibration properties are quantitatively examined. Well-defined veering rules are derived to identify dramatic changes of natural frequencies and vibration modes under parameter variations. The knowledge of free vibration properties, eigen-sensitivities, and veering rules provide important information to effectively tune the natural frequencies and optimize structural design to minimize noise and vibration. Parametric instabilities excited by mesh stiffness variations are analytically studied for multi-mesh gear systems. The discrepancies of previous studies on parametric instability of two-stage gear chains are clarified using perturbation and numerical methods. The operating conditions causing parametric instabilities are expressed in closed-form suitable for design guidance. Using the well-defined modal properties of planetary gears, the effects of mesh parameters on parametric instability are analytically identified. Simple formulae are obtained to suppress particular instabilities by adjusting contact ratios and mesh phasing.
Factors which Limit the Value of Additional Redundancy in Human Rated Launch Vehicle Systems
NASA Technical Reports Server (NTRS)
Anderson, Joel M.; Stott, James E.; Ring, Robert W.; Hatfield, Spencer; Kaltz, Gregory M.
2008-01-01
The National Aeronautics and Space Administration (NASA) has embarked on an ambitious program to return humans to the moon and beyond. As NASA moves forward in the development and design of new launch vehicles for future space exploration, it must fully consider the implications that rule-based requirements of redundancy or fault tolerance have on system reliability/risk. These considerations include common cause failure, increased system complexity, combined serial and parallel configurations, and the impact of design features implemented to control premature activation. These factors and others must be considered in trade studies to support design decisions that balance safety, reliability, performance and system complexity to achieve a relatively simple, operable system that provides the safest and most reliable system within the specified performance requirements. This paper describes conditions under which additional functional redundancy can impede improved system reliability. Examples from current NASA programs including the Ares I Upper Stage will be shown.
Universal fragment descriptors for predicting properties of inorganic crystals
NASA Astrophysics Data System (ADS)
Isayev, Olexandr; Oses, Corey; Toher, Cormac; Gossett, Eric; Curtarolo, Stefano; Tropsha, Alexander
2017-06-01
Although historically materials discovery has been driven by a laborious trial-and-error process, knowledge-driven materials design can now be enabled by the rational combination of Machine Learning methods and materials databases. Here, data from the AFLOW repository for ab initio calculations is combined with Quantitative Materials Structure-Property Relationship models to predict important properties: metal/insulator classification, band gap energy, bulk/shear moduli, Debye temperature and heat capacities. The prediction's accuracy compares well with the quality of the training data for virtually any stoichiometric inorganic crystalline material, reciprocating the available thermomechanical experimental data. The universality of the approach is attributed to the construction of the descriptors: Property-Labelled Materials Fragments. The representations require only minimal structural input allowing straightforward implementations of simple heuristic design rules.
Universal fragment descriptors for predicting properties of inorganic crystals.
Isayev, Olexandr; Oses, Corey; Toher, Cormac; Gossett, Eric; Curtarolo, Stefano; Tropsha, Alexander
2017-06-05
Although historically materials discovery has been driven by a laborious trial-and-error process, knowledge-driven materials design can now be enabled by the rational combination of Machine Learning methods and materials databases. Here, data from the AFLOW repository for ab initio calculations is combined with Quantitative Materials Structure-Property Relationship models to predict important properties: metal/insulator classification, band gap energy, bulk/shear moduli, Debye temperature and heat capacities. The prediction's accuracy compares well with the quality of the training data for virtually any stoichiometric inorganic crystalline material, reciprocating the available thermomechanical experimental data. The universality of the approach is attributed to the construction of the descriptors: Property-Labelled Materials Fragments. The representations require only minimal structural input allowing straightforward implementations of simple heuristic design rules.
A Flexible Mechanism of Rule Selection Enables Rapid Feature-Based Reinforcement Learning
Balcarras, Matthew; Womelsdorf, Thilo
2016-01-01
Learning in a new environment is influenced by prior learning and experience. Correctly applying a rule that maps a context to stimuli, actions, and outcomes enables faster learning and better outcomes compared to relying on strategies for learning that are ignorant of task structure. However, it is often difficult to know when and how to apply learned rules in new contexts. In our study we explored how subjects employ different strategies for learning the relationship between stimulus features and positive outcomes in a probabilistic task context. We test the hypothesis that task naive subjects will show enhanced learning of feature specific reward associations by switching to the use of an abstract rule that associates stimuli by feature type and restricts selections to that dimension. To test this hypothesis we designed a decision making task where subjects receive probabilistic feedback following choices between pairs of stimuli. In the task, trials are grouped in two contexts by blocks, where in one type of block there is no unique relationship between a specific feature dimension (stimulus shape or color) and positive outcomes, and following an un-cued transition, alternating blocks have outcomes that are linked to either stimulus shape or color. Two-thirds of subjects (n = 22/32) exhibited behavior that was best fit by a hierarchical feature-rule model. Supporting the prediction of the model mechanism these subjects showed significantly enhanced performance in feature-reward blocks, and rapidly switched their choice strategy to using abstract feature rules when reward contingencies changed. Choice behavior of other subjects (n = 10/32) was fit by a range of alternative reinforcement learning models representing strategies that do not benefit from applying previously learned rules. In summary, these results show that untrained subjects are capable of flexibly shifting between behavioral rules by leveraging simple model-free reinforcement learning and context-specific selections to drive responses. PMID:27064794
How children perceive fractals: Hierarchical self-similarity and cognitive development
Martins, Maurício Dias; Laaha, Sabine; Freiberger, Eva Maria; Choi, Soonja; Fitch, W. Tecumseh
2014-01-01
The ability to understand and generate hierarchical structures is a crucial component of human cognition, available in language, music, mathematics and problem solving. Recursion is a particularly useful mechanism for generating complex hierarchies by means of self-embedding rules. In the visual domain, fractals are recursive structures in which simple transformation rules generate hierarchies of infinite depth. Research on how children acquire these rules can provide valuable insight into the cognitive requirements and learning constraints of recursion. Here, we used fractals to investigate the acquisition of recursion in the visual domain, and probed for correlations with grammar comprehension and general intelligence. We compared second (n = 26) and fourth graders (n = 26) in their ability to represent two types of rules for generating hierarchical structures: Recursive rules, on the one hand, which generate new hierarchical levels; and iterative rules, on the other hand, which merely insert items within hierarchies without generating new levels. We found that the majority of fourth graders, but not second graders, were able to represent both recursive and iterative rules. This difference was partially accounted by second graders’ impairment in detecting hierarchical mistakes, and correlated with between-grade differences in grammar comprehension tasks. Empirically, recursion and iteration also differed in at least one crucial aspect: While the ability to learn recursive rules seemed to depend on the previous acquisition of simple iterative representations, the opposite was not true, i.e., children were able to acquire iterative rules before they acquired recursive representations. These results suggest that the acquisition of recursion in vision follows learning constraints similar to the acquisition of recursion in language, and that both domains share cognitive resources involved in hierarchical processing. PMID:24955884
Mining Distance Based Outliers in Near Linear Time with Randomization and a Simple Pruning Rule
NASA Technical Reports Server (NTRS)
Bay, Stephen D.; Schwabacher, Mark
2003-01-01
Defining outliers by their distance to neighboring examples is a popular approach to finding unusual examples in a data set. Recently, much work has been conducted with the goal of finding fast algorithms for this task. We show that a simple nested loop algorithm that in the worst case is quadratic can give near linear time performance when the data is in random order and a simple pruning rule is used. We test our algorithm on real high-dimensional data sets with millions of examples and show that the near linear scaling holds over several orders of magnitude. Our average case analysis suggests that much of the efficiency is because the time to process non-outliers, which are the majority of examples, does not depend on the size of the data set.
Investigation of model-based physical design restrictions (Invited Paper)
NASA Astrophysics Data System (ADS)
Lucas, Kevin; Baron, Stanislas; Belledent, Jerome; Boone, Robert; Borjon, Amandine; Couderc, Christophe; Patterson, Kyle; Riviere-Cazaux, Lionel; Rody, Yves; Sundermann, Frank; Toublan, Olivier; Trouiller, Yorick; Urbani, Jean-Christophe; Wimmer, Karl
2005-05-01
As lithography and other patterning processes become more complex and more non-linear with each generation, the task of physical design rules necessarily increases in complexity also. The goal of the physical design rules is to define the boundary between the physical layout structures which will yield well from those which will not. This is essentially a rule-based pre-silicon guarantee of layout correctness. However the rapid increase in design rule requirement complexity has created logistical problems for both the design and process functions. Therefore, similar to the semiconductor industry's transition from rule-based to model-based optical proximity correction (OPC) due to increased patterning complexity, opportunities for improving physical design restrictions by implementing model-based physical design methods are evident. In this paper we analyze the possible need and applications for model-based physical design restrictions (MBPDR). We first analyze the traditional design rule evolution, development and usage methodologies for semiconductor manufacturers. Next we discuss examples of specific design rule challenges requiring new solution methods in the patterning regime of low K1 lithography and highly complex RET. We then evaluate possible working strategies for MBPDR in the process development and product design flows, including examples of recent model-based pre-silicon verification techniques. Finally we summarize with a proposed flow and key considerations for MBPDR implementation.
26 CFR 1.652(a)-1 - Simple trusts; inclusion of amounts in income of beneficiaries.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Simple trusts; inclusion of amounts in income of beneficiaries. 1.652(a)-1 Section 1.652(a)-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE....652(a)-1 Simple trusts; inclusion of amounts in income of beneficiaries. Subject to the rules in §§ 1...
Zero-knowledge cooperation in dilemma games.
Huck, Steffen; Normann, Hans Theo; Oechssler, Jorg
2003-01-07
We consider a very simple adaptive rule that induces cooperative behavior in a large class of dilemma games. The rule has a Pavlovian flavor and can be described as win-continue, lose-reverse. It assumes no knowledge about the underlying structure of the environment (the "rules of the game") and requires very little cognitive effort. Both features make it an appealing candidate for explaining the emergence of cooperative behavior in non-human species. Copyright 2003 Elsevier Science Ltd.
Diagonalizing Tensor Covariants, Light-Cone Commutators, and Sum Rules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lo, C. Y.
We derive fixed-mass sum rules for virtual Compton scattering the forward direction. We use the methods of both Dicus, Jackiw, and Teplitz (for the absorptive parts) and Heimann, Hey, and Mandula (for the real parts). We find a set of tensor covariansa such that the corresponding scalar amplitudes are proportional to simple t-channel parity-conserving helicity amplitudes. We give a relatively complete discussion of the convergence of the sum rules in a Regge model. (auth)
Optimal Government Subsidies to Universities in the Face of Tuition and Enrollment Constraints
ERIC Educational Resources Information Center
Easton, Stephen T.; Rockerbie, Duane W.
2008-01-01
This paper develops a simple static model of an imperfectly competitive university operating under government-imposed constraints on the ability to raise tuition fees and increase enrollments. The model has particular applicability to Canadian universities. Assuming an average cost pricing rule, rules for adequate government subsidies (operating…
A Simple Derivation of Chemically Important Classical Observables and Superselection Rules.
ERIC Educational Resources Information Center
Muller-Herold, U.
1985-01-01
Explores the question "Why are so many stationary states allowed by traditional quantum mechanics not realized in nature?" through discussion of classical observables and superselection rules. Three examples are given that can be used in introductory courses (including the fermion/boson property and the mass of a "nonrelativistic" particle). (JN)
Children's Task-Switching Efficiency: Missing Our Cue?
ERIC Educational Resources Information Center
Holt, Anna E.; Deák, Gedeon
2015-01-01
In simple rule-switching tests, 3- and 4-year-olds can follow each of two sorting rules but sometimes make perseverative errors when switching. Older children make few errors but respond slowly when switching. These age-related changes might reflect the maturation of executive functions (e.g., inhibition). However, they might also reflect…
Eliciting Systematic Rule Use in Covariation Judgment [the Early Years].
ERIC Educational Resources Information Center
Shaklee, Harriet; Paszek, Donald
Related research suggests that children may show some simple understanding of event covariations by the early elementary school years. The present experiments use a rule analysis methodology to investigate covariation judgments of children in this age range. In Experiment 1, children in second, third and fourth grade judged covariations on 12…
When Simple Things Are Meaningful: Working Memory Strength Predicts Children's Cognitive Flexibility
ERIC Educational Resources Information Center
Blackwell, Katharine A.; Cepeda, Nicholas J.; Munakata, Yuko
2009-01-01
People often perseverate, repeating outdated behaviors despite correctly answering questions about rules they should be following. Children who perseverate are slower to respond to such questions than children who successfully switch to new rules, even after controlling for age and processing speed. Thus, switchers may have stronger working memory…
Context-Sensitive Rules and Word Naming in Italian Children
ERIC Educational Resources Information Center
Barca, Laura; Ellis, Andrew W.; Burani, Cristina
2007-01-01
The present study examines the role of orthographic complexity on Italian children's word reading. Two experiments are reported in which elementary school children (3rd and 5th graders) read aloud words containing simple or contextual letter-sound conversion rules. In Experiment 1, both groups of participants read words containing contextual rules…
An Evaluation of the Good Behavior Game in Kindergarten Classrooms
ERIC Educational Resources Information Center
Donaldson, Jeanne M.; Vollmer, Timothy R.; Krous, Tangala; Downs, Susan; Berard, Kerri P.
2011-01-01
The good behavior game (GBG) is a classwide group contingency that involves dividing the class into two teams, creating simple rules, and arranging contingencies for breaking or following those rules. Five kindergarten teachers and classrooms participated in this evaluation of the GBG. Disruptive behavior markedly decreased in all five classrooms…
Atomic clusters and atomic surfaces in icosahedral quasicrystals.
Quiquandon, Marianne; Portier, Richard; Gratias, Denis
2014-05-01
This paper presents the basic tools commonly used to describe the atomic structures of quasicrystals with a specific focus on the icosahedral phases. After a brief recall of the main properties of quasiperiodic objects, two simple physical rules are discussed that lead one to eventually obtain a surprisingly small number of atomic structures as ideal quasiperiodic models for real quasicrystals. This is due to the fact that the atomic surfaces (ASs) used to describe all known icosahedral phases are located on high-symmetry special points in six-dimensional space. The first rule is maximizing the density using simple polyhedral ASs that leads to two possible sets of ASs according to the value of the six-dimensional lattice parameter A between 0.63 and 0.79 nm. The second rule is maximizing the number of complete orbits of high symmetry to construct as large as possible atomic clusters similar to those observed in complex intermetallic structures and approximant phases. The practical use of these two rules together is demonstrated on two typical examples of icosahedral phases, i-AlMnSi and i-CdRE (RE = Gd, Ho, Tm).
Design technology co-optimization for 14/10nm metal1 double patterning layer
NASA Astrophysics Data System (ADS)
Duan, Yingli; Su, Xiaojing; Chen, Ying; Su, Yajuan; Shao, Feng; Zhang, Recco; Lei, Junjiang; Wei, Yayi
2016-03-01
Design and technology co-optimization (DTCO) can satisfy the needs of the design, generate robust design rule, and avoid unfriendly patterns at the early stage of design to ensure a high level of manufacturability of the product by the technical capability of the present process. The DTCO methodology in this paper includes design rule translation, layout analysis, model validation, hotspots classification and design rule optimization mainly. The correlation of the DTCO and double patterning (DPT) can optimize the related design rule and generate friendlier layout which meets the requirement of the 14/10nm technology node. The experiment demonstrates the methodology of DPT-compliant DTCO which is applied to a metal1 layer from the 14/10nm node. The DTCO workflow proposed in our job is an efficient solution for optimizing the design rules for 14/10 nm tech node Metal1 layer. And the paper also discussed and did the verification about how to tune the design rule of the U-shape and L-shape structures in a DPT-aware metal layer.
A simple rule reduces costs of extragroup parasitism in a communally breeding bird.
Riehl, Christina
2010-10-26
How do cooperatively breeding groups resist invasion by parasitic "cheaters," which dump their eggs in the communal nest but provide no parental care [1,2]? Here I show that Greater Anis (Crotophaga major), Neotropical cuckoos that nest in social groups containing several breeding females [3], use a simple rule based on the timing of laying to recognize and reject eggs laid by extragroup parasites. I experimentally confirmed that Greater Anis cannot recognize parasitic eggs based on the appearance of host egg phenotypes or on the number of eggs in the clutch. However, they can discriminate between freshly laid eggs and those that have already been incubated, and they accordingly eject asynchronous eggs. This mechanism is reliable in naturally parasitized nests, because group members typically lay their eggs in tight synchrony, whereas the majority of parasitic eggs are laid several days later. Rejection of asynchronous eggs therefore provides a rare empirical example of a complex, group-level behavior that arises through relatively simple "rules of thumb" without requiring advanced cognitive mechanisms such as learning, counting, or individual recognition. Copyright © 2010 Elsevier Ltd. All rights reserved.
A graph grammar approach to artificial life.
Kniemeyer, Ole; Buck-Sorlin, Gerhard H; Kurth, Winfried
2004-01-01
We present the high-level language of relational growth grammars (RGGs) as a formalism designed for the specification of ALife models. RGGs can be seen as an extension of the well-known parametric Lindenmayer systems and contain rule-based, procedural, and object-oriented features. They are defined as rewriting systems operating on graphs with the edges coming from a set of user-defined relations, whereas the nodes can be associated with objects. We demonstrate their ability to represent genes, regulatory networks of metabolites, and morphologically structured organisms, as well as developmental aspects of these entities, in a common formal framework. Mutation, crossing over, selection, and the dynamics of a network of gene regulation can all be represented with simple graph rewriting rules. This is demonstrated in some detail on the classical example of Dawkins' biomorphs and the ABC model of flower morphogenesis: other applications are briefly sketched. An interactive program was implemented, enabling the execution of the formalism and the visualization of the results.
Optimization of RET flow using test layout
NASA Astrophysics Data System (ADS)
Zhang, Yunqiang; Sethi, Satyendra; Lucas, Kevin
2008-11-01
At advanced technology nodes with extremely low k1 lithography, it is very hard to achieve image fidelity requirements and process window for some layout configurations. Quite often these layouts are within simple design rule constraints for a given technology node. It is important to have these layouts included during early RET flow development. Most of RET developments are based on shrunk layout from the previous technology node, which is possibly not good enough. A better methodology in creating test layout is required for optical proximity correction (OPC) recipe and assists feature development. In this paper we demonstrate the application of programmable test layouts in RET development. Layout pattern libraries are developed and embedded in a layout tool (ICWB). Assessment gauges are generated together with patterns for quick correction accuracy assessment. Several groups of test pattern libraries have been developed based on learning from product patterns and a layout DOE approach. The interaction between layout patterns and OPC recipe has been studied. Correction of a contact layer is quite challenge because of poor convergence and low process window. We developed test pattern library with many different contact configurations. Different OPC schemes are studied on these test layouts. The worst process window patterns are pinpointed for a given illumination condition. Assist features (AF) are frequently placed according to pre-determined rules to improve lithography process window. These rules are usually derived from lithographic models and experiments. Direct validation of AF rules is required at development phase. We use the test layout approach to determine rules in order to eliminate AF printability problem.
Hybrid overlay metrology with CDSEM in a BEOL patterning scheme
NASA Astrophysics Data System (ADS)
Leray, Philippe; Jehoul, Christiane; Inoue, Osamu; Okagawa, Yutaka
2015-03-01
Overlay metrology accuracy is a major concern for our industry. Advanced logic process require more tighter overlay control for multipatterning schemes. TIS (Tool Induced Shift) and WIS (Wafer Induced Shift) are the main issues for IBO (Image Based Overlay) and DBO (Diffraction Based Overlay). Methods of compensation have been introduced, some are even very efficient to reduce these measured offsets. Another related question is about the overlay target designs. These targets are never fully representative of the design rules, strong efforts have been achieved, but the device cannot be completely duplicated. Ideally, we would like to measure in the device itself to verify the real overlay value. Top down CDSEM can measure critical dimensions of any structure, it is not dependent of specific target design. It can also measure the overlay errors but only in specific cases like LELE (Litho Etch Litho Etch) after final patterning. In this paper, we will revisit the capability of the CDSEM at final patterning by measuring overlay in dedicated targets as well as inside a logic and an SRAM design. In the dedicated overlay targets, we study the measurement differences between design rules gratings and relaxed pitch gratings. These relaxed pitch which are usually used in IBO or DBO targets. Beyond this "simple" LELE case, we will explore the capability of the CDSEM to measure overlay even if not at final patterning, at litho level. We will assess the hybridization of DBO and CDSEM for reference to optical tools after final patterning. We will show that these reference data can be used to validate the DBO overlay results (correctables and residual fingerprints).
Implementing a Commercial Rule Base as a Medication Order Safety Net
Reichley, Richard M.; Seaton, Terry L.; Resetar, Ervina; Micek, Scott T.; Scott, Karen L.; Fraser, Victoria J.; Dunagan, W. Claiborne; Bailey, Thomas C.
2005-01-01
A commercial rule base (Cerner Multum) was used to identify medication orders exceeding recommended dosage limits at five hospitals within BJC HealthCare, an integrated health care system. During initial testing, clinical pharmacists determined that there was an excessive number of nuisance and clinically insignificant alerts, with an overall alert rate of 9.2%. A method for customizing the commercial rule base was implemented to increase rule specificity for problematic rules. The system was subsequently deployed at two facilities and achieved alert rates of less than 1%. Pharmacists screened these alerts and contacted ordering physicians in 21% of cases. Physicians made therapeutic changes in response to 38% of alerts presented to them. By applying simple techniques to customize rules, commercial rule bases can be used to rapidly deploy a safety net to screen drug orders for excessive dosages, while preserving the rule architecture for later implementations of more finely tuned clinical decision support. PMID:15802481
Jia, Xiuqin; Liang, Peipeng; Shi, Lin; Wang, Defeng; Li, Kuncheng
2015-01-01
In neuroimaging studies, increased task complexity can lead to increased activation in task-specific regions or to activation of additional regions. How the brain adapts to increased rule complexity during inductive reasoning remains unclear. In the current study, three types of problems were created: simple rule induction (i.e., SI, with rule complexity of 1), complex rule induction (i.e., CI, with rule complexity of 2), and perceptual control. Our findings revealed that increased activations accompany increased rule complexity in the right dorsal lateral prefrontal cortex (DLPFC) and medial posterior parietal cortex (precuneus). A cognitive model predicted both the behavioral and brain imaging results. The current findings suggest that neural activity in frontal and parietal regions is modulated by rule complexity, which may shed light on the neural mechanisms of inductive reasoning. Copyright © 2014. Published by Elsevier Ltd.
Sea-level rise and shoreline retreat: time to abandon the Bruun Rule
NASA Astrophysics Data System (ADS)
Cooper, J. Andrew G.; Pilkey, Orrin H.
2004-11-01
In the face of a global rise in sea level, understanding the response of the shoreline to changes in sea level is a critical scientific goal to inform policy makers and managers. A body of scientific information exists that illustrates both the complexity of the linkages between sea-level rise and shoreline response, and the comparative lack of understanding of these linkages. In spite of the lack of understanding, many appraisals have been undertaken that employ a concept known as the "Bruun Rule". This is a simple two-dimensional model of shoreline response to rising sea level. The model has seen near global application since its original formulation in 1954. The concept provided an advance in understanding of the coastal system at the time of its first publication. It has, however, been superseded by numerous subsequent findings and is now invalid. Several assumptions behind the Bruun Rule are known to be false and nowhere has the Bruun Rule been adequately proven; on the contrary several studies disprove it in the field. No universally applicable model of shoreline retreat under sea-level rise has yet been developed. Despite this, the Bruun Rule is in widespread contemporary use at a global scale both as a management tool and as a scientific concept. The persistence of this concept beyond its original assumption base is attributed to the following factors: Appeal of a simple, easy to use analytical model that is in widespread use. Difficulty of determining the relative validity of 'proofs' and 'disproofs'. Ease of application. Positive advocacy by some scientists. Application by other scientists without critical appraisal. The simple numerical expression of the model. Lack of easy alternatives. The Bruun Rule has no power for predicting shoreline behaviour under rising sea level and should be abandoned. It is a concept whose time has passed. The belief by policy makers that it offers a prediction of future shoreline position may well have stifled much-needed research into the coastal response to sea-level rise.
Economics and computer science of a radio spectrum reallocation.
Leyton-Brown, Kevin; Milgrom, Paul; Segal, Ilya
2017-07-11
The recent "incentive auction" of the US Federal Communications Commission was the first auction to reallocate radio frequencies between two different kinds of uses: from broadcast television to wireless Internet access. The design challenge was not just to choose market rules to govern a fixed set of potential trades but also, to determine the broadcasters' property rights, the goods to be exchanged, the quantities to be traded, the computational procedures, and even some of the performance objectives. An essential and unusual challenge was to make the auction simple enough for human participants while still ensuring that the computations would be tractable and capable of delivering nearly efficient outcomes.
Discrete shaped strain sensors for intelligent structures
NASA Technical Reports Server (NTRS)
Andersson, Mark S.; Crawley, Edward F.
1992-01-01
Design of discrete, highly distributed sensor systems for intelligent structures has been studied. Data obtained indicate that discrete strain-averaging sensors satisfy the functional requirements for distributed sensing of intelligent structures. Bartlett and Gauss-Hanning sensors, in particular, provide good wavenumber characteristics while meeting the functional requirements. They are characterized by good rolloff rates and positive Fourier transforms for all wavenumbers. For the numerical integration schemes, Simpson's rule is considered to be very simple to implement and consistently provides accurate results for five sensors or more. It is shown that a sensor system that satisfies the functional requirements can be applied to a structure that supports mode shapes with purely sinusoidal curvature.
An Integrated Product Environment
NASA Technical Reports Server (NTRS)
Higgins, Chuck
1997-01-01
Mechanical Advantage is a mechanical design decision support system. Unlike our CAD/CAM cousins, Mechanical Advantage addresses true engineering processes, not just the form and fit of geometry. If we look at a traditional engineering environment, we see that an engineer starts with two things - performance goals and design rules. The intent is to have a product perform specific functions and accomplish that within a designated environment. Geometry should be a simple byproduct of that engineering process - not the controller of it. Mechanical Advantage is a performance modeler allowing engineers to consider all these criteria in making their decisions by providing such capabilities as critical parameter analysis, tolerance and sensitivity analysis, math driven Geometry, and automated design optimizations. If you should desire an industry standard solid model, we would produce an ACIS-based solid model. If you should desire an ANSI/ISO standard drawing, we would produce this as well with a virtual push of the button. For more information on this and other Advantage Series products, please contact the author.
Generating Concise Rules for Human Motion Retrieval
NASA Astrophysics Data System (ADS)
Mukai, Tomohiko; Wakisaka, Ken-Ichi; Kuriyama, Shigeru
This paper proposes a method for retrieving human motion data with concise retrieval rules based on the spatio-temporal features of motion appearance. Our method first converts motion clip into a form of clausal language that represents geometrical relations between body parts and their temporal relationship. A retrieval rule is then learned from the set of manually classified examples using inductive logic programming (ILP). ILP automatically discovers the essential rule in the same clausal form with a user-defined hypothesis-testing procedure. All motions are indexed using this clausal language, and the desired clips are retrieved by subsequence matching using the rule. Such rule-based retrieval offers reasonable performance and the rule can be intuitively edited in the same language form. Consequently, our method enables efficient and flexible search from a large dataset with simple query language.
NASA Astrophysics Data System (ADS)
Nozaki, Daijiro; Avdoshenko, Stanislav M.; Sevinçli, Hâldun; Gutierrez, Rafael; Cuniberti, Gianaurelio
2013-03-01
Recently the interest in quantum interference (QI) phenomena in molecular devices (molecular junctions) has been growing due to the unique features observed in the transmission spectra. In order to design single molecular devices exploiting QI effects as desired, it is necessary to provide simple rules for predicting the appearance of QI effects such as anti-resonances or Fano line shapes and for controlling them. In this study, we derive a transmission function of a generic molecular junction with a side group (T-shaped molecular junction) using a minimal toy model. We developed a simple method to predict the appearance of quantum interference, Fano resonances or anti- resonances, and its position in the conductance spectrum by introducing a simple graphical representation (parabolic model). Using it we can easily visualize the relation between the key electronic parameters and the positions of normal resonant peaks and anti-resonant peaks induced by quantum interference in the conductance spectrum. We also demonstrate Fano and anti-resonance in T-shaped molecular junctions using a simple tight-binding model. This parabolic model enables one to infer on-site energies of T-shaped molecules and the coupling between side group and main conduction channel from transmission spectra.
10 CFR Appendix C to Part 52 - Design Certification Rule for the AP600 Design
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 2 2012-01-01 2012-01-01 false Design Certification Rule for the AP600 Design C Appendix C to Part 52 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES, CERTIFICATIONS, AND APPROVALS FOR NUCLEAR POWER PLANTS Pt. 52, App. C Appendix C to Part 52—Design Certification Rule for the...
26 CFR 1.1441-0 - Outline of regulation provisions for section 1441.
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Proof that tax liability has been satisfied. (iii) Liability for interest and penalties. (iv) Special...) General rule. (B) Foreign partnerships. (C) Foreign simple trusts and foreign grantor trusts. (D) Other... amounts. (23) Flow-through entity. (24) Foreign simple trust. (25) Foreign complex trust. (26) Foreign...
NASA Astrophysics Data System (ADS)
Prasitmeeboon, Pitcha
Repetitive control (RC) is a control method that specifically aims to converge to zero tracking error of a control systems that execute a periodic command or have periodic disturbances of known period. It uses the error of one period back to adjust the command in the present period. In theory, RC can completely eliminate periodic disturbance effects. RC has applications in many fields such as high-precision manufacturing in robotics, computer disk drives, and active vibration isolation in spacecraft. The first topic treated in this dissertation develops several simple RC design methods that are somewhat analogous to PID controller design in classical control. From the early days of digital control, emulation methods were developed based on a Forward Rule, a Backward Rule, Tustin's Formula, a modification using prewarping, and a pole-zero mapping method. These allowed one to convert a candidate controller design to discrete time in a simple way. We investigate to what extent they can be used to simplify RC design. A particular design is developed from modification of the pole-zero mapping rules, which is simple and sheds light on the robustness of repetitive control designs. RC convergence requires less than 90 degree model phase error at all frequencies up to Nyquist. A zero-phase cutoff filter is normally used to robustify to high frequency model error when this limit is exceeded. The result is stabilization at the expense of failure to cancel errors above the cutoff. The second topic investigates a series of methods to use data to make real time updates of the frequency response model, allowing one to increase or eliminate the frequency cutoff. These include the use of a moving window employing a recursive discrete Fourier transform (DFT), and use of a real time projection algorithm from adaptive control for each frequency. The results can be used directly to make repetitive control corrections that cancel each error frequency, or they can be used to update a repetitive control FIR compensator. The aim is to reduce the final error level by using real time frequency response model updates to successively increase the cutoff frequency, each time creating the improved model needed to produce convergence zero error up to the higher cutoff. Non-minimum phase systems present a difficult design challenge to the sister field of Iterative Learning Control. The third topic investigates to what extent the same challenges appear in RC. One challenge is that the intrinsic non-minimum phase zero mapped from continuous time is close to the pole of repetitive controller at +1 creating behavior similar to pole-zero cancellation. The near pole-zero cancellation causes slow learning at DC and low frequencies. The Min-Max cost function over the learning rate is presented. The Min-Max can be reformulated as a Quadratically Constrained Linear Programming problem. This approach is shown to be an RC design approach that addresses the main challenge of non-minimum phase systems to have a reasonable learning rate at DC. Although it was illustrated that using the Min-Max objective improves learning at DC and low frequencies compared to other designs, the method requires model accuracy at high frequencies. In the real world, models usually have error at high frequencies. The fourth topic addresses how one can merge the quadratic penalty to the Min-Max cost function to increase robustness at high frequencies. The topic also considers limiting the Min-Max optimization to some frequencies interval and applying an FIR zero-phase low-pass filter to cutoff the learning for frequencies above that interval.
Reward-Modulated Hebbian Plasticity as Leverage for Partially Embodied Control in Compliant Robotics
Burms, Jeroen; Caluwaerts, Ken; Dambre, Joni
2015-01-01
In embodied computation (or morphological computation), part of the complexity of motor control is offloaded to the body dynamics. We demonstrate that a simple Hebbian-like learning rule can be used to train systems with (partial) embodiment, and can be extended outside of the scope of traditional neural networks. To this end, we apply the learning rule to optimize the connection weights of recurrent neural networks with different topologies and for various tasks. We then apply this learning rule to a simulated compliant tensegrity robot by optimizing static feedback controllers that directly exploit the dynamics of the robot body. This leads to partially embodied controllers, i.e., hybrid controllers that naturally integrate the computations that are performed by the robot body into a neural network architecture. Our results demonstrate the universal applicability of reward-modulated Hebbian learning. Furthermore, they demonstrate the robustness of systems trained with the learning rule. This study strengthens our belief that compliant robots should or can be seen as computational units, instead of dumb hardware that needs a complex controller. This link between compliant robotics and neural networks is also the main reason for our search for simple universal learning rules for both neural networks and robotics. PMID:26347645
Butz, Markus; van Ooyen, Arjen
2013-01-01
Lasting alterations in sensory input trigger massive structural and functional adaptations in cortical networks. The principles governing these experience-dependent changes are, however, poorly understood. Here, we examine whether a simple rule based on the neurons' need for homeostasis in electrical activity may serve as driving force for cortical reorganization. According to this rule, a neuron creates new spines and boutons when its level of electrical activity is below a homeostatic set-point and decreases the number of spines and boutons when its activity exceeds this set-point. In addition, neurons need a minimum level of activity to form spines and boutons. Spine and bouton formation depends solely on the neuron's own activity level, and synapses are formed by merging spines and boutons independently of activity. Using a novel computational model, we show that this simple growth rule produces neuron and network changes as observed in the visual cortex after focal retinal lesions. In the model, as in the cortex, the turnover of dendritic spines was increased strongest in the center of the lesion projection zone, while axonal boutons displayed a marked overshoot followed by pruning. Moreover, the decrease in external input was compensated for by the formation of new horizontal connections, which caused a retinotopic remapping. Homeostatic regulation may provide a unifying framework for understanding cortical reorganization, including network repair in degenerative diseases or following focal stroke. PMID:24130472
Assessing predation risk: optimal behaviour and rules of thumb.
Welton, Nicky J; McNamara, John M; Houston, Alasdair I
2003-12-01
We look at a simple model in which an animal makes behavioural decisions over time in an environment in which all parameters are known to the animal except predation risk. In the model there is a trade-off between gaining information about predation risk and anti-predator behaviour. All predator attacks lead to death for the prey, so that the prey learns about predation risk by virtue of the fact that it is still alive. We show that it is not usually optimal to behave as if the current unbiased estimate of the predation risk is its true value. We consider two different ways to model reproduction; in the first scenario the animal reproduces throughout its life until it dies, and in the second scenario expected reproductive success depends on the level of energy reserves the animal has gained by some point in time. For both of these scenarios we find results on the form of the optimal strategy and give numerical examples which compare optimal behaviour with behaviour under simple rules of thumb. The numerical examples suggest that the value of the optimal strategy over the rules of thumb is greatest when there is little current information about predation risk, learning is not too costly in terms of predation, and it is energetically advantageous to learn about predation. We find that for the model and parameters investigated, a very simple rule of thumb such as 'use the best constant control' performs well.
Engineering modular ‘ON’ RNA switches using biological components
Ceres, Pablo; Trausch, Jeremiah J.; Batey, Robert T.
2013-01-01
Riboswitches are cis-acting regulatory elements broadly distributed in bacterial mRNAs that control a wide range of critical metabolic activities. Expression is governed by two distinct domains within the mRNA leader: a sensory ‘aptamer domain’ and a regulatory ‘expression platform’. Riboswitches have also received considerable attention as important tools in synthetic biology because of their conceptually simple structure and the ability to obtain aptamers that bind almost any conceivable small molecule using in vitro selection (referred to as SELEX). In the design of artificial riboswitches, a significant hurdle has been to couple the two domains enabling their efficient communication. We previously demonstrated that biological transcriptional ‘OFF’ expression platforms are easily coupled to diverse aptamers, both biological and SELEX-derived, using simple design rules. Here, we present two modular transcriptional ‘ON’ riboswitch expression platforms that are also capable of hosting foreign aptamers. We demonstrate that these biological parts can be used to facilely generate artificial chimeric riboswitches capable of robustly regulating transcription both in vitro and in vivo. We expect that these modular expression platforms will be of great utility for various synthetic biological applications that use RNA-based biosensors. PMID:23999097
Human anatomy nomenclature rules for the computer age.
Neumann, Paul E; Baud, Robert; Sprumont, Pierre
2017-04-01
Information systems are increasing in importance in biomedical sciences and medical practice. The nomenclature rules of human anatomy were reviewed for adequacy with respect to modern needs. New rules are proposed here to ensure that each Latin term is uniquely associated with an anatomical entity, as short and simple as possible, and machine-interpretable. Observance of these recommendations will also benefit students and translators of the Latin terms into other languages. Clin. Anat. 30:300-302, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
A Taxonomy of Network Centric Warfare Architectures
2008-01-01
mound structure emerges as a result of the termites following very simple rules, and exchanging very simple pheromone signals (Solé & Goodwin 2000...only fairly simple decisions.” For example, in far northern Australia, “magnetic termites ” build large termite mounds which are oriented north-south...and contain a complex ventilation system which controls temperature, humidity, and oxygen levels. But termite brains are too small to store a plan
Mapping Network Centric Operational Architectures to C2 and Software Architectures
2007-06-01
Instead, the termite mound structure emerges as a result of the termites following very simple rules, and exchanging very simple pheromone signals...Each worker need make only fairly simple decisions.” For example, in far northern Australia, “magnetic termites ” build large termite mounds which are...oriented north-south and contain a complex ventilation system which controls temperature, humidity, and oxygen levels. But termite brains are too
Real-time PCR (qPCR) primer design using free online software.
Thornton, Brenda; Basu, Chhandak
2011-01-01
Real-time PCR (quantitative PCR or qPCR) has become the preferred method for validating results obtained from assays which measure gene expression profiles. The process uses reverse transcription polymerase chain reaction (RT-PCR), coupled with fluorescent chemistry, to measure variations in transcriptome levels between samples. The four most commonly used fluorescent chemistries are SYBR® Green dyes and TaqMan®, Molecular Beacon or Scorpion probes. SYBR® Green is very simple to use and cost efficient. As SYBR® Green dye binds to any double-stranded DNA product, its success depends greatly on proper primer design. Many types of online primer design software are available, which can be used free of charge to design desirable SYBR® Green-based qPCR primers. This laboratory exercise is intended for those who have a fundamental background in PCR. It addresses the basic fluorescent chemistries of real-time PCR, the basic rules and pitfalls of primer design, and provides a step-by-step protocol for designing SYBR® Green-based primers with free, online software. Copyright © 2010 Wiley Periodicals, Inc.
A Consistent Set of Oxidation Number Rules for Intelligent Computer Tutoring
NASA Astrophysics Data System (ADS)
Holder, Dale A.; Johnson, Benny G.; Karol, Paul J.
2002-04-01
We have developed a method for assigning oxidation numbers that eliminates the inconsistencies and ambiguities found in most conventional textbook rules, yet remains simple enough for beginning students to use. It involves imposition of a two-level hierarchy on a set of rules similar to those already being taught. We recommend emphasizing that the oxidation number method is an approximate model and cannot always be successfully applied. This proper perspective will lead students to apply the rules more carefully in all problems. Whenever failure does occur, it will indicate the limitations of the oxidation number concept itself, rather than merely the failure of a poorly constructed set of rules. We have used these improved rules as the basis for an intelligent tutoring program on oxidation numbers.
Funk, Christopher S; Cohen, K Bretonnel; Hunter, Lawrence E; Verspoor, Karin M
2016-09-09
Gene Ontology (GO) terms represent the standard for annotation and representation of molecular functions, biological processes and cellular compartments, but a large gap exists between the way concepts are represented in the ontology and how they are expressed in natural language text. The construction of highly specific GO terms is formulaic, consisting of parts and pieces from more simple terms. We present two different types of manually generated rules to help capture the variation of how GO terms can appear in natural language text. The first set of rules takes into account the compositional nature of GO and recursively decomposes the terms into their smallest constituent parts. The second set of rules generates derivational variations of these smaller terms and compositionally combines all generated variants to form the original term. By applying both types of rules, new synonyms are generated for two-thirds of all GO terms and an increase in F-measure performance for recognition of GO on the CRAFT corpus from 0.498 to 0.636 is observed. Additionally, we evaluated the combination of both types of rules over one million full text documents from Elsevier; manual validation and error analysis show we are able to recognize GO concepts with reasonable accuracy (88 %) based on random sampling of annotations. In this work we present a set of simple synonym generation rules that utilize the highly compositional and formulaic nature of the Gene Ontology concepts. We illustrate how the generated synonyms aid in improving recognition of GO concepts on two different biomedical corpora. We discuss other applications of our rules for GO ontology quality assurance, explore the issue of overgeneration, and provide examples of how similar methodologies could be applied to other biomedical terminologies. Additionally, we provide all generated synonyms for use by the text-mining community.
Molecular implementation of simple logic programs.
Ran, Tom; Kaplan, Shai; Shapiro, Ehud
2009-10-01
Autonomous programmable computing devices made of biomolecules could interact with a biological environment and be used in future biological and medical applications. Biomolecular implementations of finite automata and logic gates have already been developed. Here, we report an autonomous programmable molecular system based on the manipulation of DNA strands that is capable of performing simple logical deductions. Using molecular representations of facts such as Man(Socrates) and rules such as Mortal(X) <-- Man(X) (Every Man is Mortal), the system can answer molecular queries such as Mortal(Socrates)? (Is Socrates Mortal?) and Mortal(X)? (Who is Mortal?). This biomolecular computing system compares favourably with previous approaches in terms of expressive power, performance and precision. A compiler translates facts, rules and queries into their molecular representations and subsequently operates a robotic system that assembles the logical deductions and delivers the result. This prototype is the first simple programming language with a molecular-scale implementation.
When push comes to shove: Exclusion processes with nonlocal consequences
NASA Astrophysics Data System (ADS)
Almet, Axel A.; Pan, Michael; Hughes, Barry D.; Landman, Kerry A.
2015-11-01
Stochastic agent-based models are useful for modelling collective movement of biological cells. Lattice-based random walk models of interacting agents where each site can be occupied by at most one agent are called simple exclusion processes. An alternative motility mechanism to simple exclusion is formulated, in which agents are granted more freedom to move under the compromise that interactions are no longer necessarily local. This mechanism is termed shoving. A nonlinear diffusion equation is derived for a single population of shoving agents using mean-field continuum approximations. A continuum model is also derived for a multispecies problem with interacting subpopulations, which either obey the shoving rules or the simple exclusion rules. Numerical solutions of the derived partial differential equations compare well with averaged simulation results for both the single species and multispecies processes in two dimensions, while some issues arise in one dimension for the multispecies case.
Self-Organized Dynamic Flocking Behavior from a Simple Deterministic Map
NASA Astrophysics Data System (ADS)
Krueger, Wesley
2007-10-01
Coherent motion exhibiting large-scale order, such as flocking, swarming, and schooling behavior in animals, can arise from simple rules applied to an initial random array of self-driven particles. We present a completely deterministic dynamic map that exhibits emergent, collective, complex motion for a group of particles. Each individual particle is driven with a constant speed in two dimensions adopting the average direction of a fixed set of non-spatially related partners. In addition, the particle changes direction by π as it reaches a circular boundary. The dynamical patterns arising from these rules range from simple circular-type convective motion to highly sophisticated, complex, collective behavior which can be easily interpreted as flocking, schooling, or swarming depending on the chosen parameters. We present the results as a series of short movies and we also explore possible order parameters and correlation functions capable of quantifying the resulting coherence.
Technology: Digital Photography in an Inner-City Fifth Grade, Part 2
ERIC Educational Resources Information Center
Riner, Phil
2005-01-01
Last month, Phil Riner began discussing his project of teaching digital photography and prosocial behavior skills to inner-city fifth-graders. This work led him to generate some very specific procedures for camera care and use. Phil also taught the students some simple rules for taking better photos. These rules fell into four broad categories:…
An Uncommon Approach to a Common Algebraic Error
ERIC Educational Resources Information Center
Rossi, Paul S.
2008-01-01
The basic rules of elementary algebra can often appear beyond the grasp of many students. Even though most subjects, including calculus, prove to be more difficult, it is the simple rules of algebra that continue to be the "thorn in the side" of many mathematics students. In this paper we present a result intended to help students achieve a…
ERIC Educational Resources Information Center
Kundey, Shannon M. A.; Strandell, Brittany; Mathis, Heather; Rowan, James D.
2010-01-01
(Hulse and Dorsky, 1977) and (Hulse and Dorsky, 1979) found that rats, like humans, learn sequences following a simple rule-based structure more quickly than those lacking a rule-based structure. Through two experiments, we explored whether two additional species--domesticated horses ("Equus callabus") and chickens ("Gallus domesticus")--would…
Design rules for RCA self-aligned silicon-gate CMOS/SOS process
NASA Technical Reports Server (NTRS)
1977-01-01
The CMOS/SOS design rules prepared by the RCA Solid State Technology Center (SSTC) are described. These rules specify the spacing and width requirements for each of the six design levels, the seventh level being used to define openings in the passivation level. An associated report, entitled Silicon-Gate CMOS/SOS Processing, provides further insight into the usage of these rules.
A retrospective study of two populations to test a simple rule for spirometry.
Ohar, Jill A; Yawn, Barbara P; Ruppel, Gregg L; Donohue, James F
2016-06-04
Chronic lung disease is common and often under-diagnosed. To test a simple rule for conducting spirometry we reviewed spirograms from two populations, occupational medicine evaluations (OME) conducted by Saint Louis and Wake Forest Universities at 3 sites (n = 3260, mean age 64.14 years, 95 % CI 58.94-69.34, 97 % men) and conducted by Wake Forest University preop clinic (POC) at one site (n = 845, mean age 62.10 years, 95 % CI 50.46-73.74, 57 % men). This retrospective review of database information that the first author collected prospectively identified rates, types, sensitivity, specificity and positive and negative predictive value for lung function abnormalities and associated mortality rate found when conducting spirometry based on the 20/40 rule (≥20 years of smoking in those aged ≥ 40 years) in the OME population. To determine the reproducibility of the 20/40 rule for conducting spirometry, the rule was applied to the POC population. A lung function abnormality was found in 74 % of the OME population and 67 % of the POC population. Sensitivity of the rule was 85 % for an obstructive pattern and 77 % for any abnormality on spirometry. Positive and negative predictive values of the rule for a spirometric abnormality were 74 and 55 %, respectively. Patients with an obstructive pattern were at greater risk of coronary heart disease (odds ratio (OR) 1.39 [confidence interval (CI) 1.00-1.93] vs. normal) and death (hazard ratio (HR) 1.53, 95 % CI 1.20-1.84) than subjects with normal spirometry. Restricted spirometry patterns were also associated with greater risk of coronary disease (odds ratio (OR) 1.7 [CI 1.23-2.35]) and death (Hazard ratio 1.40, 95 % CI 1.08-1.72). Smokers (≥ 20 pack years) age ≥ 40 years are at an increased risk for lung function abnormalities and those abnormalities are associated with greater presence of coronary heart disease and increased all-cause mortality. Use of the 20/40 rule could provide a simple method to enhance selection of candidates for spirometry evaluation in the primary care setting.
Design Rules for Life Support Systems
NASA Technical Reports Server (NTRS)
Jones, Harry
2002-01-01
This paper considers some of the common assumptions and engineering rules of thumb used in life support system design. One general design rule is that the longer the mission, the more the life support system should use recycling and regenerable technologies. A more specific rule is that, if the system grows more than half the food, the food plants will supply all the oxygen needed for the crew life support. There are many such design rules that help in planning the analysis of life support systems and in checking results. These rules are typically if-then statements describing the results of steady-state, "back of the envelope," mass flow calculations. They are useful in identifying plausible candidate life support system designs and in rough allocations between resupply and resource recovery. Life support system designers should always review the design rules and make quick steady state calculations before doing detailed design and dynamic simulation. This paper develops the basis for the different assumptions and design rules and discusses how they should be used. We start top-down, with the highest level requirement to sustain human beings in a closed environment off Earth. We consider the crew needs for air, water, and food. We then discuss atmosphere leakage and recycling losses. The needs to support the crew and to make up losses define the fundamental life support system requirements. We consider the trade-offs between resupplying and recycling oxygen, water, and food. The specific choices between resupply and recycling are determined by mission duration, presence of in-situ resources, etc., and are defining parameters of life support system design.
Modular assembly of optical nanocircuits.
Shi, Jinwei; Monticone, Francesco; Elias, Sarah; Wu, Yanwen; Ratchford, Daniel; Li, Xiaoqin; Alù, Andrea
2014-05-29
A key element enabling the microelectronic technology advances of the past decades has been the conceptualization of complex circuits with versatile functionalities as being composed of the proper combination of basic 'lumped' circuit elements (for example, inductors and capacitors). In contrast, modern nanophotonic systems are still far from a similar level of sophistication, partially because of the lack of modularization of their response in terms of basic building blocks. Here we demonstrate the design, assembly and characterization of relatively complex photonic nanocircuits by accurately positioning a number of metallic and dielectric nanoparticles acting as modular lumped elements. The nanoparticle clusters produce the desired spectral response described by simple circuit rules and are shown to be dynamically reconfigurable by modifying the direction or polarization of impinging signals. Our work represents an important step towards extending the powerful modular design tools of electronic circuits into nanophotonic systems.
Modular assembly of optical nanocircuits
NASA Astrophysics Data System (ADS)
Shi, Jinwei; Monticone, Francesco; Elias, Sarah; Wu, Yanwen; Ratchford, Daniel; Li, Xiaoqin; Alù, Andrea
2014-05-01
A key element enabling the microelectronic technology advances of the past decades has been the conceptualization of complex circuits with versatile functionalities as being composed of the proper combination of basic ‘lumped’ circuit elements (for example, inductors and capacitors). In contrast, modern nanophotonic systems are still far from a similar level of sophistication, partially because of the lack of modularization of their response in terms of basic building blocks. Here we demonstrate the design, assembly and characterization of relatively complex photonic nanocircuits by accurately positioning a number of metallic and dielectric nanoparticles acting as modular lumped elements. The nanoparticle clusters produce the desired spectral response described by simple circuit rules and are shown to be dynamically reconfigurable by modifying the direction or polarization of impinging signals. Our work represents an important step towards extending the powerful modular design tools of electronic circuits into nanophotonic systems.
Designing Light Beam Transmittance Measuring Tool Using a Laser Pointer
NASA Astrophysics Data System (ADS)
Nuroso, H.; Kurniawan, W.; Marwoto, P.
2016-08-01
A simple instrument used for measuring light beam transmittance percentage made of window film has been developed. The instrument uses a laser pointer of 405 nm and 650 nm ±10% as a light source. Its accuracy approaches 80%. Transmittance data was found by comparing the light beam before and after passing the window film. The light intensity measuring unit was deleted by splitting the light source into two beams through a beam splitter. The light beam was changed into resistance by a NORP12 LDR sensor designed at a circuit of voltage divider rule of Khirchoff's laws. This conversion system will produce light beam intensity received by the sensor to become an equal voltage. This voltage will, then, be presented on the computer screen in the form of a real time graph via a 2.0 USB data transfer.
Generating self-organizing collective behavior using separation dynamics from experimental data
NASA Astrophysics Data System (ADS)
Dieck Kattas, Graciano; Xu, Xiao-Ke; Small, Michael
2012-09-01
Mathematical models for systems of interacting agents using simple local rules have been proposed and shown to exhibit emergent swarming behavior. Most of these models are constructed by intuition or manual observations of real phenomena, and later tuned or verified to simulate desired dynamics. In contrast to this approach, we propose using a model that attempts to follow an averaged rule of the essential distance-dependent collective behavior of real pigeon flocks, which was abstracted from experimental data. By using a simple model to follow the behavioral tendencies of real data, we show that our model can exhibit a wide range of emergent self-organizing dynamics such as flocking, pattern formation, and counter-rotating vortices.
Generating self-organizing collective behavior using separation dynamics from experimental data.
Dieck Kattas, Graciano; Xu, Xiao-Ke; Small, Michael
2012-09-01
Mathematical models for systems of interacting agents using simple local rules have been proposed and shown to exhibit emergent swarming behavior. Most of these models are constructed by intuition or manual observations of real phenomena, and later tuned or verified to simulate desired dynamics. In contrast to this approach, we propose using a model that attempts to follow an averaged rule of the essential distance-dependent collective behavior of real pigeon flocks, which was abstracted from experimental data. By using a simple model to follow the behavioral tendencies of real data, we show that our model can exhibit a wide range of emergent self-organizing dynamics such as flocking, pattern formation, and counter-rotating vortices.
A Methodology for Multihazards Load Combinations of Earthquake and Heavy Trucks for Bridges
Wang, Xu; Sun, Baitao
2014-01-01
Issues of load combinations of earthquakes and heavy trucks are important contents in multihazards bridge design. Current load resistance factor design (LRFD) specifications usually treat extreme hazards alone and have no probabilistic basis in extreme load combinations. Earthquake load and heavy truck load are considered as random processes with respective characteristics, and the maximum combined load is not the simple superimposition of their maximum loads. Traditional Ferry Borges-Castaneda model that considers load lasting duration and occurrence probability well describes random process converting to random variables and load combinations, but this model has strict constraint in time interval selection to obtain precise results. Turkstra's rule considers one load reaching its maximum value in bridge's service life combined with another load with its instantaneous value (or mean value), which looks more rational, but the results are generally unconservative. Therefore, a modified model is presented here considering both advantages of Ferry Borges-Castaneda's model and Turkstra's rule. The modified model is based on conditional probability, which can convert random process to random variables relatively easily and consider the nonmaximum factor in load combinations. Earthquake load and heavy truck load combinations are employed to illustrate the model. Finally, the results of a numerical simulation are used to verify the feasibility and rationality of the model. PMID:24883347
A Hybrid Genetic Programming Algorithm for Automated Design of Dispatching Rules.
Nguyen, Su; Mei, Yi; Xue, Bing; Zhang, Mengjie
2018-06-04
Designing effective dispatching rules for production systems is a difficult and timeconsuming task if it is done manually. In the last decade, the growth of computing power, advanced machine learning, and optimisation techniques has made the automated design of dispatching rules possible and automatically discovered rules are competitive or outperform existing rules developed by researchers. Genetic programming is one of the most popular approaches to discovering dispatching rules in the literature, especially for complex production systems. However, the large heuristic search space may restrict genetic programming from finding near optimal dispatching rules. This paper develops a new hybrid genetic programming algorithm for dynamic job shop scheduling based on a new representation, a new local search heuristic, and efficient fitness evaluators. Experiments show that the new method is effective regarding the quality of evolved rules. Moreover, evolved rules are also significantly smaller and contain more relevant attributes.
Simple Identification of Complex ADHD Subtypes Using Current Symptom Counts
ERIC Educational Resources Information Center
Volk, Heather E.; Todorov, Alexandre A.; Hay, David A.; Todd, Richard D.
2009-01-01
The results of the assessment of the accuracy of simple rules based on symptom count for assigning youths to attention deficit hyperactivity disorder subtypes show that having six or more total symptoms and fewer than three hyperactive-impulsive symptoms is an accurate predictor for the latent class sever inattentive subtype.
Optimal sampling strategies for detecting zoonotic disease epidemics.
Ferguson, Jake M; Langebrake, Jessica B; Cannataro, Vincent L; Garcia, Andres J; Hamman, Elizabeth A; Martcheva, Maia; Osenberg, Craig W
2014-06-01
The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.
Hervás, César; Silva, Manuel; Serrano, Juan Manuel; Orejuela, Eva
2004-01-01
The suitability of an approach for extracting heuristic rules from trained artificial neural networks (ANNs) pruned by a regularization method and with architectures designed by evolutionary computation for quantifying highly overlapping chromatographic peaks is demonstrated. The ANN input data are estimated by the Levenberg-Marquardt method in the form of a four-parameter Weibull curve associated with the profile of the chromatographic band. To test this approach, two N-methylcarbamate pesticides, carbofuran and propoxur, were quantified using a classic peroxyoxalate chemiluminescence reaction as a detection system for chromatographic analysis. Straightforward network topologies (one and two outputs models) allow the analytes to be quantified in concentration ratios ranging from 1:7 to 5:1 with an average standard error of prediction for the generalization test of 2.7 and 2.3% for carbofuran and propoxur, respectively. The reduced dimensions of the selected ANN architectures, especially those obtained after using heuristic rules, allowed simple quantification equations to be developed that transform the input variables into output variables. These equations can be easily interpreted from a chemical point of view to attain quantitative analytical information regarding the effect of both analytes on the characteristics of chromatographic bands, namely profile, dispersion, peak height, and residence time. Copyright 2004 American Chemical Society
Meyer, Claas; Reutter, Michaela; Matzdorf, Bettina; Sattler, Claudia; Schomers, Sarah
2015-07-01
In recent years, increasing attention has been paid to financial environmental policy instruments that have played important roles in solving agri-environmental problems throughout the world, particularly in the European Union and the United States. The ample and increasing literature on Payments for Ecosystem Services (PES) and agri-environmental measures (AEMs), generally understood as governmental PES, shows that certain single design rules may have an impact on the success of a particular measure. Based on this research, we focused on the interplay of several design rules and conducted a comparative analysis of AEMs' institutional arrangements by examining 49 German cases. We analyzed the effects of the design rules and certain rule combinations on the success of AEMs. Compliance and noncompliance with the hypothesized design rules and the success of the AEMs were surveyed by questioning the responsible agricultural administration and the AEMs' mid-term evaluators. The different rules were evaluated in regard to their necessity and sufficiency for success using Qualitative Comparative Analysis (QCA). Our results show that combinations of certain design rules such as environmental goal targeting and area targeting conditioned the success of the AEMs. Hence, we generalize design principles for AEMs and discuss implications for the general advancement of ecosystem services and the PES approach in agri-environmental policies. Moreover, we highlight the relevance of the results for governmental PES program research and design worldwide. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wide-field-of-view millimeter-wave telescope design with ultra-low cross-polarization
NASA Astrophysics Data System (ADS)
Bernacki, Bruce E.; Kelly, James F.; Sheen, David; Hatchell, Brian; Valdez, Patrick; Tedeschi, Jonathan; Hall, Thomas; McMakin, Douglas
2012-06-01
As millimeter-wave arrays become available, off-axis imaging performance of the fore optics increases in importance due to the relatively large physical extent of the arrays. Typically, simple optical telescope designs are adapted to millimeter-wave imaging but single-mirror spherical or classic conic designs cannot deliver adequate image quality except near the optical axis. Since millimeter-wave designs are quasi-optical, optical ray tracing and commercial design software can be used to optimize designs to improve off-axis imaging as well as minimize cross-polarization. Methods that obey the Dragone-Mizuguchi condition for the design of reflective millimeter-wave telescopes with low cross-polarization also provide additional degrees of freedom that offer larger fields of view than possible with single-reflector designs. Dragone's graphical design method does not lend itself readily to computer-based optical design approaches, but subsequent authors expanded on Dragone's geometric design approach with analytic expressions that describe the location, shape, off-axis height and tilt of the telescope elements that satisfy Dragone's design rules and can be used as a first-order design for subsequent computer-based design and optimization. We investigate two design variants that obey the Dragone-Mizuguchi conditions that exhibit ultra-low cross-polarization and a large diffraction-limited field of view well suited to millimeter-wave imaging arrays.
NASA Astrophysics Data System (ADS)
Gabor, Allen H.; Brendler, Andrew C.; Brunner, Timothy A.; Chen, Xuemei; Culp, James A.; Levinson, Harry J.
2018-03-01
The relationship between edge placement error, semiconductor design-rule determination and predicted yield in the era of EUV lithography is examined. This paper starts with the basics of edge placement error and then builds up to design-rule calculations. We show that edge placement error (EPE) definitions can be used as the building blocks for design-rule equations but that in the last several years the term "EPE" has been used in the literature to refer to many patterning errors that are not EPE. We then explore the concept of "Good Fields"1 and use it predict the n-sigma value needed for design-rule determination. Specifically, fundamental yield calculations based on the failure opportunities per chip are used to determine at what n-sigma "value" design-rules need to be tested to ensure high yield. The "value" can be a space between two features, an intersect area between two features, a minimum area of a feature, etc. It is shown that across chip variation of design-rule important values needs to be tested at sigma values between seven and eight which is much higher than the four-sigma values traditionally used for design-rule determination. After recommending new statistics be used for design-rule calculations the paper examines the impact of EUV lithography on sources of variation important for design-rule calculations. We show that stochastics can be treated as an effective dose variation that is fully sampled across every chip. Combining the increased within chip variation from EUV with the understanding that across chip variation of design-rule important values needs to not cause a yield loss at significantly higher sigma values than have traditionally been looked at, the conclusion is reached that across-wafer, wafer-to-wafer and lot-to-lot variation will have to overscale for any technology introducing EUV lithography where stochastic noise is a significant fraction of the effective dose variation. We will emphasize stochastic effects on edge placement error distributions and appropriate design-rule setting. While CD distributions with long tails coming from stochastic effects do bring increased risk of failure (especially on chips that may have over a billion failure opportunities per layer) there are other sources of variation that have sharp cutoffs, i.e. have no tails. We will review these sources and show how distributions with different skew and kurtosis values combine.
Timmerman, Dirk; Van Calster, Ben; Testa, Antonia; Savelli, Luca; Fischerova, Daniela; Froyman, Wouter; Wynants, Laure; Van Holsbeke, Caroline; Epstein, Elisabeth; Franchi, Dorella; Kaijser, Jeroen; Czekierdowski, Artur; Guerriero, Stefano; Fruscio, Robert; Leone, Francesco P G; Rossi, Alberto; Landolfo, Chiara; Vergote, Ignace; Bourne, Tom; Valentin, Lil
2016-04-01
Accurate methods to preoperatively characterize adnexal tumors are pivotal for optimal patient management. A recent metaanalysis concluded that the International Ovarian Tumor Analysis algorithms such as the Simple Rules are the best approaches to preoperatively classify adnexal masses as benign or malignant. We sought to develop and validate a model to predict the risk of malignancy in adnexal masses using the ultrasound features in the Simple Rules. This was an international cross-sectional cohort study involving 22 oncology centers, referral centers for ultrasonography, and general hospitals. We included consecutive patients with an adnexal tumor who underwent a standardized transvaginal ultrasound examination and were selected for surgery. Data on 5020 patients were recorded in 3 phases from 2002 through 2012. The 5 Simple Rules features indicative of a benign tumor (B-features) and the 5 features indicative of malignancy (M-features) are based on the presence of ascites, tumor morphology, and degree of vascularity at ultrasonography. Gold standard was the histopathologic diagnosis of the adnexal mass (pathologist blinded to ultrasound findings). Logistic regression analysis was used to estimate the risk of malignancy based on the 10 ultrasound features and type of center. The diagnostic performance was evaluated by area under the receiver operating characteristic curve, sensitivity, specificity, positive likelihood ratio (LR+), negative likelihood ratio (LR-), positive predictive value (PPV), negative predictive value (NPV), and calibration curves. Data on 4848 patients were analyzed. The malignancy rate was 43% (1402/3263) in oncology centers and 17% (263/1585) in other centers. The area under the receiver operating characteristic curve on validation data was very similar in oncology centers (0.917; 95% confidence interval, 0.901-0.931) and other centers (0.916; 95% confidence interval, 0.873-0.945). Risk estimates showed good calibration. In all, 23% of patients in the validation data set had a very low estimated risk (<1%) and 48% had a high estimated risk (≥30%). For the 1% risk cutoff, sensitivity was 99.7%, specificity 33.7%, LR+ 1.5, LR- 0.010, PPV 44.8%, and NPV 98.9%. For the 30% risk cutoff, sensitivity was 89.0%, specificity 84.7%, LR+ 5.8, LR- 0.13, PPV 75.4%, and NPV 93.9%. Quantification of the risk of malignancy based on the Simple Rules has good diagnostic performance both in oncology centers and other centers. A simple classification based on these risk estimates may form the basis of a clinical management system. Patients with a high risk may benefit from surgery by a gynecological oncologist, while patients with a lower risk may be managed locally. Copyright © 2016 Elsevier Inc. All rights reserved.
The Game of Life Rules on Penrose Tilings: Still Life and Oscillators
NASA Astrophysics Data System (ADS)
Owens, Nick; Stepney, Susan
John Horton Conway's Game of Life is a simple two-dimensional, two state cellular automaton (CA), remarkable for its complex behaviour. That behaviour is known to be very sensitive to a change in the CA rules. Here we continue our investigations into its sensitivity to changes in the lattice, by the use of an aperiodic Penrose tiling lattice.
Working dogs cooperate among one another by generalised reciprocity.
Gfrerer, Nastassja; Taborsky, Michael
2017-03-06
Cooperation by generalised reciprocity implies that individuals apply the decision rule "help anyone if helped by someone". This mechanism has been shown to generate evolutionarily stable levels of cooperation, but as yet it is unclear how widely this cooperation mechanism is applied among animals. Dogs (Canis familiaris) are highly social animals with considerable cognitive potential and the ability to differentiate between individual social partners. But although dogs can solve complex problems, they may use simple rules for behavioural decisions. Here we show that dogs trained in an instrumental cooperative task to provide food to a social partner help conspecifics more often after receiving help from a dog before. Remarkably, in so doing they show no distinction between partners that had helped them before and completely unfamiliar conspecifics. Apparently, dogs use the simple decision rule characterizing generalised reciprocity, although they are probably capable of using the more complex decision rule of direct reciprocity: "help someone who has helped you". However, generalized reciprocity involves lower information processing costs and is therefore a cheaper cooperation strategy. Our results imply that generalised reciprocity might be applied more commonly than direct reciprocity also in other mutually cooperating animals.
Working dogs cooperate among one another by generalised reciprocity
Gfrerer, Nastassja; Taborsky, Michael
2017-01-01
Cooperation by generalised reciprocity implies that individuals apply the decision rule “help anyone if helped by someone”. This mechanism has been shown to generate evolutionarily stable levels of cooperation, but as yet it is unclear how widely this cooperation mechanism is applied among animals. Dogs (Canis familiaris) are highly social animals with considerable cognitive potential and the ability to differentiate between individual social partners. But although dogs can solve complex problems, they may use simple rules for behavioural decisions. Here we show that dogs trained in an instrumental cooperative task to provide food to a social partner help conspecifics more often after receiving help from a dog before. Remarkably, in so doing they show no distinction between partners that had helped them before and completely unfamiliar conspecifics. Apparently, dogs use the simple decision rule characterizing generalised reciprocity, although they are probably capable of using the more complex decision rule of direct reciprocity: “help someone who has helped you”. However, generalized reciprocity involves lower information processing costs and is therefore a cheaper cooperation strategy. Our results imply that generalised reciprocity might be applied more commonly than direct reciprocity also in other mutually cooperating animals. PMID:28262722
He, ZeFang; Zhao, Long
2014-01-01
An attitude control strategy based on Ziegler-Nichols rules for tuning PD (proportional-derivative) parameters of quadrotor helicopters is presented to solve the problem that quadrotor tends to be instable. This problem is caused by the narrow definition domain of attitude angles of quadrotor helicopters. The proposed controller is nonlinear and consists of a linear part and a nonlinear part. The linear part is a PD controller with PD parameters tuned by Ziegler-Nichols rules and acts on the quadrotor decoupled linear system after feedback linearization; the nonlinear part is a feedback linearization item which converts a nonlinear system into a linear system. It can be seen from the simulation results that the attitude controller proposed in this paper is highly robust, and its control effect is better than the other two nonlinear controllers. The nonlinear parts of the other two nonlinear controllers are the same as the attitude controller proposed in this paper. The linear part involves a PID (proportional-integral-derivative) controller with the PID controller parameters tuned by Ziegler-Nichols rules and a PD controller with the PD controller parameters tuned by GA (genetic algorithms). Moreover, this attitude controller is simple and easy to implement.
Clairvoyant fusion: a new methodology for designing robust detection algorithms
NASA Astrophysics Data System (ADS)
Schaum, Alan
2016-10-01
Many realistic detection problems cannot be solved with simple statistical tests for known alternative probability models. Uncontrollable environmental conditions, imperfect sensors, and other uncertainties transform simple detection problems with likelihood ratio solutions into composite hypothesis (CH) testing problems. Recently many multi- and hyperspectral sensing CH problems have been addressed with a new approach. Clairvoyant fusion (CF) integrates the optimal detectors ("clairvoyants") associated with every unspecified value of the parameters appearing in a detection model. For problems with discrete parameter values, logical rules emerge for combining the decisions of the associated clairvoyants. For many problems with continuous parameters, analytic methods of CF have been found that produce closed-form solutions-or approximations for intractable problems. Here the principals of CF are reviewed and mathematical insights are described that have proven useful in the derivation of solutions. It is also shown how a second-stage fusion procedure can be used to create theoretically superior detection algorithms for ALL discrete parameter problems.
A Rule Based Approach to ISS Interior Volume Control and Layout
NASA Technical Reports Server (NTRS)
Peacock, Brian; Maida, Jim; Fitts, David; Dory, Jonathan
2001-01-01
Traditional human factors design involves the development of human factors requirements based on a desire to accommodate a certain percentage of the intended user population. As the product is developed human factors evaluation involves comparison between the resulting design and the specifications. Sometimes performance metrics are involved that allow leniency in the design requirements given that the human performance result is satisfactory. Clearly such approaches may work but they give rise to uncertainty and negotiation. An alternative approach is to adopt human factors design rules that articulate a range of each design continuum over which there are varying outcome expectations and interactions with other variables, including time. These rules are based on a consensus of human factors specialists, designers, managers and customers. The International Space Station faces exactly this challenge in interior volume control, which is based on anthropometric, performance and subjective preference criteria. This paper describes the traditional approach and then proposes a rule-based alternative. The proposed rules involve spatial, temporal and importance dimensions. If successful this rule-based concept could be applied to many traditional human factors design variables and could lead to a more effective and efficient contribution of human factors input to the design process.
Brighton, Caroline H.; Thomas, Adrian L. R.
2017-01-01
The ability to intercept uncooperative targets is key to many diverse flight behaviors, from courtship to predation. Previous research has looked for simple geometric rules describing the attack trajectories of animals, but the underlying feedback laws have remained obscure. Here, we use GPS loggers and onboard video cameras to study peregrine falcons, Falco peregrinus, attacking stationary targets, maneuvering targets, and live prey. We show that the terminal attack trajectories of peregrines are not described by any simple geometric rule as previously claimed, and instead use system identification techniques to fit a phenomenological model of the dynamical system generating the observed trajectories. We find that these trajectories are best—and exceedingly well—modeled by the proportional navigation (PN) guidance law used by most guided missiles. Under this guidance law, turning is commanded at a rate proportional to the angular rate of the line-of-sight between the attacker and its target, with a constant of proportionality (i.e., feedback gain) called the navigation constant (N). Whereas most guided missiles use navigation constants falling on the interval 3 ≤ N ≤ 5, peregrine attack trajectories are best fitted by lower navigation constants (median N < 3). This lower feedback gain is appropriate at the lower flight speed of a biological system, given its presumably higher error and longer delay. This same guidance law could find use in small visually guided drones designed to remove other drones from protected airspace. PMID:29203660
Brighton, Caroline H; Thomas, Adrian L R; Taylor, Graham K
2017-12-19
The ability to intercept uncooperative targets is key to many diverse flight behaviors, from courtship to predation. Previous research has looked for simple geometric rules describing the attack trajectories of animals, but the underlying feedback laws have remained obscure. Here, we use GPS loggers and onboard video cameras to study peregrine falcons, Falco peregrinus , attacking stationary targets, maneuvering targets, and live prey. We show that the terminal attack trajectories of peregrines are not described by any simple geometric rule as previously claimed, and instead use system identification techniques to fit a phenomenological model of the dynamical system generating the observed trajectories. We find that these trajectories are best-and exceedingly well-modeled by the proportional navigation (PN) guidance law used by most guided missiles. Under this guidance law, turning is commanded at a rate proportional to the angular rate of the line-of-sight between the attacker and its target, with a constant of proportionality (i.e., feedback gain) called the navigation constant ( N ). Whereas most guided missiles use navigation constants falling on the interval 3 ≤ N ≤ 5, peregrine attack trajectories are best fitted by lower navigation constants (median N < 3). This lower feedback gain is appropriate at the lower flight speed of a biological system, given its presumably higher error and longer delay. This same guidance law could find use in small visually guided drones designed to remove other drones from protected airspace. Copyright © 2017 the Author(s). Published by PNAS.
Perry, Jeffrey J; Stiell, Ian G
2006-12-01
Traumatic injuries to the ankle/foot, knee, cervical spine, and head are very commonly seen in emergency and accident departments around the world. There has been much interest in the development of clinical decision rules to help guide the investigations of these patients in a standardised and cost-effective manner. In this article we reviewed the impact of the Ottawa ankle rules, Ottawa knee rules, Canadian C-spine rule and the Canadian CT head rule. The studies conducted have confirmed that the use of well developed clinical decision rules results in less radiography, less time spent in the emergency department and does not decrease patient satisfaction or result in misdiagnosis. Emergency physicians around the world should adopt the use of clinical decision rules for ankle/foot, knee, cervical spine and minor head injuries. With relatively simple implementation strategies, care can be standardized and costs reduced while providing excellent clinical care.
NASA Astrophysics Data System (ADS)
Hoebelheinrich, N. J.; Lynnes, C.; West, P.; Ferritto, M.
2014-12-01
Two problems common to many geoscience domains are the difficulties in finding tools to work with a given dataset collection, and conversely, the difficulties in finding data for a known tool. A collaborative team from the Earth Science Information Partnership (ESIP) has gotten together to design and create a web service, called ToolMatch, to address these problems. The team began their efforts by defining an initial, relatively simple conceptual model that addressed the two uses cases briefly described above. The conceptual model is expressed as an ontology using OWL (Web Ontology Language) and DCterms (Dublin Core Terms), and utilizing standard ontologies such as DOAP (Description of a Project), FOAF (Friend of a Friend), SKOS (Simple Knowledge Organization System) and DCAT (Data Catalog Vocabulary). The ToolMatch service will be taking advantage of various Semantic Web and Web standards, such as OpenSearch, RESTful web services, SWRL (Semantic Web Rule Language) and SPARQL (Simple Protocol and RDF Query Language). The first version of the ToolMatch service was deployed in early fall 2014. While more complete testing is required, a number of communities besides ESIP member organizations have expressed interest in collaborating to create, test and use the service and incorporate it into their own web pages, tools and / or services including the USGS Data Catalog service, DataONE, the Deep Carbon Observatory, Virtual Solar Terrestrial Observatory (VSTO), and the U.S. Global Change Research Program. In this session, presenters will discuss the inception and development of the ToolMatch service, the collaborative process used to design, refine, and test the service, and future plans for the service.
Symmetry rules for the indirect nuclear spin-spin coupling tensor revisited
NASA Astrophysics Data System (ADS)
Buckingham, A. D.; Pyykkö, P.; Robert, J. B.; Wiesenfeld, L.
The symmetry rules of Buckingham and Love (1970), relating the number of independent components of the indirect spin-spin coupling tensor J to the symmetry of the nuclear sites, are shown to require modification if the two nuclei are exchanged by a symmetry operation. In that case, the anti-symmetric part of J does not transform as a second-rank polar tensor under symmetry operations that interchange the coupled nuclei and may be called an anti-tensor. New rules are derived and illustrated by simple molecular models.
A new simple /spl infin/OH neuron model as a biologically plausible principal component analyzer.
Jankovic, M V
2003-01-01
A new approach to unsupervised learning in a single-layer neural network is discussed. An algorithm for unsupervised learning based upon the Hebbian learning rule is presented. A simple neuron model is analyzed. A dynamic neural model, which contains both feed-forward and feedback connections between the input and the output, has been adopted. The, proposed learning algorithm could be more correctly named self-supervised rather than unsupervised. The solution proposed here is a modified Hebbian rule, in which the modification of the synaptic strength is proportional not to pre- and postsynaptic activity, but instead to the presynaptic and averaged value of postsynaptic activity. It is shown that the model neuron tends to extract the principal component from a stationary input vector sequence. Usually accepted additional decaying terms for the stabilization of the original Hebbian rule are avoided. Implementation of the basic Hebbian scheme would not lead to unrealistic growth of the synaptic strengths, thanks to the adopted network structure.
Reinforcement Learning in a Nonstationary Environment: The El Farol Problem
NASA Technical Reports Server (NTRS)
Bell, Ann Maria
1999-01-01
This paper examines the performance of simple learning rules in a complex adaptive system based on a coordination problem modeled on the El Farol problem. The key features of the El Farol problem are that it typically involves a medium number of agents and that agents' pay-off functions have a discontinuous response to increased congestion. First we consider a single adaptive agent facing a stationary environment. We demonstrate that the simple learning rules proposed by Roth and Er'ev can be extremely sensitive to small changes in the initial conditions and that events early in a simulation can affect the performance of the rule over a relatively long time horizon. In contrast, a reinforcement learning rule based on standard practice in the computer science literature converges rapidly and robustly. The situation is reversed when multiple adaptive agents interact: the RE algorithms often converge rapidly to a stable average aggregate attendance despite the slow and erratic behavior of individual learners, while the CS based learners frequently over-attend in the early and intermediate terms. The symmetric mixed strategy equilibria is unstable: all three learning rules ultimately tend towards pure strategies or stabilize in the medium term at non-equilibrium probabilities of attendance. The brittleness of the algorithms in different contexts emphasize the importance of thorough and thoughtful examination of simulation-based results.
Phonological Concept Learning.
Moreton, Elliott; Pater, Joe; Pertsova, Katya
2017-01-01
Linguistic and non-linguistic pattern learning have been studied separately, but we argue for a comparative approach. Analogous inductive problems arise in phonological and visual pattern learning. Evidence from three experiments shows that human learners can solve them in analogous ways, and that human performance in both cases can be captured by the same models. We test GMECCS (Gradual Maximum Entropy with a Conjunctive Constraint Schema), an implementation of the Configural Cue Model (Gluck & Bower, ) in a Maximum Entropy phonotactic-learning framework (Goldwater & Johnson, ; Hayes & Wilson, ) with a single free parameter, against the alternative hypothesis that learners seek featurally simple algebraic rules ("rule-seeking"). We study the full typology of patterns introduced by Shepard, Hovland, and Jenkins () ("SHJ"), instantiated as both phonotactic patterns and visual analogs, using unsupervised training. Unlike SHJ, Experiments 1 and 2 found that both phonotactic and visual patterns that depended on fewer features could be more difficult than those that depended on more features, as predicted by GMECCS but not by rule-seeking. GMECCS also correctly predicted performance differences between stimulus subclasses within each pattern. A third experiment tried supervised training (which can facilitate rule-seeking in visual learning) to elicit simple rule-seeking phonotactic learning, but cue-based behavior persisted. We conclude that similar cue-based cognitive processes are available for phonological and visual concept learning, and hence that studying either kind of learning can lead to significant insights about the other. Copyright © 2015 Cognitive Science Society, Inc.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-16
...-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer Period for Commission Action on Proceedings to Determine Whether to Approve or Disapprove Proposed Rule Change To Establish... proposed rule change to establish various ``Benchmark Orders'' under NASDAQ Rule 4751(f). The proposed rule...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-29
...-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer Period for Commission Action on Proceedings To Determine Whether To Approve or Disapprove Proposed Rule Change To Amend Rule...,\\2\\ a proposed rule change to amend Exchange Rule 4626--Limitation of Liability (``accommodation...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-01
... the proposal. While recognizing the interest of stockholders in simple majority voting to amend these... publishing this notice to solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C... would not oppose a change to a simple majority provision for certain of the provisions currently subject...
Using High Speed Smartphone Cameras and Video Analysis Techniques to Teach Mechanical Wave Physics
ERIC Educational Resources Information Center
Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano
2017-01-01
We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses…
When Practice Doesn't Lead to Retrieval: An Analysis of Children's Errors with Simple Addition
ERIC Educational Resources Information Center
de Villiers, Celéste; Hopkins, Sarah
2013-01-01
Counting strategies initially used by young children to perform simple addition are often replaced by more efficient counting strategies, decomposition strategies and rule-based strategies until most answers are encoded in memory and can be directly retrieved. Practice is thought to be the key to developing fluent retrieval of addition facts. This…
The ABLe change framework: a conceptual and methodological tool for promoting systems change.
Foster-Fishman, Pennie G; Watson, Erin R
2012-06-01
This paper presents a new approach to the design and implementation of community change efforts like a System of Care. Called the ABLe Change Framework, the model provides simultaneous attention to the content and process of the work, ensuring effective implementation and the pursuit of systems change. Three key strategies are employed in this model to ensure the integration of content and process efforts and effective mobilization of broad scale systems change: Systemic Action Learning Teams, Simple Rules, and Small Wins. In this paper we describe the ABLe Change Framework and present a case study in which we successfully applied this approach to one system of care effort in Michigan.
Cognitive ergonomics of operational tools
NASA Astrophysics Data System (ADS)
Lüdeke, A.
2012-10-01
Control systems have become increasingly more powerful over the past decades. The availability of high data throughput and sophisticated graphical interactions has opened a variety of new possibilities. But has this helped to provide intuitive, easy to use applications to simplify the operation of modern large scale accelerator facilities? We will discuss what makes an application useful to operation and what is necessary to make a tool easy to use. We will show that even the implementation of a small number of simple application design rules can help to create ergonomic operational tools. The author is convinced that such tools do indeed help to achieve higher beam availability and better beam performance at accelerator facilities.
Economics and computer science of a radio spectrum reallocation
Leyton-Brown, Kevin; Segal, Ilya
2017-01-01
The recent “incentive auction” of the US Federal Communications Commission was the first auction to reallocate radio frequencies between two different kinds of uses: from broadcast television to wireless Internet access. The design challenge was not just to choose market rules to govern a fixed set of potential trades but also, to determine the broadcasters’ property rights, the goods to be exchanged, the quantities to be traded, the computational procedures, and even some of the performance objectives. An essential and unusual challenge was to make the auction simple enough for human participants while still ensuring that the computations would be tractable and capable of delivering nearly efficient outcomes. PMID:28652335
Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda
2016-01-01
Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules' performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2-4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved.
NASA Astrophysics Data System (ADS)
Oses, Corey; Isayev, Olexandr; Toher, Cormac; Curtarolo, Stefano; Tropsha, Alexander
Historically, materials discovery is driven by a laborious trial-and-error process. The growth of materials databases and emerging informatics approaches finally offer the opportunity to transform this practice into data- and knowledge-driven rational design-accelerating discovery of novel materials exhibiting desired properties. By using data from the AFLOW repository for high-throughput, ab-initio calculations, we have generated Quantitative Materials Structure-Property Relationship (QMSPR) models to predict critical materials properties, including the metal/insulator classification, band gap energy, and bulk modulus. The prediction accuracy obtained with these QMSPR models approaches training data for virtually any stoichiometric inorganic crystalline material. We attribute the success and universality of these models to the construction of new materials descriptors-referred to as the universal Property-Labeled Material Fragments (PLMF). This representation affords straightforward model interpretation in terms of simple heuristic design rules that could guide rational materials design. This proof-of-concept study demonstrates the power of materials informatics to dramatically accelerate the search for new materials.
Biomarkers of Fatigue: Ranking Mental Fatigue Susceptibility
2010-12-10
expected declines in performance during the 36-hour, 15-minute period of sleep deprivation without caffeine. The simple change from baseline results...rankings for fatigue resistance were then determined via a percent- change rule similar to that used in Chaiken, Harville, Harrison, Fischer, Fisher...and Whitmore (2008). This rule ranks subjects on percent change of cognitive performance from a baseline performance (before fatigue) to a fatigue
Finding the Density of a Liquid Using a Metre Rule
ERIC Educational Resources Information Center
Chattopadhyay, K. N.
2008-01-01
A simple method, which is based on the principle of moment of forces only, is described for the determination of the density of liquids without measuring the mass and volume. At first, an empty test tube and a solid substance, which are hung on each side of a metre rule, are balanced and the moment arm of the test tube is measured. Keeping the…
Numerical calculation of the Fresnel transform.
Kelly, Damien P
2014-04-01
In this paper, we address the problem of calculating Fresnel diffraction integrals using a finite number of uniformly spaced samples. General and simple sampling rules of thumb are derived that allow the user to calculate the distribution for any propagation distance. It is shown how these rules can be extended to fast-Fourier-transform-based algorithms to increase calculation efficiency. A comparison with other theoretical approaches is made.
New scheduling rules for a dynamic flexible flow line problem with sequence-dependent setup times
NASA Astrophysics Data System (ADS)
Kia, Hamidreza; Ghodsypour, Seyed Hassan; Davoudpour, Hamid
2017-09-01
In the literature, the application of multi-objective dynamic scheduling problem and simple priority rules are widely studied. Although these rules are not efficient enough due to simplicity and lack of general insight, composite dispatching rules have a very suitable performance because they result from experiments. In this paper, a dynamic flexible flow line problem with sequence-dependent setup times is studied. The objective of the problem is minimization of mean flow time and mean tardiness. A 0-1 mixed integer model of the problem is formulated. Since the problem is NP-hard, four new composite dispatching rules are proposed to solve it by applying genetic programming framework and choosing proper operators. Furthermore, a discrete-event simulation model is made to examine the performances of scheduling rules considering four new heuristic rules and the six adapted heuristic rules from the literature. It is clear from the experimental results that composite dispatching rules that are formed from genetic programming have a better performance in minimization of mean flow time and mean tardiness than others.
A neural network architecture for implementation of expert systems for real time monitoring
NASA Technical Reports Server (NTRS)
Ramamoorthy, P. A.
1991-01-01
Since neural networks have the advantages of massive parallelism and simple architecture, they are good tools for implementing real time expert systems. In a rule based expert system, the antecedents of rules are in the conjunctive or disjunctive form. We constructed a multilayer feedforward type network in which neurons represent AND or OR operations of rules. Further, we developed a translator which can automatically map a given rule base into the network. Also, we proposed a new and powerful yet flexible architecture that combines the advantages of both fuzzy expert systems and neural networks. This architecture uses the fuzzy logic concepts to separate input data domains into several smaller and overlapped regions. Rule-based expert systems for time critical applications using neural networks, the automated implementation of rule-based expert systems with neural nets, and fuzzy expert systems vs. neural nets are covered.
Continuous punishment and the potential of gentle rule enforcement.
Erev, Ido; Ingram, Paul; Raz, Ornit; Shany, Dror
2010-05-01
The paper explores the conditions that determine the effect of rule enforcement policies that imply an attempt to punish all the visible violations of the rule. We start with a simple game-theoretic analysis that highlights the value of gentle COntinuous Punishment (gentle COP) policies. If the subjects of the rule are rational, gentle COP can eliminate violations even when the rule enforcer has limited resources. The second part of the paper uses simulations to examine the robustness of gentle COP policies to likely deviations from rationality. The results suggest that when the probability of detecting violations is sufficiently high, gentle COP policies can be effective even when the subjects of the rule are boundedly rational adaptive learners. The paper concludes with experimental studies that clarify the value of gentle COP policies in the lab, and in attempt to eliminate cheating in exams. Copyright (c) 2009 Elsevier B.V. All rights reserved.
INDEXABILITY AND OPTIMAL INDEX POLICIES FOR A CLASS OF REINITIALISING RESTLESS BANDITS.
Villar, Sofía S
2016-01-01
Motivated by a class of Partially Observable Markov Decision Processes with application in surveillance systems in which a set of imperfectly observed state processes is to be inferred from a subset of available observations through a Bayesian approach, we formulate and analyze a special family of multi-armed restless bandit problems. We consider the problem of finding an optimal policy for observing the processes that maximizes the total expected net rewards over an infinite time horizon subject to the resource availability. From the Lagrangian relaxation of the original problem, an index policy can be derived, as long as the existence of the Whittle index is ensured. We demonstrate that such a class of reinitializing bandits in which the projects' state deteriorates while active and resets to its initial state when passive until its completion possesses the structural property of indexability and we further show how to compute the index in closed form. In general, the Whittle index rule for restless bandit problems does not achieve optimality. However, we show that the proposed Whittle index rule is optimal for the problem under study in the case of stochastically heterogenous arms under the expected total criterion, and it is further recovered by a simple tractable rule referred to as the 1-limited Round Robin rule. Moreover, we illustrate the significant suboptimality of other widely used heuristic: the Myopic index rule, by computing in closed form its suboptimality gap. We present numerical studies which illustrate for the more general instances the performance advantages of the Whittle index rule over other simple heuristics.
INDEXABILITY AND OPTIMAL INDEX POLICIES FOR A CLASS OF REINITIALISING RESTLESS BANDITS
Villar, Sofía S.
2016-01-01
Motivated by a class of Partially Observable Markov Decision Processes with application in surveillance systems in which a set of imperfectly observed state processes is to be inferred from a subset of available observations through a Bayesian approach, we formulate and analyze a special family of multi-armed restless bandit problems. We consider the problem of finding an optimal policy for observing the processes that maximizes the total expected net rewards over an infinite time horizon subject to the resource availability. From the Lagrangian relaxation of the original problem, an index policy can be derived, as long as the existence of the Whittle index is ensured. We demonstrate that such a class of reinitializing bandits in which the projects’ state deteriorates while active and resets to its initial state when passive until its completion possesses the structural property of indexability and we further show how to compute the index in closed form. In general, the Whittle index rule for restless bandit problems does not achieve optimality. However, we show that the proposed Whittle index rule is optimal for the problem under study in the case of stochastically heterogenous arms under the expected total criterion, and it is further recovered by a simple tractable rule referred to as the 1-limited Round Robin rule. Moreover, we illustrate the significant suboptimality of other widely used heuristic: the Myopic index rule, by computing in closed form its suboptimality gap. We present numerical studies which illustrate for the more general instances the performance advantages of the Whittle index rule over other simple heuristics. PMID:27212781
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-24
... Organizations; C2 Options Exchange, Incorporated; Order Approving a Proposed Rule Change To Adopt a Designated... thereunder,\\2\\ a proposed rule change to adopt a Designated Primary Market-Maker (``DPM'') program. The... the Notice, C2 has proposed to adopt a DPM program. The associated proposed rules are based on the...
Computational design of a self-assembling symmetrical β-propeller protein.
Voet, Arnout R D; Noguchi, Hiroki; Addy, Christine; Simoncini, David; Terada, Daiki; Unzai, Satoru; Park, Sam-Yong; Zhang, Kam Y J; Tame, Jeremy R H
2014-10-21
The modular structure of many protein families, such as β-propeller proteins, strongly implies that duplication played an important role in their evolution, leading to highly symmetrical intermediate forms. Previous attempts to create perfectly symmetrical propeller proteins have failed, however. We have therefore developed a new and rapid computational approach to design such proteins. As a test case, we have created a sixfold symmetrical β-propeller protein and experimentally validated the structure using X-ray crystallography. Each blade consists of 42 residues. Proteins carrying 2-10 identical blades were also expressed and purified. Two or three tandem blades assemble to recreate the highly stable sixfold symmetrical architecture, consistent with the duplication and fusion theory. The other proteins produce different monodisperse complexes, up to 42 blades (180 kDa) in size, which self-assemble according to simple symmetry rules. Our procedure is suitable for creating nano-building blocks from different protein templates of desired symmetry.
NASA Astrophysics Data System (ADS)
Nakanishi, Hideya; Imazu, Setsuo; Ohsuna, Masaki; Kojima, Mamoru; Nonomura, Miki; Shoji, Mamoru; Emoto, Masahiko; Yoshida, Masanobu; Iwata, Chie; Miyake, Hitoshi; Nagayama, Yoshio; Kawahata, Kazuo
To deal with endless data streams acquired in LHD steady-state experiments, the LHD data acquisition system was designed with a simple concept that divides a long pulse into a consecutive series of 10-s “subshots”. Latest digitizers applying high-speed PCI-Express technology, however, output nonstop gigabyte per second data streams whose subshot intervals would be extremely long if 10-s rule was applied. These digitizers need shorter subshot intervals, less than 10-s long. In contrast, steady-state fusion plants need uninterrupted monitoring of the environment and device soundness. They adopt longer subshot lengths of either 10 min or 1 day. To cope with both uninterrupted monitoring and ultra-fast diagnostics, the ability to vary the subshot length according to the type of operation is required. In this study, a design modification that enables variable subshot lengths was implemented and its practical effectiveness in LHD was verified.
On the analysis of photo-electron spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, C.-Z., E-mail: gao@irsamc.ups-tlse.fr; CNRS, LPT; Dinh, P.M.
2015-09-15
We analyze Photo-Electron Spectra (PES) for a variety of excitation mechanisms from a simple mono-frequency laser pulse to involved combination of pulses as used, e.g., in attosecond experiments. In the case of simple pulses, the peaks in PES reflect the occupied single-particle levels in combination with the given laser frequency. This usual, simple rule may badly fail in the case of excitation pulses with mixed frequencies and if resonant modes of the system are significantly excited. We thus develop an extension of the usual rule to cover all possible excitation scenarios, including mixed frequencies in the attosecond regime. We find thatmore » the spectral distributions of dipole, monopole and quadrupole power for the given excitation taken together and properly shifted by the single-particle energies provide a pertinent picture of the PES in all situations. This leads to the derivation of a generalized relation allowing to understand photo-electron yields even in complex experimental setups.« less
A simple rule of thumb for elegant prehension.
Mon-Williams, M; Tresilian, J R
2001-07-10
Reaching out to grasp an object (prehension) is a deceptively elegant and skilled behavior. The movement prior to object contact can be described as having two components, the movement of the hand to an appropriate location for gripping the object, the "transport" component, and the opening and closing of the aperture between the fingers as they prepare to grip the target, the "grasp" component. The grasp component is sensitive to the size of the object, so that a larger grasp aperture is formed for wider objects; the maximum grasp aperture (MGA) is a little wider than the width of the target object and occurs later in the movement for larger objects. We present a simple model that can account for the temporal relationship between the transport and grasp components. We report the results of an experiment providing empirical support for our "rule of thumb." The model provides a simple, but plausible, account of a neural control strategy that has been the center of debate over the last two decades.
Distributed Fair Auto Rate Medium Access Control for IEEE 802.11 Based WLANs
NASA Astrophysics Data System (ADS)
Zhu, Yanfeng; Niu, Zhisheng
Much research has shown that a carefully designed auto rate medium access control can utilize the underlying physical multi-rate capability to exploit the time-variation of the channel. In this paper, we develop a simple analytical model to elucidate the rule that maximizes the throughput of RTS/CTS based multi-rate wireless local area networks. Based on the discovered rule, we propose two distributed fair auto rate medium access control schemes called FARM and FARM+ from the view-point of throughput fairness and time-share fairness, respectively. With the proposed schemes, after receiving a RTS frame, the receiver selectively returns the CTS frame to inform the transmitter the maximum feasible rate probed by the signal-to-noise ratio of the received RTS frame. The key feature of the proposed schemes is that they are capable of maintaining throughput/time-share fairness in asymmetric situation where the distribution of SNR varies with stations. Extensive simulation results show that the proposed schemes outperform the existing throughput/time-share fair auto rate schemes in time-varying channel conditions.
Markov source model for printed music decoding
NASA Astrophysics Data System (ADS)
Kopec, Gary E.; Chou, Philip A.; Maltz, David A.
1995-03-01
This paper describes a Markov source model for a simple subset of printed music notation. The model is based on the Adobe Sonata music symbol set and a message language of our own design. Chord imaging is the most complex part of the model. Much of the complexity follows from a rule of music typography that requires the noteheads for adjacent pitches to be placed on opposite sides of the chord stem. This rule leads to a proliferation of cases for other typographic details such as dot placement. We describe the language of message strings accepted by the model and discuss some of the imaging issues associated with various aspects of the message language. We also point out some aspects of music notation that appear problematic for a finite-state representation. Development of the model was greatly facilitated by the duality between image synthesis and image decoding. Although our ultimate objective was a music image model for use in decoding, most of the development proceeded by using the evolving model for image synthesis, since it is computationally far less costly to image a message than to decode an image.
A Markovian engine for a biological energy transducer: the catalytic wheel.
Tsong, Tian Yow; Chang, Cheng-Hung
2007-04-01
The molecular machines in biological cells are made of proteins, DNAs and other classes of molecules. The structures of these molecules are characteristically "soft", highly flexible, and yet their interactions with other molecules or ions are specific and selective. This chapter discusses a prevalent form, the catalytic wheel, or the energy transducer of cells, examines its mechanism of action, and extracts from it a set of simple but general rules for understanding the energetics of the biomolecular devices. These rules should also benefit design of manmade nanometer scale machines such as rotary motors or track-guided linear transporters. We will focus on an electric work that, by matching system dynamics and then enhancing the conformational fluctuation of one or several driver proteins, converts stochastic input of energy into rotation or locomotion of a receptor protein. The spatial (or barrier) and temporal symmetry breakings required for selected driver/receptor combinations are examined. This electric ratchet consists of a core engine that follows the Markovian dynamic, alleviates difficulties encountered in rigid mechanical model, and tailors to the soft-matter characteristics of the biomolecules.
Pushing the rules: effects and aftereffects of deliberate rule violations.
Wirth, Robert; Pfister, Roland; Foerster, Anna; Huestegge, Lynn; Kunde, Wilfried
2016-09-01
Most of our daily life is organized around rules and social norms. But what makes rules so special? And what if one were to break a rule intentionally? Can we simply free us from the present set of rules or do we automatically adhere to them? How do rule violations influence subsequent behavior? To investigate the effects and aftereffects of violating simple S-R rule, we conducted three experiments that investigated continuous finger-tracking responses on an iPad. Our experiments show that rule violations are distinct from rule-based actions in both response times and movement trajectories, they take longer to initiate and execute, and their movement trajectory is heavily contorted. Data not only show differences between the two types of response (rule-based vs. violation), but also yielded a characteristic pattern of aftereffects in case of rule violations: rule violations do not trigger adaptation effects that render further rule violations less difficult, but every rule violation poses repeated effort on the agent. The study represents a first step towards understanding the signature and underlying mechanisms of deliberate rule violations, they cannot be acted out by themselves, but require the activation of the original rule first. Consequently, they are best understood as reformulations of existing rules that are not accessible on their own, but need to be constantly derived from the original rule, with an add-on that might entail an active tendency to steer away from mental representations that reflect (socially) unwanted behavior.
NASA Astrophysics Data System (ADS)
Chen, Jing-Chao; Zhou, Yu; Wang, Xi
2018-02-01
Technical trading rules have been widely used by practitioners in financial markets for a long time. The profitability remains controversial and few consider the stationarity of technical indicators used in trading rules. We convert MA, KDJ and Bollinger bands into stationary processes and investigate the profitability of these trading rules by using 3 high-frequency data(15s,30s and 60s) of CSI300 Stock Index Futures from January 4th 2012 to December 31st 2016. Several performance and risk measures are adopted to assess the practical value of all trading rules directly while ADF-test is used to verify the stationarity and SPA test to check whether trading rules perform well due to intrinsic superiority or pure luck. The results show that there are several significant combinations of parameters for each indicator when transaction costs are not taken into consideration. Once transaction costs are included, trading profits will be eliminated completely. We also propose a method to reduce the risk of technical trading rules.
Bösner, Stefan; Haasenritter, Jörg; Becker, Annette; Karatolios, Konstantinos; Vaucher, Paul; Gencer, Baris; Herzig, Lilli; Heinzel-Gutenbrunner, Monika; Schaefer, Juergen R; Abu Hani, Maren; Keller, Heidi; Sönnichsen, Andreas C; Baum, Erika; Donner-Banzhoff, Norbert
2010-09-07
Chest pain can be caused by various conditions, with life-threatening cardiac disease being of greatest concern. Prediction scores to rule out coronary artery disease have been developed for use in emergency settings. We developed and validated a simple prediction rule for use in primary care. We conducted a cross-sectional diagnostic study in 74 primary care practices in Germany. Primary care physicians recruited all consecutive patients who presented with chest pain (n = 1249) and recorded symptoms and findings for each patient (derivation cohort). An independent expert panel reviewed follow-up data obtained at six weeks and six months on symptoms, investigations, hospital admissions and medications to determine the presence or absence of coronary artery disease. Adjusted odds ratios of relevant variables were used to develop a prediction rule. We calculated measures of diagnostic accuracy for different cut-off values for the prediction scores using data derived from another prospective primary care study (validation cohort). The prediction rule contained five determinants (age/sex, known vascular disease, patient assumes pain is of cardiac origin, pain is worse during exercise, and pain is not reproducible by palpation), with the score ranging from 0 to 5 points. The area under the curve (receiver operating characteristic curve) was 0.87 (95% confidence interval [CI] 0.83-0.91) for the derivation cohort and 0.90 (95% CI 0.87-0.93) for the validation cohort. The best overall discrimination was with a cut-off value of 3 (positive result 3-5 points; negative result
Comparison of optimization algorithms for the slow shot phase in HPDC
NASA Astrophysics Data System (ADS)
Frings, Markus; Berkels, Benjamin; Behr, Marek; Elgeti, Stefanie
2018-05-01
High-pressure die casting (HPDC) is a popular manufacturing process for aluminum processing. The slow shot phase in HPDC is the first phase of this process. During this phase, the molten metal is pushed towards the cavity under moderate plunger movement. The so-called shot curve describes this plunger movement. A good design of the shot curve is important to produce high-quality cast parts. Three partially competing process goals characterize the slow shot phase: (1) reducing air entrapment, (2) avoiding temperature loss, and (3) minimizing oxide caused by the air-aluminum contact. Due to the rough process conditions with high pressure and temperature, it is hard to design the shot curve experimentally. There exist a few design rules that are based on theoretical considerations. Nevertheless, the quality of the shot curve design still depends on the experience of the machine operator. To improve the shot curve it seems to be natural to use numerical optimization. This work compares different optimization strategies for the slow shot phase optimization. The aim is to find the best optimization approach on a simple test problem.
Knowledge-based graphical interfaces for presenting technical information
NASA Technical Reports Server (NTRS)
Feiner, Steven
1988-01-01
Designing effective presentations of technical information is extremely difficult and time-consuming. Moreover, the combination of increasing task complexity and declining job skills makes the need for high-quality technical presentations especially urgent. We believe that this need can ultimately be met through the development of knowledge-based graphical interfaces that can design and present technical information. Since much material is most naturally communicated through pictures, our work has stressed the importance of well-designed graphics, concentrating on generating pictures and laying out displays containing them. We describe APEX, a testbed picture generation system that creates sequences of pictures that depict the performance of simple actions in a world of 3D objects. Our system supports rules for determining automatically the objects to be shown in a picture, the style and level of detail with which they should be rendered, the method by which the action itself should be indicated, and the picture's camera specification. We then describe work on GRIDS, an experimental display layout system that addresses some of the problems in designing displays containing these pictures, determining the position and size of the material to be presented.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-07
... Stock Exchange LLC Amending NYSE Rule 1 To Provide for the Designation of Qualified Employees and NYSE... qualified employees to act in place of any person named in a rule as having authority to act under such rule... 1 to provide that the Exchange may formally designate one or more qualified employees to act in...
ERIC Educational Resources Information Center
Lai, Mark H. C.; Kwok, Oi-man
2015-01-01
Educational researchers commonly use the rule of thumb of "design effect smaller than 2" as the justification of not accounting for the multilevel or clustered structure in their data. The rule, however, has not yet been systematically studied in previous research. In the present study, we generated data from three different models…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-17
... solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR... PHLX. Also, in order to continue to maintain a relatively simple routing table and fee schedule, the... routed to PHLX while also maintaining a simple pricing structure. As it has done before, despite...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-01
... Order in a Simple or Complex Order that executes against non-Initiating Order interest and will also pay... to solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2... paid to members executing electronically- delivered Customer Simple Orders in Penny Pilot Options and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-08
... notice to solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1... exchanges in the listed options marketplace. The Exchange proposes to adopt a set of fees for simple, non... Public Customer simple, non-complex Maker orders in all multiply-listed index and ETF options classes...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-18
... notice to solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1...) The Customer \\3\\ Rebate Program in Section B; (ii) Simple Order pricing in Section I entitled Rebates... Exchange proposes to amend the Simple Order Fees for Removing Liquidity in Section I which are applicable...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-20
... solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR... Public Customer complex orders, including those that trade against simple (non-complex) orders (excluding... rebate for all Maker simple orders (excluding trades on the open, for which no fees are assessed or...
Tony Blair and the Politics of Race in Education: Whiteness, "Doublethink" and New Labour
ERIC Educational Resources Information Center
Gillborn, David
2008-01-01
It is tempting to view the Blairite legacy as a simple story of political hypocrisy: a government, swept to power after almost two decades of Conservative rule, promising much but reneging on those commitments and falling back on Thatcherite authoritarian popularism when the going got tough. But that would be too simple a story. The Blairite…
Organizational Knowledge Transfer Using Ontologies and a Rule-Based System
NASA Astrophysics Data System (ADS)
Okabe, Masao; Yoshioka, Akiko; Kobayashi, Keido; Yamaguchi, Takahira
In recent automated and integrated manufacturing, so-called intelligence skill is becoming more and more important and its efficient transfer to next-generation engineers is one of the urgent issues. In this paper, we propose a new approach without costly OJT (on-the-job training), that is, combinational usage of a domain ontology, a rule ontology and a rule-based system. Intelligence skill can be decomposed into pieces of simple engineering rules. A rule ontology consists of these engineering rules as primitives and the semantic relations among them. A domain ontology consists of technical terms in the engineering rules and the semantic relations among them. A rule ontology helps novices get the total picture of the intelligence skill and a domain ontology helps them understand the exact meanings of the engineering rules. A rule-based system helps domain experts externalize their tacit intelligence skill to ontologies and also helps novices internalize them. As a case study, we applied our proposal to some actual job at a remote control and maintenance office of hydroelectric power stations in Tokyo Electric Power Co., Inc. We also did an evaluation experiment for this case study and the result supports our proposal.
Evidence flow graph methods for validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant
1989-01-01
The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.
Khadilkar, Mihir R; Escobedo, Fernando A
2014-10-17
Sought-after ordered structures of mixtures of hard anisotropic nanoparticles can often be thermodynamically unfavorable due to the components' geometric incompatibility to densely pack into regular lattices. A simple compatibilization rule is identified wherein the particle sizes are chosen such that the order-disorder transition pressures of the pure components match (and the entropies of the ordered phases are similar). Using this rule with representative polyhedra from the truncated-cube family that form pure-component plastic crystals, Monte Carlo simulations show the formation of plastic-solid solutions for all compositions and for a wide range of volume fractions.
A programmable rules engine to provide clinical decision support using HTML forms.
Heusinkveld, J; Geissbuhler, A; Sheshelidze, D; Miller, R
1999-01-01
The authors have developed a simple method for specifying rules to be applied to information on HTML forms. This approach allows clinical experts, who lack the programming expertise needed to write CGI scripts, to construct and maintain domain-specific knowledge and ordering capabilities within WizOrder, the order-entry and decision support system used at Vanderbilt Hospital. The clinical knowledge base maintainers use HTML editors to create forms and spreadsheet programs for rule entry. A test environment has been developed which uses Netscape to display forms; the production environment displays forms using an embedded browser.
Why Rules Matter in Complex Event Processing...and Vice Versa
NASA Astrophysics Data System (ADS)
Vincent, Paul
Many commercial and research CEP solutions are moving beyond simple stream query languages to more complete definitions of "process" and thence to "decisions" and "actions". And as capabilities increase in event processing capabilities, there is an increasing realization that the humble "rule" is as relevant to the event cloud as it is to specific services. Less obvious is how much event processing has to offer the process and rule execution and management technologies. Does event processing change the way we should manage businesses, processes and services, together with their embedded (and hopefully managed) rulesets?
Can the oscillator strength of the quantum dot bandgap transition exceed unity?
NASA Astrophysics Data System (ADS)
Hens, Z.
2008-10-01
We discuss the apparent contradiction between the Thomas-Reiche-Kuhn sum rule for oscillator strengths and recent experimental data on the oscillator strength of the band gap transition of quantum dots. Starting from two simple single electron model systems, we show that the sum rule does not limit this oscillator strength to values below unity, or below the number of electrons in the highest occupied single electron state. The only upper limit the sum rule imposes on the oscillator strength of the quantum dot band gap transition is the total number of electrons in the quantum dot.
Graphical Tools for Linear Structural Equation Modeling
2014-06-01
others. 4Kenny and Milan (2011) write, “Identification is perhaps the most difficult concept for SEM researchers to understand. We have seen SEM...model to using typical SEM software to determine model identifia- bility. Kenny and Milan (2011) list the following drawbacks: (i) If poor starting...the well known recursive and null rules (Bollen, 1989) and the regression rule (Kenny and Milan , 2011). A Simple Criterion for Identifying Individual
He, ZeFang
2014-01-01
An attitude control strategy based on Ziegler-Nichols rules for tuning PD (proportional-derivative) parameters of quadrotor helicopters is presented to solve the problem that quadrotor tends to be instable. This problem is caused by the narrow definition domain of attitude angles of quadrotor helicopters. The proposed controller is nonlinear and consists of a linear part and a nonlinear part. The linear part is a PD controller with PD parameters tuned by Ziegler-Nichols rules and acts on the quadrotor decoupled linear system after feedback linearization; the nonlinear part is a feedback linearization item which converts a nonlinear system into a linear system. It can be seen from the simulation results that the attitude controller proposed in this paper is highly robust, and its control effect is better than the other two nonlinear controllers. The nonlinear parts of the other two nonlinear controllers are the same as the attitude controller proposed in this paper. The linear part involves a PID (proportional-integral-derivative) controller with the PID controller parameters tuned by Ziegler-Nichols rules and a PD controller with the PD controller parameters tuned by GA (genetic algorithms). Moreover, this attitude controller is simple and easy to implement. PMID:25614879
Navigating a Mobile Robot Across Terrain Using Fuzzy Logic
NASA Technical Reports Server (NTRS)
Seraji, Homayoun; Howard, Ayanna; Bon, Bruce
2003-01-01
A strategy for autonomous navigation of a robotic vehicle across hazardous terrain involves the use of a measure of traversability of terrain within a fuzzy-logic conceptual framework. This navigation strategy requires no a priori information about the environment. Fuzzy logic was selected as a basic element of this strategy because it provides a formal methodology for representing and implementing a human driver s heuristic knowledge and operational experience. Within a fuzzy-logic framework, the attributes of human reasoning and decision- making can be formulated by simple IF (antecedent), THEN (consequent) rules coupled with easily understandable and natural linguistic representations. The linguistic values in the rule antecedents convey the imprecision associated with measurements taken by sensors onboard a mobile robot, while the linguistic values in the rule consequents represent the vagueness inherent in the reasoning processes to generate the control actions. The operational strategies of the human expert driver can be transferred, via fuzzy logic, to a robot-navigation strategy in the form of a set of simple conditional statements composed of linguistic variables. These linguistic variables are defined by fuzzy sets in accordance with user-defined membership functions. The main advantages of a fuzzy navigation strategy lie in the ability to extract heuristic rules from human experience and to obviate the need for an analytical model of the robot navigation process.
Information, intelligence, and interface: the pillars of a successful medical information system.
Hadzikadic, M; Harrington, A L; Bohren, B F
1995-01-01
This paper addresses three key issues facing developers of clinical and/or research medical information systems. 1. INFORMATION. The basic function of every database is to store information about the phenomenon under investigation. There are many ways to organize information in a computer; however only a few will prove optimal for any real life situation. Computer Science theory has developed several approaches to database structure, with relational theory leading in popularity among end users [8]. Strict conformance to the rules of relational database design rewards the user with consistent data and flexible access to that data. A properly defined database structure minimizes redundancy i.e.,multiple storage of the same information. Redundancy introduces problems when updating a database, since the repeated value has to be updated in all locations--missing even a single value corrupts the whole database, and incorrect reports are produced [8]. To avoid such problems, relational theory offers a formal mechanism for determining the number and content of data files. These files not only preserve the conceptual schema of the application domain, but allow a virtually unlimited number of reports to be efficiently generated. 2. INTELLIGENCE. Flexible access enables the user to harvest additional value from collected data. This value is usually gained via reports defined at the time of database design. Although these reports are indispensable, with proper tools more information can be extracted from the database. For example, machine learning, a sub-discipline of artificial intelligence, has been successfully used to extract knowledge from databases of varying size by uncovering a correlation among fields and records[1-6, 9]. This knowledge, represented in the form of decision trees, production rules, and probabilistic networks, clearly adds a flavor of intelligence to the data collection and manipulation system. 3. INTERFACE. Despite the obvious importance of collecting data and extracting knowledge, current systems often impede these processes. Problems stem from the lack of user friendliness and functionality. To overcome these problems, several features of a successful human-computer interface have been identified [7], including the following "golden" rules of dialog design [7]: consistency, use of shortcuts for frequent users, informative feedback, organized sequence of actions, simple error handling, easy reversal of actions, user-oriented focus of control, and reduced short-term memory load. To this list of rules, we added visual representation of both data and query results, since our experience has demonstrated that users react much more positively to visual rather than textual information. In our design of the Orthopaedic Trauma Registry--under development at the Carolinas Medical Center--we have made every effort to follow the above rules. The results were rewarding--the end users actually not only want to use the product, but also to participate in its development.
Wide-Field-of-View Millimeter-Wave Telescope Design with Ultra-Low Cross-Polarization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernacki, Bruce E.; Kelly, James F.; Sheen, David M.
2012-05-01
As millimeter-wave arrays become available, off-axis imaging performance of the fore optics increases in importance due to the relatively large physical extent of the arrays. Typically, simple optical telescope designs are adapted to millimeter-wave imaging but single-mirror spherical or classic conic designs cannot deliver adequate image quality except near the optical axis. Since most millimeter-wave designs are quasi-optical, optical ray tracing and commercial design software can be used to optimize designs to improve off-axis imaging as well as minimize cross-polarization. Methods that obey the Dragone-Mizuguchi condition for the design of reflective millimeter-wave telescopes with low cross-polarization also provide additional degreesmore » of freedom that offer larger fields of view than possible with single-reflector designs. Dragone’s graphical design method does not lend itself readily to computer-based optical design approaches, but subsequent authors expanded on Dragone’s geometric design approach with analytic expressions that describe the location, shape, off-axis height and tilt of the telescope elements that satisfy Dragone’s design rules and can be used as a first-order design for subsequent computer-based design and optimization. We investigate two design variants that obey the Dragone-Mizuguchi conditions that exhibit ultra-low polarization crosstalk and a large diffraction-limited field of view well suited to millimeter-wave imaging arrays.« less
The effects of cumulative practice on mathematics problem solving.
Mayfield, Kristin H; Chase, Philip N
2002-01-01
This study compared three different methods of teaching five basic algebra rules to college students. All methods used the same procedures to teach the rules and included four 50-question review sessions interspersed among the training of the individual rules. The differences among methods involved the kinds of practice provided during the four review sessions. Participants who received cumulative practice answered 50 questions covering a mix of the rules learned prior to each review session. Participants who received a simple review answered 50 questions on one previously trained rule. Participants who received extra practice answered 50 extra questions on the rule they had just learned. Tests administered after each review included new questions for applying each rule (application items) and problems that required novel combinations of the rules (problem-solving items). On the final test, the cumulative group outscored the other groups on application and problem-solving items. In addition, the cumulative group solved the problem-solving items significantly faster than the other groups. These results suggest that cumulative practice of component skills is an effective method of training problem solving.
The effects of cumulative practice on mathematics problem solving.
Mayfield, Kristin H; Chase, Philip N
2002-01-01
This study compared three different methods of teaching five basic algebra rules to college students. All methods used the same procedures to teach the rules and included four 50-question review sessions interspersed among the training of the individual rules. The differences among methods involved the kinds of practice provided during the four review sessions. Participants who received cumulative practice answered 50 questions covering a mix of the rules learned prior to each review session. Participants who received a simple review answered 50 questions on one previously trained rule. Participants who received extra practice answered 50 extra questions on the rule they had just learned. Tests administered after each review included new questions for applying each rule (application items) and problems that required novel combinations of the rules (problem-solving items). On the final test, the cumulative group outscored the other groups on application and problem-solving items. In addition, the cumulative group solved the problem-solving items significantly faster than the other groups. These results suggest that cumulative practice of component skills is an effective method of training problem solving. PMID:12102132
Making sense of information in noisy networks: human communication, gossip, and distortion.
Laidre, Mark E; Lamb, Alex; Shultz, Susanne; Olsen, Megan
2013-01-21
Information from others can be unreliable. Humans nevertheless act on such information, including gossip, to make various social calculations, thus raising the question of whether individuals can sort through social information to identify what is, in fact, true. Inspired by empirical literature on people's decision-making when considering gossip, we built an agent-based simulation model to examine how well simple decision rules could make sense of information as it propagated through a network. Our simulations revealed that a minimalistic decision-rule 'Bit-wise mode' - which compared information from multiple sources and then sought a consensus majority for each component bit within the message - was consistently the most successful at converging upon the truth. This decision rule attained high relative fitness even in maximally noisy networks, composed entirely of nodes that distorted the message. The rule was also superior to other decision rules regardless of its frequency in the population. Simulations carried out with variable agent memory constraints, different numbers of observers who initiated information propagation, and a variety of network types suggested that the single most important factor in making sense of information was the number of independent sources that agents could consult. Broadly, our model suggests that despite the distortion information is subject to in the real world, it is nevertheless possible to make sense of it based on simple Darwinian computations that integrate multiple sources. Copyright © 2012 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-04
... Capital Commitment Schedule (``CCS'') interest; (3) NYSE Rule 70.25 to permit d-Quotes to be designated... that MPL Orders may interact with CCS interest; (3) NYSE Rule 70.25 to permit d- Quotes to be... the CCS pursuant to Rule 1000 would not be permitted to be designated as MPL Orders. The CCS is a...
NASA Technical Reports Server (NTRS)
Christodoulou, Dimitris M.; Kazanas, Demosthenes
2017-01-01
We consider the geometric Titius-Bode rule for the semimajor axes of planetary orbits. We derive an equivalent rule for the midpoints of the segments between consecutive orbits along the radial direction and we interpret it physically in terms of the work done in the gravitational field of the Sun by particles whose orbits are perturbed around each planetary orbit. On such energetic grounds, it is not surprising that some exoplanets in multiple-planet extrasolar systems obey the same relation. However, it is surprising that this simple interpretation of the Titius-Bode rule also reveals new properties of the bound closed orbits predicted by Bertrand's theorem, which has been known since 1873.
NASA Astrophysics Data System (ADS)
Christodoulou, Dimitris M.; Kazanas, Demosthenes
2017-12-01
We consider the geometric Titius-Bode rule for the semimajor axes of planetary orbits. We derive an equivalent rule for the midpoints of the segments between consecutive orbits along the radial direction and we interpret it physically in terms of the work done in the gravitational field of the Sun by particles whose orbits are perturbed around each planetary orbit. On such energetic grounds, it is not surprising that some exoplanets in multiple-planet extrasolar systems obey the same relation. However, it is surprising that this simple interpretation of the Titius-Bode rule also reveals new properties of the bound closed orbits predicted by Bertrand’s theorem, which has been known since 1873.
50 CFR 424.16 - Proposed rules.
Code of Federal Regulations, 2011 CFR
2011-10-01
... OF COMMERCE); ENDANGERED SPECIES COMMITTEE REGULATIONS SUBCHAPTER A LISTING ENDANGERED AND THREATENED SPECIES AND DESIGNATING CRITICAL HABITAT Revision of the Lists § 424.16 Proposed rules. (a) General. Based... any proposed rule to list, delist, or reclassify a species, or to designate or revise critical habitat...
Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda
2016-01-01
Background Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules’ performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Methods Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. Results A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2–4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Conclusion Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved. PMID:26730980
Designing novel Sn-Bi, Si-C and Ge-C nanostructures, using simple theoretical chemical similarities
NASA Astrophysics Data System (ADS)
Zdetsis, Aristides D.
2011-04-01
A framework of simple, transparent and powerful concepts is presented which is based on isoelectronic (or isovalent) principles, analogies, regularities and similarities. These analogies could be considered as conceptual extensions of the periodical table of the elements, assuming that two atoms or molecules having the same number of valence electrons would be expected to have similar or homologous properties. In addition, such similar moieties should be able, in principle, to replace each other in more complex structures and nanocomposites. This is only partly true and only occurs under certain conditions which are investigated and reviewed here. When successful, these concepts are very powerful and transparent, leading to a large variety of nanomaterials based on Si and other group 14 elements, similar to well known and well studied analogous materials based on boron and carbon. Such nanomaterias designed in silico include, among many others, Si-C, Sn-Bi, Si-C and Ge-C clusters, rings, nanowheels, nanorodes, nanocages and multidecker sandwiches, as well as silicon planar rings and fullerenes similar to the analogous sp2 bonding carbon structures. It is shown that this pedagogically simple and transparent framework can lead to an endless variety of novel and functional nanomaterials with important potential applications in nanotechnology, nanomedicine and nanobiology. Some of the so called predicted structures have been already synthesized, not necessarily with the same rational and motivation. Finally, it is anticipated that such powerful and transparent rules and analogies, in addition to their predictive power, could also lead to far-reaching interpretations and a deeper understanding of already known results and information.
Designing novel Sn-Bi, Si-C and Ge-C nanostructures, using simple theoretical chemical similarities.
Zdetsis, Aristides D
2011-04-27
A framework of simple, transparent and powerful concepts is presented which is based on isoelectronic (or isovalent) principles, analogies, regularities and similarities. These analogies could be considered as conceptual extensions of the periodical table of the elements, assuming that two atoms or molecules having the same number of valence electrons would be expected to have similar or homologous properties. In addition, such similar moieties should be able, in principle, to replace each other in more complex structures and nanocomposites. This is only partly true and only occurs under certain conditions which are investigated and reviewed here. When successful, these concepts are very powerful and transparent, leading to a large variety of nanomaterials based on Si and other group 14 elements, similar to well known and well studied analogous materials based on boron and carbon. Such nanomaterias designed in silico include, among many others, Si-C, Sn-Bi, Si-C and Ge-C clusters, rings, nanowheels, nanorodes, nanocages and multidecker sandwiches, as well as silicon planar rings and fullerenes similar to the analogous sp2 bonding carbon structures. It is shown that this pedagogically simple and transparent framework can lead to an endless variety of novel and functional nanomaterials with important potential applications in nanotechnology, nanomedicine and nanobiology. Some of the so called predicted structures have been already synthesized, not necessarily with the same rational and motivation. Finally, it is anticipated that such powerful and transparent rules and analogies, in addition to their predictive power, could also lead to far-reaching interpretations and a deeper understanding of already known results and information.
Sum rules for quasifree scattering of hadrons
NASA Astrophysics Data System (ADS)
Peterson, R. J.
2018-02-01
The areas d σ /d Ω of fitted quasifree scattering peaks from bound nucleons for continuum hadron-nucleus spectra measuring d2σ /d Ω d ω are converted to sum rules akin to the Coulomb sums familiar from continuum electron scattering spectra from nuclear charge. Hadronic spectra with or without charge exchange of the beam are considered. These sums are compared to the simple expectations of a nonrelativistic Fermi gas, including a Pauli blocking factor. For scattering without charge exchange, the hadronic sums are below this expectation, as also observed with Coulomb sums. For charge exchange spectra, the sums are near or above the simple expectation, with larger uncertainties. The strong role of hadron-nucleon in-medium total cross sections is noted from use of the Glauber model.
Methodological issues with adaptation of clinical trial design.
Hung, H M James; Wang, Sue-Jane; O'Neill, Robert T
2006-01-01
Adaptation of clinical trial design generates many issues that have not been resolved for practical applications, though statistical methodology has advanced greatly. This paper focuses on some methodological issues. In one type of adaptation such as sample size re-estimation, only the postulated value of a parameter for planning the trial size may be altered. In another type, the originally intended hypothesis for testing may be modified using the internal data accumulated at an interim time of the trial, such as changing the primary endpoint and dropping a treatment arm. For sample size re-estimation, we make a contrast between an adaptive test weighting the two-stage test statistics with the statistical information given by the original design and the original sample mean test with a properly corrected critical value. We point out the difficulty in planning a confirmatory trial based on the crude information generated by exploratory trials. In regards to selecting a primary endpoint, we argue that the selection process that allows switching from one endpoint to the other with the internal data of the trial is not very likely to gain a power advantage over the simple process of selecting one from the two endpoints by testing them with an equal split of alpha (Bonferroni adjustment). For dropping a treatment arm, distributing the remaining sample size of the discontinued arm to other treatment arms can substantially improve the statistical power of identifying a superior treatment arm in the design. A common difficult methodological issue is that of how to select an adaptation rule in the trial planning stage. Pre-specification of the adaptation rule is important for the practicality consideration. Changing the originally intended hypothesis for testing with the internal data generates great concerns to clinical trial researchers.
Read Code quality assurance: from simple syntax to semantic stability.
Schulz, E B; Barrett, J W; Price, C
1998-01-01
As controlled clinical vocabularies assume an increasing role in modern clinical information systems, so the issue of their quality demands greater attention. In order to meet the resulting stringent criteria for completeness and correctness, a quality assurance system comprising a database of more than 500 rules is being developed and applied to the Read Thesaurus. The authors discuss the requirement to apply quality assurance processes to their dynamic editing database in order to ensure the quality of exported products. Sources of errors include human, hardware, and software factors as well as new rules and transactions. The overall quality strategy includes prevention, detection, and correction of errors. The quality assurance process encompasses simple data specification, internal consistency, inspection procedures and, eventually, field testing. The quality assurance system is driven by a small number of tables and UNIX scripts, with "business rules" declared explicitly as Structured Query Language (SQL) statements. Concurrent authorship, client-server technology, and an initial failure to implement robust transaction control have all provided valuable lessons. The feedback loop for error management needs to be short.
Digit Reversal in Children's Writing: A Simple Theory and Its Empirical Validation
ERIC Educational Resources Information Center
Fischer, Jean-Paul
2013-01-01
This article presents a simple theory according to which the left-right reversal of single digits by 5- and 6-year-old children is mainly due to the application of an implicit right-writing or -orienting rule. A number of nontrivial predictions can be drawn from this theory. First, left-oriented digits (1, 2, 3, 7, and 9) will be reversed more…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-23
... on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR 240.19b-4. I....'' Specifically, the Exchange proposes to amend the Customer Rebate Program, Select Symbols,\\5\\ Simple and Complex... Category D to the Customer Rebate Program relating to Customer Simple Orders in Select Symbols. The...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-28
... Account. These funds are held ``in trust'' for the obligor and currently earn simple interest at the rate..., the Government has paid simple interest at the rate of 3 percent per year on cash deposited by bond... #0;notices is to give interested persons an opportunity to participate in #0;the rule making prior to...
On the fusion of tuning parameters of fuzzy rules and neural network
NASA Astrophysics Data System (ADS)
Mamuda, Mamman; Sathasivam, Saratha
2017-08-01
Learning fuzzy rule-based system with neural network can lead to a precise valuable empathy of several problems. Fuzzy logic offers a simple way to reach at a definite conclusion based upon its vague, ambiguous, imprecise, noisy or missing input information. Conventional learning algorithm for tuning parameters of fuzzy rules using training input-output data usually end in a weak firing state, this certainly powers the fuzzy rule and makes it insecure for a multiple-input fuzzy system. In this paper, we introduce a new learning algorithm for tuning the parameters of the fuzzy rules alongside with radial basis function neural network (RBFNN) in training input-output data based on the gradient descent method. By the new learning algorithm, the problem of weak firing using the conventional method was addressed. We illustrated the efficiency of our new learning algorithm by means of numerical examples. MATLAB R2014(a) software was used in simulating our result The result shows that the new learning method has the best advantage of training the fuzzy rules without tempering with the fuzzy rule table which allowed a membership function of the rule to be used more than one time in the fuzzy rule base.
NASA Astrophysics Data System (ADS)
Sha, Wei E. I.; Zhu, Hugh L.; Chen, Luzhou; Chew, Weng Cho; Choy, Wallace C. H.
2015-02-01
It is well known that transport paths of photocarriers (electrons and holes) before collected by electrodes strongly affect bulk recombination and thus electrical properties of solar cells, including open-circuit voltage and fill factor. For boosting device performance, a general design rule, tailored to arbitrary electron to hole mobility ratio, is proposed to decide the transport paths of photocarriers. Due to a unique ability to localize and concentrate light, plasmonics is explored to manipulate photocarrier transport through spatially redistributing light absorption at the active layer of devices. Without changing the active materials, we conceive a plasmonic-electrical concept, which tunes electrical properties of solar cells via the plasmon-modified optical field distribution, to realize the design rule. Incorporating spectrally and spatially configurable metallic nanostructures, thin-film solar cells are theoretically modelled and experimentally fabricated to validate the design rule and verify the plasmonic-tunable electrical properties. The general design rule, together with the plasmonic-electrical effect, contributes to the evolution of emerging photovoltaics.
Offerman, Theo; Palley, Asa B
2016-01-01
Strictly proper scoring rules are designed to truthfully elicit subjective probabilistic beliefs from risk neutral agents. Previous experimental studies have identified two problems with this method: (i) risk aversion causes agents to bias their reports toward the probability of [Formula: see text], and (ii) for moderate beliefs agents simply report [Formula: see text]. Applying a prospect theory model of risk preferences, we show that loss aversion can explain both of these behavioral phenomena. Using the insights of this model, we develop a simple off-the-shelf probability assessment mechanism that encourages loss-averse agents to report true beliefs. In an experiment, we demonstrate the effectiveness of this modification in both eliminating uninformative reports and eliciting true probabilistic beliefs.
Quasiperiodic one-dimensional photonic crystals with adjustable multiple photonic bandgaps.
Vyunishev, Andrey M; Pankin, Pavel S; Svyakhovskiy, Sergey E; Timofeev, Ivan V; Vetrov, Stepan Ya
2017-09-15
We propose an elegant approach to produce photonic bandgap (PBG) structures with multiple photonic bandgaps by constructing quasiperiodic photonic crystals (QPPCs) composed of a superposition of photonic lattices with different periods. Generally, QPPC structures exhibit both aperiodicity and multiple PBGs due to their long-range order. They are described by a simple analytical expression, instead of quasiperiodic tiling approaches based on substitution rules. Here we describe the optical properties of QPPCs exhibiting two PBGs that can be tuned independently. PBG interband spacing and its depth can be varied by choosing appropriate reciprocal lattice vectors and their amplitudes. These effects are confirmed by the proof-of-concept measurements made for the porous silicon-based QPPC of the appropriate design.
NASA Astrophysics Data System (ADS)
Song, Junyeob; Zhou, Wei
2017-02-01
Plasmonic nanocavities can control light flows and enhance light-mater interactions at subwavelength scale, and thus can potentially be used as nanoscale components in integrated optics systems either for passive optical coupling, or for active optical modulation and emission. In this work, we investigated a new type of multilayered metal-insulator optical nanocavities that can support multiple localized plasmon resonances with ultra-small mode volumes. The total number of resonance peaks and their resonance wavelengths can be freely and accurately controlled by simple geometric design rules. Multi-resonance plasmonic nanocavities can serve as a nanoscale wavelength-multiplexed optical components in integrated optics systems, such as optical couplers, light emitters, nanolasers, optical sensors, and optical modulators.
Robots In War: Issues Of Risk And Ethics
2009-01-01
unexpected, untested ways. (And even straightforward, simple rules such as Asimov’s Laws of Robotics ( Asimov , 1950) can create unexpected dilemmas...stories (e. g., Asimov , 1950). Likewise, we may understand each rule of engagement and believe them to be sensible, but are they truly consistent...Netherlands: lOS Press. Asimov , I. (1950).1, Robot (2004 edition), New York, NY: Bantam Dell. BBC (2005). SLA Confirm Spy Plane Crash. BBC.com. Retrieved
Extension of the firefly algorithm and preference rules for solving MINLP problems
NASA Astrophysics Data System (ADS)
Costa, M. Fernanda P.; Francisco, Rogério B.; Rocha, Ana Maria A. C.; Fernandes, Edite M. G. P.
2017-07-01
An extension of the firefly algorithm (FA) for solving mixed-integer nonlinear programming (MINLP) problems is presented. Although penalty functions are nowadays frequently used to handle integrality conditions and inequality and equality constraints, this paper proposes the implementation within the FA of a simple rounded-based heuristic and four preference rules to find and converge to MINLP feasible solutions. Preliminary numerical experiments are carried out to validate the proposed methodology.
A business rules design framework for a pharmaceutical validation and alert system.
Boussadi, A; Bousquet, C; Sabatier, B; Caruba, T; Durieux, P; Degoulet, P
2011-01-01
Several alert systems have been developed to improve the patient safety aspects of clinical information systems (CIS). Most studies have focused on the evaluation of these systems, with little information provided about the methodology leading to system implementation. We propose here an 'agile' business rule design framework (BRDF) supporting both the design of alerts for the validation of drug prescriptions and the incorporation of the end user into the design process. We analyzed the unified process (UP) design life cycle and defined the activities, subactivities, actors and UML artifacts that could be used to enhance the agility of the proposed framework. We then applied the proposed framework to two different sets of data in the context of the Georges Pompidou University Hospital (HEGP) CIS. We introduced two new subactivities into UP: business rule specification and business rule instantiation activity. The pharmacist made an effective contribution to five of the eight BRDF design activities. Validation of the two new subactivities was effected in the context of drug dosage adaption to the patients' clinical and biological contexts. Pilot experiment shows that business rules modeled with BRDF and implemented as an alert system triggered an alert for 5824 of the 71,413 prescriptions considered (8.16%). A business rule design framework approach meets one of the strategic objectives for decision support design by taking into account three important criteria posing a particular challenge to system designers: 1) business processes, 2) knowledge modeling of the context of application, and 3) the agility of the various design steps.
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
40 CFR 62.1100 - Identification of plan.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS California Plan for the... of plan. (a) State of California Designated Facility Plan (Section 111(d) Plan). (b) The plan was... Pollution Control District Regulation 1; Rule 130—Definitions, Rule 240—Permit to Operate, Rule 450—Sulfide...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-25
... DEPARTMENT OF AGRICULTURE Agricultural Marketing Service 7 CFR Part 922 [Docket No. AMS-FV-12-0028... Regulations AGENCY: Agricultural Marketing Service, USDA. ACTION: Affirmation of interim rule as final rule... the marketing order for apricots grown in designated Counties in Washington. The interim rule...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rawls, G.; Newhouse, N.; Rana, M.
2010-04-13
The Boiler and Pressure Vessel Project Team on Hydrogen Tanks was formed in 2004 to develop Code rules to address the various needs that had been identified for the design and construction of up to 15000 psi hydrogen storage vessel. One of these needs was the development of Code rules for high pressure composite vessels with non-load sharing liners for stationary applications. In 2009, ASME approved new Appendix 8, for Section X Code which contains the rules for these vessels. These vessels are designated as Class III vessels with design pressure ranging from 20.7 MPa (3,000 ps)i to 103.4 MPamore » (15,000 psi) and maximum allowable outside liner diameter of 2.54 m (100 inches). The maximum design life of these vessels is limited to 20 years. Design, fabrication, and examination requirements have been specified, included Acoustic Emission testing at time of manufacture. The Code rules include the design qualification testing of prototype vessels. Qualification includes proof, expansion, burst, cyclic fatigue, creep, flaw, permeability, torque, penetration, and environmental testing.« less
Information from multiple modalities helps 5-month-olds learn abstract rules.
Frank, Michael C; Slemmer, Jonathan A; Marcus, Gary F; Johnson, Scott P
2009-07-01
By 7 months of age, infants are able to learn rules based on the abstract relationships between stimuli (Marcus et al., 1999), but they are better able to do so when exposed to speech than to some other classes of stimuli. In the current experiments we ask whether multimodal stimulus information will aid younger infants in identifying abstract rules. We habituated 5-month-olds to simple abstract patterns (ABA or ABB) instantiated in coordinated looming visual shapes and speech sounds (Experiment 1), shapes alone (Experiment 2), and speech sounds accompanied by uninformative but coordinated shapes (Experiment 3). Infants showed evidence of rule learning only in the presence of the informative multimodal cues. We hypothesize that the additional evidence present in these multimodal displays was responsible for the success of younger infants in learning rules, congruent with both a Bayesian account and with the Intersensory Redundancy Hypothesis.
Evaluating the Effectiveness of Auditing Rules for Electronic Health Record Systems
Hedda, Monica; Malin, Bradley A.; Yan, Chao; Fabbri, Daniel
2017-01-01
Healthcare organizations (HCOs) often deploy rule-based auditing systems to detect insider threats to sensitive patient health information in electronic health record (EHR) systems. These rule-based systems define behavior deemed to be high-risk a priori (e.g., family member, co-worker access). While such rules seem logical, there has been little scientific investigation into the effectiveness of these auditing rules in identifying inappropriate behavior. Thus, in this paper, we introduce an approach to evaluate the effectiveness of individual high-risk rules and rank them according to their potential risk. We investigate the rate of high-risk access patterns and minimum rate of high-risk accesses that can be explained with appropriate clinical reasons in a large EHR system. An analysis of 8M accesses from one-week of data shows that specific high-risk flags occur more frequently than theoretically expected and the rate at which accesses can be explained away with five simple reasons is 16 - 43%. PMID:29854153
Evaluating the Effectiveness of Auditing Rules for Electronic Health Record Systems.
Hedda, Monica; Malin, Bradley A; Yan, Chao; Fabbri, Daniel
2017-01-01
Healthcare organizations (HCOs) often deploy rule-based auditing systems to detect insider threats to sensitive patient health information in electronic health record (EHR) systems. These rule-based systems define behavior deemed to be high-risk a priori (e.g., family member, co-worker access). While such rules seem logical, there has been little scientific investigation into the effectiveness of these auditing rules in identifying inappropriate behavior. Thus, in this paper, we introduce an approach to evaluate the effectiveness of individual high-risk rules and rank them according to their potential risk. We investigate the rate of high-risk access patterns and minimum rate of high-risk accesses that can be explained with appropriate clinical reasons in a large EHR system. An analysis of 8M accesses from one-week of data shows that specific high-risk flags occur more frequently than theoretically expected and the rate at which accesses can be explained away with five simple reasons is 16 - 43%.
Genetic reinforcement learning through symbiotic evolution for fuzzy controller design.
Juang, C F; Lin, J Y; Lin, C T
2000-01-01
An efficient genetic reinforcement learning algorithm for designing fuzzy controllers is proposed in this paper. The genetic algorithm (GA) adopted in this paper is based upon symbiotic evolution which, when applied to fuzzy controller design, complements the local mapping property of a fuzzy rule. Using this Symbiotic-Evolution-based Fuzzy Controller (SEFC) design method, the number of control trials, as well as consumed CPU time, are considerably reduced when compared to traditional GA-based fuzzy controller design methods and other types of genetic reinforcement learning schemes. Moreover, unlike traditional fuzzy controllers, which partition the input space into a grid, SEFC partitions the input space in a flexible way, thus creating fewer fuzzy rules. In SEFC, different types of fuzzy rules whose consequent parts are singletons, fuzzy sets, or linear equations (TSK-type fuzzy rules) are allowed. Further, the free parameters (e.g., centers and widths of membership functions) and fuzzy rules are all tuned automatically. For the TSK-type fuzzy rule especially, which put the proposed learning algorithm in use, only the significant input variables are selected to participate in the consequent of a rule. The proposed SEFC design method has been applied to different simulated control problems, including the cart-pole balancing system, a magnetic levitation system, and a water bath temperature control system. The proposed SEFC has been verified to be efficient and superior from these control problems, and from comparisons with some traditional GA-based fuzzy systems.
Evidence flow graph methods for validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant
1988-01-01
This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.
Development of Watch Schedule Using Rules Approach
NASA Astrophysics Data System (ADS)
Jurkevicius, Darius; Vasilecas, Olegas
The software for schedule creation and optimization solves a difficult, important and practical problem. The proposed solution is an online employee portal where administrator users can create and manage watch schedules and employee requests. Each employee can login with his/her own account and see his/her assignments, manage requests, etc. Employees set as administrators can perform the employee scheduling online, manage requests, etc. This scheduling software allows users not only to see the initial and optimized watch schedule in a simple and understandable form, but also to create special rules and criteria and input their business. The system using rules automatically will generate watch schedule.
A programmable rules engine to provide clinical decision support using HTML forms.
Heusinkveld, J.; Geissbuhler, A.; Sheshelidze, D.; Miller, R.
1999-01-01
The authors have developed a simple method for specifying rules to be applied to information on HTML forms. This approach allows clinical experts, who lack the programming expertise needed to write CGI scripts, to construct and maintain domain-specific knowledge and ordering capabilities within WizOrder, the order-entry and decision support system used at Vanderbilt Hospital. The clinical knowledge base maintainers use HTML editors to create forms and spreadsheet programs for rule entry. A test environment has been developed which uses Netscape to display forms; the production environment displays forms using an embedded browser. Images Figure 1 PMID:10566470
Bilinearity, Rules, and Prefrontal Cortex
Dayan, Peter
2007-01-01
Humans can be instructed verbally to perform computationally complex cognitive tasks; their performance then improves relatively slowly over the course of practice. Many skills underlie these abilities; in this paper, we focus on the particular question of a uniform architecture for the instantiation of habitual performance and the storage, recall, and execution of simple rules. Our account builds on models of gated working memory, and involves a bilinear architecture for representing conditional input-output maps and for matching rules to the state of the input and working memory. We demonstrate the performance of our model on two paradigmatic tasks used to investigate prefrontal and basal ganglia function. PMID:18946523
Building a common pipeline for rule-based document classification.
Patterson, Olga V; Ginter, Thomas; DuVall, Scott L
2013-01-01
Instance-based classification of clinical text is a widely used natural language processing task employed as a step for patient classification, document retrieval, or information extraction. Rule-based approaches rely on concept identification and context analysis in order to determine the appropriate class. We propose a five-step process that enables even small research teams to develop simple but powerful rule-based NLP systems by taking advantage of a common UIMA AS based pipeline for classification. Our proposed methodology coupled with the general-purpose solution provides researchers with access to the data locked in clinical text in cases of limited human resources and compact timelines.
Capillarics: pre-programmed, self-powered microfluidic circuits built from capillary elements.
Safavieh, Roozbeh; Juncker, David
2013-11-07
Microfluidic capillary systems employ surface tension effects to manipulate liquids, and are thus self-powered and self-regulated as liquid handling is structurally and chemically encoded in microscale conduits. However, capillary systems have been limited to perform simple fluidic operations. Here, we introduce complex capillary flow circuits that encode sequential flow of multiple liquids with distinct flow rates and flow reversal. We first introduce two novel microfluidic capillary elements including (i) retention burst valves and (ii) robust low aspect ratio trigger valves. These elements are combined with flow resistors, capillary retention valves, capillary pumps, and open and closed reservoirs to build a capillary circuit that, following sample addition, autonomously delivers a defined sequence of multiple chemicals according to a preprogrammed and predetermined flow rate and time. Such a circuit was used to measure the concentration of C-reactive protein. This work illustrates that as in electronics, complex capillary circuits may be built by combining simple capillary elements. We define such circuits as "capillarics", and introduce symbolic representations. We believe that more complex circuits will become possible by expanding the library of building elements and formulating abstract design rules.
Pool Safety: A Few Simple Rules.
ERIC Educational Resources Information Center
PTA Today, 1993
1993-01-01
Presents suggestions by the National Swimming Pool Safety Committee on how to keep children safe while swimming. Ideas include maintaining strict adult supervision, pool and spa barriers, and knowledge of cardiopulmonary resuscitation. (SM)
The Good, the Bad, and the Ugly: A Theoretical Framework for the Assessment of Continuous Colormaps.
Bujack, Roxana; Turton, Terece L; Samsel, Francesca; Ware, Colin; Rogers, David H; Ahrens, James
2018-01-01
A myriad of design rules for what constitutes a "good" colormap can be found in the literature. Some common rules include order, uniformity, and high discriminative power. However, the meaning of many of these terms is often ambiguous or open to interpretation. At times, different authors may use the same term to describe different concepts or the same rule is described by varying nomenclature. These ambiguities stand in the way of collaborative work, the design of experiments to assess the characteristics of colormaps, and automated colormap generation. In this paper, we review current and historical guidelines for colormap design. We propose a specified taxonomy and provide unambiguous mathematical definitions for the most common design rules.
Ensuring production-worthy OPC recipes using large test structure arrays
NASA Astrophysics Data System (ADS)
Cork, Christopher; Zimmermann, Rainer; Mei, Xin; Shahin, Alexander
2007-03-01
The continual shrinking of design rules as the industry follows Moore's Law and the associated need for low k1 processes, have resulted in more layout configurations becoming difficult to print within the required tolerances. OPC recipes have needed to become more complex as tolerances decreased and acceptable corrections harder to find with simple algorithms. With this complexity comes the possibility of coding errors and ensuring the solutions are truly general. OPC Verification tools can check the quality of a correction based on pre-determined specifications for CD variation, line-end pullback and Edge Placement Error and then highlight layout configuration where violations are found. The problem facing a Mask Tape-Out group is that they usually have little control over the Design Styles coming in. Different approaches to eliminating problematic layouts have included highly restrictive Design Rules [1], whereby certain pitches or orientations are disallowed. Now these design rules are either becoming too complex or they overly restrict the designer from benefiting from the reduced pitch of the new node. The tight link between Design and Mask Tape-Out found in Integrated Device Manufacturers [2] (IDMs) i.e. companies that control both design and manufacturing can do much to dictate manufacturing friendly layout styles, and push ownership of problem resolution back to design groups. In fact this has been perceived as such an issue that a new class of products for designers that perform Lithographic Compliance Check on design layout is an emerging technology [3]. In contrast to IDMs, Semiconductor Foundries are presented with a much larger variety of design styles and a set of Fabless customers who generally are less knowledgeable in terms of understanding the impact of their layout on manufacturability and how to correct issues. The robustness requirements of a foundry's OPC correction recipe, therefore needs to be greater than that for an IDM's tape-out group. An OPC correction recipe which gives acceptable verification results, based solely on one customer GDS is clearly not sufficient to guarantee that all future tape-outs from multiple customers will be similarly clean. Ad hoc changes made in reaction to problems seen at verification are risky, while they may solve one particular layout issue on one product there is no guarantee that the problem may simply shift to another configuration on a yet to be manufactured part. The need to re-qualify a recipe over multiple products at each recipe change can easily results in excessive computational requirements. A single layer at an advanced node typically needs overnight runs on a large processor farm. Much of this layout, however, is extremely repetitive, made from a few standard cells placed tens of thousands of times. An alternative and more efficient approach, suggested by this paper as a screening methodology, is to encapsulate the problematic structures into a programmable test structure array. The dimensions of these test structures are parameterized in software such that they can be generated with these dimensions varied over the space of the design rules and conceivable design styles. By verifying the new recipe over these test structures one could more quickly gain confidence that this recipe would be robust over multiple tape-outs. This paper gives some examples of the implementation of this methodology.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Designated contract market and swap execution facility position limits and accountability rules. (a) Spot... rules and procedures for monitoring and enforcing spot-month position limits set at levels no greater... monitoring and enforcing spot-month position limits set at levels no greater than 25 percent of estimated...
Code of Federal Regulations, 2013 CFR
2013-04-01
... Designated contract market and swap execution facility position limits and accountability rules. (a) Spot... rules and procedures for monitoring and enforcing spot-month position limits set at levels no greater... monitoring and enforcing spot-month position limits set at levels no greater than 25 percent of estimated...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manturov, Vassily O
2010-06-29
In this work we study knot theories with a parity property for crossings: every crossing is declared to be even or odd according to a certain preassigned rule. If this rule satisfies a set of simple axioms related to the Reidemeister moves, then certain simple invariants solving the minimality problem can be defined, and invariant maps on the set of knots can be constructed. The most important example of a knot theory with parity is the theory of virtual knots. Using the parity property arising from Gauss diagrams we show that even a gross simplification of the theory of virtualmore » knots, namely, the theory of free knots, admits simple and highly nontrivial invariants. This gives a solution to a problem of Turaev, who conjectured that all free knots are trivial. In this work we show that free knots are generally not invertible, and provide invariants which detect the invertibility of free knots. The passage to ordinary virtual knots allows us to strengthen known invariants (such as the Kauffman bracket) using parity considerations. We also discuss other examples of knot theories with parity. Bibliography: 27 items.« less
ERIC Educational Resources Information Center
Beem, Kate
2004-01-01
It is such a simple mandate: Prepare healthy, nutritious meals for the schoolchildren so they can go about the business of learning. But operating a school district food service department is anything but simple. Even in the smallest districts, food service operations are businesses that must comply with many more rules than those in the private…
ERIC Educational Resources Information Center
Beal, Christine
1992-01-01
Describes typical differences in conversational routines in French and Australian English and kinds of tensions arising when speakers with two different sets of rules come into contact. Even simple questions contain a variety of assumptions ranging from whom it is suitable to ask to the kind of answer or the amount of detail that is expected. (13…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-13
... Commission is publishing this notice to solicit comments on the proposed rule change from interested persons... that the credit amounts in the Exchange's VIP for simple orders will not change as a result of the new... (simple (complex classes (monthly) orders) orders) 1 0%-0.75 $0.00 $0.00 2 Above 0.75%-2.00 0.10 0.17 3...
Yago, Martín
2017-05-01
QC planning based on risk management concepts can reduce the probability of harming patients due to an undetected out-of-control error condition. It does this by selecting appropriate QC procedures to decrease the number of erroneous results reported. The selection can be easily made by using published nomograms for simple QC rules when the out-of-control condition results in increased systematic error. However, increases in random error also occur frequently and are difficult to detect, which can result in erroneously reported patient results. A statistical model was used to construct charts for the 1 ks and X /χ 2 rules. The charts relate the increase in the number of unacceptable patient results reported due to an increase in random error with the capability of the measurement procedure. They thus allow for QC planning based on the risk of patient harm due to the reporting of erroneous results. 1 ks Rules are simple, all-around rules. Their ability to deal with increases in within-run imprecision is minimally affected by the possible presence of significant, stable, between-run imprecision. X /χ 2 rules perform better when the number of controls analyzed during each QC event is increased to improve QC performance. Using nomograms simplifies the selection of statistical QC procedures to limit the number of erroneous patient results reported due to an increase in analytical random error. The selection largely depends on the presence or absence of stable between-run imprecision. © 2017 American Association for Clinical Chemistry.
Competitive STDP Learning of Overlapping Spatial Patterns.
Krunglevicius, Dalius
2015-08-01
Spike-timing-dependent plasticity (STDP) is a set of Hebbian learning rules firmly based on biological evidence. It has been demonstrated that one of the STDP learning rules is suited for learning spatiotemporal patterns. When multiple neurons are organized in a simple competitive spiking neural network, this network is capable of learning multiple distinct patterns. If patterns overlap significantly (i.e., patterns are mutually inclusive), however, competition would not preclude trained neuron's responding to a new pattern and adjusting synaptic weights accordingly. This letter presents a simple neural network that combines vertical inhibition and Euclidean distance-dependent synaptic strength factor. This approach helps to solve the problem of pattern size-dependent parameter optimality and significantly reduces the probability of a neuron's forgetting an already learned pattern. For demonstration purposes, the network was trained for the first ten letters of the Braille alphabet.
Big Bang Day : The Great Big Particle Adventure - 3. Origins
None
2017-12-09
In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe
SYSTEMATIZATION OF MASS LEVELS OF PARTICLES AND RESONANCES ON HEURISTIC BASIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takabayasi, T.
1963-12-16
Once more a scheme of simple mass rules and formulas for particles and resonant levels is investigated and organized, based on some general hypotheses. The essential ingredients in the scheme are, on one hand, the equalinterval rule governing the isosinglet meson series, associated with particularly simple mass ratio between the 2/sup ++/ level f and 0/sup ++/ level ABC, and on the other a new basic mass formula that unifies some of the meson and baryon levels. The whole baryon levels are arranged in a table analogous to the periodic table, and then correspondences between different series and equivalence betweenmore » spin and hypercharge, when properly applied, just fix the whole baryon mass spectrum in good agreement with observations. Connections with the scheme of mass formulas formerly given are also shown. (auth)« less
Can simple rules control development of a pioneer vertebrate neuronal network generating behavior?
Roberts, Alan; Conte, Deborah; Hull, Mike; Merrison-Hort, Robert; al Azad, Abul Kalam; Buhl, Edgar; Borisyuk, Roman; Soffe, Stephen R
2014-01-08
How do the pioneer networks in the axial core of the vertebrate nervous system first develop? Fundamental to understanding any full-scale neuronal network is knowledge of the constituent neurons, their properties, synaptic interconnections, and normal activity. Our novel strategy uses basic developmental rules to generate model networks that retain individual neuron and synapse resolution and are capable of reproducing correct, whole animal responses. We apply our developmental strategy to young Xenopus tadpoles, whose brainstem and spinal cord share a core vertebrate plan, but at a tractable complexity. Following detailed anatomical and physiological measurements to complete a descriptive library of each type of spinal neuron, we build models of their axon growth controlled by simple chemical gradients and physical barriers. By adding dendrites and allowing probabilistic formation of synaptic connections, we reconstruct network connectivity among up to 2000 neurons. When the resulting "network" is populated by model neurons and synapses, with properties based on physiology, it can respond to sensory stimulation by mimicking tadpole swimming behavior. This functioning model represents the most complete reconstruction of a vertebrate neuronal network that can reproduce the complex, rhythmic behavior of a whole animal. The findings validate our novel developmental strategy for generating realistic networks with individual neuron- and synapse-level resolution. We use it to demonstrate how early functional neuronal connectivity and behavior may in life result from simple developmental "rules," which lay out a scaffold for the vertebrate CNS without specific neuron-to-neuron recognition.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lincoln, Don
2015-01-20
Albert Einstein said that what he wanted to know was “God’s thoughts,” which is a metaphor for the ultimate and most basic rules of the universe. Once known, all other phenomena would then be a consequence of these simple rules. While modern science is far from that goal, we have some thoughts on how this inquiry might unfold. In this video, Fermilab’s Dr. Don Lincoln tells what we know about GUTs (grand unified theories) and TOEs (theories of everything).
Objective estimates based on experimental data and initial and final knowledge
NASA Technical Reports Server (NTRS)
Rosenbaum, B. M.
1972-01-01
An extension of the method of Jaynes, whereby least biased probability estimates are obtained, permits such estimates to be made which account for experimental data on hand as well as prior and posterior knowledge. These estimates can be made for both discrete and continuous sample spaces. The method allows a simple interpretation of Laplace's two rules: the principle of insufficient reason and the rule of succession. Several examples are analyzed by way of illustration.
NASA Astrophysics Data System (ADS)
Ahmadianfar, Iman; Adib, Arash; Taghian, Mehrdad
2017-10-01
The reservoir hedging rule curves are used to avoid severe water shortage during drought periods. In this method reservoir storage is divided into several zones, wherein the rationing factors are changed immediately when water storage level moves from one zone to another. In the present study, a hedging rule with fuzzy rationing factors was applied for creating a transition zone in up and down each rule curve, and then the rationing factor will be changed in this zone gradually. For this propose, a monthly simulation model was developed and linked to the non-dominated sorting genetic algorithm for calculation of the modified shortage index of two objective functions involving water supply of minimum flow and agriculture demands in a long-term simulation period. Zohre multi-reservoir system in south Iran has been considered as a case study. The results of the proposed hedging rule have improved the long-term system performance from 10 till 27 percent in comparison with the simple hedging rule, where these results demonstrate that the fuzzification of hedging factors increase the applicability and the efficiency of the new hedging rule in comparison to the conventional rule curve for mitigating the water shortage problem.
Using Container Structures in Architecture and Urban Design
NASA Astrophysics Data System (ADS)
Grębowski, Karol; Kałdunek, Daniel
2017-10-01
The paper presents the use of shipping containers in architecture and urban design. Even today, houses and apartments are still too expensive. Since 1923 architects have been improving the living conditions of citizens by building very simple, repeatable forms. With prefabrication technology it became possible to build quicker, causing house prices to decrease. Apartments in block of flats became affordable to more and more people. Modernism had great impact on the quality of living spaces, despite the detrimental effect of large panel technology on social life. It gave people their own bathrooms, and gifted them with simple solutions we now consider indispensable. The ambition to build cheaply but effectively is still here. The future of housing lies in prefabricated apartment modules. A well optimized creation process is the key, but taking into consideration the mistakes made by past generations should be the second most important factor. Studies show that large panel buildings were too monumental and solid for a housing structure, and offered no public spaces between them. Lack of urban design transformed a great idea into blocks that are considered to be ugly and unfriendly. Diversity is something that large panel structures were missing. While most block of flats were being constructed out of the same module (Model 770), differentiated architecture was difficult to achieve. Nowadays, increasing numbers of shipping containers are being used for housing purposes. These constructions show that it is possible to create astonishing housing with modules. Shipping containers were not designed to be a building material, but in contrast to large panel modules, there are many more possibilities of their transformation. In this paper the authors propose a set of rules that, if followed, would result in cheaper apartments, while keeping in consideration both tremendous architecture and friendly urban design. What is more, the proposed solution is designed to adapt to personalized requirements. In this paper the authors include information about design guidelines for structures made from shipping containers.
Newgreen, Donald F; Dufour, Sylvie; Howard, Marthe J; Landman, Kerry A
2013-10-01
We review morphogenesis of the enteric nervous system from migratory neural crest cells, and defects of this process such as Hirschsprung disease, centering on cell motility and assembly, and cell adhesion and extracellular matrix molecules, along with cell proliferation and growth factors. We then review continuum and agent-based (cellular automata) models with rules of cell movement and logistical proliferation. Both movement and proliferation at the individual cell level are modeled with stochastic components from which stereotyped outcomes emerge at the population level. These models reproduced the wave-like colonization of the intestine by enteric neural crest cells, and several new properties emerged, such as colonization by frontal expansion, which were later confirmed biologically. These models predict a surprising level of clonal heterogeneity both in terms of number and distribution of daughter cells. Biologically, migrating cells form stable chains made up of unstable cells, but this is not seen in the initial model. We outline additional rules for cell differentiation into neurons, axon extension, cell-axon and cell-cell adhesions, chemotaxis and repulsion which can reproduce chain migration. After the migration stage, the cells re-arrange as a network of ganglia. Changes in cell adhesion molecules parallel this, and we describe additional rules based on Steinberg's Differential Adhesion Hypothesis, reflecting changing levels of adhesion in neural crest cells and neurons. This was able to reproduce enteric ganglionation in a model. Mouse mutants with disturbances of enteric nervous system morphogenesis are discussed, and these suggest future refinement of the models. The modeling suggests a relatively simple set of cell behavioral rules could account for complex patterns of morphogenesis. The model has allowed the proposal that Hirschsprung disease is mostly an enteric neural crest cell proliferation defect, not a defect of cell migration. In addition, the model suggests an explanations for zonal and skip segment variants of Hirschsprung disease, and also gives a novel stochastic explanation for the observed discordancy of Hirschsprung disease in identical twins. © 2013 Elsevier Inc. All rights reserved.
Load Capacity Estimation of Foil Air Journal Bearings for Oil-Free Turbomachinery Applications
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher; Valco, Mark J.
2000-01-01
This paper introduces a simple "Rule of Thumb" (ROT) method to estimate the load capacity of foil air journal bearings, which are self-acting compliant-surface hydrodynamic bearings being considered for Oil-Free turbo-machinery applications such as gas turbine engines. The ROT is based on first principles and data available in the literature and it relates bearing load capacity to the bearing size and speed through an empirically based load capacity coefficient, D. It is shown that load capacity is a linear function of bearing surface velocity and bearing projected area. Furthermore, it was found that the load capacity coefficient, D, is related to the design features of the bearing compliant members and operating conditions (speed and ambient temperature). Early bearing designs with basic or "first generation" compliant support elements have relatively low load capacity. More advanced bearings, in which the compliance of the support structure is tailored, have load capacities up to five times those of simpler designs. The ROT enables simplified load capacity estimation for foil air journal bearings and can guide development of new Oil-Free turbomachinery systems.
Essock, Susan M; Drake, Robert E; Frank, Richard G; McGuire, Thomas G
2003-01-01
The purpose of clinical research is to answer this question: Would a new treatment, when added to the existing range of treatment options available in practice, help patients? Randomized controlled trials (RCTs)--in particular, double-blind RCTs--have important methodological advantages over observational studies for addressing this question. These advantages, however, come at a price. RCTs compare treatments using a particular allocation rule for assigning patients to treatments (random assignment) that does not mimic real-world practice. "Favorable" results from an RCT indicating that a new treatment is superior to existing treatments are neither necessary nor sufficient for establishing a "yes" answer to the question posed above. Modeled on an experimental design, RCTs are expensive in time and money and must compare simple differences in treatments. Findings have a high internal validity but may not address the needs of the field, particularly where treatment is complex and rapidly evolving. Design of clinical research needs to take account of the way treatments are allocated in actual practice and include flexible designs to answer important questions most effectively.
A novel resource sharing algorithm based on distributed construction for radiant enclosure problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finzell, Peter; Bryden, Kenneth M.
This study demonstrates a novel approach to solving inverse radiant enclosure problems based on distributed construction. Specifically, the problem of determining the temperature distribution needed on the heater surfaces to achieve a desired design surface temperature profile is recast as a distributed construction problem in which a shared resource, temperature, is distributed by computational agents moving blocks. The sharing of blocks between agents enables them to achieve their desired local state, which in turn achieves the desired global state. Each agent uses the current state of their local environment and a simple set of rules to determine when to exchangemore » blocks, each block representing a discrete unit of temperature change. This algorithm is demonstrated using the established two-dimensional inverse radiation enclosure problem. The temperature profile on the heater surfaces is adjusted to achieve a desired temperature profile on the design surfaces. The resource sharing algorithm was able to determine the needed temperatures on the heater surfaces to obtain the desired temperature distribution on the design surfaces in the nine cases examined.« less
A novel resource sharing algorithm based on distributed construction for radiant enclosure problems
Finzell, Peter; Bryden, Kenneth M.
2017-03-06
This study demonstrates a novel approach to solving inverse radiant enclosure problems based on distributed construction. Specifically, the problem of determining the temperature distribution needed on the heater surfaces to achieve a desired design surface temperature profile is recast as a distributed construction problem in which a shared resource, temperature, is distributed by computational agents moving blocks. The sharing of blocks between agents enables them to achieve their desired local state, which in turn achieves the desired global state. Each agent uses the current state of their local environment and a simple set of rules to determine when to exchangemore » blocks, each block representing a discrete unit of temperature change. This algorithm is demonstrated using the established two-dimensional inverse radiation enclosure problem. The temperature profile on the heater surfaces is adjusted to achieve a desired temperature profile on the design surfaces. The resource sharing algorithm was able to determine the needed temperatures on the heater surfaces to obtain the desired temperature distribution on the design surfaces in the nine cases examined.« less
FPGA-based firmware model for extended measurement systems with data quality monitoring
NASA Astrophysics Data System (ADS)
Wojenski, A.; Pozniak, K. T.; Mazon, D.; Chernyshova, M.
2017-08-01
Modern physics experiments requires construction of advanced, modular measurement systems for data processing and registration purposes. Components are often designed in one of the common mechanical and electrical standards, e.g. VME or uTCA. The paper is focused on measurement systems using FPGAs as data processing blocks, especially for plasma diagnostics using GEM detectors with data quality monitoring aspects. In the article is proposed standardized model of HDL FPGA firmware implementation, for use in a wide range of different measurement system. The effort was made in term of flexible implementation of data quality monitoring along with source data dynamic selection. In the paper is discussed standard measurement system model followed by detailed model of FPGA firmware for modular measurement systems. Considered are both: functional blocks and data buses. In the summary, necessary blocks and signal lines are described. Implementation of firmware following the presented rules should provide modular design, with ease of change different parts of it. The key benefit is construction of universal, modular HDL design, that can be applied in different measurement system with simple adjustments.
Robust Strategy for Rocket Engine Health Monitoring
NASA Technical Reports Server (NTRS)
Santi, L. Michael
2001-01-01
Monitoring the health of rocket engine systems is essentially a two-phase process. The acquisition phase involves sensing physical conditions at selected locations, converting physical inputs to electrical signals, conditioning the signals as appropriate to establish scale or filter interference, and recording results in a form that is easy to interpret. The inference phase involves analysis of results from the acquisition phase, comparison of analysis results to established health measures, and assessment of health indications. A variety of analytical tools may be employed in the inference phase of health monitoring. These tools can be separated into three broad categories: statistical, rule based, and model based. Statistical methods can provide excellent comparative measures of engine operating health. They require well-characterized data from an ensemble of "typical" engines, or "golden" data from a specific test assumed to define the operating norm in order to establish reliable comparative measures. Statistical methods are generally suitable for real-time health monitoring because they do not deal with the physical complexities of engine operation. The utility of statistical methods in rocket engine health monitoring is hindered by practical limits on the quantity and quality of available data. This is due to the difficulty and high cost of data acquisition, the limited number of available test engines, and the problem of simulating flight conditions in ground test facilities. In addition, statistical methods incur a penalty for disregarding flow complexity and are therefore limited in their ability to define performance shift causality. Rule based methods infer the health state of the engine system based on comparison of individual measurements or combinations of measurements with defined health norms or rules. This does not mean that rule based methods are necessarily simple. Although binary yes-no health assessment can sometimes be established by relatively simple rules, the causality assignment needed for refined health monitoring often requires an exceptionally complex rule base involving complicated logical maps. Structuring the rule system to be clear and unambiguous can be difficult, and the expert input required to maintain a large logic network and associated rule base can be prohibitive.
Approach to design neural cryptography: a generalized architecture and a heuristic rule.
Mu, Nankun; Liao, Xiaofeng; Huang, Tingwen
2013-06-01
Neural cryptography, a type of public key exchange protocol, is widely considered as an effective method for sharing a common secret key between two neural networks on public channels. How to design neural cryptography remains a great challenge. In this paper, in order to provide an approach to solve this challenge, a generalized network architecture and a significant heuristic rule are designed. The proposed generic framework is named as tree state classification machine (TSCM), which extends and unifies the existing structures, i.e., tree parity machine (TPM) and tree committee machine (TCM). Furthermore, we carefully study and find that the heuristic rule can improve the security of TSCM-based neural cryptography. Therefore, TSCM and the heuristic rule can guide us to designing a great deal of effective neural cryptography candidates, in which it is possible to achieve the more secure instances. Significantly, in the light of TSCM and the heuristic rule, we further expound that our designed neural cryptography outperforms TPM (the most secure model at present) on security. Finally, a series of numerical simulation experiments are provided to verify validity and applicability of our results.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-24
... Equities Rule 104 To Adopt Pricing Obligations for Designated Market Makers September 20, 2010. Pursuant to... pricing obligations for Designated Market Makers (``DMMs''). The text of the proposed rule change is... adopt pricing obligations for DMMs. Under the proposal, the Exchange will require DMMs to continuously...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-07
... Organizations; International Securities Exchange, LLC; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change To List and Trade Option Contracts Overlying 10 Shares of a Security June... Act of 1934 (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to list and trade...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-25
...-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Designation of Longer Period for Commission Action on Proceedings To Determine Whether To Approve or Disapprove Proposed Rule Change To Link Market... Rule 19b-4 thereunder,\\2\\ a proposed rule change to discount certain market data fees and increase...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-15
...; Proposed Amendments to Rule G-8, on Books and Records, Rule G- 9, on Record Retention, and Rule G-18, on... of proposed MSRB Rule G-43, on broker's brokers; amendments to MSRB Rule G-8, on books and records...
A product of independent beta probabilities dose escalation design for dual-agent phase I trials.
Mander, Adrian P; Sweeting, Michael J
2015-04-15
Dual-agent trials are now increasingly common in oncology research, and many proposed dose-escalation designs are available in the statistical literature. Despite this, the translation from statistical design to practical application is slow, as has been highlighted in single-agent phase I trials, where a 3 + 3 rule-based design is often still used. To expedite this process, new dose-escalation designs need to be not only scientifically beneficial but also easy to understand and implement by clinicians. In this paper, we propose a curve-free (nonparametric) design for a dual-agent trial in which the model parameters are the probabilities of toxicity at each of the dose combinations. We show that it is relatively trivial for a clinician's prior beliefs or historical information to be incorporated in the model and updating is fast and computationally simple through the use of conjugate Bayesian inference. Monotonicity is ensured by considering only a set of monotonic contours for the distribution of the maximum tolerated contour, which defines the dose-escalation decision process. Varied experimentation around the contour is achievable, and multiple dose combinations can be recommended to take forward to phase II. Code for R, Stata and Excel are available for implementation. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Nguyen, Su; Zhang, Mengjie; Tan, Kay Chen
2017-09-01
Automated design of dispatching rules for production systems has been an interesting research topic over the last several years. Machine learning, especially genetic programming (GP), has been a powerful approach to dealing with this design problem. However, intensive computational requirements, accuracy and interpretability are still its limitations. This paper aims at developing a new surrogate assisted GP to help improving the quality of the evolved rules without significant computational costs. The experiments have verified the effectiveness and efficiency of the proposed algorithms as compared to those in the literature. Furthermore, new simplification and visualisation approaches have also been developed to improve the interpretability of the evolved rules. These approaches have shown great potentials and proved to be a critical part of the automated design system.
75 FR 47063 - Mutual Fund Distribution Fees; Confirmations
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-04
... competition for distribution services. The proposed rule and rule amendments are designed to protect... designed to enhance investor understanding of those charges, limit the cumulative sales charges each...(b) was designed to protect funds from being charged excessive sales and promotional expenses.\\26...
Highly scalable and robust rule learner: performance evaluation and comparison.
Kurgan, Lukasz A; Cios, Krzysztof J; Dick, Scott
2006-02-01
Business intelligence and bioinformatics applications increasingly require the mining of datasets consisting of millions of data points, or crafting real-time enterprise-level decision support systems for large corporations and drug companies. In all cases, there needs to be an underlying data mining system, and this mining system must be highly scalable. To this end, we describe a new rule learner called DataSqueezer. The learner belongs to the family of inductive supervised rule extraction algorithms. DataSqueezer is a simple, greedy, rule builder that generates a set of production rules from labeled input data. In spite of its relative simplicity, DataSqueezer is a very effective learner. The rules generated by the algorithm are compact, comprehensible, and have accuracy comparable to rules generated by other state-of-the-art rule extraction algorithms. The main advantages of DataSqueezer are very high efficiency, and missing data resistance. DataSqueezer exhibits log-linear asymptotic complexity with the number of training examples, and it is faster than other state-of-the-art rule learners. The learner is also robust to large quantities of missing data, as verified by extensive experimental comparison with the other learners. DataSqueezer is thus well suited to modern data mining and business intelligence tasks, which commonly involve huge datasets with a large fraction of missing data.
Clinical decision rules for termination of resuscitation in out-of-hospital cardiac arrest.
Sherbino, Jonathan; Keim, Samuel M; Davis, Daniel P
2010-01-01
Out-of-hospital cardiac arrest (OHCA) has a low probability of survival to hospital discharge. Four clinical decision rules (CDRs) have been validated to identify patients with no probability of survival. Three of these rules focus on exclusive prehospital basic life support care for OHCA, and two of these rules focus on prehospital advanced life support care for OHCA. Can a CDR for the termination of resuscitation identify a patient with no probability of survival in the setting of OHCA? Six validation studies were selected from a PubMed search. A structured review of each of the studies is presented. In OHCA receiving basic life support care, the BLS-TOR (basic life support termination of resuscitation) rule has a positive predictive value for death of 99.5% (95% confidence interval 98.9-99.8%), and decreases the transportation of all patients by 62.6%. This rule has been appropriately validated for widespread use. In OHCA receiving advanced life support care, no current rule has been appropriately validated for widespread use. The BLS-TOR rule is a simple rule that identifies patients who will not survive OHCA. Further research is required to identify similarly robust CDRs for patients receiving advanced life support care in the setting of OHCA. Copyright 2010 Elsevier Inc. All rights reserved.
Revealing the spectral response of a plasmonic lens using low-energy electrons
NASA Astrophysics Data System (ADS)
Cao, Shuiyan; Le Moal, Eric; Bigourdan, Florian; Hugonin, Jean-Paul; Greffet, Jean-Jacques; Drezet, Aurélien; Huant, Serge; Dujardin, Gérald; Boer-Duchemin, Elizabeth
2017-09-01
Plasmonic lenses, even of simple design, may have intricate spectral behavior. The spectral response of a plasmonic lens to a local, broadband excitation has rarely been studied despite its central importance in future applications. Here we use the unique combination of scanning tunneling microscopy (STM) and angle-resolved optical spectroscopy to probe the spectral response of a plasmonic lens. Such a lens consists of a series of concentric circular slits etched in a thick gold film. Spectrally broad, circular surface plasmon polariton (SPP) waves are electrically launched from the STM tip at the plasmonic lens center, and these waves scatter at the slits into a narrow, out-of-plane, light beam. We show that the angular distribution of the emitted light results from the interplay of the size of the plasmonic lens and the spectral width of the SPP nanosource. We then propose simple design rules for optimized light beaming with the smallest possible footprint. The spectral distribution of the emitted light depends not only on the SPP nanosource, but on the local density of electromagnetic states (EM-LDOS) at the nanosource position, which in turn depends on the cavity modes of the plasmonic microstructure. The key parameters for tailoring the spectral response of the plasmonic lens are the period of the slits forming the lens, the number of slits, and the lens inner diameter.
78 FR 16268 - Submission for OMB Review; Service Contracts Reporting Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-14
... the final rule. DATES: Interested parties should submit written comments to the Regulatory Secretariat... between the hours that a simple disclosure by a very small business might require and the much higher...
Understanding Singular Vectors
ERIC Educational Resources Information Center
James, David; Botteron, Cynthia
2013-01-01
matrix yields a surprisingly simple, heuristical approximation to its singular vectors. There are correspondingly good approximations to the singular values. Such rules of thumb provide an intuitive interpretation of the singular vectors that helps explain why the SVD is so…
Shimp, Charles P
2004-06-30
Research on categorization has changed over time, and some of these changes resemble how Wittgenstein's views changed from his Tractatus Logico-Philosophicus to his Philosophical Investigations. Wittgenstein initially focused on unambiguous, abstract, parsimonious, logical propositions and rules, and on independent, static, "atomic facts." This approach subsequently influenced the development of logical positivism and thereby may have indirectly influenced method and theory in research on categorization: much animal research on categorization has focused on learning simple, static, logical rules unambiguously interrelating small numbers of independent features. He later rejected logical simplicity and rigor and focused instead on Gestalt ideas about figure-ground reversals and context, the ambiguity of family resemblance, and the function of details of everyday language. Contemporary contextualism has been influenced by this latter position, some features of which appear in contemporary empirical research on categorization. These developmental changes are illustrated by research on avian local and global levels of visual perceptual analysis, categorization of rectangles and moving objects, and artificial grammar learning. Implications are described for peer review of quantitative theory in which ambiguity, logical rigor, simplicity, or dynamics are designed to play important roles.
Intrinsically stretchable and healable semiconducting polymer for organic transistors
Oh, Jin Young; Rondeau-Gagné, Simon; Chiu, Yu-Cheng; ...
2016-11-16
Developing a molecular design paradigm for conjugated polymers applicable to intrinsically stretchable semiconductors is crucial toward the next generation of wearable electronics. Current molecular design rules for high charge carrier mobility semiconducting polymers are unable to render the fabricated devices simultaneously stretchable and mechanically robust. Here in this paper, we present a new design concept to address the above challenge, while maintaining excellent electronic performance. This concept involves introducing chemical moieties to promote dynamic non-covalent crosslinking of the conjugated polymers. These non-covalent covalent crosslinking moieties are able to undergo an energy dissipation mechanism through breakage of bonds when strain ismore » applied, while retaining its high charge transport ability. As a result, our polymer is able to recover its high mobility performance (>1 cm 2/Vs) even after 100 cycles at 100% applied strain. Furthermore, we observed that the polymer can be efficiently repaired and/or healed with a simple heat and solvent treatment. These improved mechanical properties of our fabricated stretchable semiconductor enabled us to fabricate highly stretchable and high performance wearable organic transistors. This material design concept should illuminate and advance the pathways for future development of fully stretchable and healable skin-inspired wearable electronics.« less
2013-01-01
Background Traditional Lot Quality Assurance Sampling (LQAS) designs assume observations are collected using simple random sampling. Alternatively, randomly sampling clusters of observations and then individuals within clusters reduces costs but decreases the precision of the classifications. In this paper, we develop a general framework for designing the cluster(C)-LQAS system and illustrate the method with the design of data quality assessments for the community health worker program in Rwanda. Results To determine sample size and decision rules for C-LQAS, we use the beta-binomial distribution to account for inflated risk of errors introduced by sampling clusters at the first stage. We present general theory and code for sample size calculations. The C-LQAS sample sizes provided in this paper constrain misclassification risks below user-specified limits. Multiple C-LQAS systems meet the specified risk requirements, but numerous considerations, including per-cluster versus per-individual sampling costs, help identify optimal systems for distinct applications. Conclusions We show the utility of C-LQAS for data quality assessments, but the method generalizes to numerous applications. This paper provides the necessary technical detail and supplemental code to support the design of C-LQAS for specific programs. PMID:24160725
Hedt-Gauthier, Bethany L; Mitsunaga, Tisha; Hund, Lauren; Olives, Casey; Pagano, Marcello
2013-10-26
Traditional Lot Quality Assurance Sampling (LQAS) designs assume observations are collected using simple random sampling. Alternatively, randomly sampling clusters of observations and then individuals within clusters reduces costs but decreases the precision of the classifications. In this paper, we develop a general framework for designing the cluster(C)-LQAS system and illustrate the method with the design of data quality assessments for the community health worker program in Rwanda. To determine sample size and decision rules for C-LQAS, we use the beta-binomial distribution to account for inflated risk of errors introduced by sampling clusters at the first stage. We present general theory and code for sample size calculations.The C-LQAS sample sizes provided in this paper constrain misclassification risks below user-specified limits. Multiple C-LQAS systems meet the specified risk requirements, but numerous considerations, including per-cluster versus per-individual sampling costs, help identify optimal systems for distinct applications. We show the utility of C-LQAS for data quality assessments, but the method generalizes to numerous applications. This paper provides the necessary technical detail and supplemental code to support the design of C-LQAS for specific programs.
Intrinsically stretchable and healable semiconducting polymer for organic transistors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oh, Jin Young; Rondeau-Gagné, Simon; Chiu, Yu-Cheng
Developing a molecular design paradigm for conjugated polymers applicable to intrinsically stretchable semiconductors is crucial toward the next generation of wearable electronics. Current molecular design rules for high charge carrier mobility semiconducting polymers are unable to render the fabricated devices simultaneously stretchable and mechanically robust. Here in this paper, we present a new design concept to address the above challenge, while maintaining excellent electronic performance. This concept involves introducing chemical moieties to promote dynamic non-covalent crosslinking of the conjugated polymers. These non-covalent covalent crosslinking moieties are able to undergo an energy dissipation mechanism through breakage of bonds when strain ismore » applied, while retaining its high charge transport ability. As a result, our polymer is able to recover its high mobility performance (>1 cm 2/Vs) even after 100 cycles at 100% applied strain. Furthermore, we observed that the polymer can be efficiently repaired and/or healed with a simple heat and solvent treatment. These improved mechanical properties of our fabricated stretchable semiconductor enabled us to fabricate highly stretchable and high performance wearable organic transistors. This material design concept should illuminate and advance the pathways for future development of fully stretchable and healable skin-inspired wearable electronics.« less
A hybrid learning method for constructing compact rule-based fuzzy models.
Zhao, Wanqing; Niu, Qun; Li, Kang; Irwin, George W
2013-12-01
The Takagi–Sugeno–Kang-type rule-based fuzzy model has found many applications in different fields; a major challenge is, however, to build a compact model with optimized model parameters which leads to satisfactory model performance. To produce a compact model, most existing approaches mainly focus on selecting an appropriate number of fuzzy rules. In contrast, this paper considers not only the selection of fuzzy rules but also the structure of each rule premise and consequent, leading to the development of a novel compact rule-based fuzzy model. Here, each fuzzy rule is associated with two sets of input attributes, in which the first is used for constructing the rule premise and the other is employed in the rule consequent. A new hybrid learning method combining the modified harmony search method with a fast recursive algorithm is hereby proposed to determine the structure and the parameters for the rule premises and consequents. This is a hard mixed-integer nonlinear optimization problem, and the proposed hybrid method solves the problem by employing an embedded framework, leading to a significantly reduced number of model parameters and a small number of fuzzy rules with each being as simple as possible. Results from three examples are presented to demonstrate the compactness (in terms of the number of model parameters and the number of rules) and the performance of the fuzzy models obtained by the proposed hybrid learning method, in comparison with other techniques from the literature.
Phase transitions in the q -voter model with noise on a duplex clique
NASA Astrophysics Data System (ADS)
Chmiel, Anna; Sznajd-Weron, Katarzyna
2015-11-01
We study a nonlinear q -voter model with stochastic noise, interpreted in the social context as independence, on a duplex network. To study the role of the multilevelness in this model we propose three methods of transferring the model from a mono- to a multiplex network. They take into account two criteria: one related to the status of independence (LOCAL vs GLOBAL) and one related to peer pressure (AND vs OR). In order to examine the influence of the presence of more than one level in the social network, we perform simulations on a particularly simple multiplex: a duplex clique, which consists of two fully overlapped complete graphs (cliques). Solving numerically the rate equation and simultaneously conducting Monte Carlo simulations, we provide evidence that even a simple rearrangement into a duplex topology may lead to significant changes in the observed behavior. However, qualitative changes in the phase transitions can be observed for only one of the considered rules: LOCAL&AND. For this rule the phase transition becomes discontinuous for q =5 , whereas for a monoplex such behavior is observed for q =6 . Interestingly, only this rule admits construction of realistic variants of the model, in line with recent social experiments.
Knowledge acquisition for case-based reasoning systems
NASA Technical Reports Server (NTRS)
Riesbeck, Christopher K.
1988-01-01
Case-based reasoning (CBR) is a simple idea: solve new problems by adapting old solutions to similar problems. The CBR approach offers several potential advantages over rule-based reasoning: rules are not combined blindly in a search for solutions, solutions can be explained in terms of concrete examples, and performance can improve automatically as new problems are solved and added to the case library. Moving CBR for the university research environment to the real world requires smooth interfaces for getting knowledge from experts. Described are the basic elements of an interface for acquiring three basic bodies of knowledge that any case-based reasoner requires: the case library of problems and their solutions, the analysis rules that flesh out input problem specifications so that relevant cases can be retrieved, and the adaptation rules that adjust old solutions to fit new problems.
Marine reserves as linked social-ecological systems.
Pollnac, Richard; Christie, Patrick; Cinner, Joshua E; Dalton, Tracey; Daw, Tim M; Forrester, Graham E; Graham, Nicholas A J; McClanahan, Timothy R
2010-10-26
Marine reserves are increasingly recognized as having linked social and ecological dynamics. This study investigates how the ecological performance of 56 marine reserves throughout the Philippines, Caribbean, and Western Indian Ocean (WIO) is related to both reserve design features and the socioeconomic characteristics in associated coastal communities. Ecological performance was measured as fish biomass in the reserve relative to nearby areas. Of the socioeconomic variables considered, human population density and compliance with reserve rules had the strongest effects on fish biomass, but the effects of these variables were region specific. Relationships between population density and the reserve effect on fish biomass were negative in the Caribbean, positive in the WIO, and not detectable in the Philippines. Differing associations between population density and reserve effectiveness defy simple explanation but may depend on human migration to effective reserves, depletion of fish stocks outside reserves, or other social factors that change with population density. Higher levels of compliance reported by resource users was related to higher fish biomass in reserves compared with outside, but this relationship was only statistically significant in the Caribbean. A heuristic model based on correlations between social, cultural, political, economic, and other contextual conditions in 127 marine reserves showed that high levels of compliance with reserve rules were related to complex social interactions rather than simply to enforcement of reserve rules. Comparative research of this type is important for uncovering the complexities surrounding human dimensions of marine reserves and improving reserve management.
Acquisition, representation and rule generation for procedural knowledge
NASA Technical Reports Server (NTRS)
Ortiz, Chris; Saito, Tim; Mithal, Sachin; Loftin, R. Bowen
1991-01-01
Current research into the design and continuing development of a system for the acquisition of procedural knowledge, its representation in useful forms, and proposed methods for automated C Language Integrated Production System (CLIPS) rule generation is discussed. The Task Analysis and Rule Generation Tool (TARGET) is intended to permit experts, individually or collectively, to visually describe and refine procedural tasks. The system is designed to represent the acquired knowledge in the form of graphical objects with the capacity for generating production rules in CLIPS. The generated rules can then be integrated into applications such as NASA's Intelligent Computer Aided Training (ICAT) architecture. Also described are proposed methods for use in translating the graphical and intermediate knowledge representations into CLIPS rules.
Verifying Architectural Design Rules of the Flight Software Product Line
NASA Technical Reports Server (NTRS)
Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen
2009-01-01
This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.
Process Materialization Using Templates and Rules to Design Flexible Process Models
NASA Astrophysics Data System (ADS)
Kumar, Akhil; Yao, Wen
The main idea in this paper is to show how flexible processes can be designed by combining generic process templates and business rules. We instantiate a process by applying rules to specific case data, and running a materialization algorithm. The customized process instance is then executed in an existing workflow engine. We present an architecture and also give an algorithm for process materialization. The rules are written in a logic-based language like Prolog. Our focus is on capturing deeper process knowledge and achieving a holistic approach to robust process design that encompasses control flow, resources and data, as well as makes it easier to accommodate changes to business policy.
NASA Technical Reports Server (NTRS)
Zahm, A F; Bear, R M
1929-01-01
Part I describes vibration tests, in a wind tunnel, of simple airfoils and of the tail plane of an M0-1 airplane model; it also describes the air flow about this model. From these tests are drawn inferences as to the cause and cure of aerodynamic wing vibrations. Part II derives stability criteria for wing vibrations in pitch and roll, and gives design rules to obviate instability. Part III shows how to design spars to flex equally under a given wing loading and thereby economically minimize the twisting in pitch that permits cumulative flutter. Resonant flutter is not likely to ensue from turbulence of air flow along past wings and tail planes in usual flying conditions. To be flutterproof a wing must be void of reversible autorotation and not have its centroid far aft of its pitching axis, i. e., axis of pitching motion. Danger of flutter is minimized by so proportioning the wing's torsional resisting moment to the air pitching moment at high-speed angles that the torsional flexure is always small. (author)
Konetzka, R Tamara; Skira, Meghan M; Werner, Rachel M
2018-01-01
Pay-for-performance (P4P) programs have become a popular policy tool aimed at improving health care quality. We analyze how incentive design affects quality improvements in the nursing home setting, where several state Medicaid agencies have implemented P4P programs that vary in incentive structure. Using the Minimum Data Set and the Online Survey, Certification, and Reporting data from 2001 to 2009, we examine how the weights put on various performance measures that are tied to P4P bonuses, such as clinical outcomes, inspection deficiencies, and staffing levels, affect improvements in those measures. We find larger weights on clinical outcomes often lead to larger improvements, but small weights can lead to no improvement or worsening of some clinical outcomes. We find a qualifier for P4P eligibility based on having few or no severe inspection deficiencies is more effective at decreasing inspection deficiencies than using weights, suggesting simple rules for participation may incent larger improvement.
Emergent 1d Ising Behavior in AN Elementary Cellular Automaton Model
NASA Astrophysics Data System (ADS)
Kassebaum, Paul G.; Iannacchione, Germano S.
The fundamental nature of an evolving one-dimensional (1D) Ising model is investigated with an elementary cellular automaton (CA) simulation. The emergent CA simulation employs an ensemble of cells in one spatial dimension, each cell capable of two microstates interacting with simple nearest-neighbor rules and incorporating an external field. The behavior of the CA model provides insight into the dynamics of coupled two-state systems not expressible by exact analytical solutions. For instance, state progression graphs show the causal dynamics of a system through time in relation to the system's entropy. Unique graphical analysis techniques are introduced through difference patterns, diffusion patterns, and state progression graphs of the 1D ensemble visualizing the evolution. All analyses are consistent with the known behavior of the 1D Ising system. The CA simulation and new pattern recognition techniques are scalable (in both dimension, complexity, and size) and have many potential applications such as complex design of materials, control of agent systems, and evolutionary mechanism design.
Optimization of a matched-filter receiver for frequency hopping code acquisition in jamming
NASA Astrophysics Data System (ADS)
Pawlowski, P. R.; Polydoros, A.
A matched-filter receiver for frequency hopping (FH) code acquisition is optimized when either partial-band tone jamming or partial-band Gaussian noise jamming is present. The receiver is matched to a segment of the FH code sequence, sums hard per-channel decisions to form a test, and uses multiple tests to verify acquisition. The length of the matched filter and the number of verification tests are fixed. Optimization is then choosing thresholds to maximize performance based upon the receiver's degree of knowledge about the jammer ('side-information'). Four levels of side-information are considered, ranging from none to complete. The latter level results in a constant-false-alarm-rate (CFAR) design. At each level, performance sensitivity to threshold choice is analyzed. Robust thresholds are chosen to maximize performance as the jammer varies its power distribution, resulting in simple design rules which aid threshold selection. Performance results, which show that optimum distributions for the jammer power over the total FH bandwidth exist, are presented.
Zhang, Ziyu; Yuan, Lang; Lee, Peter D; Jones, Eric; Jones, Julian R
2014-01-01
Bone augmentation implants are porous to allow cellular growth, bone formation and fixation. However, the design of the pores is currently based on simple empirical rules, such as minimum pore and interconnects sizes. We present a three-dimensional (3D) transient model of cellular growth based on the Navier–Stokes equations that simulates the body fluid flow and stimulation of bone precursor cellular growth, attachment, and proliferation as a function of local flow shear stress. The model's effectiveness is demonstrated for two additive manufactured (AM) titanium scaffold architectures. The results demonstrate that there is a complex interaction of flow rate and strut architecture, resulting in partially randomized structures having a preferential impact on stimulating cell migration in 3D porous structures for higher flow rates. This novel result demonstrates the potential new insights that can be gained via the modeling tool developed, and how the model can be used to perform what-if simulations to design AM structures to specific functional requirements. PMID:24664988
NASA Technical Reports Server (NTRS)
Hendricks, R. C.; Braun, M. J.; Mullen, R. L.
1986-01-01
In systems where the design inlet and outlet pressures P sub amb are maintained above the thermodynamic critical pressure P sub c, it is often assumed that heat and mass transfer are governed by single-phase relations and that two-phase flows cannot occur. This simple rule of thumb is adequate in many low-power designs but is inadequate for high-performance turbomachines, boilers, and other systems where two-phase regions can exist even though P sub amb P sub c. Heat and mass transfer and rotordynamic-fluid-mechanic restoring forces depend on momentum differences, and those for a two-phase zone can differ significantly from those for a single-phase zone. By using a laminar, variable-property bearing code and a rotating boiler code, pressure and temperature surfaces were determined that illustrate nesting of a two-phase region within a supercritical pressure region. The method of corresponding states is applied to bearings with reasonable rapport.
NASA Technical Reports Server (NTRS)
Hendricks, R. C.; Braun, M. J.; Mullen, R. L.
1986-01-01
In systems where the design inlet and outlet pressure P sub amb are maintained above the thermodynamic critical pressure P sub c, it is often assumed that heat and mass transfer are governed by single-phase relations and that two-phase flows cannot occur. This simple rule of thumb is adequate in many low-power designs but is inadequate for high-performance turbomachines, boilers, and other systems where two-phase regions can exist even though P sub amb P sub c. Heat and mass transfer and rotordynamic-fluid-mechanic restoring forces depend on momentum differences, and those for a two-phase zone can differ significantly from those for a single-phase zone. By using a laminar, variable-property bearing code and a rotating boiler code, pressure and temperature surfaces were determined that illustrate nesting of a two-phase region within a supercritical pressure region. The method of corresponding states is applied to bearings with reasonable rapport.
Konetzka, R. Tamara; Skira, Meghan M.; Werner, Rachel M.
2017-01-01
Pay-for-performance (P4P) programs have become a popular policy tool aimed at improving health care quality. We analyze how incentive design affects quality improvements in the nursing home setting, where several state Medicaid agencies have implemented P4P programs that vary in incentive structure. Using the Minimum Data Set and the Online Survey, Certification, and Reporting data from 2001 to 2009, we examine how the weights put on various performance measures that are tied to P4P bonuses, such as clinical outcomes, inspection deficiencies, and staffing levels, affect improvements in those measures. We find larger weights on clinical outcomes often lead to larger improvements, but small weights can lead to no improvement or worsening of some clinical outcomes. We find a qualifier for P4P eligibility based on having few or no severe inspection deficiencies is more effective at decreasing inspection deficiencies than using weights, suggesting simple rules for participation may incent larger improvement. PMID:29594189
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-16
... be made in a nondiscriminatory fashion.\\14\\ \\14\\ See NYSE Arca Equities Rule 7.45(d)(3). NYSE Arca... Securities will be required to establish and enforce policies and procedures that are reasonably designed to... other things, that the rules of a national securities exchange be designed to prevent fraudulent and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-26
...-Regulatory Organizations; NYSE Arca, Inc.; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change Proposing a Pilot Program To Create a Lead Market Maker Issuer Incentive Program for...'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to create and implement, on a pilot basis, a...
Learning CAD at University through Summaries of the Rules of Design Intent
ERIC Educational Resources Information Center
Barbero, Basilio Ramos; Pedrosa, Carlos Melgosa; Samperio, Raúl Zamora
2017-01-01
The ease with which 3D CAD models may be modified and reused are two key aspects that improve the design-intent variable and that can significantly shorten the development timelines of a product. A set of rules are gathered from various authors that take different 3D modelling strategies into account. These rules are then applied to CAD…
Basis of the tubesheet heat exchanger design rules used in the French pressure vessel code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osweiller, F.
1992-02-01
For about 40 years most tubessheet exchangers have been designed according to the standards of TEMA. Partly due to their simplicity, these rules do not assure a safe heat-exchanger design in all cases. This is the main reason why new tubesheet design rules were developed in 1981 in France for the French pressure vessel code CODAP. For fixed tubesheet heat exchangers, the new rules account for the elastic rotational restraint of the shell and channel at the outer edge of the tubesheet, as proposed in 1959 by Galletly. For floating-head and U-tube heat exchangers, the approach developed by Gardner inmore » 1969 was selected with some modifications. In both cases, the tubesheet is replaced by an equivalent solid plate with adequate effective elastic constants, and the tube bundle is simulated by an elastic foundation. The elastic restraint at the edge of the tubesheet due the shell and channel is accounted for in different ways in the two types of heat exchangers. The purpose of the paper is to present the main basis of these rules and to compare them to TEMA rules.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-03
... Proposed Rule Change Moving the Rule Text That Provides for Pegging on the Exchange From Supplementary Material .26 of NYSE Rule 70 to NYSE Rule 13 and Amending Such Text to (i) Permit Designated Market Maker... of the Terms of Substance of the Proposed Rule Change The Exchange proposes to move the rule text...
Venice Park landfill: Working with the community
DOE Office of Scientific and Technical Information (OSTI.GOV)
McAdams, C.L.
1993-09-01
Venice Park landfill was one of the first sites to be permitted under Michigan's proposed Public Act 641. PA 641 essentially changed the rules and regulations for landfills from the simple design of digging a hole and filling it. It also upgraded standards to those that are more sophisticated, including liners, leachate collection systems, and gas extraction systems. In 1992, methane gas from the landfill was collected into wells drilled into the trash varying in depth from 30-50 feet in depth. A vacuum pulls the gas from the trash into the wells, then through a piping system. The landfill usesmore » about 80-100 kilowatts in-house. The remainder of the gas is sold to Consumers Power Co. which uses landfill gas to supply power to homes.« less
DNA sequence-directed shape change of photopatterned hydrogels via high-degree swelling
NASA Astrophysics Data System (ADS)
Cangialosi, Angelo; Yoon, ChangKyu; Liu, Jiayu; Huang, Qi; Guo, Jingkai; Nguyen, Thao D.; Gracias, David H.; Schulman, Rebecca
2017-09-01
Shape-changing hydrogels that can bend, twist, or actuate in response to external stimuli are critical to soft robots, programmable matter, and smart medicine. Shape change in hydrogels has been induced by global cues, including temperature, light, or pH. Here we demonstrate that specific DNA molecules can induce 100-fold volumetric hydrogel expansion by successive extension of cross-links. We photopattern up to centimeter-sized gels containing multiple domains that undergo different shape changes in response to different DNA sequences. Experiments and simulations suggest a simple design rule for controlled shape change. Because DNA molecules can be coupled to molecular sensors, amplifiers, and logic circuits, this strategy introduces the possibility of building soft devices that respond to diverse biochemical inputs and autonomously implement chemical control programs.
META II Complex Systems Design and Analysis (CODA)
2011-08-01
37 3.8.7 Variables, Parameters and Constraints ............................................................. 37 3.8.8 Objective...18 Figure 7: Inputs, States, Outputs and Parameters of System Requirements Specifications ......... 19...Design Rule Based on Device Parameter ....................................................... 57 Figure 35: AEE Device Design Rules (excerpt
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-14
... Proposed Rule Change To Modify the Requirements To Qualify for Credits as a Designated Liquidity Provider... requirements to qualify for credits as a designated liquidity provider under Rule 7018(i) and to make a minor... Designated Liquidity Providers: Charge to Designated Liquidity Provider $0.003 per share executed entering...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-30
...., wish to apply these airworthiness design standards to other airplane models, OHA, Inc. must submit a... affects only certain airworthiness design standards on Cessna model C172I, C172K, C172L, C172M airplanes... Design Standards for Acceptance Under the Primary Category Rule; Orlando Helicopter Airways (OHA), Inc...
ERIC Educational Resources Information Center
Sewell, Julia H.
1983-01-01
Students with undetected color blindness can have problems with specific teaching methods and materials. The problem should be ruled out in children with suspected learning disabilities and taken into account in career counseling. Nine examples of simple classroom modifications are described. (CL)
Airport noise summary, 2000-2002
DOT National Transportation Integrated Search
2002-01-01
This 20002002 edition of the NBAA Airport Noise Summary : shows those airports with noise advisories or rules. These restrictions : range from a simple avoid overflight of school 2 NM south of 09 : approach to a specific decibel level requir...
Heads Up, Shoulders Straight, Stick and Twirl Together
ERIC Educational Resources Information Center
Warrick, James
1977-01-01
With so many roles to juggle and so many complex music problems to resolve, some marching band directors overlook simple rules of thumb to increase their bands' visual and musical impact. Here are some guidelines. (Author/RK)
All-loop Mondrian diagrammatics and 4-particle amplituhedron
NASA Astrophysics Data System (ADS)
An, Yang; Li, Yi; Li, Zhinan; Rao, Junjie
2018-06-01
Based on 1712.09990 which handles the 4-particle amplituhedron at 3-loop, we have found an extremely simple pattern, yet far more non-trivial than one might naturally expect: the all-loop Mondrian diagrammatics. By further simplifying and rephrasing the key relation of positivity in the amplituhedron setting, remarkably, we find a completeness relation unifying all diagrams of the Mondrian types for the 4-particle integrand of planar N = 4 SYM to all loop orders, each of which can be mapped to a simple product following a few plain rules designed for this relation. The explicit examples we investigate span from 3-loop to 7-loop order, and based on them, we classify the basic patterns of Mondrian diagrams into four types: the ladder, cross, brick-wall and spiral patterns. Interestingly, for some special combinations of ordered subspaces (a concept defined in the previous work), we find failed exceptions of the completeness relation which are called "anomalies", nevertheless, they substantially give hints on the all-loop recursive proof of this relation. These investigations are closely related to the combinatoric knowledge of separable permutations and Schröder numbers, and go even further from a diagrammatic perspective. For physical relevance, we need to further consider dual conformal invariance for two basic diagrammatic patterns to correct the numerator for a local integrand involving one or both of such patterns, while the denominator encoding its pole structure and also the sign factor, are already fixed by rules of the completeness relation. With this extra treatment to ensure the integrals are dual conformally invariant, each Mondrian diagram can be exactly translated to its corresponding physical loop integrand after being summed over all ordered subspaces that admit it.
Scheduling Software for Complex Scenarios
NASA Technical Reports Server (NTRS)
2006-01-01
Preparing a vehicle and its payload for a single launch is a complex process that involves thousands of operations. Because the equipment and facilities required to carry out these operations are extremely expensive and limited in number, optimal assignment and efficient use are critically important. Overlapping missions that compete for the same resources, ground rules, safety requirements, and the unique needs of processing vehicles and payloads destined for space impose numerous constraints that, when combined, require advanced scheduling. Traditional scheduling systems use simple algorithms and criteria when selecting activities and assigning resources and times to each activity. Schedules generated by these simple decision rules are, however, frequently far from optimal. To resolve mission-critical scheduling issues and predict possible problem areas, NASA historically relied upon expert human schedulers who used their judgment and experience to determine where things should happen, whether they will happen on time, and whether the requested resources are truly necessary.
A Framework of Simple Event Detection in Surveillance Video
NASA Astrophysics Data System (ADS)
Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao
Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.
Big Bang Day : The Great Big Particle Adventure - 3. Origins
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nucleimore » existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe« less
Zweig-rule-satisfying inelastic rescattering in B decays to pseudoscalar mesons
NASA Astrophysics Data System (ADS)
Łach, P.; Żenczykowski, P.
2002-09-01
We discuss all contributions from Zweig-rule-satisfying SU(3)-symmetric inelastic final state interaction (FSI)-induced corrections in B decays to ππ, πK, KK¯, πη(η'), and Kη(η'). We show how all of these FSI corrections lead to a simple redefinition of the amplitudes, permitting the use of a simple diagram-based description, in which, however, weak phases may enter in a modified way. The inclusion of FSI corrections admitted by the present data allows an arbitrary relative phase between the penguin and tree short-distance amplitudes. The FSI-induced error of the method, in which the value of the weak phase γ is to be determined by combining future results from B+,B0d,B0s decays to Kπ, is estimated to be of the order of 5° for γ~50°-60°.
Simple spatial scaling rules behind complex cities.
Li, Ruiqi; Dong, Lei; Zhang, Jiang; Wang, Xinran; Wang, Wen-Xu; Di, Zengru; Stanley, H Eugene
2017-11-28
Although most of wealth and innovation have been the result of human interaction and cooperation, we are not yet able to quantitatively predict the spatial distributions of three main elements of cities: population, roads, and socioeconomic interactions. By a simple model mainly based on spatial attraction and matching growth mechanisms, we reveal that the spatial scaling rules of these three elements are in a consistent framework, which allows us to use any single observation to infer the others. All numerical and theoretical results are consistent with empirical data from ten representative cities. In addition, our model can also provide a general explanation of the origins of the universal super- and sub-linear aggregate scaling laws and accurately predict kilometre-level socioeconomic activity. Our work opens a new avenue for uncovering the evolution of cities in terms of the interplay among urban elements, and it has a broad range of applications.
Exact semiclassical expansions for one-dimensional quantum oscillators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delabaere, E.; Dillinger, H.; Pham, F.
1997-12-01
A set of rules is given for dealing with WKB expansions in the one-dimensional analytic case, whereby such expansions are not considered as approximations but as exact encodings of wave functions, thus allowing for analytic continuation with respect to whichever parameters the potential function depends on, with an exact control of small exponential effects. These rules, which include also the case when there are double turning points, are illustrated on various examples, and applied to the study of bound state or resonance spectra. In the case of simple oscillators, it is thus shown that the Rayleigh{endash}Schr{umlt o}dinger series is Borelmore » resummable, yielding the exact energy levels. In the case of the symmetrical anharmonic oscillator, one gets a simple and rigorous justification of the Zinn-Justin quantization condition, and of its solution in terms of {open_quotes}multi-instanton expansions.{close_quotes} {copyright} {ital 1997 American Institute of Physics.}« less
Cabrera, Derek; Colosi, Laura; Lobdell, Claire
2008-08-01
Evaluation is one of many fields where "systems thinking" is popular and is said to hold great promise. However, there is disagreement about what constitutes systems thinking. Its meaning is ambiguous, and systems scholars have made diverse and divergent attempts to describe it. Alternative origins include: von Bertalanffy, Aristotle, Lao Tsu or multiple aperiodic "waves." Some scholars describe it as synonymous with systems sciences (i.e., nonlinear dynamics, complexity, chaos). Others view it as taxonomy-a laundry list of systems approaches. Within so much noise, it is often difficult for evaluators to find the systems thinking signal. Recent work in systems thinking describes it as an emergent property of four simple conceptual patterns (rules). For an evaluator to become a "systems thinker", he or she need not spend years learning many methods or nonlinear sciences. Instead, with some practice, one can learn to apply these four simple rules to existing evaluation knowledge with transformative results.
Bendability optimization of flexible optical nanoelectronics via neutral axis engineering
2012-01-01
The enhancement of bendability of flexible nanoelectronics is critically important to realize future portable and wearable nanoelectronics for personal and military purposes. Because there is an enormous variety of materials and structures that are used for flexible nanoelectronic devices, a governing design rule for optimizing the bendability of these nanodevices is required. In this article, we suggest a design rule to optimize the bendability of flexible nanoelectronics through neutral axis (NA) engineering. In flexible optical nanoelectronics, transparent electrodes such as indium tin oxide (ITO) are usually the most fragile under an external load because of their brittleness. Therefore, we representatively focus on the bendability of ITO which has been widely used as transparent electrodes, and the NA is controlled by employing a buffer layer on the ITO layer. First, we independently investigate the effect of the thickness and elastic modulus of a buffer layer on the bendability of an ITO film. Then, we develop a design rule for the bendability optimization of flexible optical nanoelectronics. Because NA is determined by considering both the thickness and elastic modulus of a buffer layer, the design rule is conceived to be applicable regardless of the material and thickness that are used for the buffer layer. Finally, our design rule is applied to optimize the bendability of an organic solar cell, which allows the bending radius to reach about 1 mm. Our design rule is thus expected to provide a great strategy to enhance the bending performance of a variety of flexible nanoelectronics. PMID:22587757
Bendability optimization of flexible optical nanoelectronics via neutral axis engineering.
Lee, Sangmin; Kwon, Jang-Yeon; Yoon, Daesung; Cho, Handong; You, Jinho; Kang, Yong Tae; Choi, Dukhyun; Hwang, Woonbong
2012-05-15
The enhancement of bendability of flexible nanoelectronics is critically important to realize future portable and wearable nanoelectronics for personal and military purposes. Because there is an enormous variety of materials and structures that are used for flexible nanoelectronic devices, a governing design rule for optimizing the bendability of these nanodevices is required. In this article, we suggest a design rule to optimize the bendability of flexible nanoelectronics through neutral axis (NA) engineering. In flexible optical nanoelectronics, transparent electrodes such as indium tin oxide (ITO) are usually the most fragile under an external load because of their brittleness. Therefore, we representatively focus on the bendability of ITO which has been widely used as transparent electrodes, and the NA is controlled by employing a buffer layer on the ITO layer. First, we independently investigate the effect of the thickness and elastic modulus of a buffer layer on the bendability of an ITO film. Then, we develop a design rule for the bendability optimization of flexible optical nanoelectronics. Because NA is determined by considering both the thickness and elastic modulus of a buffer layer, the design rule is conceived to be applicable regardless of the material and thickness that are used for the buffer layer. Finally, our design rule is applied to optimize the bendability of an organic solar cell, which allows the bending radius to reach about 1 mm. Our design rule is thus expected to provide a great strategy to enhance the bending performance of a variety of flexible nanoelectronics.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-22
...-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Designation of a Longer Period for Commission Action on a Proposed Rule Change Relating to Wash Sale Transactions and FINRA Rule...-4 thereunder,\\2\\ a proposed rule change to amend FINRA Rule 5210. The proposed rule change was...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2010 CFR
2010-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2011 CFR
2011-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
14 CFR 93.121 - Applicability.
Code of Federal Regulations, 2013 CFR
2013-01-01
... AND GENERAL OPERATING RULES SPECIAL AIR TRAFFIC RULES High Density Traffic Airports § 93.121 Applicability. This subpart designates high density traffic airports and prescribes air traffic rules for...
NASA Technical Reports Server (NTRS)
Haley, Paul
1991-01-01
The C Language Integrated Production System (CLIPS) cannot effectively perform sound and complete logical inference in most real-world contexts. The problem facing CLIPS is its lack of goal generation. Without automatic goal generation and maintenance, forward chaining can only deduce all instances of a relationship. Backward chaining, which requires goal generation, allows deduction of only that subset of what is logically true which is also relevant to ongoing problem solving. Goal generation can be mimicked in simple cases using forward chaining. However, such mimicry requires manual coding of additional rules which can assert an inadequate goal representation for every condition in every rule that can have corresponding facts derived by backward chaining. In general, for N rules with an average of M conditions per rule the number of goal generation rules required is on the order of N*M. This is clearly intractable from a program maintenance perspective. We describe the support in Eclipse for backward chaining which it automatically asserts as it checks rule conditions. Important characteristics of this extension are that it does not assert goals which cannot match any rule conditions, that 2 equivalent goals are never asserted, and that goals persist as long as, but no longer than, they remain relevant.
Transition sum rules in the shell model
NASA Astrophysics Data System (ADS)
Lu, Yi; Johnson, Calvin W.
2018-03-01
An important characterization of electromagnetic and weak transitions in atomic nuclei are sum rules. We focus on the non-energy-weighted sum rule (NEWSR), or total strength, and the energy-weighted sum rule (EWSR); the ratio of the EWSR to the NEWSR is the centroid or average energy of transition strengths from an nuclear initial state to all allowed final states. These sum rules can be expressed as expectation values of operators, which in the case of the EWSR is a double commutator. While most prior applications of the double commutator have been to special cases, we derive general formulas for matrix elements of both operators in a shell model framework (occupation space), given the input matrix elements for the nuclear Hamiltonian and for the transition operator. With these new formulas, we easily evaluate centroids of transition strength functions, with no need to calculate daughter states. We apply this simple tool to a number of nuclides and demonstrate the sum rules follow smooth secular behavior as a function of initial energy, as well as compare the electric dipole (E 1 ) sum rule against the famous Thomas-Reiche-Kuhn version. We also find surprising systematic behaviors for ground-state electric quadrupole (E 2 ) centroids in the s d shell.
Route choice in mountain navigation, Naismith's rule, and the equivalence of distance and climb.
Scarf, Philip
2007-04-01
In this paper, I consider decision making about routes in mountain navigation. In particular, I discuss Naismith's rule, a method of calculating journey times in mountainous terrain, and its use for route choice. The rule is essentially concerned with the equivalence, in terms of time duration, between climb or ascent and distance travelled. Naismith himself described a rule that is purported to be based on trigonometry and simple assumptions about rate of ascent; his rule with regard to hill-walking implies that 1 m of ascent is equivalent to 7.92 m of horizontal travel (1:7.92). The analysis of data on fell running records presented here supports Naismith's rule and it is recommended that male runners and walkers use a 1:8 equivalence ratio and females a 1:10 ratio. The present findings are contrasted with those based on the analysis of data relating to treadmill running experiments (1:3.3), and with those based on the analysis of times for a mountain road-relay (1:4.4). Analysis of cycling data suggests a similar rule (1:8.2) for cycling on mountainous roads and tracks.
The relevance of a rules-based maize marketing policy: an experimental case study of Zambia.
Abbink, Klaus; Jayne, Thomas S; Moller, Lars C
2011-01-01
Strategic interaction between public and private actors is increasingly recognised as an important determinant of agricultural market performance in Africa and elsewhere. Trust and consultation tends to positively affect private activity while uncertainty of government behaviour impedes it. This paper reports on a laboratory experiment based on a stylised model of the Zambian maize market. The experiment facilitates a comparison between discretionary interventionism and a rules-based policy in which the government pre-commits itself to a future course of action. A simple precommitment rule can, in theory, overcome the prevailing strategic dilemma by encouraging private sector participation. Although this result is also borne out in the economic experiment, the improvement in private sector activity is surprisingly small and not statistically significant due to irrationally cautious choices by experimental governments. Encouragingly, a rules-based policy promotes a much more stable market outcome thereby substantially reducing the risk of severe food shortages. These results underscore the importance of predictable and transparent rules for the state's involvement in agricultural markets.
NASA Technical Reports Server (NTRS)
Berk, A.; Temkin, A.
1985-01-01
A sum rule is derived for the auxiliary eigenvalues of an equation whose eigenspectrum pertains to projection operators which describe electron scattering from multielectron atoms and ions. The sum rule's right-hand side depends on an integral involving the target system eigenfunctions. The sum rule is checked for several approximations of the two-electron target. It is shown that target functions which have a unit eigenvalue in their auxiliary eigenspectrum do not give rise to well-defined projection operators except through a limiting process. For Hylleraas target approximations, the auxiliary equations are shown to contain an infinite spectrum. However, using a Rayleigh-Ritz variational principle, it is shown that a comparatively simple aproximation can exhaust the sum rule to better than five significant figures. The auxiliary Hylleraas equation is greatly simplified by conversion to a square root equation containing the same eigenfunction spectrum and from which the required eigenvalues are trivially recovered by squaring.
Mock jury trials in Taiwan--paving the ground for introducing lay participation.
Huang, Kuo-Chang; Lin, Chang-Ching
2014-08-01
The first mock jury study in Taiwan, in which 279 community members watched a videotaped trial, investigated how jurors' estimates of the relative undesirability of wrongful conviction versus wrongful acquittal predicted individual decisions and how decision rules affected outcomes. The percentage of jurors who viewed wrongful conviction as more undesirable increased from 50.9% to 60.9% after deliberation and jurors' postdeliberation acquittal rate (71.7%) was higher than predeliberation acquittal rate (58.8%). Jurors' estimates of the undesirability of wrongful conviction were not correlated with their predeliberation votes but became positively correlated with their postdeliberation decisions. The unanimous rule facilitated jurors' change of vote, predominantly from conviction to acquittal, than the simple majority rule. Jurors reaching a verdict under the unanimous rule viewed deliberation and the verdict more positively. This study indicates that deliberation can ameliorate the problem of most Taiwanese citizens not viewing wrongful conviction as more undesirable than wrongful acquittal. It also suggests that Taiwan should adopt a unanimous rule for its proposed lay participation system.
Zhang, Wenyu; Zhang, Zhenjiang
2015-01-01
Decision fusion in sensor networks enables sensors to improve classification accuracy while reducing the energy consumption and bandwidth demand for data transmission. In this paper, we focus on the decentralized multi-class classification fusion problem in wireless sensor networks (WSNs) and a new simple but effective decision fusion rule based on belief function theory is proposed. Unlike existing belief function based decision fusion schemes, the proposed approach is compatible with any type of classifier because the basic belief assignments (BBAs) of each sensor are constructed on the basis of the classifier’s training output confusion matrix and real-time observations. We also derive explicit global BBA in the fusion center under Dempster’s combinational rule, making the decision making operation in the fusion center greatly simplified. Also, sending the whole BBA structure to the fusion center is avoided. Experimental results demonstrate that the proposed fusion rule has better performance in fusion accuracy compared with the naïve Bayes rule and weighted majority voting rule. PMID:26295399
Caccavale, Justin; Fiumara, David; Stapf, Michael; Sweitzer, Liedeke; Anderson, Hannah J; Gorky, Jonathan; Dhurjati, Prasad; Galileo, Deni S
2017-12-11
Glioblastoma multiforme (GBM) is a devastating brain cancer for which there is no known cure. Its malignancy is due to rapid cell division along with high motility and invasiveness of cells into the brain tissue. Simple 2-dimensional laboratory assays (e.g., a scratch assay) commonly are used to measure the effects of various experimental perturbations, such as treatment with chemical inhibitors. Several mathematical models have been developed to aid the understanding of the motile behavior and proliferation of GBM cells. However, many are mathematically complicated, look at multiple interdependent phenomena, and/or use modeling software not freely available to the research community. These attributes make the adoption of models and simulations of even simple 2-dimensional cell behavior an uncommon practice by cancer cell biologists. Herein, we developed an accurate, yet simple, rule-based modeling framework to describe the in vitro behavior of GBM cells that are stimulated by the L1CAM protein using freely available NetLogo software. In our model L1CAM is released by cells to act through two cell surface receptors and a point of signaling convergence to increase cell motility and proliferation. A simple graphical interface is provided so that changes can be made easily to several parameters controlling cell behavior, and behavior of the cells is viewed both pictorially and with dedicated graphs. We fully describe the hierarchical rule-based modeling framework, show simulation results under several settings, describe the accuracy compared to experimental data, and discuss the potential usefulness for predicting future experimental outcomes and for use as a teaching tool for cell biology students. It is concluded that this simple modeling framework and its simulations accurately reflect much of the GBM cell motility behavior observed experimentally in vitro in the laboratory. Our framework can be modified easily to suit the needs of investigators interested in other similar intrinsic or extrinsic stimuli that influence cancer or other cell behavior. This modeling framework of a commonly used experimental motility assay (scratch assay) should be useful to both researchers of cell motility and students in a cell biology teaching laboratory.
Hyper-heuristic Evolution of Dispatching Rules: A Comparison of Rule Representations.
Branke, Jürgen; Hildebrandt, Torsten; Scholz-Reiter, Bernd
2015-01-01
Dispatching rules are frequently used for real-time, online scheduling in complex manufacturing systems. Design of such rules is usually done by experts in a time consuming trial-and-error process. Recently, evolutionary algorithms have been proposed to automate the design process. There are several possibilities to represent rules for this hyper-heuristic search. Because the representation determines the search neighborhood and the complexity of the rules that can be evolved, a suitable choice of representation is key for a successful evolutionary algorithm. In this paper we empirically compare three different representations, both numeric and symbolic, for automated rule design: A linear combination of attributes, a representation based on artificial neural networks, and a tree representation. Using appropriate evolutionary algorithms (CMA-ES for the neural network and linear representations, genetic programming for the tree representation), we empirically investigate the suitability of each representation in a dynamic stochastic job shop scenario. We also examine the robustness of the evolved dispatching rules against variations in the underlying job shop scenario, and visualize what the rules do, in order to get an intuitive understanding of their inner workings. Results indicate that the tree representation using an improved version of genetic programming gives the best results if many candidate rules can be evaluated, closely followed by the neural network representation that already leads to good results for small to moderate computational budgets. The linear representation is found to be competitive only for extremely small computational budgets.
78 FR 67467 - Registration of Municipal Advisors
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-12
... the Exchange Act. These rules and forms are designed to give effect to provisions of Title IX of the... ``investment strategies'' in the final rule is designed to address the main concerns raised by these commenters... state, and provide tax advantages designed to encourage saving for future college costs.\\54\\ 529 Savings...
An information theory account of late frontoparietal ERP positivities in cognitive control.
Barceló, Francisco; Cooper, Patrick S
2018-03-01
ERP research on task switching has revealed distinct transient and sustained positive waveforms (latency circa 300-900 ms) while shifting task rules or stimulus-response (S-R) mappings. However, it remains unclear whether such switch-related positivities show similar scalp topography and index context-updating mechanisms akin to those posed for domain-general (i.e., classic P300) positivities in many task domains. To examine this question, ERPs were recorded from 31 young adults (18-30 years) while they were intermittently cued to switch or repeat their perceptual categorization of Gabor gratings varying in color and thickness (switch task), or else they performed two visually identical control tasks (go/no-go and oddball). Our task cueing paradigm examined two temporarily distinct stages of proactive rule updating and reactive rule execution. A simple information theory model helped us gauge cognitive demands under distinct temporal and task contexts in terms of low-level S-R pathways and higher-order rule updating operations. Task demands modulated domain-general (indexed by classic oddball P3) and switch positivities-indexed by both a cue-locked late positive complex and a sustained positivity ensuing task transitions. Topographic scalp analyses confirmed subtle yet significant split-second changes in the configuration of neural sources for both domain-general P3s and switch positivities as a function of both the temporal and task context. These findings partly meet predictions from information estimates, and are compatible with a family of P3-like potentials indexing functionally distinct neural operations within a common frontoparietal "multiple demand" system during the preparation and execution of simple task rules. © 2016 Society for Psychophysiological Research.
Development of clinical decision rules to predict recurrent shock in dengue
2013-01-01
Introduction Mortality from dengue infection is mostly due to shock. Among dengue patients with shock, approximately 30% have recurrent shock that requires a treatment change. Here, we report development of a clinical rule for use during a patient’s first shock episode to predict a recurrent shock episode. Methods The study was conducted in Center for Preventive Medicine in Vinh Long province and the Children’s Hospital No. 2 in Ho Chi Minh City, Vietnam. We included 444 dengue patients with shock, 126 of whom had recurrent shock (28%). Univariate and multivariate analyses and a preprocessing method were used to evaluate and select 14 clinical and laboratory signs recorded at shock onset. Five variables (admission day, purpura/ecchymosis, ascites/pleural effusion, blood platelet count and pulse pressure) were finally trained and validated by a 10-fold validation strategy with 10 times of repetition, using a logistic regression model. Results The results showed that shorter admission day (fewer days prior to admission), purpura/ecchymosis, ascites/pleural effusion, low platelet count and narrow pulse pressure were independently associated with recurrent shock. Our logistic prediction model was capable of predicting recurrent shock when compared to the null method (P < 0.05) and was not outperformed by other prediction models. Our final scoring rule provided relatively good accuracy (AUC, 0.73; sensitivity and specificity, 68%). Score points derived from the logistic prediction model revealed identical accuracy with AUCs at 0.73. Using a cutoff value greater than −154.5, our simple scoring rule showed a sensitivity of 68.3% and a specificity of 68.2%. Conclusions Our simple clinical rule is not to replace clinical judgment, but to help clinicians predict recurrent shock during a patient’s first dengue shock episode. PMID:24295509
Reasoning with alternative explanations in physics: The cognitive accessibility rule
NASA Astrophysics Data System (ADS)
Heckler, Andrew F.; Bogdan, Abigail M.
2018-06-01
A critical component of scientific reasoning is the consideration of alternative explanations. Recognizing that decades of cognitive psychology research have demonstrated that relative cognitive accessibility, or "what comes to mind," strongly affects how people reason in a given context, we articulate a simple "cognitive accessibility rule", namely that alternative explanations are considered less frequently when an explanation with relatively high accessibility is offered first. In a series of four experiments, we test the cognitive accessibility rule in the context of consideration of alternative explanations for six physical scenarios commonly found in introductory physics curricula. First, we administer free recall and recognition tasks to operationally establish and distinguish between the relative accessibility and availability of common explanations for the physical scenarios. Then, we offer either high or low accessibility explanations for the physical scenarios and determine the extent to which students consider alternatives to the given explanations. We find two main results consistent across algebra- and calculus-based university level introductory physics students for multiple answer formats. First, we find evidence that, at least for some contexts, most explanatory factors are cognitively available to students but not cognitively accessible. Second, we empirically verify the cognitive accessibility rule and demonstrate that the rule is strongly predictive, accounting for up to 70% of the variance of the average student consideration of alternative explanations across scenarios. Overall, we find that cognitive accessibility can help to explain biases in the consideration of alternatives in reasoning about simple physical scenarios, and these findings lend support to the growing number of science education studies demonstrating that tasks relevant to science education curricula often involve rapid, automatic, and potentially predictable processes and outcomes.
Deficits in Category Learning in Older Adults: Rule-Based Versus Clustering Accounts
2017-01-01
Memory research has long been one of the key areas of investigation for cognitive aging researchers but only in the last decade or so has categorization been used to understand age differences in cognition. Categorization tasks focus more heavily on the grouping and organization of items in memory, and often on the process of learning relationships through trial and error. Categorization studies allow researchers to more accurately characterize age differences in cognition: whether older adults show declines in the way in which they represent categories with simple rules or declines in representing categories by similarity to past examples. In the current study, young and older adults participated in a set of classic category learning problems, which allowed us to distinguish between three hypotheses: (a) rule-complexity: categories were represented exclusively with rules and older adults had differential difficulty when more complex rules were required, (b) rule-specific: categories could be represented either by rules or by similarity, and there were age deficits in using rules, and (c) clustering: similarity was mainly used and older adults constructed a less-detailed representation by lumping more items into fewer clusters. The ordinal levels of performance across different conditions argued against rule-complexity, as older adults showed greater deficits on less complex categories. The data also provided evidence against rule-specificity, as single-dimensional rules could not explain age declines. Instead, computational modeling of the data indicated that older adults utilized fewer conceptual clusters of items in memory than did young adults. PMID:28816474
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... Organizations; National Stock Exchange, Inc.; Notice of Designation of a Longer Period for Commission Action on Proposed Rule Change To Adopt a New Order Type Called the ``Auto-Ex Only'' Order March 19, 2013. On January... (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ a proposed rule change to adopt a new order type called the...
Cuskley, Christine F; Pugliese, Martina; Castellano, Claudio; Colaiori, Francesca; Loreto, Vittorio; Tria, Francesca
2014-01-01
Human languages are rule governed, but almost invariably these rules have exceptions in the form of irregularities. Since rules in language are efficient and productive, the persistence of irregularity is an anomaly. How does irregularity linger in the face of internal (endogenous) and external (exogenous) pressures to conform to a rule? Here we address this problem by taking a detailed look at simple past tense verbs in the Corpus of Historical American English. The data show that the language is open, with many new verbs entering. At the same time, existing verbs might tend to regularize or irregularize as a consequence of internal dynamics, but overall, the amount of irregularity sustained by the language stays roughly constant over time. Despite continuous vocabulary growth, and presumably, an attendant increase in expressive power, there is no corresponding growth in irregularity. We analyze the set of irregulars, showing they may adhere to a set of minority rules, allowing for increased stability of irregularity over time. These findings contribute to the debate on how language systems become rule governed, and how and why they sustain exceptions to rules, providing insight into the interplay between the emergence and maintenance of rules and exceptions in language.
Cuskley, Christine F.; Pugliese, Martina; Castellano, Claudio; Colaiori, Francesca; Loreto, Vittorio; Tria, Francesca
2014-01-01
Human languages are rule governed, but almost invariably these rules have exceptions in the form of irregularities. Since rules in language are efficient and productive, the persistence of irregularity is an anomaly. How does irregularity linger in the face of internal (endogenous) and external (exogenous) pressures to conform to a rule? Here we address this problem by taking a detailed look at simple past tense verbs in the Corpus of Historical American English. The data show that the language is open, with many new verbs entering. At the same time, existing verbs might tend to regularize or irregularize as a consequence of internal dynamics, but overall, the amount of irregularity sustained by the language stays roughly constant over time. Despite continuous vocabulary growth, and presumably, an attendant increase in expressive power, there is no corresponding growth in irregularity. We analyze the set of irregulars, showing they may adhere to a set of minority rules, allowing for increased stability of irregularity over time. These findings contribute to the debate on how language systems become rule governed, and how and why they sustain exceptions to rules, providing insight into the interplay between the emergence and maintenance of rules and exceptions in language. PMID:25084006
Webber, C J
2001-05-01
This article shows analytically that single-cell learning rules that give rise to oriented and localized receptive fields, when their synaptic weights are randomly and independently initialized according to a plausible assumption of zero prior information, will generate visual codes that are invariant under two-dimensional translations, rotations, and scale magnifications, provided that the statistics of their training images are sufficiently invariant under these transformations. Such codes span different image locations, orientations, and size scales with equal economy. Thus, single-cell rules could account for the spatial scaling property of the cortical simple-cell code. This prediction is tested computationally by training with natural scenes; it is demonstrated that a single-cell learning rule can give rise to simple-cell receptive fields spanning the full range of orientations, image locations, and spatial frequencies (except at the extreme high and low frequencies at which the scale invariance of the statistics of digitally sampled images must ultimately break down, because of the image boundary and the finite pixel resolution). Thus, no constraint on completeness, or any other coupling between cells, is necessary to induce the visual code to span wide ranges of locations, orientations, and size scales. This prediction is made using the theory of spontaneous symmetry breaking, which we have previously shown can also explain the data-driven self-organization of a wide variety of transformation invariances in neurons' responses, such as the translation invariance of complex cell response.
Liu, Zhao; Zhu, Yunhong; Wu, Chenxue
2016-01-01
Spatial-temporal k-anonymity has become a mainstream approach among techniques for protection of users’ privacy in location-based services (LBS) applications, and has been applied to several variants such as LBS snapshot queries and continuous queries. Analyzing large-scale spatial-temporal anonymity sets may benefit several LBS applications. In this paper, we propose two location prediction methods based on transition probability matrices constructing from sequential rules for spatial-temporal k-anonymity dataset. First, we define single-step sequential rules mined from sequential spatial-temporal k-anonymity datasets generated from continuous LBS queries for multiple users. We then construct transition probability matrices from mined single-step sequential rules, and normalize the transition probabilities in the transition matrices. Next, we regard a mobility model for an LBS requester as a stationary stochastic process and compute the n-step transition probability matrices by raising the normalized transition probability matrices to the power n. Furthermore, we propose two location prediction methods: rough prediction and accurate prediction. The former achieves the probabilities of arriving at target locations along simple paths those include only current locations, target locations and transition steps. By iteratively combining the probabilities for simple paths with n steps and the probabilities for detailed paths with n-1 steps, the latter method calculates transition probabilities for detailed paths with n steps from current locations to target locations. Finally, we conduct extensive experiments, and correctness and flexibility of our proposed algorithm have been verified. PMID:27508502
Multiagent optimization system for solving the traveling salesman problem (TSP).
Xie, Xiao-Feng; Liu, Jiming
2009-04-01
The multiagent optimization system (MAOS) is a nature-inspired method, which supports cooperative search by the self-organization of a group of compact agents situated in an environment with certain sharing public knowledge. Moreover, each agent in MAOS is an autonomous entity with personal declarative memory and behavioral components. In this paper, MAOS is refined for solving the traveling salesman problem (TSP), which is a classic hard computational problem. Based on a simplified MAOS version, in which each agent manipulates on extremely limited declarative knowledge, some simple and efficient components for solving TSP, including two improving heuristics based on a generalized edge assembly recombination, are implemented. Compared with metaheuristics in adaptive memory programming, MAOS is particularly suitable for supporting cooperative search. The experimental results on two TSP benchmark data sets show that MAOS is competitive as compared with some state-of-the-art algorithms, including the Lin-Kernighan-Helsgaun, IBGLK, PHGA, etc., although MAOS does not use any explicit local search during the runtime. The contributions of MAOS components are investigated. It indicates that certain clues can be positive for making suitable selections before time-consuming computation. More importantly, it shows that the cooperative search of agents can achieve an overall good performance with a macro rule in the switch mode, which deploys certain alternate search rules with the offline performance in negative correlations. Using simple alternate rules may prevent the high difficulty of seeking an omnipotent rule that is efficient for a large data set.
Analysis of pre-service physics teacher skills designing simple physics experiments based technology
NASA Astrophysics Data System (ADS)
Susilawati; Huda, C.; Kurniawan, W.; Masturi; Khoiri, N.
2018-03-01
Pre-service physics teacher skill in designing simple experiment set is very important in adding understanding of student concept and practicing scientific skill in laboratory. This study describes the skills of physics students in designing simple experiments based technologicall. The experimental design stages include simple tool design and sensor modification. The research method used is descriptive method with the number of research samples 25 students and 5 variations of simple physics experimental design. Based on the results of interviews and observations obtained the results of pre-service physics teacher skill analysis in designing simple experimental physics charged technology is good. Based on observation result, pre-service physics teacher skill in designing simple experiment is good while modification and sensor application are still not good. This suggests that pre-service physics teacher still need a lot of practice and do experiments in designing physics experiments using sensor modifications. Based on the interview result, it is found that students have high enough motivation to perform laboratory activities actively and students have high curiosity to be skilled at making simple practicum tool for physics experiment.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-24
... Rule 104 To Adopt Pricing Obligations for Designated Market Makers September 20, 2010. Pursuant to... of the Proposed Rule Change The Exchange proposes to amend Rule 104 to adopt pricing obligations for.... Purpose The Exchange proposes to amend Rule 104 to adopt pricing obligations for DMMs. Under the proposal...
Foxes and Rabbits - and a Spreadsheet.
ERIC Educational Resources Information Center
Carson, S. R.
1996-01-01
Presents a numerical simulation of a simple food chain together with a set of mathematical rules generalizing the model to a food web of any complexity. Discusses some of the model's interesting features and its use by students. (Author/JRH)
Di Legge, A; Testa, A C; Ameye, L; Van Calster, B; Lissoni, A A; Leone, F P G; Savelli, L; Franchi, D; Czekierdowski, A; Trio, D; Van Holsbeke, C; Ferrazzi, E; Scambia, G; Timmerman, D; Valentin, L
2012-09-01
To estimate the ability to discriminate between benign and malignant adnexal masses of different size using: subjective assessment, two International Ovarian Tumor Analysis (IOTA) logistic regression models (LR1 and LR2), the IOTA simple rules and the risk of malignancy index (RMI). We used a multicenter IOTA database of 2445 patients with at least one adnexal mass, i.e. the database previously used to prospectively validate the diagnostic performance of LR1 and LR2. The masses were categorized into three subgroups according to their largest diameter: small tumors (diameter < 4 cm; n = 396), medium-sized tumors (diameter, 4-9.9 cm; n = 1457) and large tumors (diameter ≥ 10 cm, n = 592). Subjective assessment, LR1 and LR2, IOTA simple rules and the RMI were applied to each of the three groups. Sensitivity, specificity, positive and negative likelihood ratio (LR+, LR-), diagnostic odds ratio (DOR) and area under the receiver-operating characteristics curve (AUC) were used to describe diagnostic performance. A moving window technique was applied to estimate the effect of tumor size as a continuous variable on the AUC. The reference standard was the histological diagnosis of the surgically removed adnexal mass. The frequency of invasive malignancy was 10% in small tumors, 19% in medium-sized tumors and 40% in large tumors; 11% of the large tumors were borderline tumors vs 3% and 4%, respectively, of the small and medium-sized tumors. The type of benign histology also differed among the three subgroups. For all methods, sensitivity with regard to malignancy was lowest in small tumors (56-84% vs 67-93% in medium-sized tumors and 74-95% in large tumors) while specificity was lowest in large tumors (60-87%vs 83-95% in medium-sized tumors and 83-96% in small tumors ). The DOR and the AUC value were highest in medium-sized tumors and the AUC was largest in tumors with a largest diameter of 7-11 cm. Tumor size affects the performance of subjective assessment, LR1 and LR2, the IOTA simple rules and the RMI in discriminating correctly between benign and malignant adnexal masses. The likely explanation, at least in part, is the difference in histology among tumors of different size. Copyright © 2012 ISUOG. Published by John Wiley & Sons, Ltd.
Explicit bounds for the positive root of classes of polynomials with applications
NASA Astrophysics Data System (ADS)
Herzberger, Jürgen
2003-03-01
We consider a certain type of polynomial equations for which there exists--according to Descartes' rule of signs--only one simple positive root. These equations are occurring in Numerical Analysis when calculating or estimating the R-order or Q-order of convergence of certain iterative processes with an error-recursion of special form. On the other hand, these polynomial equations are very common as defining equations for the effective rate of return for certain cashflows like bonds or annuities in finance. The effective rate of interest i* for those cashflows is i*=q*-1, where q* is the unique positive root of such polynomial. We construct bounds for i* for a special problem concerning an ordinary simple annuity which is obtained by changing the conditions of such an annuity with given data applying the German rule (Preisangabeverordnung or short PAngV). Moreover, we consider a number of results for such polynomial roots in Numerical Analysis showing that by a simple variable transformation we can derive several formulas out of earlier results by applying this transformation. The same is possible in finance in order to generalize results to more complicated cashflows.
A simple rule for the costs of vigilance: empirical evidence from a social forager.
Cowlishaw, Guy; Lawes, Michael J.; Lightbody, Margaret; Martin, Alison; Pettifor, Richard; Rowcliffe, J. Marcus
2004-01-01
It is commonly assumed that anti-predator vigilance by foraging animals is costly because it interrupts food searching and handling time, leading to a reduction in feeding rate. When food handling does not require visual attention, however, a forager may handle food while simultaneously searching for the next food item or scanning for predators. We present a simple model of this process, showing that when the length of such compatible handling time Hc is long relative to search time S, specifically Hc/S > 1, it is possible to perform vigilance without a reduction in feeding rate. We test three predictions of this model regarding the relationships between feeding rate, vigilance and the Hc/S ratio, with data collected from a wild population of social foragers (samango monkeys, Cercopithecus mitis erythrarchus). These analyses consistently support our model, including our key prediction: as Hc/S increases, the negative relationship between feeding rate and the proportion of time spent scanning becomes progressively shallower. This pattern is more strongly driven by changes in median scan duration than scan frequency. Our study thus provides a simple rule that describes the extent to which vigilance can be expected to incur a feeding rate cost. PMID:15002768
Kato, Ryuji; Nakano, Hideo; Konishi, Hiroyuki; Kato, Katsuya; Koga, Yuchi; Yamane, Tsuneo; Kobayashi, Takeshi; Honda, Hiroyuki
2005-08-19
To engineer proteins with desirable characteristics from a naturally occurring protein, high-throughput screening (HTS) combined with directed evolutional approach is the essential technology. However, most HTS techniques are simple positive screenings. The information obtained from the positive candidates is used only as results but rarely as clues for understanding the structural rules, which may explain the protein activity. In here, we have attempted to establish a novel strategy for exploring functional proteins associated with computational analysis. As a model case, we explored lipases with inverted enantioselectivity for a substrate p-nitrophenyl 3-phenylbutyrate from the wild-type lipase of Burkhorderia cepacia KWI-56, which is originally selective for (S)-configuration of the substrate. Data from our previous work on (R)-enantioselective lipase screening were applied to fuzzy neural network (FNN), bioinformatic algorithm, to extract guidelines for screening and engineering processes to be followed. FNN has an advantageous feature of extracting hidden rules that lie between sequences of variants and their enzyme activity to gain high prediction accuracy. Without any prior knowledge, FNN predicted a rule indicating that "size at position L167," among four positions (L17, F119, L167, and L266) in the substrate binding core region, is the most influential factor for obtaining lipase with inverted (R)-enantioselectivity. Based on the guidelines obtained, newly engineered novel variants, which were not found in the actual screening, were experimentally proven to gain high (R)-enantioselectivity by engineering the size at position L167. We also designed and assayed two novel variants, namely FIGV (L17F, F119I, L167G, and L266V) and FFGI (L17F, L167G, and L266I), which were compatible with the guideline obtained from FNN analysis, and confirmed that these designed lipases could acquire high inverted enantioselectivity. The results have shown that with the aid of bioinformatic analysis, high-throughput screening can expand its potential for exploring vast combinatorial sequence spaces of proteins.
Communication: The H2@C60 inelastic neutron scattering selection rule: Expanded and explained
NASA Astrophysics Data System (ADS)
Poirier, Bill
2015-09-01
Recently [M. Xu et al., J. Chem. Phys. 139, 064309 (2013)], an unexpected selection rule was discovered for the title system, contradicting the previously held belief that inelastic neutron scattering (INS) is not subject to any selection rules. Moreover, the newly predicted forbidden transitions, which emerge only in the context of coupled H2 translation-rotation (TR) dynamics, have been confirmed experimentally. However, a simple physical understanding, e.g., based on group theory, has been heretofore lacking. This is provided in the present paper, in which we (1) derive the correct symmetry group for the H2@C60 TR Hamiltonian and eigenstates; (2) complete the INS selection rule, and show that the set of forbidden transitions is actually much larger than previously believed; and (3) evaluate previous theoretical and experimental results, in light of the new findings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poirier, Bill, E-mail: Bill.Poirier@ttu.edu
Recently [M. Xu et al., J. Chem. Phys. 139, 064309 (2013)], an unexpected selection rule was discovered for the title system, contradicting the previously held belief that inelastic neutron scattering (INS) is not subject to any selection rules. Moreover, the newly predicted forbidden transitions, which emerge only in the context of coupled H{sub 2} translation-rotation (TR) dynamics, have been confirmed experimentally. However, a simple physical understanding, e.g., based on group theory, has been heretofore lacking. This is provided in the present paper, in which we (1) derive the correct symmetry group for the H{sub 2}@C{sub 60} TR Hamiltonian and eigenstates;more » (2) complete the INS selection rule, and show that the set of forbidden transitions is actually much larger than previously believed; and (3) evaluate previous theoretical and experimental results, in light of the new findings.« less
Connecting clinical and actuarial prediction with rule-based methods.
Fokkema, Marjolein; Smits, Niels; Kelderman, Henk; Penninx, Brenda W J H
2015-06-01
Meta-analyses comparing the accuracy of clinical versus actuarial prediction have shown actuarial methods to outperform clinical methods, on average. However, actuarial methods are still not widely used in clinical practice, and there has been a call for the development of actuarial prediction methods for clinical practice. We argue that rule-based methods may be more useful than the linear main effect models usually employed in prediction studies, from a data and decision analytic as well as a practical perspective. In addition, decision rules derived with rule-based methods can be represented as fast and frugal trees, which, unlike main effects models, can be used in a sequential fashion, reducing the number of cues that have to be evaluated before making a prediction. We illustrate the usability of rule-based methods by applying RuleFit, an algorithm for deriving decision rules for classification and regression problems, to a dataset on prediction of the course of depressive and anxiety disorders from Penninx et al. (2011). The RuleFit algorithm provided a model consisting of 2 simple decision rules, requiring evaluation of only 2 to 4 cues. Predictive accuracy of the 2-rule model was very similar to that of a logistic regression model incorporating 20 predictor variables, originally applied to the dataset. In addition, the 2-rule model required, on average, evaluation of only 3 cues. Therefore, the RuleFit algorithm appears to be a promising method for creating decision tools that are less time consuming and easier to apply in psychological practice, and with accuracy comparable to traditional actuarial methods. (c) 2015 APA, all rights reserved).
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-27
... Proposed Rule Change to Offer Risk Management Tools Designed to Allow Member Organizations to Monitor and... of the Proposed Rule Change The Exchange proposes to offer risk management tools designed to allow... risk management tools designed to allow member organizations to monitor and address exposure to risk...
77 FR 21161 - National Forest System Land Management Planning
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-09
... ecosystem services and multiple uses. The planning rule is designed to ensure that plans provide for the... adaptive and science-based, engages the public, and is designed to be efficient, effective, and within the..., the new rule is designed to make planning more efficient and effective. Purpose and Need for the New...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-21
... Habitat for Ivesia webberi (Webber's ivesia) AGENCY: Fish and Wildlife Service, Interior. ACTION: Proposed... dates published in the August 2, 2013, proposed rule to designate critical habitat for Ivesia webberi... rule to designate critical habitat for Ivesia webberi, we included the wrong date for the public...
Optimal Test Design with Rule-Based Item Generation
ERIC Educational Resources Information Center
Geerlings, Hanneke; van der Linden, Wim J.; Glas, Cees A. W.
2013-01-01
Optimal test-design methods are applied to rule-based item generation. Three different cases of automated test design are presented: (a) test assembly from a pool of pregenerated, calibrated items; (b) test generation on the fly from a pool of calibrated item families; and (c) test generation on the fly directly from calibrated features defining…
Design space exploration for early identification of yield limiting patterns
NASA Astrophysics Data System (ADS)
Li, Helen; Zou, Elain; Lee, Robben; Hong, Sid; Liu, Square; Wang, JinYan; Du, Chunshan; Zhang, Recco; Madkour, Kareem; Ali, Hussein; Hsu, Danny; Kabeel, Aliaa; ElManhawy, Wael; Kwan, Joe
2016-03-01
In order to resolve the causality dilemma of which comes first, accurate design rules or real designs, this paper presents a flow for exploration of the layout design space to early identify problematic patterns that will negatively affect the yield. A new random layout generating method called Layout Schema Generator (LSG) is reported in this paper, this method generates realistic design-like layouts without any design rule violation. Lithography simulation is then used on the generated layout to discover the potentially problematic patterns (hotspots). These hotspot patterns are further explored by randomly inducing feature and context variations to these identified hotspots through a flow called Hotspot variation Flow (HSV). Simulation is then performed on these expanded set of layout clips to further identify more problematic patterns. These patterns are then classified into design forbidden patterns that should be included in the design rule checker and legal patterns that need better handling in the RET recipes and processes.
Failure detection system design methodology. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chow, E. Y.
1980-01-01
The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.
ARROWSMITH-P: A prototype expert system for software engineering management
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Ramsey, Connie Loggia
1985-01-01
Although the field of software engineering is relatively new, it can benefit from the use of expert systems. Two prototype expert systems were developed to aid in software engineering management. Given the values for certain metrics, these systems will provide interpretations which explain any abnormal patterns of these values during the development of a software project. The two systems, which solve the same problem, were built using different methods, rule-based deduction and frame-based abduction. A comparison was done to see which method was better suited to the needs of this field. It was found that both systems performed moderately well, but the rule-based deduction system using simple rules provided more complete solutions than did the frame-based abduction system.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-04
... Moving the Rule Text That Provides for Pegging on the Exchange From Supplementary Material .26 of Rule 70--Equities to Rule 13--Equities and Amending Such Text to (i) Permit Designated Market Maker Interest To Be... Proposed Rule Change The Exchange proposes to move the rule text that provides for pegging on the Exchange...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-03
... Proposed Rule Change Amending Commentary .07 to NYSE Amex Options Rule 904 To Eliminate Position Limits for... Act of 1934 (the ``Act'') \\2\\ and Rule 19b-4 thereunder,\\3\\ a proposed rule change to eliminate... side of the market. The proposal would amend Commentary .07 to NYSE Amex Options Rule 904 to eliminate...
18 CFR 385.1403 - Petitions seeking institution of rulemaking proceedings (Rule 1404).
Code of Federal Regulations, 2010 CFR
2010-04-01
... PROCEDURE Oil Pipeline Proceedings § 385.1403 Petitions seeking institution of rulemaking proceedings (Rule... purpose of issuing statements, rules, or regulations of general applicability and significance designed to...
Direct Final Rule for Technical Amendments for Marine Spark-Ignition Engines and Vessels
Rule published September 16, 2010 to make technical amendments to the design standard for portable marine fuel tanks. This rule incorporates safe recommended practices, developed through industry consensus.
Railway Online Booking System Design and Implementation
NASA Astrophysics Data System (ADS)
Zongjiang, Wang
In this paper, we define rule usefulness and introduce one approach to evaluate the rule usefulness in rough sets. And we raise one method to get most useful rules. This method is easy and effective in applications of prisoners' reform. Comparing with the method to get most interesting rules, ours is direct and objective. Rule interestingness must consider the predefined knowledge on what kind of information is interesting. Our method greatly reduces the rule numbers generated and provides a measure of rule usefulness at the same time.
Chemical Waste Management and Disposal.
ERIC Educational Resources Information Center
Armour, Margaret-Ann
1988-01-01
Describes simple, efficient techniques for treating hazardous chemicals so that nontoxic and nonhazardous residues are formed. Discusses general rules for management of waste chemicals from school laboratories and general techniques for the disposal of waste or surplus chemicals. Lists specific disposal reactions. (CW)
76 FR 63575 - Transportation Conformity Rule: MOVES Regional Grace Period Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-13
... written in FORTRAN and used simple text files for data input and output, MOVES2010a is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables. These changes...
76 FR 63554 - Transportation Conformity Rule: MOVES Regional Grace Period Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-13
... written in FORTRAN and used simple text files for data input and output, MOVES2010a is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables. These changes...
77 FR 11394 - Transportation Conformity Rule: MOVES Regional Grace Period Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-27
... written in FORTRAN and used simple text files for data input and output, MOVES is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables.\\13\\ \\13\\ Some...
The RC Circuit--A Multipurpose Laboratory Experiment.
ERIC Educational Resources Information Center
Wood, Herbert T.
1993-01-01
Describes an experiment that demonstrates the use of Kirchoff's rules in the analysis of electrical circuits. The experiment also involves the solution of a linear nonhomogeneous differential equation that is slightly different from the standard one for the simple RC circuit. (ZWH)
Rule-based navigation control design for autonomous flight
NASA Astrophysics Data System (ADS)
Contreras, Hugo; Bassi, Danilo
2008-04-01
This article depicts a navigation control system design that is based on a set of rules in order to follow a desired trajectory. The full control of the aircraft considered here comprises: a low level stability control loop, based on classic PID controller and the higher level navigation whose main job is to exercise lateral control (course) and altitude control, trying to follow a desired trajectory. The rules and PID gains were adjusted systematically according to the result of flight simulation. In spite of its simplicity, the rule-based navigation control proved to be robust, even with big perturbation, like crossing winds.
Models of life: epigenetics, diversity and cycles.
Sneppen, Kim
2017-04-01
This review emphasizes aspects of biology that can be understood through repeated applications of simple causal rules. The selected topics include perspectives on gene regulation, phage lambda development, epigenetics, microbial ecology, as well as model approaches to diversity and to punctuated equilibrium in evolution. Two outstanding features are repeatedly described. One is the minimal number of rules to sustain specific states of complex systems for a long time. The other is the collapse of such states and the subsequent dynamical cycle of situations that restitute the system to a potentially new metastable state.
Five rules for the evolution of cooperation.
Nowak, Martin A
2006-12-08
Cooperation is needed for evolution to construct new levels of organization. Genomes, cells, multicellular organisms, social insects, and human society are all based on cooperation. Cooperation means that selfish replicators forgo some of their reproductive potential to help one another. But natural selection implies competition and therefore opposes cooperation unless a specific mechanism is at work. Here I discuss five mechanisms for the evolution of cooperation: kin selection, direct reciprocity, indirect reciprocity, network reciprocity, and group selection. For each mechanism, a simple rule is derived that specifies whether natural selection can lead to cooperation.
Five Rules for the Evolution of Cooperation
NASA Astrophysics Data System (ADS)
Nowak, Martin A.
2006-12-01
Cooperation is needed for evolution to construct new levels of organization. Genomes, cells, multicellular organisms, social insects, and human society are all based on cooperation. Cooperation means that selfish replicators forgo some of their reproductive potential to help one another. But natural selection implies competition and therefore opposes cooperation unless a specific mechanism is at work. Here I discuss five mechanisms for the evolution of cooperation: kin selection, direct reciprocity, indirect reciprocity, network reciprocity, and group selection. For each mechanism, a simple rule is derived that specifies whether natural selection can lead to cooperation.
Report of Freshwater Mussels Workshop Held at St. Louis, Missouri on 26-27 October 1982.
1983-10-01
I was accosted by a toddler armed .. with a large red apple, a double handful for him. The apple was brought down on my knee with all the force the...my best diction. "Apple!" the youngster cried, "Apple, apple, apple!" banging my knee in perfect time. Then back to his mother down the aisle he...neither simple nor perfect, but it is available and should be used. One of the basic rules , perhaps the basic rule , of the International Code of
Models of life: epigenetics, diversity and cycles
NASA Astrophysics Data System (ADS)
Sneppen, Kim
2017-04-01
This review emphasizes aspects of biology that can be understood through repeated applications of simple causal rules. The selected topics include perspectives on gene regulation, phage lambda development, epigenetics, microbial ecology, as well as model approaches to diversity and to punctuated equilibrium in evolution. Two outstanding features are repeatedly described. One is the minimal number of rules to sustain specific states of complex systems for a long time. The other is the collapse of such states and the subsequent dynamical cycle of situations that restitute the system to a potentially new metastable state.
Chartier, Sylvain; Proulx, Robert
2005-11-01
This paper presents a new unsupervised attractor neural network, which, contrary to optimal linear associative memory models, is able to develop nonbipolar attractors as well as bipolar attractors. Moreover, the model is able to develop less spurious attractors and has a better recall performance under random noise than any other Hopfield type neural network. Those performances are obtained by a simple Hebbian/anti-Hebbian online learning rule that directly incorporates feedback from a specific nonlinear transmission rule. Several computer simulations show the model's distinguishing properties.
Lincoln, Don
2018-01-16
Albert Einstein said that what he wanted to know was âGodâs thoughts,â which is a metaphor for the ultimate and most basic rules of the universe. Once known, all other phenomena would then be a consequence of these simple rules. While modern science is far from that goal, we have some thoughts on how this inquiry might unfold. In this video, Fermilabâs Dr. Don Lincoln tells what we know about GUTs (grand unified theories) and TOEs (theories of everything).
How effective is advertising in duopoly markets?
NASA Astrophysics Data System (ADS)
Sznajd-Weron, K.; Weron, R.
2003-06-01
A simple Ising spin model which can describe the mechanism of advertising in a duopoly market is proposed. In contrast to other agent-based models, the influence does not flow inward from the surrounding neighbors to the center site, but spreads outward from the center to the neighbors. The model thus describes the spread of opinions among customers. It is shown via standard Monte Carlo simulations that very simple rules and inclusion of an external field-an advertising campaign-lead to phase transitions.
The visual display of regulatory information and networks.
Pirson, I; Fortemaison, N; Jacobs, C; Dremier, S; Dumont, J E; Maenhaut, C
2000-10-01
Cell regulation and signal transduction are becoming increasingly complex, with reports of new cross-signalling, feedback, and feedforward regulations between pathways and between the multiple isozymes discovered at each step of these pathways. However, this information, which requires pages of text for its description, can be summarized in very simple schemes, although there is no consensus on the drawing of such schemes. This article presents a simple set of rules that allows a lot of information to be inserted in easily understandable displays.
Simplified adaptive control of an orbiting flexible spacecraft
NASA Astrophysics Data System (ADS)
Maganti, Ganesh B.; Singh, Sahjendra N.
2007-10-01
The paper presents the design of a new simple adaptive system for the rotational maneuver and vibration suppression of an orbiting spacecraft with flexible appendages. A moment generating device located on the central rigid body of the spacecraft is used for the attitude control. It is assumed that the system parameters are unknown and the truncated model of the spacecraft has finite but arbitrary dimension. In addition, only the pitch angle and its derivative are measured and elastic modes are not available for feedback. The control output variable is chosen as the linear combination of the pitch angle and the pitch rate. Exploiting the hyper minimum phase nature of the spacecraft, a simple adaptive control law is derived for the pitch angle control and elastic mode stabilization. The adaptation rule requires only four adjustable parameters and the structure of the control system does not depend on the order of the truncated spacecraft model. For the synthesis of control system, the measured output error and the states of a third-order command generator are used. Simulation results are presented which show that in the closed-loop system adaptive output regulation is accomplished in spite of large parameter uncertainties and disturbance input.
78 FR 36434 - Revisions to Rules of Practice
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-18
... federal holidays, make grammatical corrections, and remove the reference to part-day holidays. Rule 3001... section, the following categories of persons are designated ``decision-making personnel'': (i) The.... The following categories of person are designated ``non-decision-making personnel'': (i) All...