Sample records for reasonable computational effort

  1. Competent Reasoning with Rational Numbers.

    ERIC Educational Resources Information Center

    Smith, John P. III

    1995-01-01

    Analyzed students' reasoning with fractions. Found that skilled students applied strategies specifically tailored to restricted classes of fractions and produced reliable solutions with a minimum of computation effort. Results suggest that competent reasoning depends on a knowledge base that includes numerically specific and invented strategies,…

  2. Computers and Instruction: Implications of the Rising Tide of Criticism for Reading Education.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    1988-01-01

    Examines two major reasons that schools have adopted computers without careful prior examination and planning. Surveys a variety of criticisms targeted toward some aspects of computer-based instruction in reading in an effort to direct attention to the beneficial implications of computers in the classroom. (MS)

  3. Computers for the Faculty: How on a Limited Budget.

    ERIC Educational Resources Information Center

    Arman, Hal; Kostoff, John

    An informal investigation of the use of computers at Delta College (DC) in Michigan revealed reasonable use of computers by faculty in disciplines such as mathematics, business, and technology, but very limited use in the humanities and social sciences. In an effort to increase faculty computer usage, DC decided to make computers available to any…

  4. Teachable Agents and the Protege Effect: Increasing the Effort towards Learning

    ERIC Educational Resources Information Center

    Chase, Catherine C.; Chin, Doris B.; Oppezzo, Marily A.; Schwartz, Daniel L.

    2009-01-01

    Betty's Brain is a computer-based learning environment that capitalizes on the social aspects of learning. In Betty's Brain, students instruct a character called a Teachable Agent (TA) which can reason based on how it is taught. Two studies demonstrate the "protege effect": students make greater effort to learn for their TAs than they do…

  5. Online Secondary Research in the Advertising Research Class: A Friendly Introduction to Computing.

    ERIC Educational Resources Information Center

    Adler, Keith

    In an effort to promote computer literacy among advertising students, an assignment was devised that required the use of online database search techniques to find secondary research materials. The search program, chosen for economical reasons, was "Classroom Instruction Program" offered by Dialog Information Services. Available for a…

  6. Determination of aerodynamic sensitivity coefficients for wings in transonic flow

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.; El-Banna, Hesham M.

    1992-01-01

    The quasianalytical approach is applied to the 3-D full potential equation to compute wing aerodynamic sensitivity coefficients in the transonic regime. Symbolic manipulation is used to reduce the effort associated with obtaining the sensitivity equations, and the large sensitivity system is solved using 'state of the art' routines. The quasianalytical approach is believed to be reasonably accurate and computationally efficient for 3-D problems.

  7. Deer Browse Production: Rapid Sampling and Computer-aided Analysis

    Treesearch

    Forest W. Stearns; Dennis L. Schweitzer; William A. Creed

    1968-01-01

    Describes field techniques by which winter deer-browse production can be sampled with reasonable accuracy and moderate effort; and expedites the tabulation of the browse data. The method will be useful to both land managers and scientists doing research on the habitat of the white-tailed deer.

  8. Reasoning with Atomic-Scale Molecular Dynamic Models

    ERIC Educational Resources Information Center

    Pallant, Amy; Tinker, Robert F.

    2004-01-01

    The studies reported in this paper are an initial effort to explore the applicability of computational models in introductory science learning. Two instructional interventions are described that use a molecular dynamics model embedded in a set of online learning activities with middle and high school students in 10 classrooms. The studies indicate…

  9. A New Application of the Channel Packet Method for Low Energy 1-D Elastic Scattering

    DTIC Science & Technology

    2006-09-01

    matter. On a cosmic scale, we wonder if a collision between an asteroid and Earth led to the extinction of the dinosaurs . Collisions are important...in Figure 12. In an effort to have the computation time reasonable was chosen to be for this simulation. In order to represent the intermediate...linear regions joined by the two labeled points. However, based on Figure 13 the two potential functions are reasonably close and so one would not

  10. Technology advances and market forces: Their impact on high performance architectures

    NASA Technical Reports Server (NTRS)

    Best, D. R.

    1978-01-01

    Reasonable projections into future supercomputer architectures and technology require an analysis of the computer industry market environment, the current capabilities and trends within the component industry, and the research activities on computer architecture in the industrial and academic communities. Management, programmer, architect, and user must cooperate to increase the efficiency of supercomputer development efforts. Care must be taken to match the funding, compiler, architecture and application with greater attention to testability, maintainability, reliability, and usability than supercomputer development programs of the past.

  11. Treatment of uncertainty in artificial intelligence

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1988-01-01

    The present assessment of the development status of research efforts concerned with AI reasoning under conditions of uncertainty emphasizes the importance of appropriateness in the approach selected for both the epistemic and the computational levels. At the former level, attention is given to the form of uncertainty-representation and the fidelity of its reflection of actual problems' uncertainties; at the latter level, such issues as the availability of the requisite information and the complexity of the reasoning process must be considered. The tradeoff between these levels must always be the focus of AI system-developers' attention.

  12. A State of the Art Survey of Fraud Detection Technology

    NASA Astrophysics Data System (ADS)

    Flegel, Ulrich; Vayssière, Julien; Bitz, Gunter

    With the introduction of IT to conductbusiness we accepted the loss of a human control step.For this reason, the introductionof newIT systemswas accompanied by the development of the authorization concept. But since, in reality, there is no such thing as 100 per cent security; auditors are commissioned to examine all transactions for misconduct. Since the data exists in digital form already, it makes sense to use computer-based processes to analyse it. Such processes allow the auditor to carry out extensive checks within an acceptable timeframe and with reasonable effort. Once the algorithm has been defined, it only takes sufficient computing power to evaluate larger quantities of data. This contribution presents the state of the art for IT-based data analysis processes that can be used to identify fraudulent activities.

  13. High level cognitive information processing in neural networks

    NASA Technical Reports Server (NTRS)

    Barnden, John A.; Fields, Christopher A.

    1992-01-01

    Two related research efforts were addressed: (1) high-level connectionist cognitive modeling; and (2) local neural circuit modeling. The goals of the first effort were to develop connectionist models of high-level cognitive processes such as problem solving or natural language understanding, and to understand the computational requirements of such models. The goals of the second effort were to develop biologically-realistic model of local neural circuits, and to understand the computational behavior of such models. In keeping with the nature of NASA's Innovative Research Program, all the work conducted under the grant was highly innovative. For instance, the following ideas, all summarized, are contributions to the study of connectionist/neural networks: (1) the temporal-winner-take-all, relative-position encoding, and pattern-similarity association techniques; (2) the importation of logical combinators into connection; (3) the use of analogy-based reasoning as a bridge across the gap between the traditional symbolic paradigm and the connectionist paradigm; and (4) the application of connectionism to the domain of belief representation/reasoning. The work on local neural circuit modeling also departs significantly from the work of related researchers. In particular, its concentration on low-level neural phenomena that could support high-level cognitive processing is unusual within the area of biological local circuit modeling, and also serves to expand the horizons of the artificial neural net field.

  14. Software Engineering for Scientific Computer Simulations

    NASA Astrophysics Data System (ADS)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  15. Computer-assisted learning and simulation systems in dentistry--a challenge to society.

    PubMed

    Welk, A; Splieth, Ch; Wierinck, E; Gilpatrick, R O; Meyer, G

    2006-07-01

    Computer technology is increasingly used in practical training at universities. However, in spite of their potential, computer-assisted learning (CAL) and computer-assisted simulation (CAS) systems still appear to be underutilized in dental education. Advantages, challenges, problems, and solutions of computer-assisted learning and simulation in dentistry are discussed by means of MEDLINE, open Internet platform searches, and key results of a study among German dental schools. The advantages of computer-assisted learning are seen for example in self-paced and self-directed learning and increased motivation. It is useful for both objective theoretical and practical tests and for training students to handle complex cases. CAL can lead to more structured learning and can support training in evidence-based decision-making. The reasons for the still relatively rare implementation of CAL/CAS systems in dental education include an inability to finance, lack of studies of CAL/CAS, and too much effort required to integrate CAL/CAS systems into the curriculum. To overcome the reasons for the relative low degree of computer technology use, we should strive for multicenter research and development projects monitored by the appropriate national and international scientific societies, so that the potential of computer technology can be fully realized in graduate, postgraduate, and continuing dental education.

  16. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1991-01-01

    The difficulty of developing reliable distributed software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems which are substantially easier to develop, fault-tolerance, and self-managing. Six years of research on ISIS are reviewed, describing the model, the types of applications to which ISIS was applied, and some of the reasoning that underlies a recent effort to redesign and reimplement ISIS as a much smaller, lightweight system.

  17. Using artificial intelligence to control fluid flow computations

    NASA Technical Reports Server (NTRS)

    Gelsey, Andrew

    1992-01-01

    Computational simulation is an essential tool for the prediction of fluid flow. Many powerful simulation programs exist today. However, using these programs to reliably analyze fluid flow and other physical situations requires considerable human effort and expertise to set up a simulation, determine whether the output makes sense, and repeatedly run the simulation with different inputs until a satisfactory result is achieved. Automating this process is not only of considerable practical importance but will also significantly advance basic artificial intelligence (AI) research in reasoning about the physical world.

  18. Numerical simulation of lava flows: Applications to the terrestrial planets

    NASA Technical Reports Server (NTRS)

    Zimbelman, James R.; Campbell, Bruce A.; Kousoum, Juliana; Lampkin, Derrick J.

    1993-01-01

    Lava flows are the visible expression of the extrusion of volcanic materials on a variety of planetary surfaces. A computer program described by Ishihara et al. appears to be well suited for application to different environments, and we have undertaken tests to evaluate their approach. Our results are somewhat mixed; the program does reproduce reasonable lava flow behavior in many situations, but we have encountered some conditions common to planetary environments for which the current program is inadequate. Here we present our initial efforts to identify the 'parameter space' for reasonable numerical simulations of lava flows.

  19. Development of an adaptive hp-version finite element method for computational optimal control

    NASA Technical Reports Server (NTRS)

    Hodges, Dewey H.; Warner, Michael S.

    1994-01-01

    In this research effort, the usefulness of hp-version finite elements and adaptive solution-refinement techniques in generating numerical solutions to optimal control problems has been investigated. Under NAG-939, a general FORTRAN code was developed which approximated solutions to optimal control problems with control constraints and state constraints. Within that methodology, to get high-order accuracy in solutions, the finite element mesh would have to be refined repeatedly through bisection of the entire mesh in a given phase. In the current research effort, the order of the shape functions in each element has been made a variable, giving more flexibility in error reduction and smoothing. Similarly, individual elements can each be subdivided into many pieces, depending on the local error indicator, while other parts of the mesh remain coarsely discretized. The problem remains to reduce and smooth the error while still keeping computational effort reasonable enough to calculate time histories in a short enough time for on-board applications.

  20. Personal privacy, information assurance, and the threat posed by malware techology

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.

    2006-04-01

    In spite of our best efforts to secure the cyber world, the threats posed to personal privacy by attacks upon networks and software continue unabated. While there are many reasons for this state of affairs, clearly one of the reasons for continued vulnerabilities in software is the inability to assess their security properties and test their security systems while they are in development. A second reason for this growing threat to personal privacy is the growing sophistication and maliciousness of malware coupled with the increasing difficulty of detecting malware. The pervasive threat posed by malware coupled with the difficulties faced when trying to detect its presence or an attempted intrusion make addressing the malware threat one of the most pressing issues that must be solved in order to insure personal privacy to users of the internet. In this paper, we will discuss the threat posed by malware, the types of malware found in the wild (outside of computer laboratories), and current techniques that are available for from a successful malware penetration. The paper includes a discussion of anti-malware tools and suggestions for future anti-malware efforts.

  1. Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems

    PubMed Central

    Rodriguez-Fernandez, Maria; Egea, Jose A; Banga, Julio R

    2006-01-01

    Background We consider the problem of parameter estimation (model calibration) in nonlinear dynamic models of biological systems. Due to the frequent ill-conditioning and multi-modality of many of these problems, traditional local methods usually fail (unless initialized with very good guesses of the parameter vector). In order to surmount these difficulties, global optimization (GO) methods have been suggested as robust alternatives. Currently, deterministic GO methods can not solve problems of realistic size within this class in reasonable computation times. In contrast, certain types of stochastic GO methods have shown promising results, although the computational cost remains large. Rodriguez-Fernandez and coworkers have presented hybrid stochastic-deterministic GO methods which could reduce computation time by one order of magnitude while guaranteeing robustness. Our goal here was to further reduce the computational effort without loosing robustness. Results We have developed a new procedure based on the scatter search methodology for nonlinear optimization of dynamic models of arbitrary (or even unknown) structure (i.e. black-box models). In this contribution, we describe and apply this novel metaheuristic, inspired by recent developments in the field of operations research, to a set of complex identification problems and we make a critical comparison with respect to the previous (above mentioned) successful methods. Conclusion Robust and efficient methods for parameter estimation are of key importance in systems biology and related areas. The new metaheuristic presented in this paper aims to ensure the proper solution of these problems by adopting a global optimization approach, while keeping the computational effort under reasonable values. This new metaheuristic was applied to a set of three challenging parameter estimation problems of nonlinear dynamic biological systems, outperforming very significantly all the methods previously used for these benchmark problems. PMID:17081289

  2. Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems.

    PubMed

    Rodriguez-Fernandez, Maria; Egea, Jose A; Banga, Julio R

    2006-11-02

    We consider the problem of parameter estimation (model calibration) in nonlinear dynamic models of biological systems. Due to the frequent ill-conditioning and multi-modality of many of these problems, traditional local methods usually fail (unless initialized with very good guesses of the parameter vector). In order to surmount these difficulties, global optimization (GO) methods have been suggested as robust alternatives. Currently, deterministic GO methods can not solve problems of realistic size within this class in reasonable computation times. In contrast, certain types of stochastic GO methods have shown promising results, although the computational cost remains large. Rodriguez-Fernandez and coworkers have presented hybrid stochastic-deterministic GO methods which could reduce computation time by one order of magnitude while guaranteeing robustness. Our goal here was to further reduce the computational effort without loosing robustness. We have developed a new procedure based on the scatter search methodology for nonlinear optimization of dynamic models of arbitrary (or even unknown) structure (i.e. black-box models). In this contribution, we describe and apply this novel metaheuristic, inspired by recent developments in the field of operations research, to a set of complex identification problems and we make a critical comparison with respect to the previous (above mentioned) successful methods. Robust and efficient methods for parameter estimation are of key importance in systems biology and related areas. The new metaheuristic presented in this paper aims to ensure the proper solution of these problems by adopting a global optimization approach, while keeping the computational effort under reasonable values. This new metaheuristic was applied to a set of three challenging parameter estimation problems of nonlinear dynamic biological systems, outperforming very significantly all the methods previously used for these benchmark problems.

  3. Fostering Multilinguality in the UMLS: A Computational Approach to Terminology Expansion for Multiple Languages

    PubMed Central

    Hellrich, Johannes; Hahn, Udo

    2014-01-01

    We here report on efforts to computationally support the maintenance and extension of multilingual biomedical terminology resources. Our main idea is to treat term acquisition as a classification problem guided by term alignment in parallel multilingual corpora, using termhood information coming from of a named entity recognition system as a novel feature. We report on experiments for Spanish, French, German and Dutch parts of a multilingual UMLS-derived biomedical terminology. These efforts yielded 19k, 18k, 23k and 12k new terms and synonyms, respectively, from which about half relate to concepts without a previously available term label for these non-English languages. Based on expert assessment of a novel German terminology sample, 80% of the newly acquired terms were judged as reasonable additions to the terminology. PMID:25954371

  4. Children's Effort/Ability Reasoning: Individual Differences and Motivational Consequences.

    ERIC Educational Resources Information Center

    Leggett, Ellen L.; Dweck, Carol S.

    Individual differences in same-aged children's reasoning about effort and ability, as well as the consequences of different forms of reasoning in actual achievement situations, were investigated. It was hypothesized that different forms of children's reasoning would be related to different (helpless versus mastery-oriented) motivational patterns.…

  5. Faultfinder: A diagnostic expert system with graceful degradation for onboard aircraft applications

    NASA Technical Reports Server (NTRS)

    Abbott, Kathy H.; Schutte, Paul C.; Palmer, Michael T.; Ricks, Wendell R.

    1988-01-01

    A research effort was conducted to explore the application of artificial intelligence technology to automation of fault monitoring and diagnosis as an aid to the flight crew. Human diagnostic reasoning was analyzed and actual accident and incident cases were reconstructed. Based on this analysis and reconstruction, diagnostic concepts were conceived and implemented for an aircraft's engine and hydraulic subsystems. These concepts are embedded within a multistage approach to diagnosis that reasons about time-based, causal, and qualitative information, and enables a certain amount of graceful degradation. The diagnostic concepts are implemented in a computer program called Faultfinder that serves as a research prototype.

  6. Bridging the Gap between Human Judgment and Automated Reasoning in Predictive Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.; Riensche, Roderick M.; Unwin, Stephen D.

    2010-06-07

    Events occur daily that impact the health, security and sustainable growth of our society. If we are to address the challenges that emerge from these events, anticipatory reasoning has to become an everyday activity. Strong advances have been made in using integrated modeling for analysis and decision making. However, a wider impact of predictive analytics is currently hindered by the lack of systematic methods for integrating predictive inferences from computer models with human judgment. In this paper, we present a predictive analytics approach that supports anticipatory analysis and decision-making through a concerted reasoning effort that interleaves human judgment and automatedmore » inferences. We describe a systematic methodology for integrating modeling algorithms within a serious gaming environment in which role-playing by human agents provides updates to model nodes and the ensuing model outcomes in turn influence the behavior of the human players. The approach ensures a strong functional partnership between human players and computer models while maintaining a high degree of independence and greatly facilitating the connection between model and game structures.« less

  7. 3D Multi-Level Non-LTE Radiative Transfer for the CO Molecule

    NASA Astrophysics Data System (ADS)

    Berkner, A.; Schweitzer, A.; Hauschildt, P. H.

    2015-01-01

    The photospheres of cool stars are both rich in molecules and an environment where the assumption of LTE can not be upheld under all circumstances. Unfortunately, detailed 3D non-LTE calculations involving molecules are hardly feasible with current computers. For this reason, we present our implementation of the super level technique, in which molecular levels are combined into super levels, to reduce the number of unknowns in the rate equations and, thus, the computational effort and memory requirements involved, and show the results of our first tests against the 1D implementation of the same method.

  8. German dental faculty attitudes towards computer-assisted learning and their correlation with personal and professional profiles.

    PubMed

    Welk, A; Rosin, M; Seyer, D; Splieth, C; Siemer, M; Meyer, G

    2005-08-01

    Compared with its potential, computer technology use is still lacking in medical/dental education. To investigate the primary advantages of computer-assisted learning (CAL) systems in German dental education, as well as the reasons for their relatively low degree of use correlated with personal and professional profiles of respondents. A questionnaire was mailed to heads in the departments of conservative dentistry and prosthetic dentistry in all dental schools in Germany. Besides investigating the advantages and barriers to the use of computer technology, the questionnaire also contained questions regarding each respondent's gender, age, academic rank, experience in academia and computer skills. The response rate to the questionnaire was 90% (112 of 125). The results indicated a distinct discrepancy between the desire for and actual occurrence of lectures, seminars, etc. to instruct students in ways to search for and acquire knowledge, especially using computer technology. The highest-ranked advantages of CAL systems in order, as seen by respondents, were the possibilities for individual learning, increased motivation, and both objective theoretical tests and practical tests. The highest-ranked reasons for the low degree of usage of CAL systems in order were the inability to finance, followed equally by a lack of studies of CAL and poor cost-advantage ratio, and too much effort required to integrate CAL into the curriculum. Moreover, the higher the computer skills of the respondents, the more they noted insufficient quality of CAL systems (r = 0.200, P = 0.035) and content differences from their own dental faculty's expert opinions (r = 0.228, P = 0.016) as reasons for low use. The correlations of the attitudes towards CAL with the personal and professional profiles showed not only statistical significant reinforcements of, but also interesting deviations from, the average responses.

  9. Portable computing - A fielded interactive scientific application in a small off-the-shelf package

    NASA Technical Reports Server (NTRS)

    Groleau, Nicolas; Hazelton, Lyman; Frainier, Rich; Compton, Michael; Colombano, Silvano; Szolovits, Peter

    1993-01-01

    Experience with the design and implementation of a portable computing system for STS crew-conducted science is discussed. Principal-Investigator-in-a-Box (PI) will help the SLS-2 astronauts perform vestibular (human orientation system) experiments in flight. PI is an interactive system that provides data acquisition and analysis, experiment step rescheduling, and various other forms of reasoning to astronaut users. The hardware architecture of PI consists of a computer and an analog interface box. 'Off-the-shelf' equipment is employed in the system wherever possible in an effort to use widely available tools and then to add custom functionality and application codes to them. Other projects which can help prospective teams to learn more about portable computing in space are also discussed.

  10. Overview of Risk Mitigation for Safety-Critical Computer-Based Systems

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This report presents a high-level overview of a general strategy to mitigate the risks from threats to safety-critical computer-based systems. In this context, a safety threat is a process or phenomenon that can cause operational safety hazards in the form of computational system failures. This report is intended to provide insight into the safety-risk mitigation problem and the characteristics of potential solutions. The limitations of the general risk mitigation strategy are discussed and some options to overcome these limitations are provided. This work is part of an ongoing effort to enable well-founded assurance of safety-related properties of complex safety-critical computer-based aircraft systems by developing an effective capability to model and reason about the safety implications of system requirements and design.

  11. 29 CFR 1691.9 - EEOC reasonable cause determinations and conciliation efforts.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false EEOC reasonable cause determinations and conciliation efforts. 1691.9 Section 1691.9 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT... FEDERAL FINANCIAL ASSISTANCE § 1691.9 EEOC reasonable cause determinations and conciliation efforts. (a...

  12. 29 CFR 1691.9 - EEOC reasonable cause determinations and conciliation efforts.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false EEOC reasonable cause determinations and conciliation efforts. 1691.9 Section 1691.9 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT... FEDERAL FINANCIAL ASSISTANCE § 1691.9 EEOC reasonable cause determinations and conciliation efforts. (a...

  13. 29 CFR 1691.9 - EEOC reasonable cause determinations and conciliation efforts.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 4 2014-07-01 2014-07-01 false EEOC reasonable cause determinations and conciliation efforts. 1691.9 Section 1691.9 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT... FEDERAL FINANCIAL ASSISTANCE § 1691.9 EEOC reasonable cause determinations and conciliation efforts. (a...

  14. 29 CFR 1691.9 - EEOC reasonable cause determinations and conciliation efforts.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 4 2013-07-01 2013-07-01 false EEOC reasonable cause determinations and conciliation efforts. 1691.9 Section 1691.9 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT... FEDERAL FINANCIAL ASSISTANCE § 1691.9 EEOC reasonable cause determinations and conciliation efforts. (a...

  15. 29 CFR 1691.9 - EEOC reasonable cause determinations and conciliation efforts.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 4 2012-07-01 2012-07-01 false EEOC reasonable cause determinations and conciliation efforts. 1691.9 Section 1691.9 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT... FEDERAL FINANCIAL ASSISTANCE § 1691.9 EEOC reasonable cause determinations and conciliation efforts. (a...

  16. Life Prediction for a CMC Component Using the NASALIFE Computer Code

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, John Z.; Murthy, Pappu L. N.; Mital, Subodh K.

    2005-01-01

    The computer code, NASALIFE, was used to provide estimates for life of an SiC/SiC stator vane under varying thermomechanical loading conditions. The primary intention of this effort is to show how the computer code NASALIFE can be used to provide reasonable estimates of life for practical propulsion system components made of advanced ceramic matrix composites (CMC). Simple loading conditions provided readily observable and acceptable life predictions. Varying the loading conditions such that low cycle fatigue and creep were affected independently provided expected trends in the results for life due to varying loads and life due to creep. Analysis was based on idealized empirical data for the 9/99 Melt Infiltrated SiC fiber reinforced SiC.

  17. Hints for an extension of the early exercise premium formula for American options

    NASA Astrophysics Data System (ADS)

    Bermin, Hans-Peter; Kohatsu-Higa, Arturo; Perelló, Josep

    2005-09-01

    There exists a non-closed formula for the American put option price and non-trivial computations are required to solve it. Strong efforts have been made to propose efficient numerical techniques but few have strong mathematical reasoning to ascertain why they work well. We present an extension of the American put price aiming to catch weaknesses of the numerical methods based on their non-fulfillment of the smooth pasting condition.

  18. Analysis of Special Forces Medic (18D) Attrition

    DTIC Science & Technology

    1994-08-01

    130 students per year, 18D should reach 100% strength in the second quarter of FY95. Training Issues The biggest single reason for the high attrition...including a library and 24-hour study rooms, ix SOMED and SOMTC should consider incorporating computer-based training into the 18D training. The PA...particularly in the sciences. Given this, the high SOMED attrition rate is to be expected. Some have suggested that a greater effort should be made to recruit

  19. Top ten reasons the World Wide Web may fail to change medical education.

    PubMed

    Friedman, R B

    1996-09-01

    The Internet's World Wide Web (WWW) offers educators a unique opportunity to introduce computer-assisted instructional (CAI) programs into the medical school curriculum. With the WWW, CAI programs developed at one medical school could be successfully used at other institutions without concern about hardware or software compatibility; further, programs could be maintained and regularly updated at a single central location, could be distributed rapidly, would be technology-independent, and would be presented in the same format on all computers. However, while the WWW holds promise for CAI, the author discusses ten reasons that educators' efforts to fulfill the Web's promise may fail, including the following: CAI is generally not fully integrated into the medical school curriculum; students are not tested on material taught using CAI; and CAI programs tend to be poorly designed. The author argues that medical educators must overcome these obstacles if they are to make truly effective use of the WWW in the classroom.

  20. We favor formal models of heuristics rather than lists of loose dichotomies: a reply to Evans and Over

    PubMed Central

    Gigerenzer, Gerd

    2009-01-01

    In their comment on Marewski et al. (good judgments do not require complex cognition, 2009) Evans and Over (heuristic thinking and human intelligence: a commentary on Marewski, Gaissmaier and Gigerenzer, 2009) conjectured that heuristics can often lead to biases and are not error free. This is a most surprising critique. The computational models of heuristics we have tested allow for quantitative predictions of how many errors a given heuristic will make, and we and others have measured the amount of error by analysis, computer simulation, and experiment. This is clear progress over simply giving heuristics labels, such as availability, that do not allow for quantitative comparisons of errors. Evans and Over argue that the reason people rely on heuristics is the accuracy-effort trade-off. However, the comparison between heuristics and more effortful strategies, such as multiple regression, has shown that there are many situations in which a heuristic is more accurate with less effort. Finally, we do not see how the fast and frugal heuristics program could benefit from a dual-process framework unless the dual-process framework is made more precise. Instead, the dual-process framework could benefit if its two “black boxes” (Type 1 and Type 2 processes) were substituted by computational models of both heuristics and other processes. PMID:19784854

  1. Refusal to participate in heart failure studies: do age and gender matter?

    PubMed Central

    Harrison, Jordan M; Jung, Miyeon; Lennie, Terry A; Moser, Debra K; Smith, Dean G; Dunbar, Sandra B; Ronis, David L; Koelling, Todd M; Giordani, Bruno; Riley, Penny L; Pressler, Susan J

    2018-01-01

    Aims and objectives The objective of this retrospective study was to evaluate reasons heart failure patients decline study participation, to inform interventions to improve enrollment. Background Failure to enrol older heart failure patients (age > 65) and women in studies may lead to sampling bias, threatening study validity. Design This study was a retrospective analysis of refusal data from four heart failure studies that enrolled 788 patients in four states. Methods Chi-Square and a pooled t-test were computed to analyse refusal data (n = 300) obtained from heart failure patients who were invited to participate in one of the four studies but declined. Results Refusal reasons from 300 patients (66% men, mean age 65 33) included: not interested (n = 163), too busy (n = 64), travel burden (n = 50), too sick (n = 38), family problems (n = 14), too much commitment (n = 13) and privacy concerns (n = 4). Chi-Square analyses showed no differences in frequency of reasons (p > 0 05) between men and women. Patients who refused were older, on average, than study participants. Conclusions Some reasons were patient-dependent; others were study-dependent. With ‘not interested’ as the most common reason, cited by over 50% of patients who declined, recruitment measures should be targeted at stimulating patients’ interest. Additional efforts may be needed to recruit older participants. However, reasons for refusal were consistent regardless of gender. Relevance to clinical practice Heart failure researchers should proactively approach a greater proportion of women and patients over age 65. With no gender differences in type of reasons for refusal, similar recruitment strategies can be used for men and women. However, enrolment of a representative proportion of women in heart failure studies has proven elusive and may require significant effort from researchers. Employing strategies to stimulate interest in studies is essential for recruiting heart failure patients, who overwhelmingly cited lack of interest as the top reason for refusal. PMID:26914834

  2. Automated Performance Prediction of Message-Passing Parallel Programs

    NASA Technical Reports Server (NTRS)

    Block, Robert J.; Sarukkai, Sekhar; Mehra, Pankaj; Woodrow, Thomas S. (Technical Monitor)

    1995-01-01

    The increasing use of massively parallel supercomputers to solve large-scale scientific problems has generated a need for tools that can predict scalability trends of applications written for these machines. Much work has been done to create simple models that represent important characteristics of parallel programs, such as latency, network contention, and communication volume. But many of these methods still require substantial manual effort to represent an application in the model's format. The NIK toolkit described in this paper is the result of an on-going effort to automate the formation of analytic expressions of program execution time, with a minimum of programmer assistance. In this paper we demonstrate the feasibility of our approach, by extending previous work to detect and model communication patterns automatically, with and without overlapped computations. The predictions derived from these models agree, within reasonable limits, with execution times of programs measured on the Intel iPSC/860 and Paragon. Further, we demonstrate the use of MK in selecting optimal computational grain size and studying various scalability metrics.

  3. Reason, emotion and decision-making: risk and reward computation with feeling.

    PubMed

    Quartz, Steven R

    2009-05-01

    Many models of judgment and decision-making posit distinct cognitive and emotional contributions to decision-making under uncertainty. Cognitive processes typically involve exact computations according to a cost-benefit calculus, whereas emotional processes typically involve approximate, heuristic processes that deliver rapid evaluations without mental effort. However, it remains largely unknown what specific parameters of uncertain decision the brain encodes, the extent to which these parameters correspond to various decision-making frameworks, and their correspondence to emotional and rational processes. Here, I review research suggesting that emotional processes encode in a precise quantitative manner the basic parameters of financial decision theory, indicating a reorientation of emotional and cognitive contributions to risky choice.

  4. Optimization of a Monte Carlo Model of the Transient Reactor Test Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Kristin; DeHart, Mark; Goluoglu, Sedat

    2017-03-01

    The ultimate goal of modeling and simulation is to obtain reasonable answers to problems that don’t have representations which can be easily evaluated while minimizing the amount of computational resources. With the advances during the last twenty years of large scale computing centers, researchers have had the ability to create a multitude of tools to minimize the number of approximations necessary when modeling a system. The tremendous power of these centers requires the user to possess an immense amount of knowledge to optimize the models for accuracy and efficiency.This paper seeks to evaluate the KENO model of TREAT to optimizemore » calculational efforts.« less

  5. Computation of turbulent boundary layers employing the defect wall-function method. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Brown, Douglas L.

    1994-01-01

    In order to decrease overall computational time requirements of spatially-marching parabolized Navier-Stokes finite-difference computer code when applied to turbulent fluid flow, a wall-function methodology, originally proposed by R. Barnwell, was implemented. This numerical effort increases computational speed and calculates reasonably accurate wall shear stress spatial distributions and boundary-layer profiles. Since the wall shear stress is analytically determined from the wall-function model, the computational grid near the wall is not required to spatially resolve the laminar-viscous sublayer. Consequently, a substantially increased computational integration step size is achieved resulting in a considerable decrease in net computational time. This wall-function technique is demonstrated for adiabatic flat plate test cases from Mach 2 to Mach 8. These test cases are analytically verified employing: (1) Eckert reference method solutions, (2) experimental turbulent boundary-layer data of Mabey, and (3) finite-difference computational code solutions with fully resolved laminar-viscous sublayers. Additionally, results have been obtained for two pressure-gradient cases: (1) an adiabatic expansion corner and (2) an adiabatic compression corner.

  6. Higher Order Time Integration Schemes for the Unsteady Navier-Stokes Equations on Unstructured Meshes

    NASA Technical Reports Server (NTRS)

    Jothiprasad, Giridhar; Mavriplis, Dimitri J.; Caughey, David A.

    2002-01-01

    The rapid increase in available computational power over the last decade has enabled higher resolution flow simulations and more widespread use of unstructured grid methods for complex geometries. While much of this effort has been focused on steady-state calculations in the aerodynamics community, the need to accurately predict off-design conditions, which may involve substantial amounts of flow separation, points to the need to efficiently simulate unsteady flow fields. Accurate unsteady flow simulations can easily require several orders of magnitude more computational effort than a corresponding steady-state simulation. For this reason, techniques for improving the efficiency of unsteady flow simulations are required in order to make such calculations feasible in the foreseeable future. The purpose of this work is to investigate possible reductions in computer time due to the choice of an efficient time-integration scheme from a series of schemes differing in the order of time-accuracy, and by the use of more efficient techniques to solve the nonlinear equations which arise while using implicit time-integration schemes. This investigation is carried out in the context of a two-dimensional unstructured mesh laminar Navier-Stokes solver.

  7. Cognitive Mechanisms of Change in Delusions: An Experimental Investigation Targeting Reasoning to Effect Change in Paranoia

    PubMed Central

    Garety, Philippa; Waller, Helen; Emsley, Richard; Jolley, Suzanne; Kuipers, Elizabeth; Bebbington, Paul; Dunn, Graham; Fowler, David; Hardy, Amy; Freeman, Daniel

    2015-01-01

    Background: Given the evidence that reasoning biases contribute to delusional persistence and change, several research groups have made systematic efforts to modify them. The current experiment tested the hypothesis that targeting reasoning biases would result in change in delusions. Methods: One hundred and one participants with current delusions and schizophrenia spectrum psychosis were randomly allocated to a brief computerized reasoning training intervention or to a control condition involving computer-based activities of similar duration. The primary hypotheses tested were that the reasoning training intervention, would improve (1) data gathering and belief flexibility and (2) delusional thinking, specifically paranoia. We then tested whether the changes in paranoia were mediated by changes in data gathering and flexibility, and whether working memory and negative symptoms moderated any intervention effects. Results: On an intention-to-treat analysis, there were significant improvements in state paranoia and reasoning in the experimental compared with the control condition. There was evidence that changes in reasoning mediated changes in paranoia, although this effect fell just outside the conventional level of significance after adjustment for baseline confounders. Working memory and negative symptoms significantly moderated the effects of the intervention on reasoning. Conclusion: The study demonstrated the effectiveness of a brief reasoning intervention in improving both reasoning processes and paranoia. It thereby provides proof-of-concept evidence that reasoning is a promising intermediary target in interventions to ameliorate delusions, and thus supports the value of developing this approach as a longer therapeutic intervention. PMID:25053650

  8. The Berlin Brain-Computer Interface: Progress Beyond Communication and Control

    PubMed Central

    Blankertz, Benjamin; Acqualagna, Laura; Dähne, Sven; Haufe, Stefan; Schultze-Kraft, Matthias; Sturm, Irene; Ušćumlic, Marija; Wenzel, Markus A.; Curio, Gabriel; Müller, Klaus-Robert

    2016-01-01

    The combined effect of fundamental results about neurocognitive processes and advancements in decoding mental states from ongoing brain signals has brought forth a whole range of potential neurotechnological applications. In this article, we review our developments in this area and put them into perspective. These examples cover a wide range of maturity levels with respect to their applicability. While we assume we are still a long way away from integrating Brain-Computer Interface (BCI) technology in general interaction with computers, or from implementing neurotechnological measures in safety-critical workplaces, results have already now been obtained involving a BCI as research tool. In this article, we discuss the reasons why, in some of the prospective application domains, considerable effort is still required to make the systems ready to deal with the full complexity of the real world. PMID:27917107

  9. The Berlin Brain-Computer Interface: Progress Beyond Communication and Control.

    PubMed

    Blankertz, Benjamin; Acqualagna, Laura; Dähne, Sven; Haufe, Stefan; Schultze-Kraft, Matthias; Sturm, Irene; Ušćumlic, Marija; Wenzel, Markus A; Curio, Gabriel; Müller, Klaus-Robert

    2016-01-01

    The combined effect of fundamental results about neurocognitive processes and advancements in decoding mental states from ongoing brain signals has brought forth a whole range of potential neurotechnological applications. In this article, we review our developments in this area and put them into perspective. These examples cover a wide range of maturity levels with respect to their applicability. While we assume we are still a long way away from integrating Brain-Computer Interface (BCI) technology in general interaction with computers, or from implementing neurotechnological measures in safety-critical workplaces, results have already now been obtained involving a BCI as research tool. In this article, we discuss the reasons why, in some of the prospective application domains, considerable effort is still required to make the systems ready to deal with the full complexity of the real world.

  10. MITT writer and MITT writer advanced development: Developing authoring and training systems for complex technical domains

    NASA Technical Reports Server (NTRS)

    Wiederholt, Bradley J.; Browning, Elica J.; Norton, Jeffrey E.; Johnson, William B.

    1991-01-01

    MITT Writer is a software system for developing computer based training for complex technical domains. A training system produced by MITT Writer allows a student to learn and practice troubleshooting and diagnostic skills. The MITT (Microcomputer Intelligence for Technical Training) architecture is a reasonable approach to simulation based diagnostic training. MITT delivers training on available computing equipment, delivers challenging training and simulation scenarios, and has economical development and maintenance costs. A 15 month effort was undertaken in which the MITT Writer system was developed. A workshop was also conducted to train instructors in how to use MITT Writer. Earlier versions were used to develop an Intelligent Tutoring System for troubleshooting the Minuteman Missile Message Processing System.

  11. Composite theory applied to elastomers

    NASA Technical Reports Server (NTRS)

    Clark, S. K.

    1986-01-01

    Reinforced elastomers form the basis for most of the structural or load carrying applications of rubber products. Computer based structural analysis in the form of finite element codes was highly successful in refining structural design in both isotropic materials and rigid composites. This has lead the rubber industry to attempt to make use of such techniques in the design of structural cord-rubber composites. While such efforts appear promising, they were not easy to achieve for several reasons. Among these is a distinct lack of a clearly defined set of material property descriptors suitable for computer analysis. There are substantial differences between conventional steel, aluminum, or even rigid composites such as graphite-epoxy, and textile-cord reinforced rubber. These differences which are both conceptual and practical are discussed.

  12. Prediction of electronic structure of organic radicaloid anions using efficient, economical multireference gradient approach.

    PubMed

    Chattopadhyay, Sudip; Chaudhuri, Rajat K; Freed, Karl F

    2011-04-28

    The improved virtual orbital-complete active space configuration interaction (IVO-CASCI) method enables an economical and reasonably accurate treatment of static correlation in systems with significant multireference character, even when using a moderate basis set. This IVO-CASCI method supplants the computationally more demanding complete active space self-consistent field (CASSCF) method by producing comparable accuracy with diminished computational effort because the IVO-CASCI approach does not require additional iterations beyond an initial SCF calculation, nor does it encounter convergence difficulties or multiple solutions that may be found in CASSCF calculations. Our IVO-CASCI analytical gradient approach is applied to compute the equilibrium geometry for the ground and lowest excited state(s) of the theoretically very challenging 2,6-pyridyne, 1,2,3-tridehydrobenzene and 1,3,5-tridehydrobenzene anionic systems for which experiments are lacking, accurate quantum calculations are almost completely absent, and commonly used calculations based on single reference configurations fail to provide reasonable results. Hence, the computational complexity provides an excellent test for the efficacy of multireference methods. The present work clearly illustrates that the IVO-CASCI analytical gradient method provides a good description of the complicated electronic quasi-degeneracies during the geometry optimization process for the radicaloid anions. The IVO-CASCI treatment produces almost identical geometries as the CASSCF calculations (performed for this study) at a fraction of the computational labor. Adiabatic energy gaps to low lying excited states likewise emerge from the IVO-CASCI and CASSCF methods as very similar. We also provide harmonic vibrational frequencies to demonstrate the stability of the computed geometries.

  13. A CFD Database for Airfoils and Wings at Post-Stall Angles of Attack

    NASA Technical Reports Server (NTRS)

    Petrilli, Justin; Paul, Ryan; Gopalarathnam, Ashok; Frink, Neal T.

    2013-01-01

    This paper presents selected results from an ongoing effort to develop an aerodynamic database from Reynolds-Averaged Navier-Stokes (RANS) computational analysis of airfoils and wings at stall and post-stall angles of attack. The data obtained from this effort will be used for validation and refinement of a low-order post-stall prediction method developed at NCSU, and to fill existing gaps in high angle of attack data in the literature. Such data could have potential applications in post-stall flight dynamics, helicopter aerodynamics and wind turbine aerodynamics. An overview of the NASA TetrUSS CFD package used for the RANS computational approach is presented. Detailed results for three airfoils are presented to compare their stall and post-stall behavior. The results for finite wings at stall and post-stall conditions focus on the effects of taper-ratio and sweep angle, with particular attention to whether the sectional flows can be approximated using two-dimensional flow over a stalled airfoil. While this approximation seems reasonable for unswept wings even at post-stall conditions, significant spanwise flow on stalled swept wings preclude the use of two-dimensional data to model sectional flows on swept wings. Thus, further effort is needed in low-order aerodynamic modeling of swept wings at stalled conditions.

  14. HTSFinder: Powerful Pipeline of DNA Signature Discovery by Parallel and Distributed Computing

    PubMed Central

    Karimi, Ramin; Hajdu, Andras

    2016-01-01

    Comprehensive effort for low-cost sequencing in the past few years has led to the growth of complete genome databases. In parallel with this effort, a strong need, fast and cost-effective methods and applications have been developed to accelerate sequence analysis. Identification is the very first step of this task. Due to the difficulties, high costs, and computational challenges of alignment-based approaches, an alternative universal identification method is highly required. Like an alignment-free approach, DNA signatures have provided new opportunities for the rapid identification of species. In this paper, we present an effective pipeline HTSFinder (high-throughput signature finder) with a corresponding k-mer generator GkmerG (genome k-mers generator). Using this pipeline, we determine the frequency of k-mers from the available complete genome databases for the detection of extensive DNA signatures in a reasonably short time. Our application can detect both unique and common signatures in the arbitrarily selected target and nontarget databases. Hadoop and MapReduce as parallel and distributed computing tools with commodity hardware are used in this pipeline. This approach brings the power of high-performance computing into the ordinary desktop personal computers for discovering DNA signatures in large databases such as bacterial genome. A considerable number of detected unique and common DNA signatures of the target database bring the opportunities to improve the identification process not only for polymerase chain reaction and microarray assays but also for more complex scenarios such as metagenomics and next-generation sequencing analysis. PMID:26884678

  15. HTSFinder: Powerful Pipeline of DNA Signature Discovery by Parallel and Distributed Computing.

    PubMed

    Karimi, Ramin; Hajdu, Andras

    2016-01-01

    Comprehensive effort for low-cost sequencing in the past few years has led to the growth of complete genome databases. In parallel with this effort, a strong need, fast and cost-effective methods and applications have been developed to accelerate sequence analysis. Identification is the very first step of this task. Due to the difficulties, high costs, and computational challenges of alignment-based approaches, an alternative universal identification method is highly required. Like an alignment-free approach, DNA signatures have provided new opportunities for the rapid identification of species. In this paper, we present an effective pipeline HTSFinder (high-throughput signature finder) with a corresponding k-mer generator GkmerG (genome k-mers generator). Using this pipeline, we determine the frequency of k-mers from the available complete genome databases for the detection of extensive DNA signatures in a reasonably short time. Our application can detect both unique and common signatures in the arbitrarily selected target and nontarget databases. Hadoop and MapReduce as parallel and distributed computing tools with commodity hardware are used in this pipeline. This approach brings the power of high-performance computing into the ordinary desktop personal computers for discovering DNA signatures in large databases such as bacterial genome. A considerable number of detected unique and common DNA signatures of the target database bring the opportunities to improve the identification process not only for polymerase chain reaction and microarray assays but also for more complex scenarios such as metagenomics and next-generation sequencing analysis.

  16. Teaching Computational Thinking: Deciding to Take Small Steps in a Curriculum

    NASA Astrophysics Data System (ADS)

    Madoff, R. D.; Putkonen, J.

    2016-12-01

    While computational thinking and reasoning are not necessarily the same as computer programming, programs such as MATLAB can provide the medium through which the logical and computational thinking at the foundation of science can be taught, learned, and experienced. And while math and computer anxiety are often discussed as critical obstacles to students' progress in their geoscience curriculum, it is here suggested that an unfamiliarity with the computational and logical reasoning is what poses a first stumbling block, in addition to the hurdle of expending the effort to learn how to translate a computational problem into the appropriate computer syntax in order to achieve the intended results. Because computational thinking is so vital for all fields, there is a need to initiate many and to build support in the curriculum for it. This presentation focuses on elements to bring into the teaching of computational thinking that are intended as additions to learning MATLAB programming as a basic tool. Such elements include: highlighting a key concept, discussing a basic geoscience problem where the concept would show up, having the student draw or outline a sketch of what they think an operation needs to do in order to perform a desired result, and then finding the relevant syntax to work with. This iterative pedagogy simulates what someone with more experience in programming does, so it discloses the thinking process in the black box of a result. Intended as only a very early stage introduction, advanced applications would need to be developed as students go through an academic program. The objective would be to expose and introduce computational thinking to majors and non-majors and to alleviate some of the math and computer anxiety so that students would choose to advance on with programming or modeling, whether it is built into a 4-year curriculum or not.

  17. Cognitive mechanisms of change in delusions: an experimental investigation targeting reasoning to effect change in paranoia.

    PubMed

    Garety, Philippa; Waller, Helen; Emsley, Richard; Jolley, Suzanne; Kuipers, Elizabeth; Bebbington, Paul; Dunn, Graham; Fowler, David; Hardy, Amy; Freeman, Daniel

    2015-03-01

    Given the evidence that reasoning biases contribute to delusional persistence and change, several research groups have made systematic efforts to modify them. The current experiment tested the hypothesis that targeting reasoning biases would result in change in delusions. One hundred and one participants with current delusions and schizophrenia spectrum psychosis were randomly allocated to a brief computerized reasoning training intervention or to a control condition involving computer-based activities of similar duration. The primary hypotheses tested were that the reasoning training intervention, would improve (1) data gathering and belief flexibility and (2) delusional thinking, specifically paranoia. We then tested whether the changes in paranoia were mediated by changes in data gathering and flexibility, and whether working memory and negative symptoms moderated any intervention effects. On an intention-to-treat analysis, there were significant improvements in state paranoia and reasoning in the experimental compared with the control condition. There was evidence that changes in reasoning mediated changes in paranoia, although this effect fell just outside the conventional level of significance after adjustment for baseline confounders. Working memory and negative symptoms significantly moderated the effects of the intervention on reasoning. The study demonstrated the effectiveness of a brief reasoning intervention in improving both reasoning processes and paranoia. It thereby provides proof-of-concept evidence that reasoning is a promising intermediary target in interventions to ameliorate delusions, and thus supports the value of developing this approach as a longer therapeutic intervention. © The Author 2014. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center.

  18. A Large-Scale, High-Resolution Hydrological Model Parameter Data Set for Climate Change Impact Assessment for the Conterminous US

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oubeidillah, Abdoul A; Kao, Shih-Chieh; Ashfaq, Moetasim

    2014-01-01

    To extend geographical coverage, refine spatial resolution, and improve modeling efficiency, a computation- and data-intensive effort was conducted to organize a comprehensive hydrologic dataset with post-calibrated model parameters for hydro-climate impact assessment. Several key inputs for hydrologic simulation including meteorologic forcings, soil, land class, vegetation, and elevation were collected from multiple best-available data sources and organized for 2107 hydrologic subbasins (8-digit hydrologic units, HUC8s) in the conterminous United States at refined 1/24 (~4 km) spatial resolution. Using high-performance computing for intensive model calibration, a high-resolution parameter dataset was prepared for the macro-scale Variable Infiltration Capacity (VIC) hydrologic model. The VICmore » simulation was driven by DAYMET daily meteorological forcing and was calibrated against USGS WaterWatch monthly runoff observations for each HUC8. The results showed that this new parameter dataset may help reasonably simulate runoff at most US HUC8 subbasins. Based on this exhaustive calibration effort, it is now possible to accurately estimate the resources required for further model improvement across the entire conterminous United States. We anticipate that through this hydrologic parameter dataset, the repeated effort of fundamental data processing can be lessened, so that research efforts can emphasize the more challenging task of assessing climate change impacts. The pre-organized model parameter dataset will be provided to interested parties to support further hydro-climate impact assessment.« less

  19. Creative and algorithmic mathematical reasoning: effects of transfer-appropriate processing and effortful struggle

    NASA Astrophysics Data System (ADS)

    Jonsson, Bert; Kulaksiz, Yagmur C.; Lithner, Johan

    2016-11-01

    Two separate studies, Jonsson et al. (J. Math Behav. 2014;36: 20-32) and Karlsson Wirebring et al. (Trends Neurosci Educ. 2015;4(1-2):6-14), showed that learning mathematics using creative mathematical reasoning and constructing their own solution methods can be more efficient than if students use algorithmic reasoning and are given the solution procedures. It was argued that effortful struggle was the key that explained this difference. It was also argued that the results could not be explained by the effects of transfer-appropriate processing, although this was not empirically investigated. This study evaluated the hypotheses of transfer-appropriate processing and effortful struggle in relation to the specific characteristics associated with algorithmic reasoning task and creative mathematical reasoning task. In a between-subjects design, upper-secondary students were matched according to their working memory capacity.

  20. Computational and Physical Analysis of Catalytic Compounds

    NASA Astrophysics Data System (ADS)

    Wu, Richard; Sohn, Jung Jae; Kyung, Richard

    2015-03-01

    Nanoparticles exhibit unique physical and chemical properties depending on their geometrical properties. For this reason, synthesis of nanoparticles with controlled shape and size is important to use their unique properties. Catalyst supports are usually made of high-surface-area porous oxides or carbon nanomaterials. These support materials stabilize metal catalysts against sintering at high reaction temperatures. Many studies have demonstrated large enhancements of catalytic behavior due to the role of the oxide-metal interface. In this paper, the catalyzing ability of supported nano metal oxides, such as silicon oxide and titanium oxide compounds as catalysts have been analyzed using computational chemistry method. Computational programs such as Gamess and Chemcraft has been used in an effort to compute the efficiencies of catalytic compounds, and bonding energy changes during the optimization convergence. The result illustrates how the metal oxides stabilize and the steps that it takes. The graph of the energy computation step(N) versus energy(kcal/mol) curve shows that the energy of the titania converges faster at the 7th iteration calculation, whereas the silica converges at the 9th iteration calculation.

  1. Theoretical and Experimental Particle Velocity in Cold Spray

    NASA Astrophysics Data System (ADS)

    Champagne, Victor K.; Helfritch, Dennis J.; Dinavahi, Surya P. G.; Leyman, Phillip F.

    2011-03-01

    In an effort to corroborate theoretical and experimental techniques used for cold spray particle velocity analysis, two theoretical and one experimental methods were used to analyze the operation of a nozzle accelerating aluminum particles in nitrogen gas. Two-dimensional (2D) axi-symmetric computations of the flow through the nozzle were performed using the Reynolds averaged Navier-Stokes code in a computational fluid dynamics platform. 1D, isentropic, gas-dynamic equations were solved for the same nozzle geometry and initial conditions. Finally, the velocities of particles exiting a nozzle of the same geometry and operated at the same initial conditions were measured by a dual-slit velocimeter. Exit plume particle velocities as determined by the three methods compared reasonably well, and differences could be attributed to frictional and particle distribution effects.

  2. Terminology development towards harmonizing multiple clinical neuroimaging research repositories.

    PubMed

    Turner, Jessica A; Pasquerello, Danielle; Turner, Matthew D; Keator, David B; Alpert, Kathryn; King, Margaret; Landis, Drew; Calhoun, Vince D; Potkin, Steven G; Tallis, Marcelo; Ambite, Jose Luis; Wang, Lei

    2015-07-01

    Data sharing and mediation across disparate neuroimaging repositories requires extensive effort to ensure that the different domains of data types are referred to by commonly agreed upon terms. Within the SchizConnect project, which enables querying across decentralized databases of neuroimaging, clinical, and cognitive data from various studies of schizophrenia, we developed a model for each data domain, identified common usable terms that could be agreed upon across the repositories, and linked them to standard ontological terms where possible. We had the goal of facilitating both the current user experience in querying and future automated computations and reasoning regarding the data. We found that existing terminologies are incomplete for these purposes, even with the history of neuroimaging data sharing in the field; and we provide a model for efforts focused on querying multiple clinical neuroimaging repositories.

  3. Terminology development towards harmonizing multiple clinical neuroimaging research repositories

    PubMed Central

    Turner, Jessica A.; Pasquerello, Danielle; Turner, Matthew D.; Keator, David B.; Alpert, Kathryn; King, Margaret; Landis, Drew; Calhoun, Vince D.; Potkin, Steven G.; Tallis, Marcelo; Ambite, Jose Luis; Wang, Lei

    2015-01-01

    Data sharing and mediation across disparate neuroimaging repositories requires extensive effort to ensure that the different domains of data types are referred to by commonly agreed upon terms. Within the SchizConnect project, which enables querying across decentralized databases of neuroimaging, clinical, and cognitive data from various studies of schizophrenia, we developed a model for each data domain, identified common usable terms that could be agreed upon across the repositories, and linked them to standard ontological terms where possible. We had the goal of facilitating both the current user experience in querying and future automated computations and reasoning regarding the data. We found that existing terminologies are incomplete for these purposes, even with the history of neuroimaging data sharing in the field; and we provide a model for efforts focused on querying multiple clinical neuroimaging repositories. PMID:26688838

  4. Leveraging Crowdsourcing and Linked Open Data for Geoscience Data Sharing and Discovery

    NASA Astrophysics Data System (ADS)

    Narock, T. W.; Rozell, E. A.; Hitzler, P.; Arko, R. A.; Chandler, C. L.; Wilson, B. D.

    2013-12-01

    Data citation standards can form the basis for increased incentives, recognition, and rewards for scientists. Additionally, knowing which data were utilized in a particular publication can enhance discovery and reuse. Yet, a lack of data citation information in existing publications as well as ambiguities across datasets can limit the accuracy of automated linking approaches. We describe a crowdsourcing approach, based on Linked Open Data, in which AGU abstracts are linked to the data used in those presentations. We discuss our efforts to incentivize participants through promotion of their research, the role that the Semantic Web can play in this effort, and how this work differs from existing platforms such as Mendeley and ResearchGate. Further, we discuss the benefits and challenges of Linked Open Data as a technical solution including the role of provenance, trust, and computational reasoning.

  5. The Influence of Effortful Thought and Cognitive Proficiencies on the Conjunction Fallacy: Implications for Dual-Process Theories of Reasoning and Judgment.

    PubMed

    Scherer, Laura D; Yates, J Frank; Baker, S Glenn; Valentine, Kathrene D

    2017-06-01

    Human judgment often violates normative standards, and virtually no judgment error has received as much attention as the conjunction fallacy. Judgment errors have historically served as evidence for dual-process theories of reasoning, insofar as these errors are assumed to arise from reliance on a fast and intuitive mental process, and are corrected via effortful deliberative reasoning. In the present research, three experiments tested the notion that conjunction errors are reduced by effortful thought. Predictions based on three different dual-process theory perspectives were tested: lax monitoring, override failure, and the Tripartite Model. Results indicated that participants higher in numeracy were less likely to make conjunction errors, but this association only emerged when participants engaged in two-sided reasoning, as opposed to one-sided or no reasoning. Confidence was higher for incorrect as opposed to correct judgments, suggesting that participants were unaware of their errors.

  6. The Fox and the Grapes-How Physical Constraints Affect Value Based Decision Making.

    PubMed

    Gross, Jörg; Woelbert, Eva; Strobel, Martin

    2015-01-01

    One fundamental question in decision making research is how humans compute the values that guide their decisions. Recent studies showed that people assign higher value to goods that are closer to them, even when physical proximity should be irrelevant for the decision from a normative perspective. This phenomenon, however, seems reasonable from an evolutionary perspective. Most foraging decisions of animals involve the trade-off between the value that can be obtained and the associated effort of obtaining. Anticipated effort for physically obtaining a good could therefore affect the subjective value of this good. In this experiment, we test this hypothesis by letting participants state their subjective value for snack food while the effort that would be incurred when reaching for it was manipulated. Even though reaching was not required in the experiment, we find that willingness to pay was significantly lower when subjects wore heavy wristbands on their arms. Thus, when reaching was more difficult, items were perceived as less valuable. Importantly, this was only the case when items were physically in front of the participants but not when items were presented as text on a computer screen. Our results suggest automatic interactions of motor and valuation processes which are unexplored to this date and may account for irrational decisions that occur when reward is particularly easy to reach.

  7. The Fox and the Grapes—How Physical Constraints Affect Value Based Decision Making

    PubMed Central

    Strobel, Martin

    2015-01-01

    One fundamental question in decision making research is how humans compute the values that guide their decisions. Recent studies showed that people assign higher value to goods that are closer to them, even when physical proximity should be irrelevant for the decision from a normative perspective. This phenomenon, however, seems reasonable from an evolutionary perspective. Most foraging decisions of animals involve the trade-off between the value that can be obtained and the associated effort of obtaining. Anticipated effort for physically obtaining a good could therefore affect the subjective value of this good. In this experiment, we test this hypothesis by letting participants state their subjective value for snack food while the effort that would be incurred when reaching for it was manipulated. Even though reaching was not required in the experiment, we find that willingness to pay was significantly lower when subjects wore heavy wristbands on their arms. Thus, when reaching was more difficult, items were perceived as less valuable. Importantly, this was only the case when items were physically in front of the participants but not when items were presented as text on a computer screen. Our results suggest automatic interactions of motor and valuation processes which are unexplored to this date and may account for irrational decisions that occur when reward is particularly easy to reach. PMID:26061087

  8. System-Level Virtualization Research at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Stephen L; Vallee, Geoffroy R; Naughton, III, Thomas J

    2010-01-01

    System-level virtualization is today enjoying a rebirth as a technique to effectively share what were then considered large computing resources to subsequently fade from the spotlight as individual workstations gained in popularity with a one machine - one user approach. One reason for this resurgence is that the simple workstation has grown in capability to rival that of anything available in the past. Thus, computing centers are again looking at the price/performance benefit of sharing that single computing box via server consolidation. However, industry is only concentrating on the benefits of using virtualization for server consolidation (enterprise computing) whereas ourmore » interest is in leveraging virtualization to advance high-performance computing (HPC). While these two interests may appear to be orthogonal, one consolidating multiple applications and users on a single machine while the other requires all the power from many machines to be dedicated solely to its purpose, we propose that virtualization does provide attractive capabilities that may be exploited to the benefit of HPC interests. This does raise the two fundamental questions of: is the concept of virtualization (a machine sharing technology) really suitable for HPC and if so, how does one go about leveraging these virtualization capabilities for the benefit of HPC. To address these questions, this document presents ongoing studies on the usage of system-level virtualization in a HPC context. These studies include an analysis of the benefits of system-level virtualization for HPC, a presentation of research efforts based on virtualization for system availability, and a presentation of research efforts for the management of virtual systems. The basis for this document was material presented by Stephen L. Scott at the Collaborative and Grid Computing Technologies meeting held in Cancun, Mexico on April 12-14, 2007.« less

  9. Bridging the Vector Calculus Gap

    NASA Astrophysics Data System (ADS)

    Dray, Tevian; Manogue, Corinne

    2003-05-01

    As with Britain and America, mathematicians and physicists are separated from each other by a common language. In a nutshell, mathematics is about functions, but physics is about things. For the last several years, we have led an NSF-supported effort to "bridge the vector calculus gap" between mathematics and physics. The unifying theme we have discovered is to emphasize geometric reasoning, not (just) algebraic computation. In this talk, we will illustrate the language differences between mathematicians and physicists, and how we are trying reconcile them in the classroom. For further information about the project go to: http://www.physics.orst.edu/bridge

  10. Machine learning in the rational design of antimicrobial peptides.

    PubMed

    Rondón-Villarreal, Paola; Sierra, Daniel A; Torres, Rodrigo

    2014-01-01

    One of the most important public health issues is the microbial and bacterial resistance to conventional antibiotics by pathogen microorganisms. In recent years, many researches have been focused on the development of new antibiotics. Among these, antimicrobial peptides (AMPs) have raised as a promising alternative to combat antibioticresistant microorganisms. For this reason, many theoretical efforts have been done in the development of new computational tools for the rational design of both better and effective AMPs. In this review, we present an overview of the rational design of AMPs using machine learning techniques and new research fields.

  11. Advanced technology and truth in advertising

    NASA Astrophysics Data System (ADS)

    Landauer, Rolf

    1990-09-01

    Most proposals for new technological approaches fail, and that is reasonable. Despite that, most of the technological proposals arising from basic science are promoted unhesitantly, with little attention to critical appraisal, even little opportunity for the presentation of criticism. We discuss several case histories related to devices intended to displace the transistor in computer logic. Our list includes devices using control of quantum mechanically coherent electron transmission, devices operating at a molecular level, and devices using nonlinear electromagnetic interaction. Neural networks are placed in a different category; something seems to be coming out of this field after several decades of effort.

  12. Resolute efforts to cure hepatitis C: Understanding patients' reasons for completing antiviral treatment.

    PubMed

    Clark, Jack A; Gifford, Allen L

    2015-09-01

    Antiviral treatment for hepatitis C is usually difficult, demanding, and debilitating and has long offered modest prospects of successful cure. Most people who may need treatment have faced stigma of an illness associated with drug and alcohol misuse and thus may be deemed poor candidates for treatment, while completing a course of treatment typically calls for resolve and responsibility. Patients' efforts and their reasons for completing treatment have received scant attention in hepatitis C clinical policy discourse that instead focuses on problems of adherence and patients' expected failures. Thus, we conducted qualitative interviews with patients who had recently undertaken treatment to explore their reasons for completing antiviral treatment. Analysis of their narrative accounts identified four principal reasons: cure the infection, avoid a bad end, demonstrate the virtue of perseverance through a personal trial, and achieve personal rehabilitation. Their reasons reflect moral rationales that mark the social discredit ascribed to the infection and may represent efforts to restore creditable social membership. Their reasons may also reflect the selection processes that render some of the infected as good candidates for treatment, while excluding others. Explication of the moral context of treatment may identify opportunities to support patients' efforts in completing treatment, as well as illuminate the choices people with hepatitis C make about engaging in care. © US Government 2014.

  13. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    NASA Astrophysics Data System (ADS)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  14. Predictive Models for Semiconductor Device Design and Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1998-01-01

    The device feature size continues to be on a downward trend with a simultaneous upward trend in wafer size to 300 mm. Predictive models are needed more than ever before for this reason. At NASA Ames, a Device and Process Modeling effort has been initiated recently with a view to address these issues. Our activities cover sub-micron device physics, process and equipment modeling, computational chemistry and material science. This talk would outline these efforts and emphasize the interaction among various components. The device physics component is largely based on integrating quantum effects into device simulators. We have two parallel efforts, one based on a quantum mechanics approach and the second, a semiclassical hydrodynamics approach with quantum correction terms. Under the first approach, three different quantum simulators are being developed and compared: a nonequlibrium Green's function (NEGF) approach, Wigner function approach, and a density matrix approach. In this talk, results using various codes will be presented. Our process modeling work focuses primarily on epitaxy and etching using first-principles models coupling reactor level and wafer level features. For the latter, we are using a novel approach based on Level Set theory. Sample results from this effort will also be presented.

  15. Simulations of Bluff Body Flow Interaction for Noise Source Modeling

    NASA Technical Reports Server (NTRS)

    Khorrami, Medi R.; Lockard David P.; Choudhari, Meelan M.; Jenkins, Luther N.; Neuhart, Dan H.; McGinley, Catherine B.

    2006-01-01

    The current study is a continuation of our effort to characterize the details of flow interaction between two cylinders in a tandem configuration. This configuration is viewed to possess many of the pertinent flow features of the highly interactive unsteady flow field associated with the main landing gear of large civil transports. The present effort extends our previous two-dimensional, unsteady, Reynolds Averaged Navier-Stokes computations to three dimensions using a quasilaminar, zonal approach, in conjunction with a two-equation turbulence model. Two distinct separation length-to-diameter ratios of L/D = 3.7 and 1.435, representing intermediate and short separation distances between the two cylinders, are simulated. The Mach 0.166 simulations are performed at a Reynolds number of Re = 1.66 105 to match the companion experiments at NASA Langley Research Center. Extensive comparisons with the measured steady and unsteady surface pressure and off-surface particle image velocimetry data show encouraging agreement. Both prominent and some of the more subtle trends in the mean and fluctuating flow fields are correctly predicted. Both computations and the measured data reveal a more robust and energetic shedding process at L/D = 3.7 in comparison with the weaker shedding in the shorter separation case of L/D = 1.435. The vortex shedding frequency based on the computed surface pressure spectra is in reasonable agreement with the measured Strouhal frequency.

  16. [APPLICATION OF COMPUTER-ASSISTED TECHNOLOGY IN ANALYSIS OF REVISION REASON OF UNICOMPARTMENTAL KNEE ARTHROPLASTY].

    PubMed

    Jia, Di; Li, Yanlin; Wang, Guoliang; Gao, Huanyu; Yu, Yang

    2016-01-01

    To conclude the revision reason of unicompartmental knee arthroplasty (UKA) using computer-assisted technology so as to provide reference for reducing the revision incidence and improving the level of surgical technique and rehabilitation. The relevant literature on analyzing revision reason of UKA using computer-assisted technology in recent years was extensively reviewed. The revision reasons by computer-assisted technology are fracture of the medial tibial plateau, progressive osteoarthritis of reserved compartment, dislocation of mobile bearing, prosthesis loosening, polyethylene wear, and unexplained persistent pain. Computer-assisted technology can be used to analyze the revision reason of UKA and guide the best operating method and rehabilitation scheme by simulating the operative process and knee joint activities.

  17. Predictors and Impact of Self-Reported Suboptimal Effort on Estimates of Prevalence of HIV-Associated Neurocognitive Disorders.

    PubMed

    Levine, Andrew J; Martin, Eileen; Sacktor, Ned; Munro, Cynthia; Becker, James

    2017-06-01

    Prevalence estimates of HIV-associated neurocognitive disorders (HAND) may be inflated. Estimates are determined via cohort studies in which participants may apply suboptimal effort on neurocognitive testing, thereby inflating estimates. Additionally, fluctuating HAND severity over time may be related to inconsistent effort. To address these hypotheses, we characterized effort in the Multicenter AIDS Cohort Study. After neurocognitive testing, 935 participants (525 HIV- and 410 HIV+) completed the visual analog effort scale (VAES), rating their effort from 0% to 100%. Those with <100% then indicated the reason(s) for suboptimal effort. K-means cluster analysis established 3 groups: high (mean = 97%), moderate (79%), and low effort (51%). Rates of HAND and other characteristics were compared between the groups. Linear regression examined the predictors of VAES score. Data from 57 participants who completed the VAES at 2 visits were analyzed to characterize the longitudinal relationship between effort and HAND severity. Fifty-two percent of participants reported suboptimal effort (<100%), with no difference between serostatus groups. Common reasons included "tired" (43%) and "distracted" (36%). The lowest effort group had greater asymptomatic neurocognitive impairment and minor neurocognitive disorder diagnosis (25% and 33%) as compared with the moderate (23% and 15%) and the high (12% and 9%) effort groups. Predictors of suboptimal effort were self-reported memory impairment, African American race, and cocaine use. Change in effort between baseline and follow-up correlated with change in HAND severity. Suboptimal effort seems to inflate estimated HAND prevalence and explain fluctuation of severity over time. A simple modification of study protocols to optimize effort is indicated by the results.

  18. [Is an effort needed in order to replace the punitive culture for the sake of patient safety?].

    PubMed

    Gutiérrez Ubeda, S R

    2016-01-01

    Efforts to introduce a safety culture have flourished in a growing number of health care organisations. However, many of these organisational efforts have been incomplete with respect to the manner on how to address the resistance to change offered by the prevailing punitive culture of healthcare organisations. The present article is intended to increase the awareness on three reasons of why an effort is needed to change the punitive culture before introducing the patient safety culture. The first reason is that the culture needs to be investigated and understood. The second reason is that culture is a complex construct, deeply embedded in organisations and their contexts, and thus difficult to change. The third reason is that punitive culture is not compatible with some components of safety culture, thus without removing it there are great possibilities that it would continue to be active and dominant over safety culture. These reasons suggest that, unless planning and executing effective interventions towards replacing punitive culture with safety culture, there is the risk that punitive culture would still prevail. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  19. Investigating College and Graduate Students' Multivariable Reasoning in Computational Modeling

    ERIC Educational Resources Information Center

    Wu, Hsin-Kai; Wu, Pai-Hsing; Zhang, Wen-Xin; Hsu, Ying-Shao

    2013-01-01

    Drawing upon the literature in computational modeling, multivariable reasoning, and causal attribution, this study aims at characterizing multivariable reasoning practices in computational modeling and revealing the nature of understanding about multivariable causality. We recruited two freshmen, two sophomores, two juniors, two seniors, four…

  20. Lightweight Adaptation of Classifiers to Users and Contexts: Trends of the Emerging Domain

    PubMed Central

    Vildjiounaite, Elena; Gimel'farb, Georgy; Kyllönen, Vesa; Peltola, Johannes

    2015-01-01

    Intelligent computer applications need to adapt their behaviour to contexts and users, but conventional classifier adaptation methods require long data collection and/or training times. Therefore classifier adaptation is often performed as follows: at design time application developers define typical usage contexts and provide reasoning models for each of these contexts, and then at runtime an appropriate model is selected from available ones. Typically, definition of usage contexts and reasoning models heavily relies on domain knowledge. However, in practice many applications are used in so diverse situations that no developer can predict them all and collect for each situation adequate training and test databases. Such applications have to adapt to a new user or unknown context at runtime just from interaction with the user, preferably in fairly lightweight ways, that is, requiring limited user effort to collect training data and limited time of performing the adaptation. This paper analyses adaptation trends in several emerging domains and outlines promising ideas, proposed for making multimodal classifiers user-specific and context-specific without significant user efforts, detailed domain knowledge, and/or complete retraining of the classifiers. Based on this analysis, this paper identifies important application characteristics and presents guidelines to consider these characteristics in adaptation design. PMID:26473165

  1. Scale-Resolving simulations (SRS): How much resolution do we really need?

    NASA Astrophysics Data System (ADS)

    Pereira, Filipe M. S.; Girimaji, Sharath

    2017-11-01

    Scale-resolving simulations (SRS) are emerging as the computational approach of choice for many engineering flows with coherent structures. The SRS methods seek to resolve only the most important features of the coherent structures and model the remainder of the flow field with canonical closures. With reference to a typical Large-Eddy Simulation (LES), practical SRS methods aim to resolve a considerably narrower range of scales (reduced physical resolution) to achieve an adequate degree of accuracy at reasonable computational effort. While the objective of SRS is well-founded, the criteria for establishing the optimal degree of resolution required to achieve an acceptable level of accuracy are not clear. This study considers the canonical case of the flow around a circular cylinder to address the issue of `optimal' resolution. Two important criteria are developed. The first condition addresses the issue of adequate resolution of the flow field. The second guideline provides an assessment of whether the modeled field is canonical (stochastic) turbulence amenable to closure-based computations.

  2. 24 CFR 983.254 - Vacancies.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... vacancy (and notwithstanding the reasonable good faith efforts of the PHA to fill such vacancies), the PHA... on the PHA waiting list referred by the PHA. (3) The PHA and the owner must make reasonable good faith efforts to minimize the likelihood and length of any vacancy. (b) Reducing number of contract...

  3. 20 CFR 404.1617 - Reasonable efforts to obtain review by a qualified psychiatrist or psychologist.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... qualified psychiatrist or psychologist. 404.1617 Section 404.1617 Employees' Benefits SOCIAL SECURITY... Responsibilities for Performing the Disability Determination Function § 404.1617 Reasonable efforts to obtain... perform these reviews, which are a basic State agency responsibility, and where appropriate, the State...

  4. 20 CFR 416.1017 - Reasonable efforts to obtain review by a qualified psychiatrist or psychologist.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... qualified psychiatrist or psychologist. 416.1017 Section 416.1017 Employees' Benefits SOCIAL SECURITY... Responsibilities for Performing the Disability Determination Function § 416.1017 Reasonable efforts to obtain... perform these reviews, which are a basic State agency responsibility, and where appropriate, the State...

  5. Aeroelasticity Benchmark Assessment: Subsonic Fixed Wing Program

    NASA Technical Reports Server (NTRS)

    Florance, Jennifer P.; Chwalowski, Pawel; Wieseman, Carol D.

    2010-01-01

    The fundamental technical challenge in computational aeroelasticity is the accurate prediction of unsteady aerodynamic phenomena and the effect on the aeroelastic response of a vehicle. Currently, a benchmarking standard for use in validating the accuracy of computational aeroelasticity codes does not exist. Many aeroelastic data sets have been obtained in wind-tunnel and flight testing throughout the world; however, none have been globally presented or accepted as an ideal data set. There are numerous reasons for this. One reason is that often, such aeroelastic data sets focus on the aeroelastic phenomena alone (flutter, for example) and do not contain associated information such as unsteady pressures and time-correlated structural dynamic deflections. Other available data sets focus solely on the unsteady pressures and do not address the aeroelastic phenomena. Other discrepancies can include omission of relevant data, such as flutter frequency and / or the acquisition of only qualitative deflection data. In addition to these content deficiencies, all of the available data sets present both experimental and computational technical challenges. Experimental issues include facility influences, nonlinearities beyond those being modeled, and data processing. From the computational perspective, technical challenges include modeling geometric complexities, coupling between the flow and the structure, grid issues, and boundary conditions. The Aeroelasticity Benchmark Assessment task seeks to examine the existing potential experimental data sets and ultimately choose the one that is viewed as the most suitable for computational benchmarking. An initial computational evaluation of that configuration will then be performed using the Langley-developed computational fluid dynamics (CFD) software FUN3D1 as part of its code validation process. In addition to the benchmarking activity, this task also includes an examination of future research directions. Researchers within the Aeroelasticity Branch will examine other experimental efforts within the Subsonic Fixed Wing (SFW) program (such as testing of the NASA Common Research Model (CRM)) and other NASA programs and assess aeroelasticity issues and research topics.

  6. Active Control of Fan Noise: Feasibility Study. Volume 5; Numerical Computation of Acoustic Mode Reflection Coefficients for an Unflanged Cylindrical Duct

    NASA Technical Reports Server (NTRS)

    Kraft, R. E.

    1996-01-01

    A computational method to predict modal reflection coefficients in cylindrical ducts has been developed based on the work of Homicz, Lordi, and Rehm, which uses the Wiener-Hopf method to account for the boundary conditions at the termination of a thin cylindrical pipe. The purpose of this study is to develop a computational routine to predict the reflection coefficients of higher order acoustic modes impinging on the unflanged termination of a cylindrical duct. This effort was conducted wider Task Order 5 of the NASA Lewis LET Program, Active Noise Control of aircraft Engines: Feasibility Study, and will be used as part of the development of an integrated source noise, acoustic propagation, ANC actuator coupling, and control system algorithm simulation. The reflection coefficient prediction will be incorporated into an existing cylindrical duct modal analysis to account for the reflection of modes from the duct termination. This will provide a more accurate, rapid computation design tool for evaluating the effect of reflected waves on active noise control systems mounted in the duct, as well as providing a tool for the design of acoustic treatment in inlet ducts. As an active noise control system design tool, the method can be used preliminary to more accurate but more numerically intensive acoustic propagation models such as finite element methods. The resulting computer program has been shown to give reasonable results, some examples of which are presented. Reliable data to use for comparison is scarce, so complete checkout is difficult, and further checkout is needed over a wider range of system parameters. In future efforts the method will be adapted as a subroutine to the GEAE segmented cylindrical duct modal analysis program.

  7. Development of a Pamphlet Targeting Computer Workstation Ergonomics

    NASA Technical Reports Server (NTRS)

    Faraci, Jennifer S.

    1997-01-01

    With the increased use of computers throughout Goddard Space Flight Center, the Industrial Hygiene Office (IHO) has observed a growing trend in the number of health complaints attributed to poor computer workstation setup. A majority of the complaints has centered around musculoskeletal symptoms, including numbness, pain, and tingling in the upper extremities, shoulders, and neck. Eye strain and headaches have also been reported. In some cases, these symptoms can lead to chronic conditions such as repetitive strain injuries (RSI's). In an effort to prevent or minimize the frequency of these symptoms among the GSFC population, the IHO conducts individual ergonomic workstation evaluations and ergonomics training classes upon request. Because of the extensive number of computer workstations at GSFC, and the limited amount of manpower which the Industrial Hygiene staff could reasonably allocate to conduct workstation evaluations and employee training, a pamphlet was developed with a two-fold purpose: (1) to educate the GSFC population about the importance of ergonomically-correct computer workstation setup and the potential effects of a poorly configured workstation; and (2) to enable employees to perform a general assessment of their own workstations and make any necessary modifications for proper setup.

  8. A computational model of in vitro angiogenesis based on extracellular matrix fibre orientation.

    PubMed

    Edgar, Lowell T; Sibole, Scott C; Underwood, Clayton J; Guilkey, James E; Weiss, Jeffrey A

    2013-01-01

    Recent interest in the process of vascularisation within the biomedical community has motivated numerous new research efforts focusing on the process of angiogenesis. Although the role of chemical factors during angiogenesis has been well documented, the role of mechanical factors, such as the interaction between angiogenic vessels and the extracellular matrix, remains poorly understood. In vitro methods for studying angiogenesis exist; however, measurements available using such techniques often suffer from limited spatial and temporal resolutions. For this reason, computational models have been extensively employed to investigate various aspects of angiogenesis. This paper outlines the formulation and validation of a simple and robust computational model developed to accurately simulate angiogenesis based on length, branching and orientation morphometrics collected from vascularised tissue constructs. Microvessels were represented as a series of connected line segments. The morphology of the vessels was determined by a linear combination of the collagen fibre orientation, the vessel density gradient and a random walk component. Excellent agreement was observed between computational and experimental morphometric data over time. Computational predictions of microvessel orientation within an anisotropic matrix correlated well with experimental data. The accuracy of this modelling approach makes it a valuable platform for investigating the role of mechanical interactions during angiogenesis.

  9. Meta-Reasoning: Monitoring and Control of Thinking and Reasoning.

    PubMed

    Ackerman, Rakefet; Thompson, Valerie A

    2017-08-01

    Meta-Reasoning refers to the processes that monitor the progress of our reasoning and problem-solving activities and regulate the time and effort devoted to them. Monitoring processes are usually experienced as feelings of certainty or uncertainty about how well a process has, or will, unfold. These feelings are based on heuristic cues, which are not necessarily reliable. Nevertheless, we rely on these feelings of (un)certainty to regulate our mental effort. Most metacognitive research has focused on memorization and knowledge retrieval, with little attention paid to more complex processes, such as reasoning and problem solving. In that context, we recently developed a Meta-Reasoning framework, used here to review existing findings, consider their consequences, and frame questions for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. High pressure jet flame numerical analysis of CO emissions by means of the flamelet generated manifolds technique

    NASA Astrophysics Data System (ADS)

    Donini, A.; Martin, S. M.; Bastiaans, R. J. M.; van Oijen, J. A.; de Goey, L. P. H.

    2013-10-01

    In the present paper a computational analysis of a high pressure confined premixed turbulent methane/air jet flames is presented. In this scope, chemistry is reduced by the use of the Flamelet Generated Manifold method [1] and the fluid flow is modeled in an LES and RANS context. The reaction evolution is described by the reaction progress variable, the heat loss is described by the enthalpy and the turbulence effect on the reaction is represented by the progress variable variance. The interaction between chemistry and turbulence is considered through a presumed probability density function (PDF) approach. The use of FGM as a combustion model shows that combustion features at gas turbine conditions can be satisfactorily reproduced with a reasonable computational effort. Furthermore, the present analysis indicates that the physical and chemical processes controlling carbon monoxide (CO) emissions can be captured only by means of unsteady simulations.

  11. Iterative CT reconstruction using coordinate descent with ordered subsets of data

    NASA Astrophysics Data System (ADS)

    Noo, F.; Hahn, K.; Schöndube, H.; Stierstorfer, K.

    2016-04-01

    Image reconstruction based on iterative minimization of a penalized weighted least-square criteria has become an important topic of research in X-ray computed tomography. This topic is motivated by increasing evidence that such a formalism may enable a significant reduction in dose imparted to the patient while maintaining or improving image quality. One important issue associated with this iterative image reconstruction concept is slow convergence and the associated computational effort. For this reason, there is interest in finding methods that produce approximate versions of the targeted image with a small number of iterations and an acceptable level of discrepancy. We introduce here a novel method to produce such approximations: ordered subsets in combination with iterative coordinate descent. Preliminary results demonstrate that this method can produce, within 10 iterations and using only a constant image as initial condition, satisfactory reconstructions that retain the noise properties of the targeted image.

  12. Simulation Enabled Safeguards Assessment Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-09-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements inmore » functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed.« less

  13. Parallelization of ARC3D with Computer-Aided Tools

    NASA Technical Reports Server (NTRS)

    Jin, Haoqiang; Hribar, Michelle; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    A series of efforts have been devoted to investigating methods of porting and parallelizing applications quickly and efficiently for new architectures, such as the SCSI Origin 2000 and Cray T3E. This report presents the parallelization of a CFD application, ARC3D, using the computer-aided tools, Cesspools. Steps of parallelizing this code and requirements of achieving better performance are discussed. The generated parallel version has achieved reasonably well performance, for example, having a speedup of 30 for 36 Cray T3E processors. However, this performance could not be obtained without modification of the original serial code. It is suggested that in many cases improving serial code and performing necessary code transformations are important parts for the automated parallelization process although user intervention in many of these parts are still necessary. Nevertheless, development and improvement of useful software tools, such as Cesspools, can help trim down many tedious parallelization details and improve the processing efficiency.

  14. Effects of heat exchanger tubes on hydrodynamics and CO 2 capture of a sorbent-based fluidized bed reactor

    DOE PAGES

    Lai, Canhai; Xu, Zhijie; Li, Tingwen; ...

    2017-08-05

    In virtual design and scale up of pilot-scale carbon capture systems, the coupled reactive multiphase flow problem must be solved to predict the adsorber's performance and capture efficiency under various operation conditions. This paper focuses on the detailed computational fluid dynamics (CFD) modeling of a pilot-scale fluidized bed adsorber equipped with vertical cooling tubes. Multiphase Flow with Interphase eXchanges (MFiX), an open-source multiphase flow CFD solver, is used for the simulations with custom code to simulate the chemical reactions and filtered sub-grid models to capture the effect of the unresolved details in the coarser mesh for simulations with reasonable accuracymore » and manageable computational effort. Previously developed filtered models for horizontal cylinder drag, heat transfer, and reaction kinetics have been modified to derive the 2D filtered models representing vertical cylinders in the coarse-grid CFD simulations. The effects of the heat exchanger configurations (i.e., horizontal or vertical tubes) on the adsorber's hydrodynamics and CO 2 capture performance are then examined. A one-dimensional three-region process model is briefly introduced for comparison purpose. The CFD model matches reasonably well with the process model while provides additional information about the flow field that is not available with the process model.« less

  15. Applying Model-Based Reasoning to the FDIR of the Command and Data Handling Subsystem of the International Space Station

    NASA Technical Reports Server (NTRS)

    Robinson, Peter; Shirley, Mark; Fletcher, Daryl; Alena, Rick; Duncavage, Dan; Lee, Charles

    2003-01-01

    All of the International Space Station (ISS) systems which require computer control depend upon the hardware and software of the Command and Data Handling System (C&DH) system, currently a network of over 30 386-class computers called Multiplexor/Dimultiplexors (MDMs)[18]. The Caution and Warning System (C&W)[7], a set of software tasks that runs on the MDMs, is responsible for detecting, classifying, and reporting errors in all ISS subsystems including the C&DH. Fault Detection, Isolation and Recovery (FDIR) of these errors is typically handled with a combination of automatic and human effort. We are developing an Advanced Diagnostic System (ADS) to augment the C&W system with decision support tools to aid in root cause analysis as well as resolve differing human and machine C&DH state estimates. These tools which draw from sources in model-based reasoning[ 16,291, will improve the speed and accuracy of flight controllers by reducing the uncertainty in C&DH state estimation, allowing for a more complete assessment of risk. We have run tests with ISS telemetry and focus on those C&W events which relate to the C&DH system itself. This paper describes our initial results and subsequent plans.

  16. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  17. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  18. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  19. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  20. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  1. Relations between Inductive Reasoning and Deductive Reasoning

    ERIC Educational Resources Information Center

    Heit, Evan; Rotello, Caren M.

    2010-01-01

    One of the most important open questions in reasoning research is how inductive reasoning and deductive reasoning are related. In an effort to address this question, we applied methods and concepts from memory research. We used 2 experiments to examine the effects of logical validity and premise-conclusion similarity on evaluation of arguments.…

  2. Estimating rates of local species extinction, colonization and turnover in animal communities

    USGS Publications Warehouse

    Nichols, James D.; Boulinier, T.; Hines, J.E.; Pollock, K.H.; Sauer, J.R.

    1998-01-01

    Species richness has been identified as a useful state variable for conservation and management purposes. Changes in richness over time provide a basis for predicting and evaluating community responses to management, to natural disturbance, and to changes in factors such as community composition (e.g., the removal of a keystone species). Probabilistic capture-recapture models have been used recently to estimate species richness from species count and presence-absence data. These models do not require the common assumption that all species are detected in sampling efforts. We extend this approach to the development of estimators useful for studying the vital rates responsible for changes in animal communities over time; rates of local species extinction, turnover, and colonization. Our approach to estimation is based on capture-recapture models for closed animal populations that permit heterogeneity in detection probabilities among the different species in the sampled community. We have developed a computer program, COMDYN, to compute many of these estimators and associated bootstrap variances. Analyses using data from the North American Breeding Bird Survey (BBS) suggested that the estimators performed reasonably well. We recommend estimators based on probabilistic modeling for future work on community responses to management efforts as well as on basic questions about community dynamics.

  3. Research and Technology 1997

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This report highlights the challenging work accomplished during fiscal year 1997 by Ames research scientists and engineers. The work is divided into accomplishments that support the goals of NASA s four Strategic Enterprises: Aeronautics and Space Transportation Technology, Space Science, Human Exploration and Development of Space (HEDS), and Earth Science. NASA Ames Research Center s research effort in the Space, Earth, and HEDS Enterprises is focused i n large part to support Ames lead role for Astrobiology, which broadly defined is the scientific study of the origin, distribution, and future of life in the universe. This NASA initiative in Astrobiology is a broad science effort embracing basic research, technology development, and flight missions. Ames contributions to the Space Science Enterprise are focused in the areas of exobiology, planetary systems, astrophysics, and space technology. Ames supports the Earth Science Enterprise by conducting research and by developing technology with the objective of expanding our knowledge of the Earth s atmosphere and ecosystems. Finallv, Ames supports the HEDS Enterprise by conducting research, managing spaceflight projects, and developing technologies. A key objective is to understand the phenomena surrounding the effects of gravity on living things. Ames has also heen designated the Agency s Center of Evcellence for Information Technnlogv. The three cornerstones of Information Technology research at Ames are automated reasoning, human-centered computing, and high performance computing and networking.

  4. Gender and theory of mind in preschoolers' group effort: evidence for timing differences behind children's earliest social loafing.

    PubMed

    Thompson, R Bruce; Thornton, Bill

    2014-01-01

    This study explored mental state reasoning within the context of group effort and possible differences in development between boys and girls. Preschool children (59 girls, 47 boys) were assessed for theory of mind (ToM) ability using classic false belief tests. Children participated in group effort conditions that alternated from one condition, where individual effort was transparent and obvious, to one where individual effort remained anonymous. The aim was to investigate if emergent mental state reasoning, after controlling for age, was associated with the well-known phenomenon of reduced effort in group tasks ("social loafing"). Girls had slightly higher ToM scores and social loafing than boys. Hierarchical regression, controlling for age, indicated that understanding of others' false beliefs uniquely predicted social loafing and interacted weakly with gender status.

  5. Perceptions of Project Representatives Concerning Project Success and Pre-Project Planning Effort

    DTIC Science & Technology

    1993-12-01

    25 Table 2. What are your main reasons for the project’s level of success? ........... 28 Table 3. What , if anything...Table 3a. Other Categories for Table 3 .......................................................... 32 Table 4. What are your main reasons for your...assessment of the level of effort expended on pre-project planning? ............................. ....... .. .......... .. . .. 33 Table 5. What concerning

  6. Optimizing R with SparkR on a commodity cluster for biomedical research.

    PubMed

    Sedlmayr, Martin; Würfl, Tobias; Maier, Christian; Häberle, Lothar; Fasching, Peter; Prokosch, Hans-Ulrich; Christoph, Jan

    2016-12-01

    Medical researchers are challenged today by the enormous amount of data collected in healthcare. Analysis methods such as genome-wide association studies (GWAS) are often computationally intensive and thus require enormous resources to be performed in a reasonable amount of time. While dedicated clusters and public clouds may deliver the desired performance, their use requires upfront financial efforts or anonymous data, which is often not possible for preliminary or occasional tasks. We explored the possibilities to build a private, flexible cluster for processing scripts in R based on commodity, non-dedicated hardware of our department. For this, a GWAS-calculation in R on a single desktop computer, a Message Passing Interface (MPI)-cluster, and a SparkR-cluster were compared with regards to the performance, scalability, quality, and simplicity. The original script had a projected runtime of three years on a single desktop computer. Optimizing the script in R already yielded a significant reduction in computing time (2 weeks). By using R-MPI and SparkR, we were able to parallelize the computation and reduce the time to less than three hours (2.6 h) on already available, standard office computers. While MPI is a proven approach in high-performance clusters, it requires rather static, dedicated nodes. SparkR and its Hadoop siblings allow for a dynamic, elastic environment with automated failure handling. SparkR also scales better with the number of nodes in the cluster than MPI due to optimized data communication. R is a popular environment for clinical data analysis. The new SparkR solution offers elastic resources and allows supporting big data analysis using R even on non-dedicated resources with minimal change to the original code. To unleash the full potential, additional efforts should be invested to customize and improve the algorithms, especially with regards to data distribution. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  7. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    ERIC Educational Resources Information Center

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  8. Solving probability reasoning based on DNA strand displacement and probability modules.

    PubMed

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Linear and passive silicon diodes, isolators, and logic gates

    NASA Astrophysics Data System (ADS)

    Li, Zhi-Yuan

    2013-12-01

    Silicon photonic integrated devices and circuits have offered a promising means to revolutionalize information processing and computing technologies. One important reason is that these devices are compatible with conventional complementary metal oxide semiconductor (CMOS) processing technology that overwhelms current microelectronics industry. Yet, the dream to build optical computers has yet to come without the breakthrough of several key elements including optical diodes, isolators, and logic gates with low power, high signal contrast, and large bandwidth. Photonic crystal has a great power to mold the flow of light in micrometer/nanometer scale and is a promising platform for optical integration. In this paper we present our recent efforts of design, fabrication, and characterization of ultracompact, linear, passive on-chip optical diodes, isolators and logic gates based on silicon two-dimensional photonic crystal slabs. Both simulation and experiment results show high performance of these novel designed devices. These linear and passive silicon devices have the unique properties of small fingerprint, low power request, large bandwidth, fast response speed, easy for fabrication, and being compatible with COMS technology. Further improving their performance would open up a road towards photonic logics and optical computing and help to construct nanophotonic on-chip processor architectures for future optical computers.

  10. Promoting Physical Activity through Hand-Held Computer Technology

    PubMed Central

    King, Abby C.; Ahn, David K.; Oliveira, Brian M.; Atienza, Audie A.; Castro, Cynthia M.; Gardner, Christopher D.

    2009-01-01

    Background Efforts to achieve population-wide increases in walking and similar moderate-intensity physical activities potentially can be enhanced through relevant applications of state-of-the-art interactive communication technologies. Yet few systematic efforts to evaluate the efficacy of hand-held computers and similar devices for enhancing physical activity levels have occurred. The purpose of this first-generation study was to evaluate the efficacy of a hand-held computer (i.e., personal digital assistant [PDA]) for increasing moderate intensity or more vigorous (MOD+) physical activity levels over 8 weeks in mid-life and older adults relative to a standard information control arm. Design Randomized, controlled 8-week experiment. Data were collected in 2005 and analyzed in 2006-2007. Setting/Participants Community-based study of 37 healthy, initially underactive adults aged 50 years and older who were randomized and completed the 8-week study (intervention=19, control=18). Intervention Participants received an instructional session and a PDA programmed to monitor their physical activity levels twice per day and provide daily and weekly individualized feedback, goal setting, and support. Controls received standard, age-appropriate written physical activity educational materials. Main Outcome Measure Physical activity was assessed via the Community Healthy Activities Model Program for Seniors (CHAMPS) questionnaire at baseline and 8 weeks. Results Relative to controls, intervention participants reported significantly greater 8-week mean estimated caloric expenditure levels and minutes per week in MOD+ activity (p<0.04). Satisfaction with the PDA was reasonably high in this largely PDA-naive sample. Conclusions Results from this first-generation study indicate that hand-held computers may be effective tools for increasing initial physical activity levels among underactive adults. PMID:18201644

  11. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    NASA Astrophysics Data System (ADS)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  12. Thermophysics Characterization of Multiply Ionized Air Plasma Absorption of Laser Radiation

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Rhodes, Robert; Turner, Jim (Technical Monitor)

    2002-01-01

    The impact of multiple ionization of air plasma on the inverse Bremsstrahlung absorption of laser radiation is investigated for air breathing laser propulsion. Thermochemical properties of multiply ionized air plasma species are computed for temperatures up to 200,000 deg K, using hydrogenic approximation of the electronic partition function; And those for neutral air molecules are also updated for temperatures up to 50,000 deg K, using available literature data. Three formulas for absorption are calculated and a general formula is recommended for multiple ionization absorption calculation. The plasma composition required for absorption calculation is obtained by increasing the degree of ionization sequentially, up to quadruple ionization, with a series of thermal equilibrium computations. The calculated second ionization absorption coefficient agrees reasonably well with that of available data. The importance of multiple ionization modeling is demonstrated with the finding that area under the quadruple ionization curve of absorption is found to be twice that of single ionization. The effort of this work is beneficial to the computational plasma aerodynamics modeling of laser lightcraft performance.

  13. Stabilized FE simulation of prototype thermal-hydraulics problems with integrated adjoint-based capabilities

    NASA Astrophysics Data System (ADS)

    Shadid, J. N.; Smith, T. M.; Cyr, E. C.; Wildey, T. M.; Pawlowski, R. P.

    2016-09-01

    A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. In this respect the understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In this study we report on initial efforts to apply integrated adjoint-based computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier-Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. Initial results are presented that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.

  14. Stabilized FE simulation of prototype thermal-hydraulics problems with integrated adjoint-based capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadid, J.N., E-mail: jnshadi@sandia.gov; Department of Mathematics and Statistics, University of New Mexico; Smith, T.M.

    A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. In this respect the understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In this study we report on initial efforts tomore » apply integrated adjoint-based computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier–Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. Initial results are presented that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.« less

  15. Stabilized FE simulation of prototype thermal-hydraulics problems with integrated adjoint-based capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadid, J. N.; Smith, T. M.; Cyr, E. C.

    A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. The understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In our study we report on initial efforts to apply integrated adjoint-basedmore » computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier–Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. We present the initial results that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.« less

  16. Stabilized FE simulation of prototype thermal-hydraulics problems with integrated adjoint-based capabilities

    DOE PAGES

    Shadid, J. N.; Smith, T. M.; Cyr, E. C.; ...

    2016-05-20

    A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. The understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In our study we report on initial efforts to apply integrated adjoint-basedmore » computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier–Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. We present the initial results that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.« less

  17. The development of nickel-metal hydride technology for use in aerospace applications

    NASA Technical Reports Server (NTRS)

    Rampel, Guy; Johnson, Herschel; Dell, Dan; Wu, Tony; Puglisi, Vince

    1992-01-01

    The nickel metal hydride technology for battery application is relatively immature even though this technology was made widely known by Philips' scientists as long ago as 1970. Recently, because of the international environmental regulatory pressures being placed on cadmium in the workplace and in disposal practices, battery companies have initiated extensive development programs to make this technology a viable commercial operation. These hydrides do not pose a toxilogical threat as does cadmium. Also, they provide a higher energy density and specific energy when compared to the other nickel based battery technologies. For these reasons, the nickel metal hydride electrochemisty is being evaluated as the next power source for varied applications such as laptop computers, cellular telephones, electric vehicles, and satellites. A parallel development effort is under way to look at aerospace applications for nickel metal hydride cells. This effort is focused on life testing of small wound cells of the commercial type to validate design options and development of prismatic design cells for aerospace applications.

  18. Real-time multisensor data fusion for target detection, classification, tracking, counting, and range estimates

    NASA Astrophysics Data System (ADS)

    Tsui, Eddy K.; Thomas, Russell L.

    2004-09-01

    As part of the Commanding General of Army Material Command's Research, Development & Engineering Command (RDECOM), the U.S. Army Research Development and Engineering Center (ARDEC), Picatinny funded a joint development effort with McQ Associates, Inc. to develop an Advanced Minefield Sensor (AMS) as a technology evaluation prototype for the Anti-Personnel Landmine Alternatives (APLA) Track III program. This effort laid the fundamental groundwork of smart sensors for detection and classification of targets, identification of combatant or noncombatant, target location and tracking at and between sensors, fusion of information across targets and sensors, and automatic situation awareness to the 1st responder. The efforts have culminated in developing a performance oriented architecture meeting the requirements of size, weight, and power (SWAP). The integrated digital signal processor (DSP) paradigm is capable of computing signals from sensor modalities to extract needed information within either a 360° or fixed field of view with acceptable false alarm rate. This paper discusses the challenges in the developments of such a sensor, focusing on achieving reasonable operating ranges, achieving low power, small size and low cost, and applications for extensions of this technology.

  19. Focus of attention in systems for visual monitoring of experiments

    NASA Technical Reports Server (NTRS)

    Blank, G. E.; Martin, W. N.

    1987-01-01

    The problem of designing a computerized experiment monitoring system for use in a space station or elsewhere is examined. It is shown that the essential challenge of such a system - attaining a reasonable expected running time - can be attacked using the concept of focus of attention and by exploiting parallelism. The use of the Contract Net Protocol for the latter purpose is discussed. The use of ideas from information science to help focus a programs's efforts on those computations likely to bring results is addressed, and the incorporation of those ideas into a design in order to aid the system in deciding upon the best course of action is considered.

  20. Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University

    ERIC Educational Resources Information Center

    Plane, Jandelyn

    2010-01-01

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…

  1. A conceptual model to empower software requirements conflict detection and resolution with rule-based reasoning

    NASA Astrophysics Data System (ADS)

    Ahmad, Sabrina; Jalil, Intan Ermahani A.; Ahmad, Sharifah Sakinah Syed

    2016-08-01

    It is seldom technical issues which impede the process of eliciting software requirements. The involvement of multiple stakeholders usually leads to conflicts and therefore the need of conflict detection and resolution effort is crucial. This paper presents a conceptual model to further improve current efforts. Hence, this paper forwards an improved conceptual model to assist the conflict detection and resolution effort which extends the model ability and improves overall performance. The significant of the new model is to empower the automation of conflicts detection and its severity level with rule-based reasoning.

  2. Observed differences in upper extremity forces, muscle efforts, postures, velocities and accelerations across computer activities in a field study of office workers.

    PubMed

    Bruno Garza, J L; Eijckelhof, B H W; Johnson, P W; Raina, S M; Rynell, P W; Huysmans, M A; van Dieën, J H; van der Beek, A J; Blatter, B M; Dennerlein, J T

    2012-01-01

    This study, a part of the PRedicting Occupational biomechanics in OFfice workers (PROOF) study, investigated whether there are differences in field-measured forces, muscle efforts, postures, velocities and accelerations across computer activities. These parameters were measured continuously for 120 office workers performing their own work for two hours each. There were differences in nearly all forces, muscle efforts, postures, velocities and accelerations across keyboard, mouse and idle activities. Keyboard activities showed a 50% increase in the median right trapezius muscle effort when compared to mouse activities. Median shoulder rotation changed from 25 degrees internal rotation during keyboard use to 15 degrees external rotation during mouse use. Only keyboard use was associated with median ulnar deviations greater than 5 degrees. Idle activities led to the greatest variability observed in all muscle efforts and postures measured. In future studies, measurements of computer activities could be used to provide information on the physical exposures experienced during computer use. Practitioner Summary: Computer users may develop musculoskeletal disorders due to their force, muscle effort, posture and wrist velocity and acceleration exposures during computer use. We report that many physical exposures are different across computer activities. This information may be used to estimate physical exposures based on patterns of computer activities over time.

  3. Rule-Based Reasoning Is Fast and Belief-Based Reasoning Can Be Slow: Challenging Current Explanations of Belief-Bias and Base-Rate Neglect

    ERIC Educational Resources Information Center

    Newman, Ian R.; Gibb, Maia; Thompson, Valerie A.

    2017-01-01

    It is commonly assumed that belief-based reasoning is fast and automatic, whereas rule-based reasoning is slower and more effortful. Dual-Process theories of reasoning rely on this speed-asymmetry explanation to account for a number of reasoning phenomena, such as base-rate neglect and belief-bias. The goal of the current study was to test this…

  4. Supporting Students' Learning and Socioscientific Reasoning About Climate Change—the Effect of Computer-Based Concept Mapping Scaffolds

    NASA Astrophysics Data System (ADS)

    Eggert, Sabina; Nitsch, Anne; Boone, William J.; Nückles, Matthias; Bögeholz, Susanne

    2017-02-01

    Climate change is one of the most challenging problems facing today's global society (e.g., IPCC 2013). While climate change is a widely covered topic in the media, and abundant information is made available through the internet, the causes and consequences of climate change in its full complexity are difficult for individuals, especially non-scientists, to grasp. Science education is a field which can play a crucial role in fostering meaningful education of students to become climate literate citizens (e.g., NOAA 2009; Schreiner et al., 41, 3-50, 2005). If students are, at some point, to participate in societal discussions about the sustainable development of our planet, their learning with respect to such issues needs to be supported. This includes the ability to think critically, to cope with complex scientific evidence, which is often subject to ongoing inquiry, and to reach informed decisions on the basis of factual information as well as values-based considerations. The study presented in this paper focused on efforts to advance students in (1) their conceptual understanding about climate change and (2) their socioscientific reasoning and decision making regarding socioscientific issues in general. Although there is evidence that "knowledge" does not guarantee pro-environmental behavior (e.g. Schreiner et al., 41, 3-50, 2005; Skamp et al., 97(2), 191-217, 2013), conceptual, interdisciplinary understanding of climate change is an important prerequisite to change individuals' attitudes towards climate change and thus to eventually foster climate literate citizens (e.g., Clark et al. 2013). In order to foster conceptual understanding and socioscientific reasoning, a computer-based learning environment with an embedded concept mapping tool was utilized to support senior high school students' learning about climate change and possible solution strategies. The evaluation of the effect of different concept mapping scaffolds focused on the quality of student-generated concept maps, as well as on students' test performance with respect to conceptual knowledge as well as socioscientific reasoning and socioscientific decision making.

  5. Assessing Motivations and Use of Online Citizen Science Astronomy Projects

    NASA Astrophysics Data System (ADS)

    Nona Bakerman, Maya; Buxner, Sanlyn; Bracey, Georgia; Gugliucci, Nicole

    2018-01-01

    The exponential proliferation of astronomy data has resulted in the need to develop new ways to analyze data. Recent efforts to engage the public in the discussion of the importance of science has led to projects that are aimed at letting them have hands-on experiences. Citizen science in astronomy, which has followed the model of citizen science in other scientific fields, has increased in the number and type of projects in the last few years and poses captivating ways to engage the public in science.The primary feature of this study was citizen science users’ motivations and activities related to engaging in astronomy citizen science projects. We report on participants’ interview responses related to their motivations, length and frequency of engagement, and reasons for leaving the project. From May to October 2014, 32 adults were interviewed to assess their motivations and experiences with citizen science. In particular, we looked at if and how motivations have changed for those who have engaged in the projects in order to develop support for and understandparticipants of citizen science. The predominant reasons participants took part in citizen science were: interest, helping, learning or teaching, and being part of science. Everyone interviewed demonstrated an intrinsic motivation to do citizen science projects.Participants’ reasons for ending their engagement on any given day were: having to do other things, physical effects of the computer, scheduled event that ended, attention span or tired, computer or program issues. A small fraction of the participants also indicated experiencing negative feedback. Out of the participants who no longer took part in citizen science projects, some indicated that receiving negative feedback was their primary reason and others reported the program to be frustrating.Our work is helping us to understand participants who engage in online citizen science projects so that researchers can better design projects to meet their needs and develop support materials and incentives to encourage more participation.

  6. Old adults perform activities of daily living near their maximal capabilities.

    PubMed

    Hortobágyi, Tibor; Mizelle, Chris; Beam, Stacey; DeVita, Paul

    2003-05-01

    Old adults' ability to execute activities of daily living (ADLs) declines with age. One possible reason for this decline is that the execution of customary motor tasks requires a substantially greater effort in old compared with young adults relative to their available maximal capacity. We tested the hypothesis that the relative effort (i.e., the percentage of joint moment relative to maximal joint moment) to execute ADLs is higher in old adults compared with young adults. Healthy young adults (n = 13; mean age, 22 years) and old adults (n = 14; mean age, 74 years) ascended and descended stairs and rose from a chair and performed maximal-effort isometric supine leg press. Using inverse dynamics analysis, we determined knee joint moments in ADLs and computed relative effort. Compared with young adults, old adults had 60% lower maximal leg press moments, 53% slower knee angular velocity at peak torque, and 27% lower knee joint moments in the ADLs (all p <.05). Relative effort in ascent was 54% (SD +/- 16%) and 78% (+/-20%) in young and old adults, respectively; in descent, it was 42% (+/-20%) and 88% (+/-43%); and in chair rise, it was 42% (+/-19%) and 80% (+/-34%) (all p <.05). The relative electromyographic activity of the vastus lateralis and the coactivity of the biceps femoris associated with this relative effort were, respectively, 2- and 1.6-fold greater in old compared with young adults in the 3 ADLs (p <.05). For healthy old adults, the difficulty that arises while performing ADLs may be due more to working at a higher level of effort relative to their maximum capability than to the absolute functional demands imposed by the task.

  7. Cognitive foundations for model-based sensor fusion

    NASA Astrophysics Data System (ADS)

    Perlovsky, Leonid I.; Weijers, Bertus; Mutz, Chris W.

    2003-08-01

    Target detection, tracking, and sensor fusion are complicated problems, which usually are performed sequentially. First detecting targets, then tracking, then fusing multiple sensors reduces computations. This procedure however is inapplicable to difficult targets which cannot be reliably detected using individual sensors, on individual scans or frames. In such more complicated cases one has to perform functions of fusing, tracking, and detecting concurrently. This often has led to prohibitive combinatorial complexity and, as a consequence, to sub-optimal performance as compared to the information-theoretic content of all the available data. It is well appreciated that in this task the human mind is by far superior qualitatively to existing mathematical methods of sensor fusion, however, the human mind is limited in the amount of information and speed of computation it can cope with. Therefore, research efforts have been devoted toward incorporating "biological lessons" into smart algorithms, yet success has been limited. Why is this so, and how to overcome existing limitations? The fundamental reasons for current limitations are analyzed and a potentially breakthrough research and development effort is outlined. We utilize the way our mind combines emotions and concepts in the thinking process and present the mathematical approach to accomplishing this in the current technology computers. The presentation will summarize the difficulties encountered by intelligent systems over the last 50 years related to combinatorial complexity, analyze the fundamental limitations of existing algorithms and neural networks, and relate it to the type of logic underlying the computational structure: formal, multivalued, and fuzzy logic. A new concept of dynamic logic will be introduced along with algorithms capable of pulling together all the available information from multiple sources. This new mathematical technique, like our brain, combines conceptual understanding with emotional evaluation and overcomes the combinatorial complexity of concurrent fusion, tracking, and detection. The presentation will discuss examples of performance, where computational speedups of many orders of magnitude were attained leading to performance improvements of up to 10 dB (and better).

  8. A Computational Account of Children's Analogical Reasoning: Balancing Inhibitory Control in Working Memory and Relational Representation

    ERIC Educational Resources Information Center

    Morrison, Robert G.; Doumas, Leonidas A. A.; Richland, Lindsey E.

    2011-01-01

    Theories accounting for the development of analogical reasoning tend to emphasize either the centrality of relational knowledge accretion or changes in information processing capability. Simulations in LISA (Hummel & Holyoak, 1997, 2003), a neurally inspired computer model of analogical reasoning, allow us to explore how these factors may…

  9. 28 CFR 42.609 - EEOC reasonable cause determination and conciliation efforts.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed Against Recipients of Federal Financial Assistance § 42.609 EEOC reasonable cause...

  10. Mass Storage and Retrieval at Rome Laboratory

    NASA Technical Reports Server (NTRS)

    Kann, Joshua L.; Canfield, Brady W.; Jamberdino, Albert A.; Clarke, Bernard J.; Daniszewski, Ed; Sunada, Gary

    1996-01-01

    As the speed and power of modern digital computers continues to advance, the demands on secondary mass storage systems grow. In many cases, the limitations of existing mass storage reduce the overall effectiveness of the computing system. Image storage and retrieval is one important area where improved storage technologies are required. Three dimensional optical memories offer the advantage of large data density, on the order of 1 Tb/cm(exp 3), and faster transfer rates because of the parallel nature of optical recording. Such a system allows for the storage of multiple-Gbit sized images, which can be recorded and accessed at reasonable rates. Rome Laboratory is currently investigating several techniques to perform three-dimensional optical storage including holographic recording, two-photon recording, persistent spectral-hole burning, multi-wavelength DNA recording, and the use of bacteriorhodopsin as a recording material. In this paper, the current status of each of these on-going efforts is discussed. In particular, the potential payoffs as well as possible limitations are addressed.

  11. Computational studies of adsorption in metal organic frameworks and interaction of nanoparticles in condensed phases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Annapureddy, HVR; Motkuri, RK; Nguyen, PTM

    In this review, we describe recent efforts to systematically study nano-structured metal organic frameworks (MOFs), also known as metal organic heat carriers, with particular emphasis on their application in heating and cooling processes. We used both molecular dynamics and grand canonical Monte Carlo simulation techniques to gain a molecular-level understanding of the adsorption mechanism of gases in these porous materials. We investigated the uptake of various gases such as refrigerants R12 and R143a. We also evaluated the effects of temperature and pressure on the uptake mechanism. Our computed results compared reasonably well with available measurements from experiments, thus validating ourmore » potential models and approaches. In addition, we investigated the structural, diffusive and adsorption properties of different hydrocarbons in Ni-2(dhtp). Finally, to elucidate the mechanism of nanoparticle dispersion in condensed phases, we studied the interactions among nanoparticles in various liquids, such as n-hexane, water and methanol.« less

  12. Evaluation of Computer Aided Vortex Forecast System

    NASA Technical Reports Server (NTRS)

    Rossow, Vernon J.; Olson, Lawrence E. (Technical Monitor)

    1995-01-01

    Several countries, including the United States. Canada, Germany, England and Russia, are in the process of trying to develop some sort of computer-aided system that will guide controllers at airports on the hazard posed by lift-generated vortices that trail behind subsonic transport aircraft. The emphasis on this particular subject has come about because the hazard posed by wake vortices is currently the only reason why aircraft are spaced at 3 to 6 miles apart during landing and takeoff rather than something like 2 miles. It is well known that under certain weather conditions, aircraft spacings can be safely reduced to as little as the desired 2 miles. In an effort to perhaps capitalize on such a possibility, a combined FAA and NASA program is currently underway in the United States to develop such a system. Needless to say, the problems associated with anticipating the required separation distances when weather conditions are involved is very difficult. Similarly, Canada has a corresponding program to develop a vortex forecast system of their own.

  13. NASA integrated vehicle health management technology experiment for X-37

    NASA Astrophysics Data System (ADS)

    Schwabacher, Mark; Samuels, Jeff; Brownston, Lee

    2002-07-01

    The NASA Integrated Vehicle Health Management (IVHM) Technology Experiment for X-37 was intended to run IVHM software on board the X-37 spacecraft. The X-37 is an unpiloted vehicle designed to orbit the Earth for up to 21 days before landing on a runway. The objectives of the experiment were to demonstrate the benefits of in-flight IVHM to the operation of a Reusable Launch Vehicle, to advance the Technology Readiness Level of this IVHM technology within a flight environment, and to demonstrate that the IVHM software could operate on the Vehicle Management Computer. The scope of the experiment was to perform real-time fault detection and isolation for X-37's electrical power system and electro-mechanical actuators. The experiment used Livingstone, a software system that performs diagnosis using a qualitative, model-based reasoning approach that searches system-wide interactions to detect and isolate failures. Two of the challenges we faced were to make this research software more efficient so that it would fit within the limited computational resources that were available to us on the X-37 spacecraft, and to modify it so that it satisfied the X-37's software safety requirements. Although the experiment is currently unfunded, the development effort resulted in major improvements in Livingstone's efficiency and safety. This paper reviews some of the details of the modeling and integration efforts, and some of the lessons that were learned.

  14. Bayesian inversion using a geologically realistic and discrete model space

    NASA Astrophysics Data System (ADS)

    Jaeggli, C.; Julien, S.; Renard, P.

    2017-12-01

    Since the early days of groundwater modeling, inverse methods play a crucial role. Many research and engineering groups aim to infer extensive knowledge of aquifer parameters from a sparse set of observations. Despite decades of dedicated research on this topic, there are still several major issues to be solved. In the hydrogeological framework, one is often confronted with underground structures that present very sharp contrasts of geophysical properties. In particular, subsoil structures such as karst conduits, channels, faults, or lenses, strongly influence groundwater flow and transport behavior of the underground. For this reason it can be essential to identify their location and shape very precisely. Unfortunately, when inverse methods are specially trained to consider such complex features, their computation effort often becomes unaffordably high. The following work is an attempt to solve this dilemma. We present a new method that is, in some sense, a compromise between the ergodicity of Markov chain Monte Carlo (McMC) methods and the efficient handling of data by the ensemble based Kalmann filters. The realistic and complex random fields are generated by a Multiple-Point Statistics (MPS) tool. Nonetheless, it is applicable with any conditional geostatistical simulation tool. Furthermore, the algorithm is independent of any parametrization what becomes most important when two parametric systems are equivalent (permeability and resistivity, speed and slowness, etc.). When compared to two existing McMC schemes, the computational effort was divided by a factor of 12.

  15. Skin Effect Simulation for Area 11 Dense Plasma Focus Hot Plate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meehan, B. Timothy

    Two arc flashover events occurred at the DPF Area 11 facility. These flashover events happened in the same location on the bank current delivery plates. The damage from one of these events can be seen on the left-hand side of Figure 1. Since the flashovers occurred in the same area of the bank, and the reliability of the bank is important for future DPF experiments, a failure analysis effort was initiated. Part of this failure analysis effort was an effort to understand the physical reasons behind why the flashover happened, and why it happened in the same place twice. Thismore » paper summarizes an effort to simulate the current flow in the bank in order to understand the reasons for the flashover.« less

  16. Predicting Reasoning from Memory

    ERIC Educational Resources Information Center

    Heit, Evan; Hayes, Brett K.

    2011-01-01

    In an effort to assess the relations between reasoning and memory, in 8 experiments, the authors examined how well responses on an inductive reasoning task are predicted from responses on a recognition memory task for the same picture stimuli. Across several experimental manipulations, such as varying study time, presentation frequency, and the…

  17. Intuition, Reason, and Metacognition

    ERIC Educational Resources Information Center

    Thompson, Valerie A.; Prowse Turner, Jamie A.; Pennycook, Gordon

    2011-01-01

    Dual Process Theories (DPT) of reasoning posit that judgments are mediated by both fast, automatic processes and more deliberate, analytic ones. A critical, but unanswered question concerns the issue of monitoring and control: When do reasoners rely on the first, intuitive output and when do they engage more effortful thinking? We hypothesised…

  18. Initial CGE Model Results Summary Exogenous and Endogenous Variables Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, Brian Keith; Boero, Riccardo; Rivera, Michael Kelly

    The following discussion presents initial results of tests of the most recent version of the National Infrastructure Simulation and Analysis Center Dynamic Computable General Equilibrium (CGE) model developed by Los Alamos National Laboratory (LANL). The intent of this is to test and assess the model’s behavioral properties. The test evaluated whether the predicted impacts are reasonable from a qualitative perspective. This issue is whether the predicted change, be it an increase or decrease in other model variables, is consistent with prior economic intuition and expectations about the predicted change. One of the purposes of this effort is to determine whethermore » model changes are needed in order to improve its behavior qualitatively and quantitatively.« less

  19. Psychological Trauma as a Reason for Computer Game Addiction among Adolescents

    ERIC Educational Resources Information Center

    Oskenbay, Fariza; Tolegenova, Aliya; Kalymbetova, Elmira; Chung, Man Cheung; Faizullina, Aida; Jakupov, Maksat

    2016-01-01

    This study explores psychological trauma as a reason for computer game addiction among adolescents. The findings of this study show that there is a connection between psychological trauma and computer game addiction. Some psychologists note that the main cause of any type of addiction derives from psychological trauma, and that finding such…

  20. Does Computer Use Matter? The Influence of Computer Usage on Eighth-Grade Students' Mathematics Reasoning

    ERIC Educational Resources Information Center

    Ayieko, Rachel A.; Gokbel, Elif N.; Nelson, Bryan

    2017-01-01

    This study uses the 2011 Trends in International Mathematics and Science Study to investigate the relationships among students' and teachers' computer use, and eighth-grade students' mathematical reasoning in three high-achieving nations: Finland, Chinese Taipei, and Singapore. The study found a significant negative relationship in all three…

  1. Computational aerodynamics and artificial intelligence

    NASA Technical Reports Server (NTRS)

    Mehta, U. B.; Kutler, P.

    1984-01-01

    The general principles of artificial intelligence are reviewed and speculations are made concerning how knowledge based systems can accelerate the process of acquiring new knowledge in aerodynamics, how computational fluid dynamics may use expert systems, and how expert systems may speed the design and development process. In addition, the anatomy of an idealized expert system called AERODYNAMICIST is discussed. Resource requirements for using artificial intelligence in computational fluid dynamics and aerodynamics are examined. Three main conclusions are presented. First, there are two related aspects of computational aerodynamics: reasoning and calculating. Second, a substantial portion of reasoning can be achieved with artificial intelligence. It offers the opportunity of using computers as reasoning machines to set the stage for efficient calculating. Third, expert systems are likely to be new assets of institutions involved in aeronautics for various tasks of computational aerodynamics.

  2. A computational model of self-efficacy's various effects on performance: Moving the debate forward.

    PubMed

    Vancouver, Jeffrey B; Purl, Justin D

    2017-04-01

    Self-efficacy, which is one's belief in one's capacity, has been found to both positively and negatively influence effort and performance. The reasons for these different effects have been a major topic of debate among social-cognitive and perceptual control theorists. In particular, the findings of various self-efficacy effects has been motivated by a perceptual control theory view of self-regulation that social-cognitive theorists' question. To provide more clarity to the theoretical arguments, a computational model of the multiple processes presumed to create the positive, negative, and null effects for self-efficacy is presented. Building on an existing computational model of goal choice that produces a positive effect for self-efficacy, the current article adds a symbolic processing structure used during goal striving that explains the negative self-efficacy effect observed in recent studies. Moreover, the multiple processes, operating together, allow the model to recreate the various effects found in a published study of feedback ambiguity's moderating role on the self-efficacy to performance relationship (Schmidt & DeShon, 2010). Discussion focuses on the implications of the model for the self-efficacy debate, alternative computational models, the overlap between control theory and social-cognitive theory explanations, the value of using computational models for resolving theoretical disputes, and future research and directions the model inspires. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Application of a personal computer for the uncoupled vibration analysis of wind turbine blade and counterweight assemblies

    NASA Technical Reports Server (NTRS)

    White, P. R.; Little, R. R.

    1985-01-01

    A research effort was undertaken to develop personal computer based software for vibrational analysis. The software was developed to analytically determine the natural frequencies and mode shapes for the uncoupled lateral vibrations of the blade and counterweight assemblies used in a single bladed wind turbine. The uncoupled vibration analysis was performed in both the flapwise and chordwise directions for static rotor conditions. The effects of rotation on the uncoupled flapwise vibration of the blade and counterweight assemblies were evaluated for various rotor speeds up to 90 rpm. The theory, used in the vibration analysis codes, is based on a lumped mass formulation for the blade and counterweight assemblies. The codes are general so that other designs can be readily analyzed. The input for the codes is generally interactive to facilitate usage. The output of the codes is both tabular and graphical. Listings of the codes are provided. Predicted natural frequencies of the first several modes show reasonable agreement with experimental results. The analysis codes were originally developed on a DEC PDP 11/34 minicomputer and then downloaded and modified to run on an ITT XTRA personal computer. Studies conducted to evaluate the efficiency of running the programs on a personal computer as compared with the minicomputer indicated that, with the proper combination of hardware and software options, the efficiency of using a personal computer exceeds that of a minicomputer.

  4. Great Computational Intelligence in the Formal Sciences via Analogical Reasoning

    DTIC Science & Technology

    2017-05-08

    computational harnessing of traditional mathematical statistics (as e.g. covered in Hogg, Craig & McKean 2005) is used to power statistical learning techniques...AFRL-AFOSR-VA-TR-2017-0099 Great Computational Intelligence in the Formal Sciences via Analogical Reasoning Selmer Bringsjord RENSSELAER POLYTECHNIC...08-05-2017 2. REPORT TYPE Final Performance 3. DATES COVERED (From - To) 15 Oct 2011 to 31 Dec 2016 4. TITLE AND SUBTITLE Great Computational

  5. 20 CFR 1002.198 - What efforts must the employer make to help the employee become qualified for the reemployment...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false What efforts must the employer make to help... Reemployment Rights and Benefits Reemployment Position § 1002.198 What efforts must the employer make to help... reemployment position. The employer must make reasonable efforts to help the employee become qualified to...

  6. Design for a Crane Metallic Structure Based on Imperialist Competitive Algorithm and Inverse Reliability Strategy

    NASA Astrophysics Data System (ADS)

    Fan, Xiao-Ning; Zhi, Bo

    2017-07-01

    Uncertainties in parameters such as materials, loading, and geometry are inevitable in designing metallic structures for cranes. When considering these uncertainty factors, reliability-based design optimization (RBDO) offers a more reasonable design approach. However, existing RBDO methods for crane metallic structures are prone to low convergence speed and high computational cost. A unilevel RBDO method, combining a discrete imperialist competitive algorithm with an inverse reliability strategy based on the performance measure approach, is developed. Application of the imperialist competitive algorithm at the optimization level significantly improves the convergence speed of this RBDO method. At the reliability analysis level, the inverse reliability strategy is used to determine the feasibility of each probabilistic constraint at each design point by calculating its α-percentile performance, thereby avoiding convergence failure, calculation error, and disproportionate computational effort encountered using conventional moment and simulation methods. Application of the RBDO method to an actual crane structure shows that the developed RBDO realizes a design with the best tradeoff between economy and safety together with about one-third of the convergence speed and the computational cost of the existing method. This paper provides a scientific and effective design approach for the design of metallic structures of cranes.

  7. Dynamic Interaction of Long Suspension Bridges with Running Trains

    NASA Astrophysics Data System (ADS)

    XIA, H.; XU, Y. L.; CHAN, T. H. T.

    2000-10-01

    This paper presents an investigation of dynamic interaction of long suspension bridges with running trains. A three-dimensional finite element model is used to represent a long suspension bridge. Each 4-axle vehicle in a train is modelled by a 27-degrees-of-freedom dynamic system. The dynamic interaction between the bridge and train is realized through the contact forces between the wheels and track. By applying a mode superposition technique to the bridge only and taking the measured track irregularities as known quantities, the number of degrees of freedom (d.o.f.) the bridge-train system is significantly reduced and the coupled equations of motion are efficiently solved. The proposed formulation and the associated computer program are then applied to a real long suspension bridge carrying a railway within the bridge deck. The dynamic response of the bridge-train system and the derail and offload factors related to the running safety of the train are computed. The results show that the formulation presented in this paper can well predict dynamic behaviors of both bridge and train with reasonable computation efforts. Dynamic interaction between the long suspension bridge and train is not significant.

  8. Knowledge-acquisition tools for medical knowledge-based systems.

    PubMed

    Lanzola, G; Quaglini, S; Stefanelli, M

    1995-03-01

    Knowledge-based systems (KBS) have been proposed to solve a large variety of medical problems. A strategic issue for KBS development and maintenance are the efforts required for both knowledge engineers and domain experts. The proposed solution is building efficient knowledge acquisition (KA) tools. This paper presents a set of KA tools we are developing within a European Project called GAMES II. They have been designed after the formulation of an epistemological model of medical reasoning. The main goal is that of developing a computational framework which allows knowledge engineers and domain experts to interact cooperatively in developing a medical KBS. To this aim, a set of reusable software components is highly recommended. Their design was facilitated by the development of a methodology for KBS construction. It views this process as comprising two activities: the tailoring of the epistemological model to the specific medical task to be executed and the subsequent translation of this model into a computational architecture so that the connections between computational structures and their knowledge level counterparts are maintained. The KA tools we developed are illustrated taking examples from the behavior of a KBS we are building for the management of children with acute myeloid leukemia.

  9. An implicit scheme with memory reduction technique for steady state solutions of DVBE in all flow regimes

    NASA Astrophysics Data System (ADS)

    Yang, L. M.; Shu, C.; Yang, W. M.; Wu, J.

    2018-04-01

    High consumption of memory and computational effort is the major barrier to prevent the widespread use of the discrete velocity method (DVM) in the simulation of flows in all flow regimes. To overcome this drawback, an implicit DVM with a memory reduction technique for solving a steady discrete velocity Boltzmann equation (DVBE) is presented in this work. In the method, the distribution functions in the whole discrete velocity space do not need to be stored, and they are calculated from the macroscopic flow variables. As a result, its memory requirement is in the same order as the conventional Euler/Navier-Stokes solver. In the meantime, it is more efficient than the explicit DVM for the simulation of various flows. To make the method efficient for solving flow problems in all flow regimes, a prediction step is introduced to estimate the local equilibrium state of the DVBE. In the prediction step, the distribution function at the cell interface is calculated by the local solution of DVBE. For the flow simulation, when the cell size is less than the mean free path, the prediction step has almost no effect on the solution. However, when the cell size is much larger than the mean free path, the prediction step dominates the solution so as to provide reasonable results in such a flow regime. In addition, to further improve the computational efficiency of the developed scheme in the continuum flow regime, the implicit technique is also introduced into the prediction step. Numerical results showed that the proposed implicit scheme can provide reasonable results in all flow regimes and increase significantly the computational efficiency in the continuum flow regime as compared with the existing DVM solvers.

  10. Reasoning on the Autism Spectrum: A Dual Process Theory Account

    ERIC Educational Resources Information Center

    Brosnan, Mark; Lewton, Marcus; Ashwin, Chris

    2016-01-01

    Dual process theory proposes two distinct reasoning processes in humans, an intuitive style that is rapid and automatic and a deliberative style that is more effortful. However, no study to date has specifically examined these reasoning styles in relation to the autism spectrum. The present studies investigated deliberative and intuitive reasoning…

  11. Logic Brightens My Day: Evidence for Implicit Sensitivity to Logical Validity

    ERIC Educational Resources Information Center

    Trippas, Dries; Handley, Simon J.; Verde, Michael F.; Morsanyi, Kinga

    2016-01-01

    A key assumption of dual process theory is that reasoning is an explicit, effortful, deliberative process. The present study offers evidence for an implicit, possibly intuitive component of reasoning. Participants were shown sentences embedded in logically valid or invalid arguments. Participants were not asked to reason but instead rated the…

  12. Parallelisation study of a three-dimensional environmental flow model

    NASA Astrophysics Data System (ADS)

    O'Donncha, Fearghal; Ragnoli, Emanuele; Suits, Frank

    2014-03-01

    There are many simulation codes in the geosciences that are serial and cannot take advantage of the parallel computational resources commonly available today. One model important for our work in coastal ocean current modelling is EFDC, a Fortran 77 code configured for optimal deployment on vector computers. In order to take advantage of our cache-based, blade computing system we restructured EFDC from serial to parallel, thereby allowing us to run existing models more quickly, and to simulate larger and more detailed models that were previously impractical. Since the source code for EFDC is extensive and involves detailed computation, it is important to do such a port in a manner that limits changes to the files, while achieving the desired speedup. We describe a parallelisation strategy involving surgical changes to the source files to minimise error-prone alteration of the underlying computations, while allowing load-balanced domain decomposition for efficient execution on a commodity cluster. The use of conjugate gradient posed particular challenges due to implicit non-local communication posing a hindrance to standard domain partitioning schemes; a number of techniques are discussed to address this in a feasible, computationally efficient manner. The parallel implementation demonstrates good scalability in combination with a novel domain partitioning scheme that specifically handles mixed water/land regions commonly found in coastal simulations. The approach presented here represents a practical methodology to rejuvenate legacy code on a commodity blade cluster with reasonable effort; our solution has direct application to other similar codes in the geosciences.

  13. 28 CFR 42.609 - EEOC reasonable cause determination and conciliation efforts.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... informal methods of conference, conciliation and persuasion. If EEOC would like the referring agency to... efforts to resolve the complaint by informal methods of conference, conciliation and persuasion fail, EEOC...

  14. 28 CFR 42.609 - EEOC reasonable cause determination and conciliation efforts.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... informal methods of conference, conciliation and persuasion. If EEOC would like the referring agency to... efforts to resolve the complaint by informal methods of conference, conciliation and persuasion fail, EEOC...

  15. Using Computational Toxicology to Enable Risk-Based ...

    EPA Pesticide Factsheets

    presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment. Slide presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment.

  16. U.S. EPA computational toxicology programs: Central role of chemical-annotation efforts and molecular databases

    EPA Science Inventory

    EPA’s National Center for Computational Toxicology is engaged in high-profile research efforts to improve the ability to more efficiently and effectively prioritize and screen thousands of environmental chemicals for potential toxicity. A central component of these efforts invol...

  17. Creative and Algorithmic Mathematical Reasoning: Effects of Transfer-Appropriate Processing and Effortful Struggle

    ERIC Educational Resources Information Center

    Jonsson, Bert; Kulaksiz, Yagmur C.; Lithner, Johan

    2016-01-01

    Two separate studies, Jonsson et al. ("J. Math Behav." 2014;36: 20-32) and Karlsson Wirebring et al. ("Trends Neurosci Educ." 2015;4(1-2):6-14), showed that learning mathematics using creative mathematical reasoning and constructing their own solution methods can be more efficient than if students use algorithmic reasoning and…

  18. Assessing Probabilistic Reasoning in Verbal-Numerical and Graphical-Pictorial Formats: An Evaluation of the Psychometric Properties of an Instrument

    ERIC Educational Resources Information Center

    Agus, Mirian; Penna, Maria Pietronilla; Peró-Cebollero, Maribel; Guàrdia-Olmos, Joan

    2016-01-01

    Research on the graphical facilitation of probabilistic reasoning has been characterised by the effort expended to identify valid assessment tools. The authors developed an assessment instrument to compare reasoning performances when problems were presented in verbal-numerical and graphical-pictorial formats. A sample of undergraduate psychology…

  19. Item Response Theory in the context of Improving Student Reasoning

    NASA Astrophysics Data System (ADS)

    Goddard, Chase; Davis, Jeremy; Pyper, Brian

    2011-10-01

    We are interested to see if Item Response Theory can help to better inform the development of reasoning ability in introductory physics. A first pass through our latest batch of data from the Heat and Temperature Conceptual Evaluation, the Lawson Classroom Test of Scientific Reasoning, and the Epistemological Beliefs About Physics Survey may help in this effort.

  20. Developing a Construct-Based Assessment to Examine Students' Analogical Reasoning around Physical Models in Earth Science

    ERIC Educational Resources Information Center

    Rivet, Ann E.; Kastens, Kim A.

    2012-01-01

    In recent years, science education has placed increasing importance on learners' mastery of scientific reasoning. This growing emphasis presents a challenge for both developers and users of assessments. We report on our effort around the conceptualization, development, and testing the validity of an assessment of students' ability to reason around…

  1. Understanding Gaps in Research Networks: Using "Spatial Reasoning" as a Window into the Importance of Networked Educational Research

    ERIC Educational Resources Information Center

    Bruce, Catherine D.; Davis, Brent; Sinclair, Nathalie; McGarvey, Lynn; Hallowell, David; Drefs, Michelle; Francis, Krista; Hawes, Zachary; Moss, Joan; Mulligan, Joanne; Okamoto, Yukari; Whiteley, Walter; Woolcott, Geoff

    2017-01-01

    This paper finds its origins in a multidisciplinary research group's efforts to assemble a review of research in order to better appreciate how "spatial reasoning" is understood and investigated across academic disciplines. We first collaborated to create a historical map of the development of spatial reasoning across key disciplines…

  2. The Comparison of Inductive Reasoning under Risk Conditions between Chinese and Japanese Based on Computational Models: Toward the Application to CAE for Foreign Language

    ERIC Educational Resources Information Center

    Zhang, Yujie; Terai, Asuka; Nakagawa, Masanori

    2013-01-01

    Inductive reasoning under risk conditions is an important thinking process not only for sciences but also in our daily life. From this viewpoint, it is very useful for language learning to construct computational models of inductive reasoning which realize the CAE for foreign languages. This study proposes the comparison of inductive reasoning…

  3. Knowledge Representation and Ontologies

    NASA Astrophysics Data System (ADS)

    Grimm, Stephan

    Knowledge representation and reasoning aims at designing computer systems that reason about a machine-interpretable representation of the world. Knowledge-based systems have a computational model of some domain of interest in which symbols serve as surrogates for real world domain artefacts, such as physical objects, events, relationships, etc. [1]. The domain of interest can cover any part of the real world or any hypothetical system about which one desires to represent knowledge for com-putational purposes. A knowledge-based system maintains a knowledge base, which stores the symbols of the computational model in the form of statements about the domain, and it performs reasoning by manipulating these symbols. Applications can base their decisions on answers to domain-relevant questions posed to a knowledge base.

  4. An intelligent tutoring system for space shuttle diagnosis

    NASA Technical Reports Server (NTRS)

    Johnson, William B.; Norton, Jeffrey E.; Duncan, Phillip C.

    1988-01-01

    An Intelligent Tutoring System (ITS) transcends conventional computer-based instruction. An ITS is capable of monitoring and understanding student performance thereby providing feedback, explanation, and remediation. This is accomplished by including models of the student, the instructor, and the expert technician or operator in the domain of interest. The space shuttle fuel cell is the technical domain for the project described below. One system, Microcomputer Intelligence for Technical Training (MITT), demonstrates that ITS's can be developed and delivered, with a reasonable amount of effort and in a short period of time, on a microcomputer. The MITT system capitalizes on the diagnostic training approach called Framework for Aiding the Understanding of Logical Troubleshooting (FAULT) (Johnson, 1987). The system's embedded procedural expert was developed with NASA's C-Language Integrated Production (CLIP) expert system shell (Cubert, 1987).

  5. Class Model Development Using Business Rules

    NASA Astrophysics Data System (ADS)

    Skersys, Tomas; Gudas, Saulius

    New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.

  6. Future fundamental combustion research for aeropropulsion systems

    NASA Technical Reports Server (NTRS)

    Mularz, E. J.

    1985-01-01

    Physical fluid mechanics, heat transfer, and chemical kinetic processes which occur in the combustion chamber of aeropropulsion systems were investigated. With the component requirements becoming more severe for future engines, the current design methodology needs the new tools to obtain the optimum configuration in a reasonable design and development cycle. Research efforts in the last few years were encouraging but to achieve these benefits research is required into the fundamental aerothermodynamic processes of combustion. It is recommended that research continues in the areas of flame stabilization, combustor aerodynamics, heat transfer, multiphase flow and atomization, turbulent reacting flows, and chemical kinetics. Associated with each of these engineering sciences is the need for research into computational methods to accurately describe and predict these complex physical processes. Research needs in each of these areas are highlighted.

  7. Modeling of Melt-Infiltrated SiC/SiC Composite Properties

    NASA Technical Reports Server (NTRS)

    Mital, Subodh K.; Bednarcyk, Brett A.; Arnold, Steven M.; Lang, Jerry

    2009-01-01

    The elastic properties of a two-dimensional five-harness melt-infiltrated silicon carbide fiber reinforced silicon carbide matrix (MI SiC/SiC) ceramic matrix composite (CMC) were predicted using several methods. Methods used in this analysis are multiscale laminate analysis, micromechanics-based woven composite analysis, a hybrid woven composite analysis, and two- and three-dimensional finite element analyses. The elastic properties predicted are in good agreement with each other as well as with the available measured data. However, the various methods differ from each other in three key areas: (1) the fidelity provided, (2) the efforts required for input data preparation, and (3) the computational resources required. Results also indicate that efficient methods are also able to provide a reasonable estimate of local stress fields.

  8. Computational fluid dynamics modeling of laminar, transitional, and turbulent flows with sensitivity to streamline curvature and rotational effects

    NASA Astrophysics Data System (ADS)

    Chitta, Varun

    Modeling of complex flows involving the combined effects of flow transition and streamline curvature using two advanced turbulence models, one in the Reynolds-averaged Navier-Stokes (RANS) category and the other in the hybrid RANS-Large eddy simulation (LES) category is considered in this research effort. In the first part of the research, a new scalar eddy-viscosity model (EVM) is proposed, designed to exhibit physically correct responses to flow transition, streamline curvature, and system rotation effects. The four equation model developed herein is a curvature-sensitized version of a commercially available three-equation transition-sensitive model. The physical effects of rotation and curvature (RC) enter the model through the added transport equation, analogous to a transverse turbulent velocity scale. The eddy-viscosity has been redefined such that the proposed model is constrained to reduce to the original transition-sensitive model definition in nonrotating flows or in regions with negligible RC effects. In the second part of the research, the developed four-equation model is combined with a LES technique using a new hybrid modeling framework, dynamic hybrid RANS-LES. The new framework is highly generalized, allowing coupling of any desired LES model with any given RANS model and addresses several deficiencies inherent in most current hybrid models. In the present research effort, the DHRL model comprises of the proposed four-equation model for RANS component and the MILES scheme for LES component. Both the models were implemented into a commercial computational fluid dynamics (CFD) solver and tested on a number of engineering and generic flow problems. Results from both the RANS and hybrid models show successful resolution of the combined effects of transition and curvature with reasonable engineering accuracy, and for only a small increase in computational cost. In addition, results from the hybrid model indicate significant levels of turbulent fluctuations in the flowfield, improved accuracy compared to RANS models predictions, and are obtained at a significant reduction of computational cost compared to full LES models. The results suggest that the advanced turbulence modeling techniques presented in this research effort have potential as practical tools for solving low/high Re flows over blunt/curved bodies for the prediction of transition and RC effects.

  9. The NASA Integrated Vehicle Health Management Technology Experiment for X-37

    NASA Technical Reports Server (NTRS)

    Schwabacher, Mark; Samuels, Jeff; Brownston, Lee; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The NASA Integrated Vehicle Health Management (IVHM) Technology Experiment for X-37 was intended to run IVHM software on-board the X-37 spacecraft. The X-37 is intended to be an unpiloted vehicle that would orbit the Earth for up to 21 days before landing on a runway. The objectives of the experiment were to demonstrate the benefits of in-flight IVHM to the operation of a Reusable Launch Vehicle, to advance the Technology Readiness Level of this IVHM technology within a flight environment, and to demonstrate that the IVHM software could operate on the Vehicle Management Computer. The scope of the experiment was to perform real-time fault detection and isolation for X-37's electrical power system and electro-mechanical actuators. The experiment used Livingstone, a software system that performs diagnosis using a qualitative, model-based reasoning approach that searches system-wide interactions to detect and isolate failures. Two of the challenges we faced were to make this research software more efficient so that it would fit within the limited computational resources that were available to us on the X-37 spacecraft, and to modify it so that it satisfied the X-37's software safety requirements. Although the experiment is currently unfunded, the development effort had value in that it resulted in major improvements in Livingstone's efficiency and safety. This paper reviews some of the details of the modeling and integration efforts, and some of the lessons that were learned.

  10. Simulation in pediatric anesthesiology.

    PubMed

    Fehr, James J; Honkanen, Anita; Murray, David J

    2012-10-01

    Simulation-based training, research and quality initiatives are expanding in pediatric anesthesiology just as in other medical specialties. Various modalities are available, from task trainers to standardized patients, and from computer-based simulations to mannequins. Computer-controlled mannequins can simulate pediatric vital signs with reasonable reliability; however the fidelity of skin temperature and color change, airway reflexes and breath and heart sounds remains rudimentary. Current pediatric mannequins are utilized in simulation centers, throughout hospitals in-situ, at national meetings for continuing medical education and in research into individual and team performance. Ongoing efforts by pediatric anesthesiologists dedicated to using simulation to improve patient care and educational delivery will result in further dissemination of this technology. Health care professionals who provide complex, subspecialty care to children require a curriculum supported by an active learning environment where skills directly relevant to pediatric care can be developed. The approach is not only the most effective method to educate adult learners, but meets calls for education reform and offers the potential to guide efforts toward evaluating competence. Simulation addresses patient safety imperatives by providing a method for trainees to develop skills and experience in various management strategies, without risk to the health and life of a child. A curriculum that provides pediatric anesthesiologists with the range of skills required in clinical practice settings must include a relatively broad range of task-training devises and electromechanical mannequins. Challenges remain in defining the best integration of this modality into training and clinical practice to meet the needs of pediatric patients. © 2012 Blackwell Publishing Ltd.

  11. 45 CFR 263.9 - May a State avoid a penalty for failing to meet the basic MOE requirement through reasonable...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the basic MOE requirement through reasonable cause or corrective compliance? 263.9 Section 263.9... FEDERAL TANF FUNDS What Rules Apply to a State's Maintenance of Effort? § 263.9 May a State avoid a penalty for failing to meet the basic MOE requirement through reasonable cause or corrective compliance...

  12. Modeling the Effects of Argument Length and Validity on Inductive and Deductive Reasoning

    ERIC Educational Resources Information Center

    Rotello, Caren M.; Heit, Evan

    2009-01-01

    In an effort to assess models of inductive reasoning and deductive reasoning, the authors, in 3 experiments, examined the effects of argument length and logical validity on evaluation of arguments. In Experiments 1a and 1b, participants were given either induction or deduction instructions for a common set of stimuli. Two distinct effects were…

  13. Shor's factoring algorithm and modern cryptography. An illustration of the capabilities inherent in quantum computers

    NASA Astrophysics Data System (ADS)

    Gerjuoy, Edward

    2005-06-01

    The security of messages encoded via the widely used RSA public key encryption system rests on the enormous computational effort required to find the prime factors of a large number N using classical (conventional) computers. In 1994 Peter Shor showed that for sufficiently large N, a quantum computer could perform the factoring with much less computational effort. This paper endeavors to explain, in a fashion comprehensible to the nonexpert, the RSA encryption protocol; the various quantum computer manipulations constituting the Shor algorithm; how the Shor algorithm performs the factoring; and the precise sense in which a quantum computer employing Shor's algorithm can be said to accomplish the factoring of very large numbers with less computational effort than a classical computer. It is made apparent that factoring N generally requires many successive runs of the algorithm. Our analysis reveals that the probability of achieving a successful factorization on a single run is about twice as large as commonly quoted in the literature.

  14. Causal Reasoning in Medicine: Analysis of a Protocol.

    ERIC Educational Resources Information Center

    Kuipers, Benjamin; Kassirer, Jerome P.

    1984-01-01

    Describes the construction of a knowledge representation from the identification of the problem (nephrotic syndrome) to a running computer simulation of causal reasoning to provide a vertical slice of the construction of a cognitive model. Interactions between textbook knowledge, observations of human experts, and computational requirements are…

  15. Intrusive and Non-Intrusive Instruction in Dynamic Skill Training.

    DTIC Science & Technology

    1981-10-01

    less sensitive to the processing load imposed by the dynaic task together with instructional feedback processing than were the decison - making and...betwee computer based instruction of knowledge systems and computer based instruction of dynamic skills. There is reason to expect that the findings of...knowledge 3Ytm and computer based instruction of dynlamic skill.. There is reason to expect that the findings of research on knowledge system

  16. CHAMPION: Intelligent Hierarchical Reasoning Agents for Enhanced Decision Support

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hohimer, Ryan E.; Greitzer, Frank L.; Noonan, Christine F.

    2011-11-15

    We describe the design and development of an advanced reasoning framework employing semantic technologies, organized within a hierarchy of computational reasoning agents that interpret domain specific information. Designed based on an inspirational metaphor of the pattern recognition functions performed by the human neocortex, the CHAMPION reasoning framework represents a new computational modeling approach that derives invariant knowledge representations through memory-prediction belief propagation processes that are driven by formal ontological language specification and semantic technologies. The CHAMPION framework shows promise for enhancing complex decision making in diverse problem domains including cyber security, nonproliferation and energy consumption analysis.

  17. Designing computer learning environments for engineering and computer science: The scaffolded knowledge integration framework

    NASA Astrophysics Data System (ADS)

    Linn, Marcia C.

    1995-06-01

    Designing effective curricula for complex topics and incorporating technological tools is an evolving process. One important way to foster effective design is to synthesize successful practices. This paper describes a framework called scaffolded knowledge integration and illustrates how it guided the design of two successful course enhancements in the field of computer science and engineering. One course enhancement, the LISP Knowledge Integration Environment, improved learning and resulted in more gender-equitable outcomes. The second course enhancement, the spatial reasoning environment, addressed spatial reasoning in an introductory engineering course. This enhancement minimized the importance of prior knowledge of spatial reasoning and helped students develop a more comprehensive repertoire of spatial reasoning strategies. Taken together, the instructional research programs reinforce the value of the scaffolded knowledge integration framework and suggest directions for future curriculum reformers.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unat, Didem; Dubey, Anshu; Hoefler, Torsten

    The cost of data movement has always been an important concern in high performance computing (HPC) systems. It has now become the dominant factor in terms of both energy consumption and performance. Support for expression of data locality has been explored in the past, but those efforts have had only modest success in being adopted in HPC applications for various reasons. However, with the increasing complexity of the memory hierarchy and higher parallelism in emerging HPC systems, locality management has acquired a new urgency. Developers can no longer limit themselves to low-level solutions and ignore the potential for productivity andmore » performance portability obtained by using locality abstractions. Fortunately, the trend emerging in recent literature on the topic alleviates many of the concerns that got in the way of their adoption by application developers. Data locality abstractions are available in the forms of libraries, data structures, languages and runtime systems; a common theme is increasing productivity without sacrificing performance. Furthermore, this paper examines these trends and identifies commonalities that can combine various locality concepts to develop a comprehensive approach to expressing and managing data locality on future large-scale high-performance computing systems.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lai, Canhai; Xu, Zhijie; Li, Tingwen

    In virtual design and scale up of pilot-scale carbon capture systems, the coupled reactive multiphase flow problem must be solved to predict the adsorber’s performance and capture efficiency under various operation conditions. This paper focuses on the detailed computational fluid dynamics (CFD) modeling of a pilot-scale fluidized bed adsorber equipped with vertical cooling tubes. Multiphase Flow with Interphase eXchanges (MFiX), an open-source multiphase flow CFD solver, is used for the simulations with custom code to simulate the chemical reactions and filtered models to capture the effect of the unresolved details in the coarser mesh for simulations with reasonable simulations andmore » manageable computational effort. Previously developed two filtered models for horizontal cylinder drag, heat transfer, and reaction kinetics have been modified to derive the 2D filtered models representing vertical cylinders in the coarse-grid CFD simulations. The effects of the heat exchanger configurations (i.e., horizontal or vertical) on the adsorber’s hydrodynamics and CO2 capture performance are then examined. The simulation result subsequently is compared and contrasted with another predicted by a one-dimensional three-region process model.« less

  20. Nonlinear Unsteady Aerodynamic Modeling Using Wind Tunnel and Computational Data

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Klein, Vladislav; Frink, Neal T.

    2016-01-01

    Extensions to conventional aircraft aerodynamic models are required to adequately predict responses when nonlinear unsteady flight regimes are encountered, especially at high incidence angles and under maneuvering conditions. For a number of reasons, such as loss of control, both military and civilian aircraft may extend beyond normal and benign aerodynamic flight conditions. In addition, military applications may require controlled flight beyond the normal envelope, and civilian flight may require adequate recovery or prevention methods from these adverse conditions. These requirements have led to the development of more general aerodynamic modeling methods and provided impetus for researchers to improve both techniques and the degree of collaboration between analytical and experimental research efforts. In addition to more general mathematical model structures, dynamic test methods have been designed to provide sufficient information to allow model identification. This paper summarizes research to develop a modeling methodology appropriate for modeling aircraft aerodynamics that include nonlinear unsteady behaviors using both experimental and computational test methods. This work was done at Langley Research Center, primarily under the NASA Aviation Safety Program, to address aircraft loss of control, prevention, and recovery aerodynamics.

  1. Life's attractors : understanding developmental systems through reverse engineering and in silico evolution.

    PubMed

    Jaeger, Johannes; Crombach, Anton

    2012-01-01

    We propose an approach to evolutionary systems biology which is based on reverse engineering of gene regulatory networks and in silico evolutionary simulations. We infer regulatory parameters for gene networks by fitting computational models to quantitative expression data. This allows us to characterize the regulatory structure and dynamical repertoire of evolving gene regulatory networks with a reasonable amount of experimental and computational effort. We use the resulting network models to identify those regulatory interactions that are conserved, and those that have diverged between different species. Moreover, we use the models obtained by data fitting as starting points for simulations of evolutionary transitions between species. These simulations enable us to investigate whether such transitions are random, or whether they show stereotypical series of regulatory changes which depend on the structure and dynamical repertoire of an evolving network. Finally, we present a case study-the gap gene network in dipterans (flies, midges, and mosquitoes)-to illustrate the practical application of the proposed methodology, and to highlight the kind of biological insights that can be gained by this approach.

  2. Computer simulation and laboratory work in the teaching of mechanics

    NASA Astrophysics Data System (ADS)

    Borghi, L.; DeAmbrosis, A.; Mascheretti, P.; Massara, C. I.

    1987-03-01

    By analysing the measures of student success in learning the fundamentals of physics in conjunction with the research reported in the literature one can conclude that it is difficult or undergraduates as well as high-school students to gain a reasonable understanding of elementary mechanics. Considerable effort has been devoted to identifying those factors which might prevent mechanics being successfully learnt and also to developing instructional methods which could improve its teaching (Champagne et al. 1984, Hewson 1985, McDermott 1983, Saltiel and Malgrange 1980, Whitaker 1983, White 1983). Starting from these research results and drawing from their own experience (Borghi et al. 1984, 1985), they arrived at the following conclusions. A strategy based on experimental activity, performed by the students themselves, together with a proper use of computer simulations, could well improve the learning of mechanics and enhance the interest in, and understanding of, topics which are difficult to treat in a traditional way. The authors describe the strategy they have designed to help high school students to learn mechanics and report how they have applied this strategy to their particular topic of projectile motion.

  3. Eddy Current Influences on the Dynamic Behaviour of Magnetic Suspension Systems

    NASA Technical Reports Server (NTRS)

    Britcher, Colin P.; Bloodgood, Dale V.

    1998-01-01

    This report will summarize some results from a multi-year research effort at NASA Langley Research Center aimed at the development of an improved capability for practical modelling of eddy current effects in magnetic suspension systems. Particular attention is paid to large-gap systems, although generic results applicable to both large-gap and small-gap systems are presented. It is shown that eddy currents can significantly affect the dynamic behavior of magnetic suspension systems, but that these effects can be amenable to modelling and measurement. Theoretical frameworks are presented, together with comparisons of computed and experimental data particularly related to the Large Angle Magnetic Suspension Test Fixture at NASA Langley Research Center, and the Annular Suspension and Pointing System at Old Dominion University. In both cases, practical computations are capable of providing reasonable estimates of important performance-related parameters. The most difficult case is seen to be that of eddy currents in highly permeable material, due to the low skin depths. Problems associated with specification of material properties and areas for future research are discussed.

  4. Force Field Accelerated Density Functional Theory Molecular Dynamics for Simulation of Reactive Systems at Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Lindsey, Rebecca; Goldman, Nir; Fried, Laurence

    2017-06-01

    Atomistic modeling of chemistry at extreme conditions remains a challenge, despite continuing advances in computing resources and simulation tools. While first principles methods provide a powerful predictive tool, the time and length scales associated with chemistry at extreme conditions (ns and μm, respectively) largely preclude extension of such models to molecular dynamics. In this work, we develop a simulation approach that retains the accuracy of density functional theory (DFT) while decreasing computational effort by several orders of magnitude. We generate n-body descriptions for atomic interactions by mapping forces arising from short density functional theory (DFT) trajectories on to simple Chebyshev polynomial series. We examine the importance of including greater than 2-body interactions, model transferability to different state points, and discuss approaches to ensure smooth and reasonable model shape outside of the distance domain sampled by the DFT training set. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  5. Integrating Cache Performance Modeling and Tuning Support in Parallelization Tools

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    With the resurgence of distributed shared memory (DSM) systems based on cache-coherent Non Uniform Memory Access (ccNUMA) architectures and increasing disparity between memory and processors speeds, data locality overheads are becoming the greatest bottlenecks in the way of realizing potential high performance of these systems. While parallelization tools and compilers facilitate the users in porting their sequential applications to a DSM system, a lot of time and effort is needed to tune the memory performance of these applications to achieve reasonable speedup. In this paper, we show that integrating cache performance modeling and tuning support within a parallelization environment can alleviate this problem. The Cache Performance Modeling and Prediction Tool (CPMP), employs trace-driven simulation techniques without the overhead of generating and managing detailed address traces. CPMP predicts the cache performance impact of source code level "what-if" modifications in a program to assist a user in the tuning process. CPMP is built on top of a customized version of the Computer Aided Parallelization Tools (CAPTools) environment. Finally, we demonstrate how CPMP can be applied to tune a real Computational Fluid Dynamics (CFD) application.

  6. Proposed Directions for Research in Computer-Based Education.

    ERIC Educational Resources Information Center

    Waugh, Michael L.

    Several directions for potential research efforts in the field of computer-based education (CBE) are discussed. (For the purposes of this paper, CBE is defined as any use of computers to promote learning with no intended inference as to the specific nature or organization of the educational application under discussion.) Efforts should be directed…

  7. Reprocessing Multiyear GPS Data from Continuously Operating Reference Stations on Cloud Computing Platform

    NASA Astrophysics Data System (ADS)

    Yoon, S.

    2016-12-01

    To define geodetic reference frame using GPS data collected by Continuously Operating Reference Stations (CORS) network, historical GPS data needs to be reprocessed regularly. Reprocessing GPS data collected by upto 2000 CORS sites for the last two decades requires a lot of computational resource. At National Geodetic Survey (NGS), there has been one completed reprocessing in 2011, and currently, the second reprocessing is undergoing. For the first reprocessing effort, in-house computing resource was utilized. In the current second reprocessing effort, outsourced cloud computing platform is being utilized. In this presentation, the outline of data processing strategy at NGS is described as well as the effort to parallelize the data processing procedure in order to maximize the benefit of the cloud computing. The time and cost savings realized by utilizing cloud computing approach will also be discussed.

  8. 48 CFR 952.227-14 - Rights in data-general. (DOE coverage-alternates VI and VII)

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... data regarded as limited rights data or restricted computer software to the Government and third parties at reasonable royalties upon request by the Department of Energy. (k) Contractor licensing. Except... rights data or restricted computer software on terms and conditions reasonable under the circumstances...

  9. 48 CFR 952.227-14 - Rights in data-general. (DOE coverage-alternates VI and VII)

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... data regarded as limited rights data or restricted computer software to the Government and third parties at reasonable royalties upon request by the Department of Energy. (k) Contractor licensing. Except... rights data or restricted computer software on terms and conditions reasonable under the circumstances...

  10. Effects of Computer Algebra System (CAS) with Metacognitive Training on Mathematical Reasoning.

    ERIC Educational Resources Information Center

    Kramarski, Bracha; Hirsch, Chaya

    2003-01-01

    Describes a study that investigated the differential effects of Computer Algebra Systems (CAS) and metacognitive training (META) on mathematical reasoning. Participants were 83 Israeli eighth-grade students. Results showed that CAS embedded within META significantly outperformed the META and CAS alone conditions, which in turn significantly…

  11. Components of Understanding in Proportional Reasoning: A Fuzzy Set Representation of Developmental Progressions.

    ERIC Educational Resources Information Center

    Moore, Colleen F.; And Others

    1991-01-01

    Examined the development of proportional reasoning by means of a temperature mixture task. Results show the importance of distinguishing between intuitive knowledge and formal computational knowledge of proportional concepts. Provides a new perspective on the relation of intuitive and computational knowledge during development. (GLR)

  12. 20 CFR 1002.226 - If the employee has a disability that was incurred in, or aggravated during, the period of...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... incurred in, or aggravated during, the period of service, what efforts must the employer make to help him... aggravated during, the period of service, what efforts must the employer make to help him or her become... reemployment position regardless of any disability. The employer must make reasonable efforts to help the...

  13. 20 CFR 1002.226 - If the employee has a disability that was incurred in, or aggravated during, the period of...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... incurred in, or aggravated during, the period of service, what efforts must the employer make to help him... aggravated during, the period of service, what efforts must the employer make to help him or her become... reemployment position regardless of any disability. The employer must make reasonable efforts to help the...

  14. 47 CFR 73.2080 - Equal employment opportunities (EEO).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... or positions, but will be expected to make reasonable, good faith efforts to recruit applicants who... vacancy sufficient in its reasonable, good faith judgment to widely disseminate information concerning the... designed to assist students interested in pursuing a career in broadcasting; (viii) Establishment of...

  15. 14 CFR 16.21 - Pre-complaint resolution.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... complaint certifies that substantial and reasonable good faith efforts to resolve the disputed matter... person directly and substantially affected by the alleged noncompliance shall initiate and engage in good faith efforts to resolve the disputed matter informally with those individuals or entities believed...

  16. How localized is ``local?'' Efficiency vs. accuracy of O(N) domain decomposition in local orbital based all-electron electronic structure theory

    NASA Astrophysics Data System (ADS)

    Havu, Vile; Blum, Volker; Scheffler, Matthias

    2007-03-01

    Numeric atom-centered local orbitals (NAO) are efficient basis sets for all-electron electronic structure theory. The locality of NAO's can be exploited to render (in principle) all operations of the self-consistency cycle O(N). This is straightforward for 3D integrals using domain decomposition into spatially close subsets of integration points, enabling critical computational savings that are effective from ˜tens of atoms (no significant overhead for smaller systems) and make large systems (100s of atoms) computationally feasible. Using a new all-electron NAO-based code,^1 we investigate the quantitative impact of exploiting this locality on two distinct classes of systems: Large light-element molecules [Alanine-based polypeptide chains (Ala)n], and compact transition metal clusters. Strict NAO locality is achieved by imposing a cutoff potential with an onset radius rc, and exploited by appropriately shaped integration domains (subsets of integration points). Conventional tight rc<= 3å have no measurable accuracy impact in (Ala)n, but introduce inaccuracies of 20-30 meV/atom in Cun. The domain shape impacts the computational effort by only 10-20 % for reasonable rc. ^1 V. Blum, R. Gehrke, P. Havu, V. Havu, M. Scheffler, The FHI Ab Initio Molecular Simulations (aims) Project, Fritz-Haber-Institut, Berlin (2006).

  17. A Comparison of PETSC Library and HPF Implementations of an Archetypal PDE Computation

    NASA Technical Reports Server (NTRS)

    Hayder, M. Ehtesham; Keyes, David E.; Mehrotra, Piyush

    1997-01-01

    Two paradigms for distributed-memory parallel computation that free the application programmer from the details of message passing are compared for an archetypal structured scientific computation a nonlinear, structured-grid partial differential equation boundary value problem using the same algorithm on the same hardware. Both paradigms, parallel libraries represented by Argonne's PETSC, and parallel languages represented by the Portland Group's HPF, are found to be easy to use for this problem class, and both are reasonably effective in exploiting concurrency after a short learning curve. The level of involvement required by the application programmer under either paradigm includes specification of the data partitioning (corresponding to a geometrically simple decomposition of the domain of the PDE). Programming in SPAM style for the PETSC library requires writing the routines that discretize the PDE and its Jacobian, managing subdomain-to-processor mappings (affine global- to-local index mappings), and interfacing to library solver routines. Programming for HPF requires a complete sequential implementation of the same algorithm, introducing concurrency through subdomain blocking (an effort similar to the index mapping), and modest experimentation with rewriting loops to elucidate to the compiler the latent concurrency. Correctness and scalability are cross-validated on up to 32 nodes of an IBM SP2.

  18. Efforts to reduce mortality to hydroelectric turbine-passed fish: locating and quantifying damaging shear stresses.

    PubMed

    Cada, Glenn; Loar, James; Garrison, Laura; Fisher, Richard; Neitzel, Duane

    2006-06-01

    Severe fluid forces are believed to be a source of injury and mortality to fish that pass through hydroelectric turbines. A process is described by which laboratory bioassays, computational fluid dynamics models, and field studies can be integrated to evaluate the significance of fluid shear stresses that occur in a turbine. Areas containing potentially lethal shear stresses were identified near the stay vanes and wicket gates, runner, and in the draft tube of a large Kaplan turbine. However, under typical operating conditions, computational models estimated that these dangerous areas comprise less than 2% of the flow path through the modeled turbine. The predicted volumes of the damaging shear stress zones did not correlate well with observed fish mortality at a field installation of this turbine, which ranged from less than 1% to nearly 12%. Possible reasons for the poor correlation are discussed. Computational modeling is necessary to develop an understanding of the role of particular fish injury mechanisms, to compare their effects with those of other sources of injury, and to minimize the trial and error previously needed to mitigate those effects. The process we describe is being used to modify the design of hydroelectric turbines to improve fish passage survival.

  19. Efficacy and Mediation of a Theory-Based Physical Activity Intervention for African American Men Who Have Sex with Men: A Randomized Controlled Trial.

    PubMed

    Zhang, Jingwen; Jemmott, John B; O'Leary, Ann; Stevens, Robin; Jemmott, Loretta Sweet; Icard, Larry D; Hsu, Janet; Rutledge, Scott E

    2017-02-01

    Few trials have tested physical-activity interventions among sexual minorities, including African American men who have sex with men (MSM). We examined the efficacy and mediation of the Being Responsible for Ourselves (BRO) physical-activity intervention among African American MSM. African American MSM were randomized to the physical-activity intervention consisting of three 90-min one-on-one sessions or an attention-matched control intervention and completed pre-intervention, immediately post-intervention, and 6- and 12-month post-intervention audio computer-based surveys. Of the 595 participants, 503 completed the 12-month follow-up. Generalized estimating equation models revealed that the intervention increased self-reported physical activity compared with the control intervention, adjusted for pre-intervention physical activity. Mediation analyses suggested that the intervention increased reasoned action approach variables, subjective norm and self-efficacy, increasing intention immediately post-intervention, which increased physical activity during the follow-up period. Interventions targeting reasoned action approach variables may contribute to efforts to increase African American MSM's physical activity. The trial was registered with the ClinicalTrials.gov Identifier NCT02561286 .

  20. Technology and Sexuality--What's the Connection? Addressing Youth Sexualities in Efforts to Increase Girls' Participation in Computing

    ERIC Educational Resources Information Center

    Ashcraft, Catherine

    2015-01-01

    To date, girls and women are significantly underrepresented in computer science and technology. Concerns about this underrepresentation have sparked a wealth of educational efforts to promote girls' participation in computing, but these programs have demonstrated limited impact on reversing current trends. This paper argues that this is, in part,…

  1. The achievement of spacecraft autonomy through the thematic application of multiple cooperating intelligent agents

    NASA Technical Reports Server (NTRS)

    Rossomando, Philip J.

    1992-01-01

    A description is given of UNICORN, a prototype system developed for the purpose of investigating artificial intelligence (AI) concepts supporting spacecraft autonomy. UNICORN employs thematic reasoning, of the type first described by Rodger Schank of Northwestern University, to allow the context-sensitive control of multiple intelligent agents within a blackboard based environment. In its domain of application, UNICORN demonstrates the ability to reason teleologically with focused knowledge. Also presented are some of the lessons learned as a result of this effort. These lessons apply to any effort wherein system level autonomy is the objective.

  2. Computer simulation and performance assessment of the packet-data service of the Aeronautical Mobile Satellite Service (AMSS)

    NASA Technical Reports Server (NTRS)

    Ferzali, Wassim; Zacharakis, Vassilis; Upadhyay, Triveni; Weed, Dennis; Burke, Gregory

    1995-01-01

    The ICAO Aeronautical Mobile Communications Panel (AMCP) completed the drafting of the Aeronautical Mobile Satellite Service (AMSS) Standards and Recommended Practices (SARP's) and the associated Guidance Material and submitted these documents to ICAO Air Navigation Commission (ANC) for ratification in May 1994. This effort, encompassed an extensive, multi-national SARP's validation. As part of this activity, the US Federal Aviation Administration (FAA) sponsored an effort to validate the SARP's via computer simulation. This paper provides a description of this effort. Specifically, it describes: (1) the approach selected for the creation of a high-fidelity AMSS computer model; (2) the test traffic generation scenarios; and (3) the resultant AMSS performance assessment. More recently, the AMSS computer model was also used to provide AMSS performance statistics in support of the RTCA standardization activities. This paper describes this effort as well.

  3. A general concept for consistent documentation of computational analyses

    PubMed Central

    Müller, Fabian; Nordström, Karl; Lengauer, Thomas; Schulz, Marcel H.

    2015-01-01

    The ever-growing amount of data in the field of life sciences demands standardized ways of high-throughput computational analysis. This standardization requires a thorough documentation of each step in the computational analysis to enable researchers to understand and reproduce the results. However, due to the heterogeneity in software setups and the high rate of change during tool development, reproducibility is hard to achieve. One reason is that there is no common agreement in the research community on how to document computational studies. In many cases, simple flat files or other unstructured text documents are provided by researchers as documentation, which are often missing software dependencies, versions and sufficient documentation to understand the workflow and parameter settings. As a solution we suggest a simple and modest approach for documenting and verifying computational analysis pipelines. We propose a two-part scheme that defines a computational analysis using a Process and an Analysis metadata document, which jointly describe all necessary details to reproduce the results. In this design we separate the metadata specifying the process from the metadata describing an actual analysis run, thereby reducing the effort of manual documentation to an absolute minimum. Our approach is independent of a specific software environment, results in human readable XML documents that can easily be shared with other researchers and allows an automated validation to ensure consistency of the metadata. Because our approach has been designed with little to no assumptions concerning the workflow of an analysis, we expect it to be applicable in a wide range of computational research fields. Database URL: http://deep.mpi-inf.mpg.de/DAC/cmds/pub/pyvalid.zip PMID:26055099

  4. Elaborated Corrective Feedback and the Acquisition of Reasoning Skills: A Study of Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Collins, Maria; And Others

    1987-01-01

    Thirteen learning disabled and 15 remedial high school students were taught reasoning skills using computer-assisted instruction and were given basic or elaborated corrections. Criterion-referenced test scores were significantly higher for the elaborated-corrections treatment on the post- and maintenance tests and on a transfer test assessing…

  5. Proportional Reasoning in the Laboratory: An Intervention Study in Vocational Education

    ERIC Educational Resources Information Center

    Bakker, Arthur; Groenveld, Djonie; Wijers, Monica; Akkerman, Sanne F.; Gravemeijer, Koeno P. E.

    2014-01-01

    Based on insights into the nature of vocational mathematical knowledge, we designed a computer tool with which students in laboratory schools at senior secondary vocational school level could develop a better proficiency in the proportional reasoning involved in dilution. We did so because we had identified computations of concentrations of…

  6. Barriers to Decentralized Teacher Education.

    ERIC Educational Resources Information Center

    Stuhr, Christian

    In an effort to meet the demand for off-campus postsecondary education at the degree, diploma, or certificate levels, this report examines the barriers against and reasons for offering decentralized teacher education programs from universities to colleges in rural Canadian provinces. Several reasons exist for the demand for off-campus…

  7. Reasons Parents Exempt Children from Receiving Immunizations

    ERIC Educational Resources Information Center

    Luthy, Karlen E.; Beckstrand, Renea L.; Callister, Lynn C.; Cahoon, Spencer

    2012-01-01

    School nurses are on the front lines of educational efforts to promote childhood vaccinations. However, some parents still choose to exempt their children from receiving vaccinations for personal reasons. Studying the beliefs of parents who exempt vaccinations allows health care workers, including school nurses, to better understand parental…

  8. Research in Hypersonic Airbreathing Propulsion at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay; Drummond, J. Philip; McClinton, Charles R.; Hunt, James L.

    2001-01-01

    The NASA Langley Research Center has been conducting research for over four decades to develop technology for an airbreathing-propelled vehicle. Several other organizations within the United States have also been involved in this endeavor. Even though significant progress has been made over this period, a hypersonic airbreathing vehicle has not yet been realized due to low technology maturity. One of the major reasons for the slow progress in technology development has been the low level and cyclic nature of funding. The paper provides a brief historical overview of research in hypersonic airbreathing technology and then discusses current efforts at NASA Langley to develop various analytical, computational, and experimental design tools and their application in the development of future hypersonic airbreathing vehicles. The main focus of this paper is on the hypersonic airbreathing propulsion technology.

  9. The interaction of representation and reasoning.

    PubMed

    Bundy, Alan

    2013-09-08

    Automated reasoning is an enabling technology for many applications of informatics. These applications include verifying that a computer program meets its specification; enabling a robot to form a plan to achieve a task and answering questions by combining information from diverse sources, e.g. on the Internet, etc. How is automated reasoning possible? Firstly, knowledge of a domain must be stored in a computer, usually in the form of logical formulae. This knowledge might, for instance, have been entered manually, retrieved from the Internet or perceived in the environment via sensors, such as cameras. Secondly, rules of inference are applied to old knowledge to derive new knowledge. Automated reasoning techniques have been adapted from logic, a branch of mathematics that was originally designed to formalize the reasoning of humans, especially mathematicians. My special interest is in the way that representation and reasoning interact. Successful reasoning is dependent on appropriate representation of both knowledge and successful methods of reasoning. Failures of reasoning can suggest changes of representation. This process of representational change can also be automated. We will illustrate the automation of representational change by drawing on recent work in my research group.

  10. Reasons for limiting drinking in an HIV primary care sample.

    PubMed

    Elliott, Jennifer C; Aharonovich, Efrat; Hasin, Deborah S

    2014-06-01

    Heavy drinking among individuals with HIV is associated with major health concerns (liver disease, medication nonadherence, immune functioning), but little is known about cognitive-motivational factors involved in alcohol consumption in this population, particularly reasons for limiting drinking. Urban HIV primary care patients (N = 254; 78.0% male; 94.5% African American or Hispanic) in a randomized trial of brief drinking-reduction interventions reported on reasons for limiting drinking, alcohol consumption, and alcohol dependence symptoms prior to intervention. Exploratory factor analysis indicated 3 main domains of reasons for limiting drinking: social reasons (e.g., responsibility to family), lifestyle reasons (e.g., religious/moral reasons), and impairment concerns (e.g., hangovers). These factors evidenced good internal consistency (αs = 0.76 to 0.86). Higher scores on social reasons for limiting drinking were associated with lower typical quantity, maximum quantity, and binge frequency (ps < 0.01), and higher scores on lifestyle reasons were associated with lower maximum quantity, binge frequency, and intoxication frequency (ps < 0.01). In contrast, higher scores on impairment concerns were associated with more frequent drinking and intoxication, and higher risk of alcohol dependence (ps < 0.05), likely because dependent drinkers are more familiar with alcohol-induced impairment. The current study is the first to explore reasons for limiting drinking among individuals with HIV and how these reasons relate to alcohol involvement. This study yields a scale that can be used to assess reasons for limiting drinking among HIV-positive drinkers and provides information that can be used to enhance interventions with this population. Discussing social and lifestyle reasons for limiting drinking among less extreme drinkers may support and validate these patients' efforts to limit engagement in heavy drinking; discussion of impairment reasons for limiting drinking may be a way to engage dependent drinkers in efforts to decrease their alcohol consumption. Copyright © 2014 by the Research Society on Alcoholism.

  11. Reasons for limiting drinking in an HIV primary care sample

    PubMed Central

    Elliott, Jennifer C.; Aharonovich, Efrat; Hasin, Deborah

    2015-01-01

    BACKGROUND Heavy drinking among individuals with HIV is associated with major health concerns (liver disease, medication nonadherence, immune functioning), but little is known about cognitive-motivational factors involved in alcohol consumption in this population, particularly reasons for limiting drinking. METHODS Urban HIV primary care patients (N=254; 78.0% male; 94.5% African American or Hispanic) in a randomized trial of brief drinking-reduction interventions reported on reasons for limiting drinking, alcohol consumption, and alcohol dependence symptoms prior to intervention. RESULTS Exploratory factor analysis indicated three main domains of reasons for limiting drinking: social reasons (e.g., responsibility to family), lifestyle reasons (e.g., religious/moral reasons), and impairment concerns (e.g., hangovers). These factors evidenced good internal consistency (αs=0.76–0.86). Higher scores on social reasons for limiting drinking were associated with lower typical quantity, maximum quantity, and binge frequency (ps<0.01), and higher scores on lifestyle reasons were associated with lower maximum quantity, binge frequency, and intoxication frequency (ps<0.01). In contrast, higher scores on impairment concerns were associated with more frequent drinking and intoxication, and higher risk of alcohol dependence (ps<0.05), likely because dependent drinkers are more familiar with alcohol-induced impairment. CONCLUSIONS The current study is the first to explore reasons for limiting drinking among individuals with HIV, and how these reasons relate to alcohol involvement. This study yields a scale that can be used to assess reasons for limiting drinking among HIV-positive drinkers, and provides information that can be used to enhance interventions with this population. Discussing social and lifestyle reasons for limiting drinking among less extreme drinkers may support and validate these patients’ efforts to limit engagement in heavy drinking; discussion of impairment reasons for limiting drinking may be a way to engage dependent drinkers in efforts to decrease their alcohol consumption. PMID:24796381

  12. Save medical personnel's time by improved user interfaces.

    PubMed

    Kindler, H

    1997-01-01

    Common objectives in the industrial countries are the improvement of quality of care, clinical effectiveness, and cost control. Cost control, in particular, has been addressed through the introduction of case mix systems for reimbursement by social-security institutions. More data is required to enable quality improvement, increases in clinical effectiveness and for juridical reasons. At first glance, this documentation effort is contradictory to cost reduction. However, integrated services for resource management based on better documentation should help to reduce costs. The clerical effort for documentation should be decreased by providing a co-operative working environment for healthcare professionals applying sophisticated human-computer interface technology. Additional services, e.g., automatic report generation, increase the efficiency of healthcare personnel. Modelling the medical work flow forms an essential prerequisite for integrated resource management services and for co-operative user interfaces. A user interface aware of the work flow provides intelligent assistance by offering the appropriate tools at the right moment. Nowadays there is a trend to client/server systems with relational databases or object-oriented databases as repository. The work flows used for controlling purposes and to steer the user interfaces must be represented in the repository.

  13. Implementation theory of distortion-invariant pattern recognition for optical and digital signal processing systems

    NASA Astrophysics Data System (ADS)

    Lhamon, Michael Earl

    A pattern recognition system which uses complex correlation filter banks requires proportionally more computational effort than single-real valued filters. This introduces increased computation burden but also introduces a higher level of parallelism, that common computing platforms fail to identify. As a result, we consider algorithm mapping to both optical and digital processors. For digital implementation, we develop computationally efficient pattern recognition algorithms, referred to as, vector inner product operators that require less computational effort than traditional fast Fourier methods. These algorithms do not need correlation and they map readily onto parallel digital architectures, which imply new architectures for optical processors. These filters exploit circulant-symmetric matrix structures of the training set data representing a variety of distortions. By using the same mathematical basis as with the vector inner product operations, we are able to extend the capabilities of more traditional correlation filtering to what we refer to as "Super Images". These "Super Images" are used to morphologically transform a complicated input scene into a predetermined dot pattern. The orientation of the dot pattern is related to the rotational distortion of the object of interest. The optical implementation of "Super Images" yields feature reduction necessary for using other techniques, such as artificial neural networks. We propose a parallel digital signal processor architecture based on specific pattern recognition algorithms but general enough to be applicable to other similar problems. Such an architecture is classified as a data flow architecture. Instead of mapping an algorithm to an architecture, we propose mapping the DSP architecture to a class of pattern recognition algorithms. Today's optical processing systems have difficulties implementing full complex filter structures. Typically, optical systems (like the 4f correlators) are limited to phase-only implementation with lower detection performance than full complex electronic systems. Our study includes pseudo-random pixel encoding techniques for approximating full complex filtering. Optical filter bank implementation is possible and they have the advantage of time averaging the entire filter bank at real time rates. Time-averaged optical filtering is computational comparable to billions of digital operations-per-second. For this reason, we believe future trends in high speed pattern recognition will involve hybrid architectures of both optical and DSP elements.

  14. Computer-Based Assessment of School Readiness and Early Reasoning

    ERIC Educational Resources Information Center

    Csapó, Beno; Molnár, Gyöngyvér; Nagy, József

    2014-01-01

    This study explores the potential of using online tests for the assessment of school readiness and for monitoring early reasoning. Four tests of a face-to-face-administered school readiness test battery (speech sound discrimination, relational reasoning, counting and basic numeracy, and deductive reasoning) and a paper-and-pencil inductive…

  15. Assessing Clinical Reasoning (ASCLIRE): Instrument Development and Validation

    ERIC Educational Resources Information Center

    Kunina-Habenicht, Olga; Hautz, Wolf E.; Knigge, Michel; Spies, Claudia; Ahlers, Olaf

    2015-01-01

    Clinical reasoning is an essential competency in medical education. This study aimed at developing and validating a test to assess diagnostic accuracy, collected information, and diagnostic decision time in clinical reasoning. A norm-referenced computer-based test for the assessment of clinical reasoning (ASCLIRE) was developed, integrating the…

  16. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 2, Assessment and verification results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eyler, L L; Trent, D S; Budden, M J

    During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs.

  17. Democratic Education: An (Im)Possibility that yet Remains to Come

    ERIC Educational Resources Information Center

    Friedrich, Daniel; Jaastad, Bryn; Popkewitz, Thomas S.

    2010-01-01

    Efforts to develop democratic schools have moved along particular rules and standards of "reasoning" even when expressed through different ideological and paradigmatic lines. From attempts to make a democratic education to critical pedagogy, different approaches overlap in their historical construction of the reason of schooling: designing society…

  18. Using computer aided case based reasoning to support clinical reasoning in community occupational therapy.

    PubMed

    Taylor, Bruce; Robertson, David; Wiratunga, Nirmalie; Craw, Susan; Mitchell, Dawn; Stewart, Elaine

    2007-08-01

    Community occupational therapists have long been involved in the provision of environmental control systems. Diverse electronic technologies with the potential to improve the health and quality of life of selected clients have developed rapidly in recent years. Occupational therapists employ clinical reasoning in order to determine the most appropriate technology to meet the needs of individual clients. This paper describes a number of the drivers that may increase the adoption of information and communication technologies in the occupational therapy profession. It outlines case based reasoning as understood in the domains of expert systems and knowledge management and presents the preliminary results of an ongoing investigation into the potential of a prototype computer aided case based reasoning tool to support the clinical reasoning of community occupational therapists in the process of assisting clients to choose home electronic assistive or smart house technology.

  19. 42 CFR 441.182 - Maintenance of effort: Computation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SERVICES Inpatient Psychiatric Services for Individuals Under Age 21 in Psychiatric Facilities or Programs § 441.182 Maintenance of effort: Computation. (a) For expenditures for inpatient psychiatric services... total State Medicaid expenditures in the current quarter for inpatient psychiatric services and...

  20. An accurate binding interaction model in de novo computational protein design of interactions: if you build it, they will bind.

    PubMed

    London, Nir; Ambroggio, Xavier

    2014-02-01

    Computational protein design efforts aim to create novel proteins and functions in an automated manner and, in the process, these efforts shed light on the factors shaping natural proteins. The focus of these efforts has progressed from the interior of proteins to their surface and the design of functions, such as binding or catalysis. Here we examine progress in the development of robust methods for the computational design of non-natural interactions between proteins and molecular targets such as other proteins or small molecules. This problem is referred to as the de novo computational design of interactions. Recent successful efforts in de novo enzyme design and the de novo design of protein-protein interactions open a path towards solving this problem. We examine the common themes in these efforts, and review recent studies aimed at understanding the nature of successes and failures in the de novo computational design of interactions. While several approaches culminated in success, the use of a well-defined structural model for a specific binding interaction in particular has emerged as a key strategy for a successful design, and is therefore reviewed with special consideration. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. High-School Students' Reasoning while Constructing Plant Growth Models in a Computer-Supported Educational Environment. Research Report

    ERIC Educational Resources Information Center

    Ergazaki, Marida; Komis, Vassilis; Zogza, Vassiliki

    2005-01-01

    This paper highlights specific aspects of high-school students' reasoning while coping with a modeling task of plant growth in a computer-supported educational environment. It is particularly concerned with the modeling levels ('macro-phenomenological' and 'micro-conceptual' level) activated by peers while exploring plant growth and with their…

  2. Visual Reasoning in Computational Environment: A Case of Graph Sketching

    ERIC Educational Resources Information Center

    Leung, Allen; Chan, King Wah

    2004-01-01

    This paper reports the case of a form six (grade 12) Hong Kong student's exploration of graph sketching in a computational environment. In particular, the student summarized his discovery in the form of two empirical laws. The student was interviewed and the interviewed data were used to map out a possible path of his visual reasoning. Critical…

  3. Cultural Commonalities and Differences in Spatial Problem-Solving: A Computational Analysis

    ERIC Educational Resources Information Center

    Lovett, Andrew; Forbus, Kenneth

    2011-01-01

    A fundamental question in human cognition is how people reason about space. We use a computational model to explore cross-cultural commonalities and differences in spatial cognition. Our model is based upon two hypotheses: (1) the structure-mapping model of analogy can explain the visual comparisons used in spatial reasoning; and (2) qualitative,…

  4. The Effects of Learning a Computer Programming Language on the Logical Reasoning of School Children.

    ERIC Educational Resources Information Center

    Seidman, Robert H.

    The research reported in this paper explores the syntactical and semantic link between computer programming statements and logical principles, and addresses the effects of learning a programming language on logical reasoning ability. Fifth grade students in a public school in Syracuse, New York, were randomly selected as subjects, and then…

  5. The Difficult Process of Scientific Modelling: An Analysis Of Novices' Reasoning During Computer-Based Modelling

    ERIC Educational Resources Information Center

    Sins, Patrick H. M.; Savelsbergh, Elwin R.; van Joolingen, Wouter R.

    2005-01-01

    Although computer modelling is widely advocated as a way to offer students a deeper understanding of complex phenomena, the process of modelling is rather complex itself and needs scaffolding. In order to offer adequate support, a thorough understanding of the reasoning processes students employ and of difficulties they encounter during a…

  6. Developing Strategic and Reasoning Abilities with Computer Games at Primary School Level

    ERIC Educational Resources Information Center

    Bottino, R. M.; Ferlino, L.; Ott, M.; Tavella, M.

    2007-01-01

    The paper reports a small-scale, long-term pilot project designed to foster strategic and reasoning abilities in young primary school pupils by engaging them in a number of computer games, mainly those usually called mind games (brainteasers, puzzlers, etc.). In this paper, the objectives, work methodology, experimental setting, and tools used in…

  7. Tandem Cylinder Noise Predictions

    NASA Technical Reports Server (NTRS)

    Lockard, David P.; Khorrami, Mehdi R.; CHoudhari, Meelan M.; Hutcheson, Florence V.; Brooks, Thomas F.; Stead, Daniel J.

    2007-01-01

    In an effort to better understand landing-gear noise sources, we have been examining a simplified configuration that still maintains some of the salient features of landing-gear flow fields. In particular, tandem cylinders have been studied because they model a variety of component level interactions. The present effort is directed at the case of two identical cylinders spatially separated in the streamwise direction by 3.7 diameters. Experimental measurements from the Basic Aerodynamic Research Tunnel (BART) and Quiet Flow Facility (QFF) at NASA Langley Research Center (LaRC) have provided steady surface pressures, detailed off-surface measurements of the flow field using Particle Image Velocimetry (PIV), hot-wire measurements in the wake of the rear cylinder, unsteady surface pressure data, and the radiated noise. The experiments were conducted at a Reynolds number of 166 105 based on the cylinder diameter. A trip was used on the upstream cylinder to insure a fully turbulent shedding process and simulate the effects of a high Reynolds number flow. The parallel computational effort uses the three-dimensional Navier-Stokes solver CFL3D with a hybrid, zonal turbulence model that turns off the turbulence production term everywhere except in a narrow ring surrounding solid surfaces. The current calculations further explore the influence of the grid resolution and spanwise extent on the flow and associated radiated noise. Extensive comparisons with the experimental data are used to assess the ability of the computations to simulate the details of the flow. The results show that the pressure fluctuations on the upstream cylinder, caused by vortex shedding, are smaller than those generated on the downstream cylinder by wake interaction. Consequently, the downstream cylinder dominates the noise radiation, producing an overall directivity pattern that is similar to that of an isolated cylinder. Only calculations based on the full length of the model span were able to capture the complete decay in the spanwise correlation, thereby producing reasonable noise radiation levels.

  8. Turbulence modeling of free shear layers for high performance aircraft

    NASA Technical Reports Server (NTRS)

    Sondak, Douglas

    1993-01-01

    In many flowfield computations, accuracy of the turbulence model employed is frequently a limiting factor in the overall accuracy of the computation. This is particularly true for complex flowfields such as those around full aircraft configurations. Free shear layers such as wakes, impinging jets (in V/STOL applications), and mixing layers over cavities are often part of these flowfields. Although flowfields have been computed for full aircraft, the memory and CPU requirements for these computations are often excessive. Additional computer power is required for multidisciplinary computations such as coupled fluid dynamics and conduction heat transfer analysis. Massively parallel computers show promise in alleviating this situation, and the purpose of this effort was to adapt and optimize CFD codes to these new machines. The objective of this research effort was to compute the flowfield and heat transfer for a two-dimensional jet impinging normally on a cool plate. The results of this research effort were summarized in an AIAA paper titled 'Parallel Implementation of the k-epsilon Turbulence Model'. Appendix A contains the full paper.

  9. A Computational Model of Reasoning from the Clinical Literature

    PubMed Central

    Rennels, Glenn D.

    1986-01-01

    This paper explores the premise that a formalized representation of empirical studies can play a central role in computer-based decision support. The specific motivations underlying this research include the following propositions: 1. Reasoning from experimental evidence contained in the clinical literature is central to the decisions physicians make in patient care. 2. A computational model, based upon a declarative representation for published reports of clinical studies, can drive a computer program that selectively tailors knowledge of the clinical literature as it is applied to a particular case. 3. The development of such a computational model is an important first step toward filling a void in computer-based decision support systems. Furthermore, the model may help us better understand the general principles of reasoning from experimental evidence both in medicine and other domains. Roundsman is a developmental computer system which draws upon structured representations of the clinical literature in order to critique plans for the management of primary breast cancer. Roundsman is able to produce patient-specific analyses of breast cancer management options based on the 24 clinical studies currently encoded in its knowledge base. The Roundsman system is a first step in exploring how the computer can help to bring a critical analysis of the relevant literature to the physician, structured around a particular patient and treatment decision.

  10. Incorporating time and spatial-temporal reasoning into situation management

    NASA Astrophysics Data System (ADS)

    Jakobson, Gabriel

    2010-04-01

    Spatio-temporal reasoning plays a significant role in situation management that is performed by intelligent agents (human or machine) by affecting how the situations are recognized, interpreted, acted upon or predicted. Many definitions and formalisms for the notion of spatio-temporal reasoning have emerged in various research fields including psychology, economics and computer science (computational linguistics, data management, control theory, artificial intelligence and others). In this paper we examine the role of spatio-temporal reasoning in situation management, particularly how to resolve situations that are described by using spatio-temporal relations among events and situations. We discuss a model for describing context sensitive temporal relations and show have the model can be extended for spatial relations.

  11. Common world model for unmanned systems: Phase 2

    NASA Astrophysics Data System (ADS)

    Dean, Robert M. S.; Oh, Jean; Vinokurov, Jerry

    2014-06-01

    The Robotics Collaborative Technology Alliance (RCTA) seeks to provide adaptive robot capabilities which move beyond traditional metric algorithms to include cognitive capabilities. Key to this effort is the Common World Model, which moves beyond the state-of-the-art by representing the world using semantic and symbolic as well as metric information. It joins these layers of information to define objects in the world. These objects may be reasoned upon jointly using traditional geometric, symbolic cognitive algorithms and new computational nodes formed by the combination of these disciplines to address Symbol Grounding and Uncertainty. The Common World Model must understand how these objects relate to each other. It includes the concept of Self-Information about the robot. By encoding current capability, component status, task execution state, and their histories we track information which enables the robot to reason and adapt its performance using Meta-Cognition and Machine Learning principles. The world model also includes models of how entities in the environment behave which enable prediction of future world states. To manage complexity, we have adopted a phased implementation approach. Phase 1, published in these proceedings in 2013 [1], presented the approach for linking metric with symbolic information and interfaces for traditional planners and cognitive reasoning. Here we discuss the design of "Phase 2" of this world model, which extends the Phase 1 design API, data structures, and reviews the use of the Common World Model as part of a semantic navigation use case.

  12. Qualitative Constraint Reasoning For Image Understanding

    NASA Astrophysics Data System (ADS)

    Perry, John L.

    1987-05-01

    Military planners and analysts are exceedingly concerned with increasing the effectiveness of command and control (C2) processes for battlefield management (BM). A variety of technical approaches have been taken in this effort. These approaches are intended to support and assist commanders in situation assessment, course of action generation and evaluation, and other C2 decision-making tasks. A specific task within this technology support includes the ability to effectively gather information concerning opposing forces and plan/replan tactical maneuvers. Much of the information that is gathered is image-derived, along with collateral data supporting this visual imagery. In this paper, we intend to describe a process called qualitative constraint reasoning (QCR) which is being developed as a mechanism for reasoning in the mid to high level vision domain. The essential element of QCR is the abstraction process. One of the factors that is unique to QCR is the level at which the abstraction process occurs relative to the problem domain. The computational mechanisms used in QCR belong to a general class of problem called the consistent labeling problem. The success of QCR is its ability to abstract out from a visual domain a structure appropriate for applying the labeling procedure. An example will be given that will exemplify the abstraction process for a battlefield management application. Exploratory activities are underway for investigating the suitability of QCR approach for the battlefield scenario. Further research is required to investigate the utility of QCR in a more complex battlefield environment.

  13. Monitoring Affect States during Effortful Problem Solving Activities

    ERIC Educational Resources Information Center

    D'Mello, Sidney K.; Lehman, Blair; Person, Natalie

    2010-01-01

    We explored the affective states that students experienced during effortful problem solving activities. We conducted a study where 41 students solved difficult analytical reasoning problems from the Law School Admission Test. Students viewed videos of their faces and screen captures and judged their emotions from a set of 14 states (basic…

  14. Motivated to do well: an examination of the relationships between motivation, effort, and cognitive performance in schizophrenia.

    PubMed

    Foussias, G; Siddiqui, I; Fervaha, G; Mann, S; McDonald, K; Agid, O; Zakzanis, K K; Remington, G

    2015-08-01

    The uncertain relationship between negative symptoms, and specifically motivational deficits, with cognitive dysfunction in schizophrenia is in need of further elucidation as it pertains to the interpretation of cognitive test results. Findings to date have suggested a possible mediating role of motivational deficits on cognitive test measures, although findings from formal examinations of effort using performance validity measures have been inconsistent. The aim of this study was to examine the relationships between motivation, effort exerted during cognitive testing, and cognitive performance in schizophrenia. Sixty-nine outpatients with schizophrenia or schizoaffective disorder were evaluated for psychopathology, severity of motivational deficits, effort exerted during cognitive testing, and cognitive performance. Motivation and degree of effort exerted during cognitive testing were significantly related to cognitive performance, specifically verbal fluency, verbal and working memory, attention and processing speed, and reasoning and problem solving. Further, effort accounted for 15% of the variance in cognitive performance, and partially mediated the relationship between motivation and cognitive performance. Examining cognitive performance profiles for individuals exerting normal or reduced effort revealed significant differences in global cognition, as well as attention/processing speed and reasoning and problem solving. These findings suggest that cognitive domains may be differentially affected by impairments in motivation and effort, and highlight the importance of understanding the interplay between motivation and cognitive performance deficits, which may guide the appropriate selection of symptom targets for promoting recovery in patients. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Improving the Well-Being of Abused and Neglected Children. Hearing on Exploring How the Well-Being of Abused and Neglected Children Can Be Improved through Clarifying the Reasonable Efforts Requirement of the Adoption Assistance and Child Welfare Act To Make the Child's Health and Safety the Primary Concern, before the Committee on Labor and Human Resources. United States Senate, One Hundred Fourth Congress, Second Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Committee on Labor and Human Resources.

    This hearing transcript presents testimony exploring how the well-being of abused and neglected children can be improved through an amendment clarifying the "reasonable efforts" requirement of the Adoption Assistance and Child Welfare Act (1980) to allow the child's health and safety to take precedence over parents' rights. Testimony…

  16. Concepts and Relations in Neurally Inspired In Situ Concept-Based Computing

    PubMed Central

    van der Velde, Frank

    2016-01-01

    In situ concept-based computing is based on the notion that conceptual representations in the human brain are “in situ.” In this way, they are grounded in perception and action. Examples are neuronal assemblies, whose connection structures develop over time and are distributed over different brain areas. In situ concepts representations cannot be copied or duplicated because that will disrupt their connection structure, and thus the meaning of these concepts. Higher-level cognitive processes, as found in language and reasoning, can be performed with in situ concepts by embedding them in specialized neurally inspired “blackboards.” The interactions between the in situ concepts and the blackboards form the basis for in situ concept computing architectures. In these architectures, memory (concepts) and processing are interwoven, in contrast with the separation between memory and processing found in Von Neumann architectures. Because the further development of Von Neumann computing (more, faster, yet power limited) is questionable, in situ concept computing might be an alternative for concept-based computing. In situ concept computing will be illustrated with a recently developed BABI reasoning task. Neurorobotics can play an important role in the development of in situ concept computing because of the development of in situ concept representations derived in scenarios as needed for reasoning tasks. Neurorobotics would also benefit from power limited and in situ concept computing. PMID:27242504

  17. Concepts and Relations in Neurally Inspired In Situ Concept-Based Computing.

    PubMed

    van der Velde, Frank

    2016-01-01

    In situ concept-based computing is based on the notion that conceptual representations in the human brain are "in situ." In this way, they are grounded in perception and action. Examples are neuronal assemblies, whose connection structures develop over time and are distributed over different brain areas. In situ concepts representations cannot be copied or duplicated because that will disrupt their connection structure, and thus the meaning of these concepts. Higher-level cognitive processes, as found in language and reasoning, can be performed with in situ concepts by embedding them in specialized neurally inspired "blackboards." The interactions between the in situ concepts and the blackboards form the basis for in situ concept computing architectures. In these architectures, memory (concepts) and processing are interwoven, in contrast with the separation between memory and processing found in Von Neumann architectures. Because the further development of Von Neumann computing (more, faster, yet power limited) is questionable, in situ concept computing might be an alternative for concept-based computing. In situ concept computing will be illustrated with a recently developed BABI reasoning task. Neurorobotics can play an important role in the development of in situ concept computing because of the development of in situ concept representations derived in scenarios as needed for reasoning tasks. Neurorobotics would also benefit from power limited and in situ concept computing.

  18. Computational psychiatry

    PubMed Central

    Montague, P. Read; Dolan, Raymond J.; Friston, Karl J.; Dayan, Peter

    2013-01-01

    Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects. PMID:22177032

  19. ``But it doesn't come naturally'': how effort expenditure shapes the benefit of growth mindset on women's sense of intellectual belonging in computing

    NASA Astrophysics Data System (ADS)

    Stout, Jane G.; Blaney, Jennifer M.

    2017-10-01

    Research suggests growth mindset, or the belief that knowledge is acquired through effort, may enhance women's sense of belonging in male-dominated disciplines, like computing. However, other research indicates women who spend a great deal of time and energy in technical fields experience a low sense of belonging. The current study assessed the benefits of a growth mindset on women's (and men's) sense of intellectual belonging in computing, accounting for the amount of time and effort dedicated to academics. We define "intellectual belonging" as the sense that one is believed to be a competent member of the community. Whereas a stronger growth mindset was associated with stronger intellectual belonging for men, a growth mindset only boosted women's intellectual belonging when they did not work hard on academics. Our findings suggest, paradoxically, women may not benefit from a growth mindset in computing when they exert a lot of effort.

  20. Predicting Motivation: Computational Models of PFC Can Explain Neural Coding of Motivation and Effort-based Decision-making in Health and Disease.

    PubMed

    Vassena, Eliana; Deraeve, James; Alexander, William H

    2017-10-01

    Human behavior is strongly driven by the pursuit of rewards. In daily life, however, benefits mostly come at a cost, often requiring that effort be exerted to obtain potential benefits. Medial PFC (MPFC) and dorsolateral PFC (DLPFC) are frequently implicated in the expectation of effortful control, showing increased activity as a function of predicted task difficulty. Such activity partially overlaps with expectation of reward and has been observed both during decision-making and during task preparation. Recently, novel computational frameworks have been developed to explain activity in these regions during cognitive control, based on the principle of prediction and prediction error (predicted response-outcome [PRO] model [Alexander, W. H., & Brown, J. W. Medial prefrontal cortex as an action-outcome predictor. Nature Neuroscience, 14, 1338-1344, 2011], hierarchical error representation [HER] model [Alexander, W. H., & Brown, J. W. Hierarchical error representation: A computational model of anterior cingulate and dorsolateral prefrontal cortex. Neural Computation, 27, 2354-2410, 2015]). Despite the broad explanatory power of these models, it is not clear whether they can also accommodate effects related to the expectation of effort observed in MPFC and DLPFC. Here, we propose a translation of these computational frameworks to the domain of effort-based behavior. First, we discuss how the PRO model, based on prediction error, can explain effort-related activity in MPFC, by reframing effort-based behavior in a predictive context. We propose that MPFC activity reflects monitoring of motivationally relevant variables (such as effort and reward), by coding expectations and discrepancies from such expectations. Moreover, we derive behavioral and neural model-based predictions for healthy controls and clinical populations with impairments of motivation. Second, we illustrate the possible translation to effort-based behavior of the HER model, an extended version of PRO model based on hierarchical error prediction, developed to explain MPFC-DLPFC interactions. We derive behavioral predictions that describe how effort and reward information is coded in PFC and how changing the configuration of such environmental information might affect decision-making and task performance involving motivation.

  1. Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.

    2016-01-01

    A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.

  2. The 4th R: Reasoning.

    ERIC Educational Resources Information Center

    Miles, Curtis

    1983-01-01

    Reviews sources of information on materials for teaching reasoning with a microcomputer. Suggests microcomputer magazines, catalogs of commercial materials, CONDUIT (a nonprofit organization devoted to educational computer use), and local microcomputer users groups. Lists Apple II software for strategy games with reasoning applications. (DMM)

  3. Why Don't All Professors Use Computers?

    ERIC Educational Resources Information Center

    Drew, David Eli

    1989-01-01

    Discusses the adoption of computer technology at universities and examines reasons why some professors don't use computers. Topics discussed include computer applications, including artificial intelligence, social science research, statistical analysis, and cooperative research; appropriateness of the technology for the task; the Computer Aptitude…

  4. Children Can Solve Bayesian Problems: The Role of Representation in Mental Computation

    ERIC Educational Resources Information Center

    Zhu, Liqi; Gigerenzer, Gerd

    2006-01-01

    Can children reason the Bayesian way? We argue that the answer to this question depends on how numbers are represented, because a representation can do part of the computation. We test, for the first time, whether Bayesian reasoning can be elicited in children by means of natural frequencies. We show that when information was presented to fourth,…

  5. The Application of Multiobjective Evolutionary Algorithms to an Educational Computational Model of Science Information Processing: A Computational Experiment in Science Education

    ERIC Educational Resources Information Center

    Lamb, Richard L.; Firestone, Jonah B.

    2017-01-01

    Conflicting explanations and unrelated information in science classrooms increase cognitive load and decrease efficiency in learning. This reduced efficiency ultimately limits one's ability to solve reasoning problems in the science. In reasoning, it is the ability of students to sift through and identify critical pieces of information that is of…

  6. The Effects of Computer Programming on High School Students' Reasoning Skills and Mathematical Self-Efficacy and Problem Solving

    ERIC Educational Resources Information Center

    Psycharis, Sarantos; Kallia, Maria

    2017-01-01

    In this paper we investigate whether computer programming has an impact on high school student's reasoning skills, problem solving and self-efficacy in Mathematics. The quasi-experimental design was adopted to implement the study. The sample of the research comprised 66 high school students separated into two groups, the experimental and the…

  7. Reasoning Abilities in Primary School: A Pilot Study on Poor Achievers vs. Normal Achievers in Computer Game Tasks

    ERIC Educational Resources Information Center

    Dagnino, Francesca Maria; Ballauri, Margherita; Benigno, Vincenza; Caponetto, Ilaria; Pesenti, Elia

    2013-01-01

    This paper presents the results of preliminary research on the assessment of reasoning abilities in primary school poor achievers vs. normal achievers using computer game tasks. Subjects were evaluated by means of cognitive assessment on logical abilities and academic skills. The aim of this study is to better understand the relationship between…

  8. Transforming Medical Assessment: Integrating Uncertainty Into the Evaluation of Clinical Reasoning in Medical Education.

    PubMed

    Cooke, Suzette; Lemay, Jean-Francois

    2017-06-01

    In an age where practicing physicians have access to an overwhelming volume of clinical information and are faced with increasingly complex medical decisions, the ability to execute sound clinical reasoning is essential to optimal patient care. The authors propose two concepts that are philosophically paramount to the future assessment of clinical reasoning in medicine: assessment in the context of "uncertainty" (when, despite all of the information that is available, there is still significant doubt as to the best diagnosis, investigation, or treatment), and acknowledging that it is entirely possible (and reasonable) to have more than "one correct answer." The purpose of this article is to highlight key elements related to these two core concepts and discuss genuine barriers that currently exist on the pathway to creating such assessments. These include acknowledging situations of uncertainty, creating clear frameworks that define progressive levels of clinical reasoning skills, providing validity evidence to increase the defensibility of such assessments, considering the comparative feasibility with other forms of assessment, and developing strategies to evaluate the impact of these assessment methods on future learning and practice. The authors recommend that concerted efforts be directed toward these key areas to help advance the field of clinical reasoning assessment, improve the clinical care decisions made by current and future physicians, and have positive outcomes for patients. It is anticipated that these and subsequent efforts will aid in reaching the goal of making future assessment in medical education more representative of current-day clinical reasoning and decision making.

  9. Indicators of Informal and Formal Decision-Making about a Socioscientific Issue

    ERIC Educational Resources Information Center

    Dauer, Jenny M.; Lute, Michelle L.; Straka, Olivia

    2017-01-01

    We propose two contrasting types of student decision-making based on social and cognitive psychology models of separate mental processes for problem solving. Informal decision-making uses intuitive reasoning and is subject to cognitive biases, whereas formal decision-making uses effortful, logical reasoning. We explored indicators of students'…

  10. Presidential Address: How to Improve Poverty Measurement in the United States

    ERIC Educational Resources Information Center

    Blank, Rebecca M.

    2008-01-01

    This paper discusses the reasons why the current official U.S. poverty measure is outdated and nonresponsive to many anti-poverty initiatives. A variety of efforts to update and improve the statistic have failed, for political, technical, and institutional reasons. Meanwhile, the European Union is taking a very different approach to poverty…

  11. Developing Teachers' Reasoning about Comparing Distributions: A Cross-Institutional Effort

    ERIC Educational Resources Information Center

    Tran, Dung; Lee, Hollylynne; Doerr, Helen

    2016-01-01

    The research reported here uses a pre/post-test model and stimulated recall interviews to assess teachers' statistical reasoning about comparing distributions, when enrolled in a graduate-level statistics education course. We discuss key aspects of the course design aimed at improving teachers' learning and teaching of statistics, and the…

  12. The Quantitative Reasoning for College Science (QuaRCS) Assessment: Emerging Themes from 5 Years of Data

    NASA Astrophysics Data System (ADS)

    Follette, Katherine; Dokter, Erin; Buxner, Sanlyn

    2018-01-01

    The Quantitative Reasoning for College Science (QuaRCS) Assessment is a validated assessment instrument that was designed to measure changes in students' quantitative reasoning skills, attitudes toward mathematics, and ability to accurately assess their own quantitative abilities. It has been administered to more than 5,000 students at a variety of institutions at the start and end of a semester of general education college science instruction. I will begin by briefly summarizing our published work surrounding validation of the instrument and identification of underlying attitudinal factors (composite variables identified via factor analysis) that predict 50% of the variation in students' scores on the assessment. I will then discuss more recent unpublished work, including: (1) Development and validation of an abbreviated version of the assessment (The QuaRCS Light), which results in marked improvements in students' ability to maintain a high effort level throughout the assessment and has broad implications for quantitative reasoning assessments in general, and (2) Our efforts to revise the attitudinal portion of the assessment to better assess math anxiety level, another key factor in student performance on numerical assessments.

  13. Secure and Efficient Signature Scheme Based on NTRU for Mobile Payment

    NASA Astrophysics Data System (ADS)

    Xia, Yunhao; You, Lirong; Sun, Zhe; Sun, Zhixin

    2017-10-01

    Mobile payment becomes more and more popular, however the traditional public-key encryption algorithm has higher requirements for hardware which is not suitable for mobile terminals of limited computing resources. In addition, these public-key encryption algorithms do not have the ability of anti-quantum computing. This paper researches public-key encryption algorithm NTRU for quantum computation through analyzing the influence of parameter q and k on the probability of generating reasonable signature value. Two methods are proposed to improve the probability of generating reasonable signature value. Firstly, increase the value of parameter q. Secondly, add the authentication condition that meet the reasonable signature requirements during the signature phase. Experimental results show that the proposed signature scheme can realize the zero leakage of the private key information of the signature value, and increase the probability of generating the reasonable signature value. It also improve rate of the signature, and avoid the invalid signature propagation in the network, but the scheme for parameter selection has certain restrictions.

  14. The interaction of representation and reasoning

    PubMed Central

    Bundy, Alan

    2013-01-01

    Automated reasoning is an enabling technology for many applications of informatics. These applications include verifying that a computer program meets its specification; enabling a robot to form a plan to achieve a task and answering questions by combining information from diverse sources, e.g. on the Internet, etc. How is automated reasoning possible? Firstly, knowledge of a domain must be stored in a computer, usually in the form of logical formulae. This knowledge might, for instance, have been entered manually, retrieved from the Internet or perceived in the environment via sensors, such as cameras. Secondly, rules of inference are applied to old knowledge to derive new knowledge. Automated reasoning techniques have been adapted from logic, a branch of mathematics that was originally designed to formalize the reasoning of humans, especially mathematicians. My special interest is in the way that representation and reasoning interact. Successful reasoning is dependent on appropriate representation of both knowledge and successful methods of reasoning. Failures of reasoning can suggest changes of representation. This process of representational change can also be automated. We will illustrate the automation of representational change by drawing on recent work in my research group. PMID:24062623

  15. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  16. Aeroacoustic Simulations of Tandem Cylinders with Subcritical Spacing

    NASA Technical Reports Server (NTRS)

    Lockard, David P.; Choudhari, Meelan M.; Khorrami, Mehdi R.; Neuhart, Dan H.; Hutcheson, Florence V.; Brooks, Thomas F.; Stead, Daniel J.

    2008-01-01

    Tandem cylinders are being studied because they model a variety of component level interactions of landing gear. The present effort is directed at the case of two identical cylinders with their centroids separated in the streamwise direction by 1.435 diameters. Experiments in the Basic Aerodynamic Research Tunnel and Quiet Flow Facility at NASA Langley Research Center have provided an extensive experimental database of the nearfield flow and radiated noise. The measurements were conducted at a Mach number of 0.1285 and Reynolds number of 1.66x10(exp 5) based on the cylinder diameter. A trip was used on the upstream cylinder to insure a fully turbulent flow separation and, hence, to simulate a major aspect of high Reynolds number flow. The parallel computational effort uses the three-dimensional Navier-Stokes solver CFL3D with a hybrid, zonal turbulence model that turns off the turbulence production term everywhere except in a narrow ring surrounding solid surfaces. The experiments exhibited an asymmetry in the surface pressure that was persistent despite attempts to eliminate it through small changes in the configuration. To model the asymmetry, the simulations were run with the cylinder configuration at a nonzero but small angle of attack. The computed results and experiments are in general agreement that vortex shedding for the spacing studied herein is weak relative to that observed at supercritical spacings. Although the shedding was subdued in the simulations, it was still more prominent than in the experiments. Overall, the simulation comparisons with measured near-field data and the radiated acoustics are reasonable, especially if one is concerned with capturing the trends relative to larger cylinder spacings. However, the flow details of the 1.435 diameter spacing have not been captured in full even though very fine grid computations have been performed. Some of the discrepancy may be associated with the simulation s inexact representation of the experimental configuration, but numerical and flow modeling errors are also likely contributors to the observed differences.

  17. Visualizing complex processes using a cognitive-mapping tool to support the learning of clinical reasoning.

    PubMed

    Wu, Bian; Wang, Minhong; Grotzer, Tina A; Liu, Jun; Johnson, Janice M

    2016-08-22

    Practical experience with clinical cases has played an important role in supporting the learning of clinical reasoning. However, learning through practical experience involves complex processes difficult to be captured by students. This study aimed to examine the effects of a computer-based cognitive-mapping approach that helps students to externalize the reasoning process and the knowledge underlying the reasoning process when they work with clinical cases. A comparison between the cognitive-mapping approach and the verbal-text approach was made by analyzing their effects on learning outcomes. Fifty-two third-year or higher students from two medical schools participated in the study. Students in the experimental group used the computer-base cognitive-mapping approach, while the control group used the verbal-text approach, to make sense of their thinking and actions when they worked with four simulated cases over 4 weeks. For each case, students in both groups reported their reasoning process (involving data capture, hypotheses formulation, and reasoning with justifications) and the underlying knowledge (involving identified concepts and the relationships between the concepts) using the given approach. The learning products (cognitive maps or verbal text) revealed that students in the cognitive-mapping group outperformed those in the verbal-text group in the reasoning process, but not in making sense of the knowledge underlying the reasoning process. No significant differences were found in a knowledge posttest between the two groups. The computer-based cognitive-mapping approach has shown a promising advantage over the verbal-text approach in improving students' reasoning performance. Further studies are needed to examine the effects of the cognitive-mapping approach in improving the construction of subject-matter knowledge on the basis of practical experience.

  18. [AERA. Dream machines and computing practices at the Mathematical Center].

    PubMed

    Alberts, Gerard; De Beer, Huub T

    2008-01-01

    Dream machines may be just as effective as the ones materialised. Their symbolic thrust can be quite powerful. The Amsterdam 'Mathematisch Centrum' (Mathematical Center), founded February 11, 1946, created a Computing Department in an effort to realise its goal of serving society. When Aad van Wijngaarden was appointed as head of the Computing Department, however, he claimed space for scientific research and computer construction, next to computing as a service. Still, the computing service following the five stage style of Hartree's numerical analysis remained a dominant characteristic of the work of the Computing Department. The high level of ambition held by Aad van Wijngaarden lead to ever renewed projections of big automatic computers, symbolised by the never-built AERA. Even a machine that was actually constructed, the ARRA which followed A.D. Booth's design of the ARC, never made it into real operation. It did serve Van Wijngaarden to bluff his way into the computer age by midsummer 1952. Not until January 1954 did the computing department have a working stored program computer, which for reasons of policy went under the same name: ARRA. After just one other machine, the ARMAC, had been produced, a separate company, Electrologica, was set up for the manufacture of computers, which produced the rather successful X1 computer. The combination of ambition and absence of a working machine lead to a high level of work on programming, way beyond the usual ideas of libraries of subroutines. Edsger W. Dijkstra in particular led the way to an emphasis on the duties of the programmer within the pattern of numerical analysis. Programs generating programs, known elsewhere as autocoding systems, were at the 'Mathematisch Centrum' called 'superprograms'. Practical examples were usually called a 'complex', in Dutch, where in English one might say 'system'. Historically, this is where software begins. Dekker's matrix complex, Dijkstra's interrupt system, Dijkstra and Zonneveld's ALGOL compiler--which for housekeeping contained 'the complex'--were actual examples of such super programs. In 1960 this compiler gave the Mathematical Center a leading edge in the early development of software.

  19. 40 CFR 33.302 - Are there any additional contract administration requirements?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY PROGRAMS Good Faith Efforts § 33.302 Are there any additional contract... require its prime contractor to employ the six good faith efforts described in § 33.301 even if the prime... the subcontract for any reason, the recipient must require the prime contractor to employ the six good...

  20. Entitlement Attitudes Predict Students' Poor Performance in Challenging Academic Conditions

    ERIC Educational Resources Information Center

    Anderson, Donna; Halberstadt, Jamin; Aitken, Robert

    2013-01-01

    Excessive entitlement--an exaggerated or unrealistic belief about what one deserves--has been associated with a variety of maladaptive behaviors, including a decline in motivation and effort. In the context of tertiary education, we reasoned that if students expend less effort to obtain positive outcomes to which they feel entitled, this should…

  1. New Mexico district work-effort analysis computer program

    USGS Publications Warehouse

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation 6600 computer system. Central processing computer time has seldom exceeded 5 minutes on the longest year-to-date runs.

  2. Nonlinear model predictive control applied to the separation of praziquantel in simulated moving bed chromatography.

    PubMed

    Andrade Neto, A S; Secchi, A R; Souza, M B; Barreto, A G

    2016-10-28

    An adaptive nonlinear model predictive control of a simulated moving bed unit for the enantioseparation of praziquantel is presented. A first principle model was applied at the proposed purity control scheme. The main concern about this kind of model in a control framework is in regard to the computational effort to solve it; however, a fast enough solution was achieved. In order to evaluate the controller's performance, several cases were simulated, including external pumps and switching valve malfunctions. The problem of plant-model mismatch was also investigated, and for that reason a parameter estimation step was introduced in the control strategy. In every studied scenario, the controller was able to maintain the purity levels at their set points, which were set to 99% and 98.6% for extract and raffinate, respectively. Additionally, fast responses and smooth actuation were achieved. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Use patterns of a state health care price transparency web site: what do patients shop for?

    PubMed

    Mehrotra, Ateev; Brannen, Tyler; Sinaiko, Anna D

    2014-01-01

    To help people shop for lower cost providers, several states have created their own price transparency Web sites or passed legislation mandating health plans provide such information. New Hampshire's HealthCost Web site is on the forefront of such initiatives. Despite the growing interest in price transparency, little is known about such efforts, including how often these tools are used and for what reason. We examined the use of New Hampshire HealthCost over a 3-year period. Approximately 1% of the state's residents used the Web site, and the most common searches were for outpatient visits, magnetic resonance imaging (MRI) or computed tomography (CT) scans, and emergency department visits. The results provide a cautionary note on the level of potential interest among consumers in this information but may guide others on practically what are the most "shop-able" services for patients. © The Author(s) 2014.

  4. Large public display boards: a case study of an OR board and design implications.

    PubMed

    Lasome, C E; Xiao, Y

    2001-01-01

    A compelling reason for studying artifacts in collaborative work is to inform design. We present a case study of a public display board (12 ft by 4 ft) in a Level-I trauma center operating room (OR) unit. The board has evolved into a sophisticated coordination tool for clinicians and supporting personnel. This paper draws on study findings about how the OR board is used and organizes the findings into three areas: (1) visual and physical properties of the board that are exploited for collaboration, (2) purposes the board was configured to serve, and (3) types of physical and perceptual interaction with the board. Findings and implications related to layout, size, flexibility, task management, problem-solving, resourcing, shared awareness, and communication are discussed in an effort to propose guidelines to facilitate the design of electronic, computer driven display boards in the OR environment.

  5. All-in-one model for designing optimal water distribution pipe networks

    NASA Astrophysics Data System (ADS)

    Aklog, Dagnachew; Hosoi, Yoshihiko

    2017-05-01

    This paper discusses the development of an easy-to-use, all-in-one model for designing optimal water distribution networks. The model combines different optimization techniques into a single package in which a user can easily choose what optimizer to use and compare the results of different optimizers to gain confidence in the performances of the models. At present, three optimization techniques are included in the model: linear programming (LP), genetic algorithm (GA) and a heuristic one-by-one reduction method (OBORM) that was previously developed by the authors. The optimizers were tested on a number of benchmark problems and performed very well in terms of finding optimal or near-optimal solutions with a reasonable computation effort. The results indicate that the model effectively addresses the issues of complexity and limited performance trust associated with previous models and can thus be used for practical purposes.

  6. Designing a Software Tool for Fuzzy Logic Programming

    NASA Astrophysics Data System (ADS)

    Abietar, José M.; Morcillo, Pedro J.; Moreno, Ginés

    2007-12-01

    Fuzzy Logic Programming is an interesting and still growing research area that agglutinates the efforts for introducing fuzzy logic into logic programming (LP), in order to incorporate more expressive resources on such languages for dealing with uncertainty and approximated reasoning. The multi-adjoint logic programming approach is a recent and extremely flexible fuzzy logic paradigm for which, unfortunately, we have not found practical tools implemented so far. In this work, we describe a prototype system which is able to directly translate fuzzy logic programs into Prolog code in order to safely execute these residual programs inside any standard Prolog interpreter in a completely transparent way for the final user. We think that the development of such fuzzy languages and programing tools might play an important role in the design of advanced software applications for computational physics, chemistry, mathematics, medicine, industrial control and so on.

  7. A Plan for Advanced Guidance and Control Technology for 2nd Generation Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hanson, John M.; Fogle, Frank (Technical Monitor)

    2002-01-01

    Advanced guidance and control (AG&C) technologies are critical for meeting safety/reliability and cost requirements for the next generation of reusable launch vehicle (RLV). This becomes clear upon examining the number of expendable launch vehicle failures in the recent past where AG&C technologies would have saved a RLV with the same failure mode, the additional vehicle problems where this technology applies, and the costs associated with mission design with or without all these failure issues. The state-of-the-art in guidance and control technology, as well as in computing technology, is at the point where we can took to the possibility of being able to safely return a RLV in any situation where it can physically be recovered. This paper outlines reasons for AG&C, current technology efforts, and the additional work needed for making this goal a reality.

  8. A Unified Data-Driven Approach for Programming In Situ Analysis and Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiken, Alex

    The placement and movement of data is becoming the key limiting factor on both performance and energy efficiency of high performance computations. As systems generate more data, it is becoming increasingly difficult to actually move that data elsewhere for post-processing, as the rate of improvements in supporting I/O infrastructure is not keeping pace. Together, these trends are creating a shift in how we think about exascale computations, from a viewpoint that focuses on FLOPS to one that focuses on data and data-centric operations as fundamental to the reasoning about, and optimization of, scientific workflows on extreme-scale architectures. The overarching goalmore » of our effort was the study of a unified data-driven approach for programming applications and in situ analysis and visualization. Our work was to understand the interplay between data-centric programming model requirements at extreme-scale and the overall impact of those requirements on the design, capabilities, flexibility, and implementation details for both applications and the supporting in situ infrastructure. In this context, we made many improvements to the Legion programming system (one of the leading data-centric models today) and demonstrated in situ analyses on real application codes using these improvements.« less

  9. Probalistic Finite Elements (PFEM) structural dynamics and fracture mechanics

    NASA Technical Reports Server (NTRS)

    Liu, Wing-Kam; Belytschko, Ted; Mani, A.; Besterfield, G.

    1989-01-01

    The purpose of this work is to develop computationally efficient methodologies for assessing the effects of randomness in loads, material properties, and other aspects of a problem by a finite element analysis. The resulting group of methods is called probabilistic finite elements (PFEM). The overall objective of this work is to develop methodologies whereby the lifetime of a component can be predicted, accounting for the variability in the material and geometry of the component, the loads, and other aspects of the environment; and the range of response expected in a particular scenario can be presented to the analyst in addition to the response itself. Emphasis has been placed on methods which are not statistical in character; that is, they do not involve Monte Carlo simulations. The reason for this choice of direction is that Monte Carlo simulations of complex nonlinear response require a tremendous amount of computation. The focus of efforts so far has been on nonlinear structural dynamics. However, in the continuation of this project, emphasis will be shifted to probabilistic fracture mechanics so that the effect of randomness in crack geometry and material properties can be studied interactively with the effect of random load and environment.

  10. Semantic Similarity between Web Documents Using Ontology

    NASA Astrophysics Data System (ADS)

    Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh

    2018-06-01

    The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.

  11. Trends in data locality abstractions for HPC systems

    DOE PAGES

    Unat, Didem; Dubey, Anshu; Hoefler, Torsten; ...

    2017-05-10

    The cost of data movement has always been an important concern in high performance computing (HPC) systems. It has now become the dominant factor in terms of both energy consumption and performance. Support for expression of data locality has been explored in the past, but those efforts have had only modest success in being adopted in HPC applications for various reasons. However, with the increasing complexity of the memory hierarchy and higher parallelism in emerging HPC systems, locality management has acquired a new urgency. Developers can no longer limit themselves to low-level solutions and ignore the potential for productivity andmore » performance portability obtained by using locality abstractions. Fortunately, the trend emerging in recent literature on the topic alleviates many of the concerns that got in the way of their adoption by application developers. Data locality abstractions are available in the forms of libraries, data structures, languages and runtime systems; a common theme is increasing productivity without sacrificing performance. Furthermore, this paper examines these trends and identifies commonalities that can combine various locality concepts to develop a comprehensive approach to expressing and managing data locality on future large-scale high-performance computing systems.« less

  12. Benchmark solution for vibrations from a moving point source in a tunnel embedded in a half-space

    NASA Astrophysics Data System (ADS)

    Yuan, Zonghao; Boström, Anders; Cai, Yuanqiang

    2017-01-01

    A closed-form semi-analytical solution for the vibrations due to a moving point load in a tunnel embedded in a half-space is given in this paper. The tunnel is modelled as an elastic hollow cylinder and the ground surrounding the tunnel as a linear viscoelastic material. The total wave field in the half-space with a cylindrical hole is represented by outgoing cylindrical waves and down-going plane waves. To apply the boundary conditions on the ground surface and at the tunnel-soil interface, the transformation properties between the plane and cylindrical wave functions are employed. The proposed solution can predict the ground vibration from an underground railway tunnel of circular cross-section with a reasonable computational effort and can serve as a benchmark solution for other computational methods. Numerical results for the ground vibrations on the free surface due to a moving constant load and a moving harmonic load applied at the tunnel invert are presented for different load velocities and excitation frequencies. It is found that Rayleigh waves play an important role in the ground vibrations from a shallow tunnel.

  13. Ab initio characterization of coupling strength for all types of dangling-bond pairs on the hydrogen-terminated Si(100)-2 × 1 surface

    NASA Astrophysics Data System (ADS)

    Shaterzadeh-Yazdi, Zahra; Sanders, Barry C.; DiLabio, Gino A.

    2018-04-01

    Recent work has suggested that coupled silicon dangling bonds sharing an excess electron may serve as building blocks for quantum-cellular-automata cells and quantum computing schemes when constructed on hydrogen-terminated silicon surfaces. In this work, we employ ab initio density-functional theory to examine the details associated with the coupling between two dangling bonds sharing one excess electron and arranged in various configurations on models of phosphorous-doped hydrogen-terminated silicon (100) surfaces. Our results show that the coupling strength depends strongly on the relative orientation of the dangling bonds on the surface and on the separation between them. The orientation of dangling bonds is determined by the anisotropy of the silicon (100) surface, so this feature of the surface is a significant contributing factor to variations in the strength of coupling between dangling bonds. The results demonstrate that simple models for approximating tunneling, such as the Wentzel-Kramer-Brillouin method, which do not incorporate the details of surface structure, are incapable of providing reasonable estimates of tunneling rates between dangling bonds. The results provide guidance to efforts related to the development of dangling-bond based computing elements.

  14. Transient Approximation of SAFE-100 Heat Pipe Operation

    NASA Technical Reports Server (NTRS)

    Bragg-Sitton, Shannon M.; Reid, Robert S.

    2005-01-01

    Engineers at Los Alamos National Laboratory (LANL) have designed several heat pipe cooled reactor concepts, ranging in power from 15 kWt to 800 kWt, for both surface power systems and nuclear electric propulsion systems. The Safe, Affordable Fission Engine (SAFE) is now being developed in a collaborative effort between LANL and NASA Marshall Space Flight Center (NASA/MSFC). NASA is responsible for fabrication and testing of non-nuclear, electrically heated modules in the Early Flight Fission Test Facility (EFF-TF) at MSFC. In-core heat pipes must be properly thawed as the reactor power starts. Computational models have been developed to assess the expected operation of a specific heat pipe design during start-up, steady state operation, and shutdown. While computationally intensive codes provide complete, detailed analyses of heat pipe thaw, a relatively simple. concise routine can also be applied to approximate the response of a heat pipe to changes in the evaporator heat transfer rate during start-up and power transients (e.g., modification of reactor power level) with reasonably accurate results. This paper describes a simplified model of heat pipe start-up that extends previous work and compares the results to experimental measurements for a SAFE-100 type heat pipe design.

  15. Synthetic Earthquake Statistics From Physical Fault Models for the Lower Rhine Embayment

    NASA Astrophysics Data System (ADS)

    Brietzke, G. B.; Hainzl, S.; Zöller, G.

    2012-04-01

    As of today, seismic risk and hazard estimates mostly use pure empirical, stochastic models of earthquake fault systems tuned specifically to the vulnerable areas of interest. Although such models allow for reasonable risk estimates they fail to provide a link between the observed seismicity and the underlying physical processes. Solving a state-of-the-art fully dynamic description set of all relevant physical processes related to earthquake fault systems is likely not useful since it comes with a large number of degrees of freedom, poor constraints on its model parameters and a huge computational effort. Here, quasi-static and quasi-dynamic physical fault simulators provide a compromise between physical completeness and computational affordability and aim at providing a link between basic physical concepts and statistics of seismicity. Within the framework of quasi-static and quasi-dynamic earthquake simulators we investigate a model of the Lower Rhine Embayment (LRE) that is based upon seismological and geological data. We present and discuss statistics of the spatio-temporal behavior of generated synthetic earthquake catalogs with respect to simplification (e.g. simple two-fault cases) as well as to complication (e.g. hidden faults, geometric complexity, heterogeneities of constitutive parameters).

  16. Ensemble Sampling vs. Time Sampling in Molecular Dynamics Simulations of Thermal Conductivity

    DOE PAGES

    Gordiz, Kiarash; Singh, David J.; Henry, Asegun

    2015-01-29

    In this report we compare time sampling and ensemble averaging as two different methods available for phase space sampling. For the comparison, we calculate thermal conductivities of solid argon and silicon structures, using equilibrium molecular dynamics. We introduce two different schemes for the ensemble averaging approach, and show that both can reduce the total simulation time as compared to time averaging. It is also found that velocity rescaling is an efficient mechanism for phase space exploration. Although our methodology is tested using classical molecular dynamics, the ensemble generation approaches may find their greatest utility in computationally expensive simulations such asmore » first principles molecular dynamics. For such simulations, where each time step is costly, time sampling can require long simulation times because each time step must be evaluated sequentially and therefore phase space averaging is achieved through sequential operations. On the other hand, with ensemble averaging, phase space sampling can be achieved through parallel operations, since each ensemble is independent. For this reason, particularly when using massively parallel architectures, ensemble sampling can result in much shorter simulation times and exhibits similar overall computational effort.« less

  17. Semantic Similarity between Web Documents Using Ontology

    NASA Astrophysics Data System (ADS)

    Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh

    2018-03-01

    The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.

  18. Predicting low-temperature free energy landscapes with flat-histogram Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Mahynski, Nathan A.; Blanco, Marco A.; Errington, Jeffrey R.; Shen, Vincent K.

    2017-02-01

    We present a method for predicting the free energy landscape of fluids at low temperatures from flat-histogram grand canonical Monte Carlo simulations performed at higher ones. We illustrate our approach for both pure and multicomponent systems using two different sampling methods as a demonstration. This allows us to predict the thermodynamic behavior of systems which undergo both first order and continuous phase transitions upon cooling using simulations performed only at higher temperatures. After surveying a variety of different systems, we identify a range of temperature differences over which the extrapolation of high temperature simulations tends to quantitatively predict the thermodynamic properties of fluids at lower ones. Beyond this range, extrapolation still provides a reasonably well-informed estimate of the free energy landscape; this prediction then requires less computational effort to refine with an additional simulation at the desired temperature than reconstruction of the surface without any initial estimate. In either case, this method significantly increases the computational efficiency of these flat-histogram methods when investigating thermodynamic properties of fluids over a wide range of temperatures. For example, we demonstrate how a binary fluid phase diagram may be quantitatively predicted for many temperatures using only information obtained from a single supercritical state.

  19. Some People Should Be Afraid of Computers.

    ERIC Educational Resources Information Center

    Rubin, Charles

    1983-01-01

    Discusses the "computerphobia" phenomenon, separating the valid reasons for some individual's anxiety about computers from their irrational fears. Among the factors examined are fear of breaking the computer, use of unclear documentation, lack of time for learning how to use the computer, and lack of computer knowledge. (JN)

  20. Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2

    NASA Technical Reports Server (NTRS)

    Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)

    1998-01-01

    The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.

  1. [Earth Science Technology Office's Computational Technologies Project

    NASA Technical Reports Server (NTRS)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  2. Interesting viewpoints to those who will put Ada into practice

    NASA Technical Reports Server (NTRS)

    Carlsson, Arne

    1986-01-01

    Ada will most probably be used as the programming language for computers in the NASA Space Station. It is reasonable to suppose that Ada will be used for at least embedded computers, because the high software costs for these embedded computers were the reason why Ada activities were initiated about ten years ago. The on-board computers are designed for use in space applications, where maintenance by man is impossible. All manipulation of such computers has to be performed in an autonomous way or remote with commands from the ground. In a manned Space Station some maintenance work can be performed by service people on board, but there are still a lot of applications, which require autonomous computers, for example, vital Space Station functions and unmanned orbital transfer vehicles. Those aspect which have come out of the analysis of Ada characteristics together with the experience of requirements for embedded on-board computers in space applications are examined.

  3. Characterizing and modeling organic binder burnout from green ceramic compacts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewsuk, K.G.; Cesarano, J. III; Cochran, R.J.

    New characterization and computational techniques have been developed to evaluate and simulate binder burnout from pressed powder compacts. Using engineering data and a control volume finite element method (CVFEM) thermal model, a nominally one dimensional (1-D) furnace has been designed to test, refine, and validate computer models that simulate binder burnout assuming a 1-D thermal gradient across the ceramic body during heating. Experimentally, 1-D radial heat flow was achieved using a rod-shaped heater that directly heats the inside surface of a stack of ceramic annuli surrounded by thermal insulation. The computational modeling effort focused on producing a macroscopic model formore » binder burnout based on continuum approaches to heat and mass conservation for porous media. Two increasingly complex models have been developed that predict the temperature and mass of a porous powder compact as a function of time during binder burnout. The more complex model also predicts the pressure within a powder compact during binder burnout. Model predictions are in reasonably good agreement with experimental data on binder burnout from a 57--65% relative density pressed powder compact of a 94 wt% alumina body containing {approximately}3 wt% binder. In conjunction with the detailed experimental data from the prototype binder burnout furnace, the models have also proven useful for conducting parametric studies to elucidate critical i-material property data required to support model development.« less

  4. Oak Ridge Institutional Cluster Autotune Test Drive Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jibonananda, Sanyal; New, Joshua Ryan

    2014-02-01

    The Oak Ridge Institutional Cluster (OIC) provides general purpose computational resources for the ORNL staff to run computation heavy jobs that are larger than desktop applications but do not quite require the scale and power of the Oak Ridge Leadership Computing Facility (OLCF). This report details the efforts made and conclusions derived in performing a short test drive of the cluster resources on Phase 5 of the OIC. EnergyPlus was used in the analysis as a candidate user program and the overall software environment was evaluated against anticipated challenges experienced with resources such as the shared memory-Nautilus (JICS) and Titanmore » (OLCF). The OIC performed within reason and was found to be acceptable in the context of running EnergyPlus simulations. The number of cores per node and the availability of scratch space per node allow non-traditional desktop focused applications to leverage parallel ensemble execution. Although only individual runs of EnergyPlus were executed, the software environment on the OIC appeared suitable to run ensemble simulations with some modifications to the Autotune workflow. From a standpoint of general usability, the system supports common Linux libraries, compilers, standard job scheduling software (Torque/Moab), and the OpenMPI library (the only MPI library) for MPI communications. The file system is a Panasas file system which literature indicates to be an efficient file system.« less

  5. Advanced Variance Reduction Strategies for Optimizing Mesh Tallies in MAVRIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peplow, Douglas E.; Blakeman, Edward D; Wagner, John C

    2007-01-01

    More often than in the past, Monte Carlo methods are being used to compute fluxes or doses over large areas using mesh tallies (a set of region tallies defined on a mesh that overlays the geometry). For problems that demand that the uncertainty in each mesh cell be less than some set maximum, computation time is controlled by the cell with the largest uncertainty. This issue becomes quite troublesome in deep-penetration problems, and advanced variance reduction techniques are required to obtain reasonable uncertainties over large areas. The CADIS (Consistent Adjoint Driven Importance Sampling) methodology has been shown to very efficientlymore » optimize the calculation of a response (flux or dose) for a single point or a small region using weight windows and a biased source based on the adjoint of that response. This has been incorporated into codes such as ADVANTG (based on MCNP) and the new sequence MAVRIC, which will be available in the next release of SCALE. In an effort to compute lower uncertainties everywhere in the problem, Larsen's group has also developed several methods to help distribute particles more evenly, based on forward estimates of flux. This paper focuses on the use of a forward estimate to weight the placement of the source in the adjoint calculation used by CADIS, which we refer to as a forward-weighted CADIS (FW-CADIS).« less

  6. Computer-Game Construction: A Gender-Neutral Attractor to Computing Science

    ERIC Educational Resources Information Center

    Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan

    2010-01-01

    Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…

  7. Logic Feels so Good--I Like It! Evidence for Intuitive Detection of Logicality in Syllogistic Reasoning

    ERIC Educational Resources Information Center

    Morsanyi, Kinga; Handley, Simon J.

    2012-01-01

    When people evaluate syllogisms, their judgments of validity are often biased by the believability of the conclusions of the problems. Thus, it has been suggested that syllogistic reasoning performance is based on an interplay between a conscious and effortful evaluation of logicality and an intuitive appreciation of the believability of the…

  8. Using Performance Tasks to Improve Quantitative Reasoning in an Introductory Mathematics Course

    ERIC Educational Resources Information Center

    Kruse, Gerald; Drews, David

    2013-01-01

    A full-cycle assessment of our efforts to improve quantitative reasoning in an introductory math course is described. Our initial iteration substituted more open-ended performance tasks for the active learning projects than had been used. Using a quasi-experimental design, we compared multiple sections of the same course and found non-significant…

  9. Do Motives Matter?: Nonmedical Use of Prescription Medications among Adolescents

    ERIC Educational Resources Information Center

    McCabe, Sean Esteban; Boyd, Carol J.

    2012-01-01

    Adolescents' motives for engaging in nonmedical prescription drug use is somewhat different than their reasons for using other drugs, such as marijuana. For some youth, nonmedical prescription drug use is an attempt to self-treat a medical condition, for others it is an effort to get high, and some youth misuse prescription drugs for both reasons.…

  10. A Framework for Understanding and Cultivating the Transition from Arithmetic to Algebraic Reasoning

    ERIC Educational Resources Information Center

    Nathan, Mitchell J.; Koellner, Karen

    2007-01-01

    Algebraic reasoning stands as a formidable gatekeeper for students in their efforts to progress in mathematics and science, and to obtain economic opportunities (Ladson-Billings, 1998; RAND, 2003). Currently, mathematics education research has focused on algebra in order to provide access and opportunities for more students. There is now a growing…

  11. Computational Methods for Stability and Control (COMSAC): The Time Has Come

    NASA Technical Reports Server (NTRS)

    Hall, Robert M.; Biedron, Robert T.; Ball, Douglas N.; Bogue, David R.; Chung, James; Green, Bradford E.; Grismer, Matthew J.; Brooks, Gregory P.; Chambers, Joseph R.

    2005-01-01

    Powerful computational fluid dynamics (CFD) tools have emerged that appear to offer significant benefits as an adjunct to the experimental methods used by the stability and control community to predict aerodynamic parameters. The decreasing costs for and increasing availability of computing hours are making these applications increasingly viable as time goes on and the cost of computing continues to drop. This paper summarizes the efforts of four organizations to utilize high-end computational fluid dynamics (CFD) tools to address the challenges of the stability and control arena. General motivation and the backdrop for these efforts will be summarized as well as examples of current applications.

  12. Common world model for unmanned systems

    NASA Astrophysics Data System (ADS)

    Dean, Robert Michael S.

    2013-05-01

    The Robotic Collaborative Technology Alliance (RCTA) seeks to provide adaptive robot capabilities which move beyond traditional metric algorithms to include cognitive capabilities. Key to this effort is the Common World Model, which moves beyond the state-of-the-art by representing the world using metric, semantic, and symbolic information. It joins these layers of information to define objects in the world. These objects may be reasoned upon jointly using traditional geometric, symbolic cognitive algorithms and new computational nodes formed by the combination of these disciplines. The Common World Model must understand how these objects relate to each other. Our world model includes the concept of Self-Information about the robot. By encoding current capability, component status, task execution state, and histories we track information which enables the robot to reason and adapt its performance using Meta-Cognition and Machine Learning principles. The world model includes models of how aspects of the environment behave, which enable prediction of future world states. To manage complexity, we adopted a phased implementation approach to the world model. We discuss the design of "Phase 1" of this world model, and interfaces by tracing perception data through the system from the source to the meta-cognitive layers provided by ACT-R and SS-RICS. We close with lessons learned from implementation and how the design relates to Open Architecture.

  13. Associations of students' self-reports of their teachers' verbal aggression, intrinsic motivation, and perceptions of reasons for discipline in Greek physical education classes.

    PubMed

    Bekiari, Alexandra; Kokaridas, Dimitrios; Sakellariou, Kimon

    2006-04-01

    In this study were examined associations among physical education teachers' verbal aggressiveness as perceived by students and students' intrinsic motivation and reasons for discipline. The sample consisted of 265 Greek adolescent students who completed four questionnaires, the Verbal Aggressiveness Scale, the Lesson Satisfaction Scale, the Reasons for Discipline Scale, and the Intrinsic Motivation Inventory during physical education classes. Analysis indicated significant positive correlations among students' perceptions of teachers' verbal aggressiveness with pressure/ tension, external reasons, introjected reasons, no reasons, and self-responsibility. Significant negative correlations were noted for students' perceptions of teachers' verbal aggression with lesson satisfaction, enjoyment/interest, competence, effort/importance, intrinsic reasons, and caring. Differences between the two sexes were observed in their perceptions of teachers' verbal aggressiveness, intrinsic motivation, and reasons for discipline. Findings and implications for teachers' type of communication were also discussed and suggestions for research made.

  14. A compendium of computational fluid dynamics at the Langley Research Center

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Through numerous summary examples, the scope and general nature of the computational fluid dynamics (CFD) effort at Langley is identified. These summaries will help inform researchers in CFD and line management at Langley of the overall effort. In addition to the inhouse efforts, out of house CFD work supported by Langley through industrial contracts and university grants are included. Researchers were encouraged to include summaries of work in preliminary and tentative states of development as well as current research approaching definitive results.

  15. Motivational Beliefs, Student Effort, and Feedback Behaviour in Computer-Based Formative Assessment

    ERIC Educational Resources Information Center

    Timmers, Caroline F.; Braber-van den Broek, Jannie; van den Berg, Stephanie M.

    2013-01-01

    Feedback can only be effective when students seek feedback and process it. This study examines the relations between students' motivational beliefs, effort invested in a computer-based formative assessment, and feedback behaviour. Feedback behaviour is represented by whether a student seeks feedback and the time a student spends studying the…

  16. Establishing a K-12 Circuit Design Program

    ERIC Educational Resources Information Center

    Inceoglu, Mustafa M.

    2010-01-01

    Outreach, as defined by Wikipedia, is an effort by an organization or group to connect its ideas or practices to the efforts of other organizations, groups, specific audiences, or the general public. This paper describes a computer engineering outreach project of the Department of Computer Engineering at Ege University, Izmir, Turkey, to a local…

  17. Identifying Predictors of Achievement in the Newly Defined Information Literacy: A Neural Network Analysis

    ERIC Educational Resources Information Center

    Sexton, Randall; Hignite, Michael; Margavio, Thomas M.; Margavio, Geanie W.

    2009-01-01

    Information Literacy is a concept that evolved as a result of efforts to move technology-based instructional and research efforts beyond the concepts previously associated with "computer literacy." While computer literacy was largely a topic devoted to knowledge of hardware and software, information literacy is concerned with students' abilities…

  18. Determining the Alignment between What Teachers Are Expected to Teach, What They Know, and How They Assess Scientific Literacy

    ERIC Educational Resources Information Center

    Pitot, Lisa Noel

    2014-01-01

    Science education reform efforts have highlighted the need for a scientifically literate citizen, capable of using their scientific knowledge and skills for reasoning, argumentation, and decision-making. Yet little is known about secondary science teachers' understandings of these reform efforts, specifically their knowledge, skills, and abilities…

  19. Predicting reasoning from memory.

    PubMed

    Heit, Evan; Hayes, Brett K

    2011-02-01

    In an effort to assess the relations between reasoning and memory, in 8 experiments, the authors examined how well responses on an inductive reasoning task are predicted from responses on a recognition memory task for the same picture stimuli. Across several experimental manipulations, such as varying study time, presentation frequency, and the presence of stimuli from other categories, there was a high correlation between reasoning and memory responses (average r = .87), and these manipulations showed similar effects on the 2 tasks. The results point to common mechanisms underlying inductive reasoning and recognition memory abilities. A mathematical model, GEN-EX (generalization from examples), derived from exemplar models of categorization, is presented, which predicts both reasoning and memory responses from pairwise similarities among the stimuli, allowing for additional influences of subtyping and deterministic responding. (c) 2010 APA, all rights reserved.

  20. Reasoning on the Autism Spectrum: A Dual Process Theory Account.

    PubMed

    Brosnan, Mark; Lewton, Marcus; Ashwin, Chris

    2016-06-01

    Dual process theory proposes two distinct reasoning processes in humans, an intuitive style that is rapid and automatic and a deliberative style that is more effortful. However, no study to date has specifically examined these reasoning styles in relation to the autism spectrum. The present studies investigated deliberative and intuitive reasoning profiles in: (1) a non-clinical sample from the general population with varying degrees of autism traits (n = 95), and (2) males diagnosed with ASD (n = 17) versus comparisons (n = 18). Taken together, the results suggest reasoning on the autism spectrum is compatible with the processes proposed by Dual Process Theory and that higher autism traits and ASD are characterised by a consistent bias towards deliberative reasoning (and potentially away from intuition).

  1. A Case against Computer Symbolic Manipulation in School Mathematics Today.

    ERIC Educational Resources Information Center

    Waits, Bert K.; Demana, Franklin

    1992-01-01

    Presented are two reasons discouraging computer symbol manipulation systems use in school mathematics at present: cost for computer laboratories or expensive pocket computers; and impracticality of exact solution representations. Although development with this technology in mathematics education advances, graphing calculators are recommended to…

  2. 77 FR 58576 - Certain Wireless Communication Devices, Portable Music and Data Processing Devices, Computers...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-21

    ... Devices, Portable Music and Data Processing Devices, Computers, and Components Thereof; Institution of... communication devices, portable music and data processing devices, computers, and components thereof by reason... certain wireless communication devices, portable music and data processing devices, computers, and...

  3. Are human beings humean robots?

    NASA Astrophysics Data System (ADS)

    Génova, Gonzalo; Quintanilla Navarro, Ignacio

    2018-01-01

    David Hume, the Scottish philosopher, conceives reason as the slave of the passions, which implies that human reason has predetermined objectives it cannot question. An essential element of an algorithm running on a computational machine (or Logical Computing Machine, as Alan Turing calls it) is its having a predetermined purpose: an algorithm cannot question its purpose, because it would cease to be an algorithm. Therefore, if self-determination is essential to human intelligence, then human beings are neither Humean beings, nor computational machines. We examine also some objections to the Turing Test as a model to understand human intelligence.

  4. Learning mechanisms in pain chronification--teachings from placebo research.

    PubMed

    Ingvar, Martin

    2015-04-01

    This review presents a general model for the understanding of pain, placebo, and chronification of pain in the framework of cognitive neuroscience. The concept of a computational cost-function underlying the functional imaging responses to placebo manipulations is put forward and demonstrated to be compatible with the placebo literature including data that demonstrate that placebo responses as seen on the behavioural level may be elicited on all levels of the neuroaxis. In the same vein, chronification of pain is discussed as a consequence of brain mechanisms for learning and expectation. Further studies are necessary on the reversal of chronic pain given the weak effects of treatment but also due to alarming findings that suggest morphological changes in the brain pain regulatory systems concurrent with the chronification process. The burden of chronic pain is devastating both on the individual level and society level and affects more than one-quarter of the world's population. Women are greatly overrepresented in patients with chronic pain. Hence, both from a general standpoint and from reasons of health equity, it is of essence to advance research and care efforts. Success in these efforts will only be granted with better theoretical concepts of chronic pain mechanisms that maps into the framework of cognitive neuroscience.

  5. Learning mechanisms in pain chronification—teachings from placebo research

    PubMed Central

    2015-01-01

    Abstract This review presents a general model for the understanding of pain, placebo, and chronification of pain in the framework of cognitive neuroscience. The concept of a computational cost-function underlying the functional imaging responses to placebo manipulations is put forward and demonstrated to be compatible with the placebo literature including data that demonstrate that placebo responses as seen on the behavioural level may be elicited on all levels of the neuroaxis. In the same vein, chronification of pain is discussed as a consequence of brain mechanisms for learning and expectation. Further studies are necessary on the reversal of chronic pain given the weak effects of treatment but also due to alarming findings that suggest morphological changes in the brain pain regulatory systems concurrent with the chronification process. The burden of chronic pain is devastating both on the individual level and society level and affects more than one-quarter of the world's population. Women are greatly overrepresented in patients with chronic pain. Hence, both from a general standpoint and from reasons of health equity, it is of essence to advance research and care efforts. Success in these efforts will only be granted with better theoretical concepts of chronic pain mechanisms that maps into the framework of cognitive neuroscience. PMID:25789431

  6. Exploratory procedures with carbon nanotube-based sensors for propellant degradation determinations

    NASA Astrophysics Data System (ADS)

    Ruffin, Paul B.; Edwards, Eugene; Brantley, Christina; McDonald, Brian

    2010-04-01

    Exploratory research is conducted at the US Army Aviation & Missile Research, Development, and Engineering Center (AMRDEC) in order to perform assessments of the degradation of solid propellant used in rocket motors. Efforts are made to discontinue and/or minimize destructive methods and utilize nondestructive techniques to assure the quality and reliability of the weaponry's propulsion system. Collaborative efforts were successfully made between AMRDEC and NASA-Ames for potential add-on configurations to a previously designed sensor that AMRDEC plan to use for preliminary detection of off-gassing. Evaluations were made in order to use the design as the introductory component for the determination of shelf-life degradation rate of rocket motors. Previous and subsequent sensor designs utilize functionalized single-walled carbon nano-tubes (SWCNTs) as the key sensing element. On-going research is conducted to consider key changes that can be implemented (for the existing sensor design) such that a complete wireless sensor system design can be realized. Results should be a cost-saving and timely approach to enhance the Army's ability to develop methodologies for measuring weaponry off-gassing and simultaneously detecting explosives. Expectations are for the resulting sensors to enhance the warfighters' ability to simultaneously detect a greater variety of analytes. Outlined in this paper are the preliminary results that have been accomplished for this research. The behavior of the SWCNT sensor at storage temperatures is outlined, along with the initial sensor response to propellant related analytes. Preparatory computer-based programming routines and computer controlled instrumentation scenarios have been developed in order to subsequently minimize subjective interpretation of test results and provide a means for obtaining data that is reasonable and repetitively quantitative. Typical laboratory evaluation methods are likewise presented, and program limitations/barriers are outlined.

  7. Computer analysis of digital sky surveys using citizen science and manual classification

    NASA Astrophysics Data System (ADS)

    Kuminski, Evan; Shamir, Lior

    2015-01-01

    As current and future digital sky surveys such as SDSS, LSST, DES, Pan-STARRS and Gaia create increasingly massive databases containing millions of galaxies, there is a growing need to be able to efficiently analyze these data. An effective way to do this is through manual analysis, however, this may be insufficient considering the extremely vast pipelines of astronomical images generated by the present and future surveys. Some efforts have been made to use citizen science to classify galaxies by their morphology on a larger scale than individual or small groups of scientists can. While these citizen science efforts such as Zooniverse have helped obtain reasonably accurate morphological information about large numbers of galaxies, they cannot scale to provide complete analysis of billions of galaxy images that will be collected by future ventures such as LSST. Since current forms of manual classification cannot scale to the masses of data collected by digital sky surveys, it is clear that in order to keep up with the growing databases some form of automation of the data analysis will be required, and will work either independently or in combination with human analysis such as citizen science. Here we describe a computer vision method that can automatically analyze galaxy images and deduce galaxy morphology. Experiments using Galaxy Zoo 2 data show that the performance of the method increases as the degree of agreement between the citizen scientists gets higher, providing a cleaner dataset. For several morphological features, such as the spirality of the galaxy, the algorithm agreed with the citizen scientists on around 95% of the samples. However, the method failed to analyze some of the morphological features such as the number of spiral arms, and provided accuracy of just ~36%.

  8. Impact of remote sensing upon the planning, management, and development of water resources

    NASA Technical Reports Server (NTRS)

    Loats, H. L.; Fowler, T. R.; Frech, S. L.

    1974-01-01

    A survey of the principal water resource users was conducted to determine the impact of new remote data streams on hydrologic computer models. The analysis of the responses and direct contact demonstrated that: (1) the majority of water resource effort of the type suitable to remote sensing inputs is conducted by major federal water resources agencies or through federally stimulated research, (2) the federal government develops most of the hydrologic models used in this effort; and (3) federal computer power is extensive. The computers, computer power, and hydrologic models in current use were determined.

  9. The Rayleigh curve as a model for effort distribution over the life of medium scale software systems. M.S. Thesis - Maryland Univ.

    NASA Technical Reports Server (NTRS)

    Picasso, G. O.; Basili, V. R.

    1982-01-01

    It is noted that previous investigations into the applicability of Rayleigh curve model to medium scale software development efforts have met with mixed results. The results of these investigations are confirmed by analyses of runs and smoothing. The reasons for the models' failure are found in the subcycle effort data. There are four contributing factors: uniqueness of the environment studied, the influence of holidays, varying management techniques and differences in the data studied.

  10. An opportunity cost model of subjective effort and task performance

    PubMed Central

    Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus

    2013-01-01

    Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775

  11. An efficient method for hybrid density functional calculation with spin-orbit coupling

    NASA Astrophysics Data System (ADS)

    Wang, Maoyuan; Liu, Gui-Bin; Guo, Hong; Yao, Yugui

    2018-03-01

    In first-principles calculations, hybrid functional is often used to improve accuracy from local exchange correlation functionals. A drawback is that evaluating the hybrid functional needs significantly more computing effort. When spin-orbit coupling (SOC) is taken into account, the non-collinear spin structure increases computing effort by at least eight times. As a result, hybrid functional calculations with SOC are intractable in most cases. In this paper, we present an approximate solution to this problem by developing an efficient method based on a mixed linear combination of atomic orbital (LCAO) scheme. We demonstrate the power of this method using several examples and we show that the results compare very well with those of direct hybrid functional calculations with SOC, yet the method only requires a computing effort similar to that without SOC. The presented technique provides a good balance between computing efficiency and accuracy, and it can be extended to magnetic materials.

  12. Computational Models of Anterior Cingulate Cortex: At the Crossroads between Prediction and Effort.

    PubMed

    Vassena, Eliana; Holroyd, Clay B; Alexander, William H

    2017-01-01

    In the last two decades the anterior cingulate cortex (ACC) has become one of the most investigated areas of the brain. Extensive neuroimaging evidence suggests countless functions for this region, ranging from conflict and error coding, to social cognition, pain and effortful control. In response to this burgeoning amount of data, a proliferation of computational models has tried to characterize the neurocognitive architecture of ACC. Early seminal models provided a computational explanation for a relatively circumscribed set of empirical findings, mainly accounting for EEG and fMRI evidence. More recent models have focused on ACC's contribution to effortful control. In parallel to these developments, several proposals attempted to explain within a single computational framework a wider variety of empirical findings that span different cognitive processes and experimental modalities. Here we critically evaluate these modeling attempts, highlighting the continued need to reconcile the array of disparate ACC observations within a coherent, unifying framework.

  13. Building Citywide Systems for Quality: A Guide and Case Studies for Afterschool Leaders

    ERIC Educational Resources Information Center

    Yohalem, Nicole; Devaney, Elizabeth; Smith, Charles; Wilson-Ahlstrom, Alicia

    2012-01-01

    A quality improvement system (QIS) is an intentional effort to raise the quality of afterschool programming in an ongoing, organized fashion. There are a number of reasons the QIS is gaining popularity. The main reasons community leaders are drawn to improving quality is that they know that 1) higher quality programs will mean better experiences…

  14. Prediction and causal reasoning in planning

    NASA Technical Reports Server (NTRS)

    Dean, T.; Boddy, M.

    1987-01-01

    Nonlinear planners are often touted as having an efficiency advantage over linear planners. The reason usually given is that nonlinear planners, unlike their linear counterparts, are not forced to make arbitrary commitments to the order in which actions are to be performed. This ability to delay commitment enables nonlinear planners to solve certain problems with far less effort than would be required of linear planners. Here, it is argued that this advantage is bought with a significant reduction in the ability of a nonlinear planner to accurately predict the consequences of actions. Unfortunately, the general problem of predicting the consequences of a partially ordered set of actions is intractable. In gaining the predictive power of linear planners, nonlinear planners sacrifice their efficiency advantage. There are, however, other advantages to nonlinear planning (e.g., the ability to reason about partial orders and incomplete information) that make it well worth the effort needed to extend nonlinear methods. A framework is supplied for causal inference that supports reasoning about partially ordered events and actions whose effects depend upon the context in which they are executed. As an alternative to a complete but potentially exponential-time algorithm, researchers provide a provably sound polynomial-time algorithm for predicting the consequences of partially ordered events.

  15. The logic-bias effect: The role of effortful processing in the resolution of belief-logic conflict.

    PubMed

    Howarth, Stephanie; Handley, Simon J; Walsh, Clare

    2016-02-01

    According to the default interventionist dual-process account of reasoning, belief-based responses to reasoning tasks are based on Type 1 processes generated by default, which must be inhibited in order to produce an effortful, Type 2 output based on the validity of an argument. However, recent research has indicated that reasoning on the basis of beliefs may not be as fast and automatic as this account claims. In three experiments, we presented participants with a reasoning task that was to be completed while they were generating random numbers (RNG). We used the novel methodology introduced by Handley, Newstead & Trippas (Journal of Experimental Psychology: Learning, Memory, and Cognition, 37, 28-43, 2011), which required participants to make judgments based upon either the validity of a conditional argument or the believability of its conclusion. The results showed that belief-based judgments produced lower rates of accuracy overall and were influenced to a greater extent than validity judgments by the presence of a conflict between belief and logic for both simple and complex arguments. These findings were replicated in Experiment 3, in which we controlled for switching demands in a blocked design. Across all three experiments, we found a main effect of RNG, implying that both instructional sets require some effortful processing. However, in the blocked design RNG had its greatest impact on logic judgments, suggesting that distinct executive resources may be required for each type of judgment. We discuss the implications of our findings for the default interventionist account and offer a parallel competitive model as an alternative interpretation for our findings.

  16. Autobiographical reasoning: arguing and narrating from a biographical perspective.

    PubMed

    Habermas, Tilmann

    2011-01-01

    Autobiographical reasoning is the activity of creating relations between different parts of one's past, present, and future life and one's personality and development. It embeds personal memories in a culturally, temporally, causally, and thematically coherent life story. Prototypical autobiographical arguments are presented. Culture and socializing interactions shape the development of autobiographical reasoning especially in late childhood and adolescence. Situated at the intersection of cognitive and narrative development and autobiographical memory, autobiographical reasoning contributes to the development of personality and identity, is instrumental in efforts to cope with life events, and helps to create a shared history. Copyright © 2011 Wiley Periodicals, Inc., A Wiley Company.

  17. An Investigation of the Flow Physics of Acoustic Liners by Direct Numerical Simulation

    NASA Technical Reports Server (NTRS)

    Watson, Willie R. (Technical Monitor); Tam, Christopher

    2004-01-01

    This report concentrates on reporting the effort and status of work done on three dimensional (3-D) simulation of a multi-hole resonator in an impedance tube. This work is coordinated with a parallel experimental effort to be carried out at the NASA Langley Research Center. The outline of this report is as follows : 1. Preliminary consideration. 2. Computation model. 3. Mesh design and parallel computing. 4. Visualization. 5. Status of computer code development. 1. Preliminary Consideration.

  18. Professional Computer Education Organizations--A Resource for Administrators.

    ERIC Educational Resources Information Center

    Ricketts, Dick

    Professional computer education organizations serve a valuable function by generating, collecting, and disseminating information concerning the role of the computer in education. This report touches briefly on the reasons for the rapid and successful development of professional computer education organizations. A number of attributes of effective…

  19. [Earth and Space Sciences Project Services for NASA HPCC

    NASA Technical Reports Server (NTRS)

    Merkey, Phillip

    2002-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  20. Automated Estimation Of Software-Development Costs

    NASA Technical Reports Server (NTRS)

    Roush, George B.; Reini, William

    1993-01-01

    COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.

  1. Logic as Marr's Computational Level: Four Case Studies.

    PubMed

    Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter

    2015-04-01

    We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition. Copyright © 2014 Cognitive Science Society, Inc.

  2. The limits of intelligence in design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Papamichael, K.; Protzen, J.P.

    1993-05-01

    A new, comprehensive design theory is presented, applicable to all design domains such as engineering and industrial design, architecture, city and regional planning, and, in general, any goal-oriented activity that involves decision making. The design process is analyzed into fundamental activities that are characterized with respect to the nature of knowledge requirements and the degree to which they can be specified and delegated to others, in general, and to computers in particular. The characterization of design problems as ``wicked,`` or ``ill-defined,`` design has been understood as a rational activity, that is ``thinking before acting.`` The new theory presented in thismore » paper suggests that design is ``thinking and feeling while acting,`` supporting the position that design is only partially rational. Intelligence, ``natural`` or ``artificial,`` is only one of two requirements for design, the other being emotions. Design decisions are only partially inferred, that is, they are not entirely the product of reasoning. Rather, design decisions are based on judgment that requires the notion of ``good`` and ``bad,`` which is attributed to feelings, rather than thoughts. The presentation of the design theory extends to the implications associated with the limits of intelligence in design, which, in turn, become constraints on the potential role of computers in design. Many of the current development efforts in computer-aided design violate these constraints, especially in the implementation of expert systems and multi-criterion evaluation models. These violations are identified and discussed in detail. Finally, specific areas for further research and development in computer-aided design are presented and discussed.« less

  3. Automated Stitching of Microtubule Centerlines across Serial Electron Tomograms

    PubMed Central

    Weber, Britta; Tranfield, Erin M.; Höög, Johanna L.; Baum, Daniel; Antony, Claude; Hyman, Tony; Verbavatz, Jean-Marc; Prohaska, Steffen

    2014-01-01

    Tracing microtubule centerlines in serial section electron tomography requires microtubules to be stitched across sections, that is lines from different sections need to be aligned, endpoints need to be matched at section boundaries to establish a correspondence between neighboring sections, and corresponding lines need to be connected across multiple sections. We present computational methods for these tasks: 1) An initial alignment is computed using a distance compatibility graph. 2) A fine alignment is then computed with a probabilistic variant of the iterative closest points algorithm, which we extended to handle the orientation of lines by introducing a periodic random variable to the probabilistic formulation. 3) Endpoint correspondence is established by formulating a matching problem in terms of a Markov random field and computing the best matching with belief propagation. Belief propagation is not generally guaranteed to converge to a minimum. We show how convergence can be achieved, nonetheless, with minimal manual input. In addition to stitching microtubule centerlines, the correspondence is also applied to transform and merge the electron tomograms. We applied the proposed methods to samples from the mitotic spindle in C. elegans, the meiotic spindle in X. laevis, and sub-pellicular microtubule arrays in T. brucei. The methods were able to stitch microtubules across section boundaries in good agreement with experts' opinions for the spindle samples. Results, however, were not satisfactory for the microtubule arrays. For certain experiments, such as an analysis of the spindle, the proposed methods can replace manual expert tracing and thus enable the analysis of microtubules over long distances with reasonable manual effort. PMID:25438148

  4. Automated stitching of microtubule centerlines across serial electron tomograms.

    PubMed

    Weber, Britta; Tranfield, Erin M; Höög, Johanna L; Baum, Daniel; Antony, Claude; Hyman, Tony; Verbavatz, Jean-Marc; Prohaska, Steffen

    2014-01-01

    Tracing microtubule centerlines in serial section electron tomography requires microtubules to be stitched across sections, that is lines from different sections need to be aligned, endpoints need to be matched at section boundaries to establish a correspondence between neighboring sections, and corresponding lines need to be connected across multiple sections. We present computational methods for these tasks: 1) An initial alignment is computed using a distance compatibility graph. 2) A fine alignment is then computed with a probabilistic variant of the iterative closest points algorithm, which we extended to handle the orientation of lines by introducing a periodic random variable to the probabilistic formulation. 3) Endpoint correspondence is established by formulating a matching problem in terms of a Markov random field and computing the best matching with belief propagation. Belief propagation is not generally guaranteed to converge to a minimum. We show how convergence can be achieved, nonetheless, with minimal manual input. In addition to stitching microtubule centerlines, the correspondence is also applied to transform and merge the electron tomograms. We applied the proposed methods to samples from the mitotic spindle in C. elegans, the meiotic spindle in X. laevis, and sub-pellicular microtubule arrays in T. brucei. The methods were able to stitch microtubules across section boundaries in good agreement with experts' opinions for the spindle samples. Results, however, were not satisfactory for the microtubule arrays. For certain experiments, such as an analysis of the spindle, the proposed methods can replace manual expert tracing and thus enable the analysis of microtubules over long distances with reasonable manual effort.

  5. Expertise and reasoning with possibility: An explanation of modal logic and expert systems

    NASA Technical Reports Server (NTRS)

    Rochowiak, Daniel

    1988-01-01

    Recently systems of modal reasoning have been brought to the foreground of artificial intelligence studies. The intuitive idea of research efforts in this area is that in addition to the actual world in which sentences have certain truth values there are other worlds in which those sentences have different truth values. Such alternative worlds can be considered as possible worlds, and an agent may or may not have access to some or all of them. This approach to reasoning can be valuable in extending the expert system paradigm. Using the scheme of reasoning proposed by Toulmin, Reike and Janick and the modal system T, a scheme is proposed for expert reasoning that mitigates some of the criticisms raised by Schank and Nickerson.

  6. Relations between inductive reasoning and deductive reasoning.

    PubMed

    Heit, Evan; Rotello, Caren M

    2010-05-01

    One of the most important open questions in reasoning research is how inductive reasoning and deductive reasoning are related. In an effort to address this question, we applied methods and concepts from memory research. We used 2 experiments to examine the effects of logical validity and premise-conclusion similarity on evaluation of arguments. Experiment 1 showed 2 dissociations: For a common set of arguments, deduction judgments were more affected by validity, and induction judgments were more affected by similarity. Moreover, Experiment 2 showed that fast deduction judgments were like induction judgments-in terms of being more influenced by similarity and less influenced by validity, compared with slow deduction judgments. These novel results pose challenges for a 1-process account of reasoning and are interpreted in terms of a 2-process account of reasoning, which was implemented as a multidimensional signal detection model and applied to receiver operating characteristic data. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  7. Operation ARA: A Computerized Learning Game that Teaches Critical Thinking and Scientific Reasoning

    ERIC Educational Resources Information Center

    Halpern, Diane F.; Millis, Keith; Graesser, Arthur C.; Butler, Heather; Forsyth, Carol; Cai, Zhiqiang

    2012-01-01

    Operation ARA (Acquiring Research Acumen) is a computerized learning game that teaches critical thinking and scientific reasoning. It is a valuable learning tool that utilizes principles from the science of learning and serious computer games. Students learn the skills of scientific reasoning by engaging in interactive dialogs with avatars. They…

  8. Using Computer Simulations for Promoting Model-Based Reasoning: Epistemological and Educational Dimensions

    ERIC Educational Resources Information Center

    Develaki, Maria

    2017-01-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and…

  9. Reference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate BoilingReference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate Boiling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pointer, William David

    The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes weremore » used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge« less

  10. The Cognitive Predictors of Computational Skill with Whole versus Rational Numbers: An Exploratory Study.

    PubMed

    Seethaler, Pamela M; Fuchs, Lynn S; Star, Jon R; Bryant, Joan

    2011-10-01

    The purpose of the present study was to explore the 3(rd)-grade cognitive predictors of 5th-grade computational skill with rational numbers and how those are similar to and different from the cognitive predictors of whole-number computational skill. Students (n = 688) were assessed on incoming whole-number calculation skill, language, nonverbal reasoning, concept formation, processing speed, and working memory in the fall of 3(rd) grade. Students were followed longitudinally and assessed on calculation skill with whole numbers and with rational numbers in the spring of 5(th) grade. The unique predictors of skill with whole-number computation were incoming whole-number calculation skill, nonverbal reasoning, concept formation, and working memory (numerical executive control). In addition to these cognitive abilities, language emerged as a unique predictor of rational-number computational skill.

  11. The Cognitive Predictors of Computational Skill with Whole versus Rational Numbers: An Exploratory Study

    PubMed Central

    Seethaler, Pamela M.; Fuchs, Lynn S.; Star, Jon R.; Bryant, Joan

    2011-01-01

    The purpose of the present study was to explore the 3rd-grade cognitive predictors of 5th-grade computational skill with rational numbers and how those are similar to and different from the cognitive predictors of whole-number computational skill. Students (n = 688) were assessed on incoming whole-number calculation skill, language, nonverbal reasoning, concept formation, processing speed, and working memory in the fall of 3rd grade. Students were followed longitudinally and assessed on calculation skill with whole numbers and with rational numbers in the spring of 5th grade. The unique predictors of skill with whole-number computation were incoming whole-number calculation skill, nonverbal reasoning, concept formation, and working memory (numerical executive control). In addition to these cognitive abilities, language emerged as a unique predictor of rational-number computational skill. PMID:21966180

  12. Colovesical fistula causing an uncommon reason for failure of computed tomography colonography: a case report.

    PubMed

    Neroladaki, Angeliki; Breguet, Romain; Botsikas, Diomidis; Terraz, Sylvain; Becker, Christoph D; Montet, Xavier

    2012-07-23

    Computed tomography colonography, or virtual colonoscopy, is a good alternative to optical colonoscopy. However, suboptimal patient preparation or colon distension may reduce the diagnostic accuracy of this imaging technique. We report the case of an 83-year-old Caucasian woman who presented with a five-month history of pneumaturia and fecaluria and an acute episode of macrohematuria, leading to a high clinical suspicion of a colovesical fistula. The fistula was confirmed by standard contrast-enhanced computed tomography. Optical colonoscopy was performed to exclude the presence of an underlying colonic neoplasm. Since optical colonoscopy was incomplete, computed tomography colonography was performed, but also failed due to inadequate colon distension. The insufflated air directly accumulated within the bladder via the large fistula. Clinicians should consider colovesical fistula as a potential reason for computed tomography colonography failure.

  13. Microprogramming Handbook. Second Edition.

    ERIC Educational Resources Information Center

    Microdata Corp., Santa Ana, CA.

    Instead of instructions residing in the main memory as in a fixed instruction computer, a micro-programable computer has a separete read-only memory which is alterable so that the system can be efficiently adapted to the application at hand. Microprogramable computers are faster than fixed instruction computers for several reasons: instruction…

  14. An Educational Approach to Computationally Modeling Dynamical Systems

    ERIC Educational Resources Information Center

    Chodroff, Leah; O'Neal, Tim M.; Long, David A.; Hemkin, Sheryl

    2009-01-01

    Chemists have used computational science methodologies for a number of decades and their utility continues to be unabated. For this reason we developed an advanced lab in computational chemistry in which students gain understanding of general strengths and weaknesses of computation-based chemistry by working through a specific research problem.…

  15. Establishing a Computer Literacy Requirement for All Students.

    ERIC Educational Resources Information Center

    Kieffer, Linda M.

    Several factors have indicated the necessity of formally requiring computer literacy at the university level. This paper discusses the reasoning for, the development of, and content of two computer literacy courses required of all freshmen. The first course contains computer awareness and knowledge that students should have upon entering the…

  16. Taking Stock: The Movement To Create Mini-Schools, Schools-within-Schools, and Separate Small Schools. Urban Diversity Series No. 108.

    ERIC Educational Resources Information Center

    Raywid, Mary Anne

    Many educators see school downsizing efforts as the linchpin of school restructuring. Several forms that school downsizing efforts are taking are explored, along with a discussion of the reasons for which small schools are being established. The types of subschools that are being launched (houses, mini-schools, schools-within-schools) are…

  17. Coordination of Ocean Management: A Perspective on the Gulf of Maine,

    DTIC Science & Technology

    1982-11-01

    transportation modes. The U.S. Coast Guard operates two OMEGA stations; the remaining six are operated by host countries under international agreement. The...42 Reasons for a Comprehensive Approach .......................... 43 Regulatory Reform .................................. 44...physical ocean, but that framework falls short in its efforts to guide and coordinate the activities of ocean users. Much of the remaining effort in this

  18. VIRTUAL REALITY HYPNOSIS

    PubMed Central

    Askay, Shelley Wiechman; Patterson, David R.; Sharar, Sam R.

    2010-01-01

    Scientific evidence for the viability of hypnosis as a treatment for pain has flourished over the past two decades (Rainville, Duncan, Price, Carrier and Bushnell, 1997; Montgomery, DuHamel and Redd, 2000; Lang and Rosen, 2002; Patterson and Jensen, 2003). However its widespread use has been limited by factors such as the advanced expertise, time and effort required by clinicians to provide hypnosis, and the cognitive effort required by patients to engage in hypnosis. The theory in developing virtual reality hypnosis was to apply three-dimensional, immersive, virtual reality technology to guide the patient through the same steps used when hypnosis is induced through an interpersonal process. Virtual reality replaces many of the stimuli that the patients have to struggle to imagine via verbal cueing from the therapist. The purpose of this paper is to explore how virtual reality may be useful in delivering hypnosis, and to summarize the scientific literature to date. We will also explore various theoretical and methodological issues that can guide future research. In spite of the encouraging scientific and clinical findings, hypnosis for analgesia is not universally used in medical centres. One reason for the slow acceptance is the extensive provider training required in order for hypnosis to be an effective pain management modality. Training in hypnosis is not commonly offered in medical schools or even psychology graduate curricula. Another reason is that hypnosis requires far more time and effort to administer than an analgesic pill or injection. Hypnosis requires training, skill and patience to deliver in medical centres that are often fast-paced and highly demanding of clinician time. Finally, the attention and cognitive effort required for hypnosis may be more than patients in an acute care setting, who may be under the influence of opiates and benzodiazepines, are able to impart. It is a challenge to make hypnosis a standard part of care in this environment. Over the past 25 years, researchers have been investigating ways to make hypnosis more standardized and accessible. There have been a handful of studies that have looked at the efficacy of using audiotapes to provide the hypnotic intervention (Johnson and Wiese, 1979; Hart, 1980; Block, Ghoneim, Sum Ping and Ali, 1991; Enqvist, Bjorklund, Engman and Jakobsson, 1997; Eberhart, Doring, Holzrichter, Roscher and Seeling, 1998; Perugini, Kirsch, Allen, et al., 1998; Forbes, MacAuley, Chiotakakou-Faliakou, 2000; Ghoneim, Block, Sarasin, Davis and Marchman, 2000). These studies have yielded mixed results. Generally, we can conclude that audio-taped hypnosis is more effective than no treatment at all, but less effective than the presence of a live hypnotherapist. Grant and Nash (1995) were the first to use computer-assisted hypnosis as a behavioural measure to assess hypnotizability. They used a digitized voice that guided subjects through a procedure and tailored software according to the subject’s unique responses and reactions. However, it utilized conventional two-dimensional screen technology that required patients to focus their attention on a computer screen, making them vulnerable to any type of distraction that might enter the environment. Further, the two-dimensional technology did not present compelling visual stimuli for capturing the user’s attention. PMID:20737029

  19. VIRTUAL REALITY HYPNOSIS.

    PubMed

    Askay, Shelley Wiechman; Patterson, David R; Sharar, Sam R

    2009-03-01

    Scientific evidence for the viability of hypnosis as a treatment for pain has flourished over the past two decades (Rainville, Duncan, Price, Carrier and Bushnell, 1997; Montgomery, DuHamel and Redd, 2000; Lang and Rosen, 2002; Patterson and Jensen, 2003). However its widespread use has been limited by factors such as the advanced expertise, time and effort required by clinicians to provide hypnosis, and the cognitive effort required by patients to engage in hypnosis.The theory in developing virtual reality hypnosis was to apply three-dimensional, immersive, virtual reality technology to guide the patient through the same steps used when hypnosis is induced through an interpersonal process. Virtual reality replaces many of the stimuli that the patients have to struggle to imagine via verbal cueing from the therapist. The purpose of this paper is to explore how virtual reality may be useful in delivering hypnosis, and to summarize the scientific literature to date. We will also explore various theoretical and methodological issues that can guide future research.In spite of the encouraging scientific and clinical findings, hypnosis for analgesia is not universally used in medical centres. One reason for the slow acceptance is the extensive provider training required in order for hypnosis to be an effective pain management modality. Training in hypnosis is not commonly offered in medical schools or even psychology graduate curricula. Another reason is that hypnosis requires far more time and effort to administer than an analgesic pill or injection. Hypnosis requires training, skill and patience to deliver in medical centres that are often fast-paced and highly demanding of clinician time. Finally, the attention and cognitive effort required for hypnosis may be more than patients in an acute care setting, who may be under the influence of opiates and benzodiazepines, are able to impart. It is a challenge to make hypnosis a standard part of care in this environment.Over the past 25 years, researchers have been investigating ways to make hypnosis more standardized and accessible. There have been a handful of studies that have looked at the efficacy of using audiotapes to provide the hypnotic intervention (Johnson and Wiese, 1979; Hart, 1980; Block, Ghoneim, Sum Ping and Ali, 1991; Enqvist, Bjorklund, Engman and Jakobsson, 1997; Eberhart, Doring, Holzrichter, Roscher and Seeling, 1998; Perugini, Kirsch, Allen, et al., 1998; Forbes, MacAuley, Chiotakakou-Faliakou, 2000; Ghoneim, Block, Sarasin, Davis and Marchman, 2000). These studies have yielded mixed results. Generally, we can conclude that audio-taped hypnosis is more effective than no treatment at all, but less effective than the presence of a live hypnotherapist. Grant and Nash (1995) were the first to use computer-assisted hypnosis as a behavioural measure to assess hypnotizability. They used a digitized voice that guided subjects through a procedure and tailored software according to the subject's unique responses and reactions. However, it utilized conventional two-dimensional screen technology that required patients to focus their attention on a computer screen, making them vulnerable to any type of distraction that might enter the environment. Further, the two-dimensional technology did not present compelling visual stimuli for capturing the user's attention.

  20. Computer Anxiety: Relationship to Math Anxiety and Holland Types.

    ERIC Educational Resources Information Center

    Bellando, Jayne; Winer, Jane L.

    Although the number of computers in the school system is increasing, many schools are not using computers to their capacity. One reason for this may be computer anxiety on the part of the teacher. A review of the computer anxiety literature reveals little information on the subject, and findings from previous studies suggest that basic controlled…

  1. Applications of Out-of-Domain Knowledge in Students' Reasoning about Computer Program State

    ERIC Educational Resources Information Center

    Lewis, Colleen Marie

    2012-01-01

    To meet a growing demand and a projected deficit in the supply of computer professionals (NCWIT, 2009), it is of vital importance to expand students' access to computer science. However, many researchers in the computer science education community unproductively assume that some students lack an innate ability for computer science and…

  2. The Reasoning behind the Scene: Why Do Early Childhood Educators Use Computers in Their Classrooms?

    ERIC Educational Resources Information Center

    Edwards, Suzy

    2005-01-01

    In recent times discussion surrounding the use of computers in early childhood education has emphasised the role computers play in children's everyday lives. This realisation has replaced early debate regarding the appropriateness or otherwise of computer use for young children in early childhood education. An important component of computer use…

  3. Clinical Computer Applications in Mental Health

    PubMed Central

    Greist, John H.; Klein, Marjorie H.; Erdman, Harold P.; Jefferson, James W.

    1982-01-01

    Direct patient-computer interviews were among the earliest applications of computing in medicine. Yet patient interviewing and other clinical applications have lagged behind fiscal/administrative uses. Several reasons for delays in the development and implementation of clinical computing programs and their resolution are discussed. Patient interviewing, clinician consultation and other applications of clinical computing in mental health are reviewed.

  4. A Survey of Techniques for Approximate Computing

    DOE PAGES

    Mittal, Sparsh

    2016-03-18

    Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is tomore » provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.« less

  5. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2012-07-01 2012-07-01 false How does the Secretary compute maintenance of effort in...

  6. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2013-07-01 2013-07-01 false How does the Secretary compute maintenance of effort in...

  7. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2011-07-01 2011-07-01 false How does the Secretary compute maintenance of effort in...

  8. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2014-07-01 2014-07-01 false How does the Secretary compute maintenance of effort in...

  9. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary compute maintenance of effort in...

  10. Fuzzy inductive reasoning: a consolidated approach to data-driven construction of complex dynamical systems

    NASA Astrophysics Data System (ADS)

    Nebot, Àngela; Mugica, Francisco

    2012-10-01

    Fuzzy inductive reasoning (FIR) is a modelling and simulation methodology derived from the General Systems Problem Solver. It compares favourably with other soft computing methodologies, such as neural networks, genetic or neuro-fuzzy systems, and with hard computing methodologies, such as AR, ARIMA, or NARMAX, when it is used to predict future behaviour of different kinds of systems. This paper contains an overview of the FIR methodology, its historical background, and its evolution.

  11. Adapting Japanese Lesson Study to Enhance the Teaching and Learning of Geometry and Spatial Reasoning in Early Years Classrooms: A Case Study

    ERIC Educational Resources Information Center

    Moss, Joan; Hawes, Zachary; Naqvi, Sarah; Caswell, Beverly

    2015-01-01

    Increased efforts are needed to meet the demand for high quality mathematics in early years classrooms. Despite the foundational role of geometry and spatial reasoning for later mathematics success, the strand receives inadequate instructional time and is limited to concepts of static geometry. Moreover, early years teachers typically lack both…

  12. Integrating computation into the undergraduate curriculum: A vision and guidelines for future developments

    NASA Astrophysics Data System (ADS)

    Chonacky, Norman; Winch, David

    2008-04-01

    There is substantial evidence of a need to make computation an integral part of the undergraduate physics curriculum. This need is consistent with data from surveys in both the academy and the workplace, and has been reinforced by two years of exploratory efforts by a group of physics faculty for whom computation is a special interest. We have examined past and current efforts at reform and a variety of strategic, organizational, and institutional issues involved in any attempt to broadly transform existing practice. We propose a set of guidelines for development based on this past work and discuss our vision of computationally integrated physics.

  13. Micro-video display with ocular tracking and interactive voice control

    NASA Technical Reports Server (NTRS)

    Miller, James E.

    1993-01-01

    In certain space-restricted environments, many of the benefits resulting from computer technology have been foregone because of the size, weight, inconvenience, and lack of mobility associated with existing computer interface devices. Accordingly, an effort to develop a highly miniaturized and 'wearable' computer display and control interface device, referred to as the Sensory Integrated Data Interface (SIDI), is underway. The system incorporates a micro-video display that provides data display and ocular tracking on a lightweight headset. Software commands are implemented by conjunctive eye movement and voice commands of the operator. In this initial prototyping effort, various 'off-the-shelf' components have been integrated into a desktop computer and with a customized menu-tree software application to demonstrate feasibility and conceptual capabilities. When fully developed as a customized system, the interface device will allow mobile, 'hand-free' operation of portable computer equipment. It will thus allow integration of information technology applications into those restrictive environments, both military and industrial, that have not yet taken advantage of the computer revolution. This effort is Phase 1 of Small Business Innovative Research (SBIR) Topic number N90-331 sponsored by the Naval Undersea Warfare Center Division, Newport. The prime contractor is Foster-Miller, Inc. of Waltham, MA.

  14. Retention in a Computer-based Outreach Intervention For Chronically Ill Rural Women

    PubMed Central

    Weinert, Clarann; Cudney, Shirley; Hill, Wade G.

    2009-01-01

    The study's purpose was to examine retention factors in a computer intervention with 158 chronically ill rural women. After a 22 week intervention, 18.9 percent of the women had dropped out. A Cox regression survival analysis was performed to assess the effects of selected covariates on retention. Reasons for dropping were tallied and categorized. Major reasons for dropping were: lack of time; decline in health status, and non-participation in study activities. Four covariates predicted survival time: level of computer skills, marital status, work outside of home, and impact of social events on participants' lives. Retention-enhancing strategies are suggested for implementation. PMID:18226760

  15. Measuring and Modeling Change in Examinee Effort on Low-Stakes Tests across Testing Occasions

    ERIC Educational Resources Information Center

    Sessoms, John; Finney, Sara J.

    2015-01-01

    Because schools worldwide use low-stakes tests to make important decisions, value-added indices computed from test scores must accurately reflect student learning, which requires equal test-taking effort across testing occasions. Evaluating change in effort assumes effort is measured equivalently across occasions. We evaluated the longitudinal…

  16. 47 CFR 7.7 - Product design, development, and evaluation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...: (1) Where market research is undertaken, including individuals with disabilities in target... appropriate disability-related organizations; and (4) Making reasonable efforts to validate any unproven...

  17. 47 CFR 6.7 - Product design, development, and evaluation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... factors, as the manufacturer deems appropriate: (1) Where market research is undertaken, including...) Working cooperatively with appropriate disability-related organizations; and (4) Making reasonable efforts...

  18. Materials Genome Initiative

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The Materials Genome Initiative (MGI) project element is a cross-Center effort that is focused on the integration of computational tools to simulate manufacturing processes and materials behavior. These computational simulations will be utilized to gain understanding of processes and materials behavior to accelerate process development and certification to more efficiently integrate new materials in existing NASA projects and to lead to the design of new materials for improved performance. This NASA effort looks to collaborate with efforts at other government agencies and universities working under the national MGI. MGI plans to develop integrated computational/experimental/ processing methodologies for accelerating discovery and insertion of materials to satisfy NASA's unique mission demands. The challenges include validated design tools that incorporate materials properties, processes, and design requirements; and materials process control to rapidly mature emerging manufacturing methods and develop certified manufacturing processes

  19. Teacher Pedagogical Content Knowledge (PCK) and Students’ Reasoning and Wellbeing

    NASA Astrophysics Data System (ADS)

    Widodo, A.

    2017-02-01

    This paper summarizes findings of a study on efforts to improve teachers Pedagogical Content Knowledge and how it affects students’ reasoning and wellbeing. It was found that improvement of teachers’ PCK was not very strong but we managed to develop strategies to facilitate their developments. In the second year, the research was focused on identifying students’ reasoning skills both informal reasoning and formal reasoning. Data showed that students reasoning is relatively low (level 2 of five levels) and they could not construct highly coherence arguments. In addition alternative strategies to promote students’ reasoning were explored. Attempts to support teachers to conduct lessons that facilitate students’ reasoning found that teachers need intensive and continuous support. The study also identifies students’ wellbeing as the impact of improvement of lessons and other activities designed to improve students’ wellbeing. Research on students’ wellbeing is not yet given attention in Indonesian schools although it plays very important roles in students’ academic and nonacademic achievements.

  20. Exploring High School Students Beginning Reasoning about Significance Tests with Technology

    ERIC Educational Resources Information Center

    García, Víctor N.; Sánchez, Ernesto

    2017-01-01

    In the present study we analyze how students reason about or make inferences given a particular hypothesis testing problem (without having studied formal methods of statistical inference) when using Fathom. They use Fathom to create an empirical sampling distribution through computer simulation. It is found that most student´s reasoning rely on…

  1. 76 FR 41523 - In the Matter of Certain Mobile Communications and Computer Devices and Components Thereof...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-14

    ... Communications and Computer Devices and Components Thereof; Notice of Commission Determination Not To Review an... in its entirety Inv. No. 337-TA-704, Certain Mobile Communications and Computer Devices and... importation of certain mobile communications and computer devices and components thereof by reason of...

  2. 26 CFR 1.167(b)-0 - Methods of computing depreciation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 26 Internal Revenue 2 2014-04-01 2014-04-01 false Methods of computing depreciation. 1.167(b)-0....167(b)-0 Methods of computing depreciation. (a) In general. Any reasonable and consistently applied method of computing depreciation may be used or continued in use under section 167. Regardless of the...

  3. Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses

    ERIC Educational Resources Information Center

    Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan

    2013-01-01

    Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…

  4. Rhetorical Consequences of the Computer Society: Expert Systems and Human Communication.

    ERIC Educational Resources Information Center

    Skopec, Eric Wm.

    Expert systems are computer programs that solve selected problems by modelling domain-specific behaviors of human experts. These computer programs typically consist of an input/output system that feeds data into the computer and retrieves advice, an inference system using the reasoning and heuristic processes of human experts, and a knowledge…

  5. Computing Whether She Belongs: Stereotypes Undermine Girls' Interest and Sense of Belonging in Computer Science

    ERIC Educational Resources Information Center

    Master, Allison; Cheryan, Sapna; Meltzoff, Andrew N.

    2016-01-01

    Computer science has one of the largest gender disparities in science, technology, engineering, and mathematics. An important reason for this disparity is that girls are less likely than boys to enroll in necessary "pipeline courses," such as introductory computer science. Two experiments investigated whether high-school girls' lower…

  6. Changing a Generation's Way of Thinking: Teaching Computational Thinking through Programming

    ERIC Educational Resources Information Center

    Buitrago Flórez, Francisco; Casallas, Rubby; Hernández, Marcela; Reyes, Alejandro; Restrepo, Silvia; Danies, Giovanna

    2017-01-01

    Computational thinking (CT) uses concepts that are essential to computing and information science to solve problems, design and evaluate complex systems, and understand human reasoning and behavior. This way of thinking has important implications in computer sciences as well as in almost every other field. Therefore, we contend that CT should be…

  7. 26 CFR 1.167(b)-0 - Methods of computing depreciation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 2 2010-04-01 2010-04-01 false Methods of computing depreciation. 1.167(b)-0....167(b)-0 Methods of computing depreciation. (a) In general. Any reasonable and consistently applied method of computing depreciation may be used or continued in use under section 167. Regardless of the...

  8. Accessing Computers in Education, One Byte at a Time.

    ERIC Educational Resources Information Center

    Manzo, Anthony V.

    This paper discusses computers and their potential role in education. The term "byte" is first explained, to emphasize the idea that the use of computers should be implemented one "byte" or step at a time. The reasons for this approach are then outlined. Potential applications in computer usage in educational administration are suggested, computer…

  9. 20 CFR 225.3 - PIA computation formulas.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... computed under one of two normal formulas determined by the employee's eligibility year. In addition, there.... The Average Monthly Earnings PIA formula is used to compute a PIA for one of two reasons: either the... 20 Employees' Benefits 1 2014-04-01 2012-04-01 true PIA computation formulas. 225.3 Section 225.3...

  10. 20 CFR 225.3 - PIA computation formulas.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... computed under one of two normal formulas determined by the employee's eligibility year. In addition, there.... The Average Monthly Earnings PIA formula is used to compute a PIA for one of two reasons: either the... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true PIA computation formulas. 225.3 Section 225.3...

  11. 20 CFR 225.3 - PIA computation formulas.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... computed under one of two normal formulas determined by the employee's eligibility year. In addition, there.... The Average Monthly Earnings PIA formula is used to compute a PIA for one of two reasons: either the... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false PIA computation formulas. 225.3 Section 225.3...

  12. 20 CFR 225.3 - PIA computation formulas.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... computed under one of two normal formulas determined by the employee's eligibility year. In addition, there.... The Average Monthly Earnings PIA formula is used to compute a PIA for one of two reasons: either the... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false PIA computation formulas. 225.3 Section 225.3...

  13. 20 CFR 225.3 - PIA computation formulas.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... computed under one of two normal formulas determined by the employee's eligibility year. In addition, there.... The Average Monthly Earnings PIA formula is used to compute a PIA for one of two reasons: either the... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false PIA computation formulas. 225.3 Section 225.3...

  14. RASCAL: A Rudimentary Adaptive System for Computer-Aided Learning.

    ERIC Educational Resources Information Center

    Stewart, John Christopher

    Both the background of computer-assisted instruction (CAI) systems in general and the requirements of a computer-aided learning system which would be a reasonable assistant to a teacher are discussed. RASCAL (Rudimentary Adaptive System for Computer-Aided Learning) is a first attempt at defining a CAI system which would individualize the learning…

  15. 75 FR 8400 - In the Matter of Certain Notebook Computer Products and Components Thereof; Notice of Investigation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-705] In the Matter of Certain Notebook Computer... United States after importation of certain notebook computer products and components thereof by reason of... after importation of certain notebook computer products or components thereof that infringe one or more...

  16. 77 FR 32996 - Certain Handheld Electronic Computing Devices, Related Software, and Components Thereof...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-04

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-769] Certain Handheld Electronic Computing Devices, Related Software, and Components Thereof; Termination of the Investigation Based on... electronic computing devices, related software, and components thereof by reason of infringement of certain...

  17. Multiphysics Thrust Chamber Modeling for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Cheng, Gary; Chen, Yen-Sen

    2006-01-01

    The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation. A two-pronged approach is employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of heat transfer on thrust performance. Preliminary results on both aspects are presented.

  18. 36 CFR 1193.23 - Product design, development, and evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... following factors, as the manufacturer deems appropriate: (1) Where market research is undertaken, including...) Working cooperatively with appropriate disability-related organizations; and (4) Making reasonable efforts...

  19. 36 CFR 1193.23 - Product design, development, and evaluation.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... following factors, as the manufacturer deems appropriate: (1) Where market research is undertaken, including...) Working cooperatively with appropriate disability-related organizations; and (4) Making reasonable efforts...

  20. 32 CFR 1900.02 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... institution engaged in research concerning the social, biological, or physical sciences or an instructor or... descriptive terms which permit an Agency employee to locate documents with reasonable effort given existing...

  1. 32 CFR 1900.02 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... institution engaged in research concerning the social, biological, or physical sciences or an instructor or... descriptive terms which permit an Agency employee to locate documents with reasonable effort given existing...

  2. 32 CFR 1900.02 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... institution engaged in research concerning the social, biological, or physical sciences or an instructor or... descriptive terms which permit an Agency employee to locate documents with reasonable effort given existing...

  3. 32 CFR 1900.02 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... institution engaged in research concerning the social, biological, or physical sciences or an instructor or... descriptive terms which permit an Agency employee to locate documents with reasonable effort given existing...

  4. 32 CFR 1900.02 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... institution engaged in research concerning the social, biological, or physical sciences or an instructor or... descriptive terms which permit an Agency employee to locate documents with reasonable effort given existing...

  5. Embattled Behemoths.

    ERIC Educational Resources Information Center

    Carey, John

    1995-01-01

    Examines reasons some whale populations are depressed despite bans on whaling. Discusses recovery efforts and the impacts of historical whaling, illegal killing of whales, injury from ships, commercial overfishing, and atmospheric pollution. (LZ)

  6. 32 CFR 290.7 - Procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... be notified of the referral. However, if for investigative or intelligence purposes, the outside... knowledge of its files and reasonable search efforts that it neither controls nor otherwise possesses the...

  7. 32 CFR 290.7 - Procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... be notified of the referral. However, if for investigative or intelligence purposes, the outside... knowledge of its files and reasonable search efforts that it neither controls nor otherwise possesses the...

  8. 32 CFR 290.7 - Procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... be notified of the referral. However, if for investigative or intelligence purposes, the outside... knowledge of its files and reasonable search efforts that it neither controls nor otherwise possesses the...

  9. 32 CFR 290.7 - Procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... be notified of the referral. However, if for investigative or intelligence purposes, the outside... knowledge of its files and reasonable search efforts that it neither controls nor otherwise possesses the...

  10. Assessing School-Based Gang Prevention Efforts in Urban Centers: Are These Programs Reaching Those Students Who May Benefit the Most?

    ERIC Educational Resources Information Center

    Rodriguez, Hector

    2009-01-01

    In recent years, schools have become a focal point for general delinquency and gang prevention programs for a variety of reasons. One premise behind this approach is that schools can serve as ideal settings for providing delinquency and intervention services because youths spend so much time there. School-based gang prevention efforts are supposed…

  11. Automated eddy current analysis of materials

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.

    1990-01-01

    This research effort focused on the use of eddy current techniques for characterizing flaws in graphite-based filament-wound cylindrical structures. A major emphasis was on incorporating artificial intelligence techniques into the signal analysis portion of the inspection process. Developing an eddy current scanning system using a commercial robot for inspecting graphite structures (and others) has been a goal in the overall concept and is essential for the final implementation for expert system interpretation. Manual scans, as performed in the preliminary work here, do not provide sufficiently reproducible eddy current signatures to be easily built into a real time expert system. The expert systems approach to eddy current signal analysis requires that a suitable knowledge base exist in which correct decisions as to the nature of the flaw can be performed. In eddy current or any other expert systems used to analyze signals in real time in a production environment, it is important to simplify computational procedures as much as possible. For that reason, we have chosen to use the measured resistance and reactance values for the preliminary aspects of this work. A simple computation, such as phase angle of the signal, is certainly within the real time processing capability of the computer system. In the work described here, there is a balance between physical measurements and finite element calculations of those measurements. The goal is to evolve into the most cost effective procedures for maintaining the correctness of the knowledge base.

  12. Software and the Scientist: Coding and Citation Practices in Geodynamics

    NASA Astrophysics Data System (ADS)

    Hwang, Lorraine; Fish, Allison; Soito, Laura; Smith, MacKenzie; Kellogg, Louise H.

    2017-11-01

    In geodynamics as in other scientific areas, computation has become a core component of research, complementing field observation, laboratory analysis, experiment, and theory. Computational tools for data analysis, mapping, visualization, modeling, and simulation are essential for all aspects of the scientific workflow. Specialized scientific software is often developed by geodynamicists for their own use, and this effort represents a distinctive intellectual contribution. Drawing on a geodynamics community that focuses on developing and disseminating scientific software, we assess the current practices of software development and attribution, as well as attitudes about the need and best practices for software citation. We analyzed publications by participants in the Computational Infrastructure for Geodynamics and conducted mixed method surveys of the solid earth geophysics community. From this we learned that coding skills are typically learned informally. Participants considered good code as trusted, reusable, readable, and not overly complex and considered a good coder as one that participates in the community in an open and reasonable manor contributing to both long- and short-term community projects. Participants strongly supported citing software reflected by the high rate a software package was named in the literature and the high rate of citations in the references. However, lacking are clear instructions from developers on how to cite and education of users on what to cite. In addition, citations did not always lead to discoverability of the resource. A unique identifier to the software package itself, community education, and citation tools would contribute to better attribution practices.

  13. Computational studies of adsorption in metal organic frameworks and interaction of nanoparticles in condensed phases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Annapureddy, Harsha V.; Motkuri, Radha K.; Nguyen, Phuong T.

    In this review, we describe recent efforts in which computer simulations were used to systematically study nano-structured metal organic frameworks, with particular emphasis on their application in heating and cooling processes. These materials also are known as metal organic heat carriers. We used both molecular dynamics and Grand Canonical Monte Carlo simulation techniques to gain a molecular-level understanding of the adsorption mechanism of gases in these porous materials. We investigated the uptake of various gases such as refrigerants R12 and R143a and also the elemental gases Xe and Rn by the metal organic framework (i.e., Ni2(dhtp)). We also evaluated themore » effects of temperature and pressure on the uptake mechanism. Our computed results compared reasonably well with available experimental measurements, thus validating our potential models and approaches. In addition, we also investigated the structural, diffusive, and adsorption properties of different hydrocarbons in Ni2(dhtp). To elucidate the mechanism of nanoparticle dispersion in condensed phases, we also studied the interactions among nanoparticles in various liquids, such as n-hexane, water and methanol. This work was performed at Pacific Northwest National Laboratory (PNNL) and was supported by the Division of Chemical Sciences, Geosciences and Biosciences, Office of Basic Energy Sciences, U.S. Department of Energy (DOE). PNNL is operated by Battelle for the DOE. The authors also gratefully acknowledge support received from the National Energy Technology Laboratory of DOE's Office of Fossil Energy.« less

  14. 34 CFR 403.185 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... VOCATIONAL AND APPLIED TECHNOLOGY EDUCATION PROGRAM What Financial Conditions Must Be Met by a State? § 403... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary compute maintenance of effort in the event of a waiver? 403.185 Section 403.185 Education Regulations of the Offices of the Department...

  15. Computational Fluid Dynamics Technology for Hypersonic Applications

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2003-01-01

    Several current challenges in computational fluid dynamics and aerothermodynamics for hypersonic vehicle applications are discussed. Example simulations are presented from code validation and code benchmarking efforts to illustrate capabilities and limitations. Opportunities to advance the state-of-art in algorithms, grid generation and adaptation, and code validation are identified. Highlights of diverse efforts to address these challenges are then discussed. One such effort to re-engineer and synthesize the existing analysis capability in LAURA, VULCAN, and FUN3D will provide context for these discussions. The critical (and evolving) role of agile software engineering practice in the capability enhancement process is also noted.

  16. 32 CFR 154.40 - General.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... effort to assess the probability of future behavior which could have an effect adverse to the national... prior experience with similar cases, reasonably suggest a degree of probability of prejudicial behavior...

  17. 32 CFR 154.40 - General.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... effort to assess the probability of future behavior which could have an effect adverse to the national... prior experience with similar cases, reasonably suggest a degree of probability of prejudicial behavior...

  18. Kids first?

    PubMed

    Sharfstein, J

    2000-01-01

    Nearly 20 years afer the licensure of a vaccine against the hepatitis B virus, an estimated 300,000 U.S. residents still become infected with the potentially fatal liver virus every year. One major reason for the persistence of hepatitis B is that few adolescents and adults whose sexual and drug-using behavior places them in danger of infection are able to obtain the vaccine. Public health authorities and legislators have spent hundreds of millions of dollars to vaccinate low-risk but politically popular babies, while largely ignoring high-risk older siblings, parents, aunts, and uncles. Now this strategy, chosen in part for political reasons, is unwittingly fueling anti-vaccine efforts. The United States' poor use of the hepatitis B vaccine will surely cast a shadow over efforts to prevent HIV, a disease with remarkably similar transmission patterns.

  19. Equal Time for Women.

    ERIC Educational Resources Information Center

    Kolata, Gina

    1984-01-01

    Examines social influences which discourage women from pursuing studies in computer science, including monopoly of computer time by boys at the high school level, sexual harassment in college, movies, and computer games. Describes some initial efforts to encourage females of all ages to study computer science. (JM)

  20. A plug flow reactor model of a vanadium redox flow battery considering the conductive current collectors

    NASA Astrophysics Data System (ADS)

    König, S.; Suriyah, M. R.; Leibfried, T.

    2017-08-01

    A lumped-parameter model for vanadium redox flow batteries, which use metallic current collectors, is extended into a one-dimensional model using the plug flow reactor principle. Thus, the commonly used simplification of a perfectly mixed cell is no longer required. The resistances of the cell components are derived in the in-plane and through-plane directions. The copper current collector is the only component with a significant in-plane conductance, which allows for a simplified electrical network. The division of a full-scale flow cell into 10 layers in the direction of fluid flow represents a reasonable compromise between computational effort and accuracy. Due to the variations in the state of charge and thus the open circuit voltage of the electrolyte, the currents in the individual layers vary considerably. Hence, there are situations, in which the first layer, directly at the electrolyte input, carries a multiple of the last layer's current. The conventional model overestimates the cell performance. In the worst-case scenario, the more accurate 20-layer model yields a discharge capacity 9.4% smaller than that computed with the conventional model. The conductive current collector effectively eliminates the high over-potentials in the last layers of the plug flow reactor models that have been reported previously.

  1. Investigation of the bindings of a class of inhibitors with GSK3β kinase using thermodynamic integration MD simulation and kinase assay.

    PubMed

    Hsu, Chia-Jen; Hsu, Wen-Chi; Lee, Der-Jay; Liu, An-Lun; Chang, Chia-Ming; Shih, Huei-Jhen; Huang, Wun-Han; Lee-Chen, Guey-Jen; Hsieh-Li, Hsiu Mei; Lee, Guan-Chiun; Sun, Ying-Chieh

    2017-08-01

    GSK3β kinase is a noteworthy target for discovery of the drugs that will be used to treat several diseases. In the effort to identify a new inhibitor lead compound, we utilized thermodynamic integration (TI)-molecular dynamics (MD) simulation and kinase assay to investigate the bindings between GSK3β kinase and five compounds that were analogous to a known inhibitor with an available crystal structure. TI-MD simulations of the first two compounds (analogs 1 and 2) were used for calibration. The computed binding affinities of analogs 1 and 2 agreed well with the experimental results. The rest three compounds (analogs 3-5) were newly obtained from a database search, and their affinity data were newly measured in our labs. TI-MD simulations predicted the binding modes and the computed ΔΔG values have a reasonably good correlation with the experimental affinity data. These newly identified inhibitors appear to be new leads according to our survey of GSK3β inhibitors listed in recent review articles. The predicted binding modes of these compounds should aid in designing new derivatives of these compounds in the future. © 2017 John Wiley & Sons A/S.

  2. Deep Part Load Flow Analysis in a Francis Model turbine by means of two-phase unsteady flow simulations

    NASA Astrophysics Data System (ADS)

    Conrad, Philipp; Weber, Wilhelm; Jung, Alexander

    2017-04-01

    Hydropower plants are indispensable to stabilize the grid by reacting quickly to changes of the energy demand. However, an extension of the operating range towards high and deep part load conditions without fatigue of the hydraulic components is desirable to increase their flexibility. In this paper a model sized Francis turbine at low discharge operating conditions (Q/QBEP = 0.27) is analyzed by means of computational fluid dynamics (CFD). Unsteady two-phase simulations for two Thoma-number conditions are conducted. Stochastic pressure oscillations, observed on the test rig at low discharge, require sophisticated numerical models together with small time steps, large grid sizes and long simulation times to cope with these fluctuations. In this paper the BSL-EARSM model (Explicit Algebraic Reynolds Stress) was applied as a compromise between scale resolving and two-equation turbulence models with respect to computational effort and accuracy. Simulation results are compared to pressure measurements showing reasonable agreement in resolving the frequency spectra and amplitude. Inner blade vortices were predicted successfully in shape and size. Surface streamlines in blade-to-blade view are presented, giving insights to the formation of the inner blade vortices. The acquired time dependent pressure fields can be used for quasi-static structural analysis (FEA) for fatigue calculations in the future.

  3. Improved dynamic analysis method using load-dependent Ritz vectors

    NASA Technical Reports Server (NTRS)

    Escobedo-Torres, J.; Ricles, J. M.

    1993-01-01

    The dynamic analysis of large space structures is important in order to predict their behavior under operating conditions. Computer models of large space structures are characterized by having a large number of degrees of freedom, and the computational effort required to carry out the analysis is very large. Conventional methods of solution utilize a subset of the eigenvectors of the system, but for systems with many degrees of freedom, the solution of the eigenproblem is in many cases the most costly phase of the analysis. For this reason, alternate solution methods need to be considered. It is important that the method chosen for the analysis be efficient and that accurate results be obtainable. It is important that the method chosen for the analysis be efficient and that accurate results be obtainable. The load dependent Ritz vector method is presented as an alternative to the classical normal mode methods for obtaining dynamic responses of large space structures. A simplified model of a space station is used to compare results. Results show that the load dependent Ritz vector method predicts the dynamic response better than the classical normal mode method. Even though this alternate method is very promising, further studies are necessary to fully understand its attributes and limitations.

  4. Analysis and design of planar and non-planar wings for induced drag minimization

    NASA Technical Reports Server (NTRS)

    Mortara, K.; Straussfogel, Dennis M.; Maughmer, Mark D.

    1991-01-01

    The goal of the work was to develop and validate computational tools to be used for the design of planar and non-planar wing geometries for minimum induced drag. Because of the iterative nature of the design problem, it is important that, in addition to being sufficiently accurate for the problem at hand, they are reasonably fast and computationally efficient. Toward this end, a method of predicting induced drag in the presence of a non-rigid wake is coupled with a panel method. The induced drag prediction technique is based on the Kutta-Joukowski law applied at the trailing edge. Until recently, the use of this method has not been fully explored and pressure integration and Trefftz-plane calculations favored. As is shown in this report, however, the Kutta-Joukowski method is able to give better results for a given amount of effort than the more common techniques, particularly when relaxed wakes and non-planar wing geometries are considered. Using these tools, a workable design method is in place which takes into account relaxed wakes and non-planar wing geometries. It is recommended that this method be used to design a wind-tunnel experiment to verify the predicted aerodynamic benefits of non-planar wing geometries.

  5. Year 2 Report: Protein Function Prediction Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, C E

    2012-04-27

    Upon completion of our second year of development in a 3-year development cycle, we have completed a prototype protein structure-function annotation and function prediction system: Protein Function Prediction (PFP) platform (v.0.5). We have met our milestones for Years 1 and 2 and are positioned to continue development in completion of our original statement of work, or a reasonable modification thereof, in service to DTRA Programs involved in diagnostics and medical countermeasures research and development. The PFP platform is a multi-scale computational modeling system for protein structure-function annotation and function prediction. As of this writing, PFP is the only existing fullymore » automated, high-throughput, multi-scale modeling, whole-proteome annotation platform, and represents a significant advance in the field of genome annotation (Fig. 1). PFP modules perform protein functional annotations at the sequence, systems biology, protein structure, and atomistic levels of biological complexity (Fig. 2). Because these approaches provide orthogonal means of characterizing proteins and suggesting protein function, PFP processing maximizes the protein functional information that can currently be gained by computational means. Comprehensive annotation of pathogen genomes is essential for bio-defense applications in pathogen characterization, threat assessment, and medical countermeasure design and development in that it can short-cut the time and effort required to select and characterize protein biomarkers.« less

  6. Assessment of a hybrid finite element-transfer matrix model for flat structures with homogeneous acoustic treatments.

    PubMed

    Alimonti, Luca; Atalla, Noureddine; Berry, Alain; Sgard, Franck

    2014-05-01

    Modeling complex vibroacoustic systems including poroelastic materials using finite element based methods can be unfeasible for practical applications. For this reason, analytical approaches such as the transfer matrix method are often preferred to obtain a quick estimation of the vibroacoustic parameters. However, the strong assumptions inherent within the transfer matrix method lead to a lack of accuracy in the description of the geometry of the system. As a result, the transfer matrix method is inherently limited to the high frequency range. Nowadays, hybrid substructuring procedures have become quite popular. Indeed, different modeling techniques are typically sought to describe complex vibroacoustic systems over the widest possible frequency range. As a result, the flexibility and accuracy of the finite element method and the efficiency of the transfer matrix method could be coupled in a hybrid technique to obtain a reduction of the computational burden. In this work, a hybrid methodology is proposed. The performances of the method in predicting the vibroacoutic indicators of flat structures with attached homogeneous acoustic treatments are assessed. The results prove that, under certain conditions, the hybrid model allows for a reduction of the computational effort while preserving enough accuracy with respect to the full finite element solution.

  7. Potential-scour assessments and estimates of scour depth using different techniques at selected bridge sites in Missouri

    USGS Publications Warehouse

    Huizinga, Richard J.; Rydlund, Jr., Paul H.

    2004-01-01

    The evaluation of scour at bridges throughout the state of Missouri has been ongoing since 1991 in a cooperative effort by the U.S. Geological Survey and Missouri Department of Transportation. A variety of assessment methods have been used to identify bridges susceptible to scour and to estimate scour depths. A potential-scour assessment (Level 1) was used at 3,082 bridges to identify bridges that might be susceptible to scour. A rapid estimation method (Level 1+) was used to estimate contraction, pier, and abutment scour depths at 1,396 bridge sites to identify bridges that might be scour critical. A detailed hydraulic assessment (Level 2) was used to compute contraction, pier, and abutment scour depths at 398 bridges to determine which bridges are scour critical and would require further monitoring or application of scour countermeasures. The rapid estimation method (Level 1+) was designed to be a conservative estimator of scour depths compared to depths computed by a detailed hydraulic assessment (Level 2). Detailed hydraulic assessments were performed at 316 bridges that also had received a rapid estimation assessment, providing a broad data base to compare the two scour assessment methods. The scour depths computed by each of the two methods were compared for bridges that had similar discharges. For Missouri, the rapid estimation method (Level 1+) did not provide a reasonable conservative estimate of the detailed hydraulic assessment (Level 2) scour depths for contraction scour, but the discrepancy was the result of using different values for variables that were common to both of the assessment methods. The rapid estimation method (Level 1+) was a reasonable conservative estimator of the detailed hydraulic assessment (Level 2) scour depths for pier scour if the pier width is used for piers without footing exposure and the footing width is used for piers with footing exposure. Detailed hydraulic assessment (Level 2) scour depths were conservatively estimated by the rapid estimation method (Level 1+) for abutment scour, but there was substantial variability in the estimates and several substantial underestimations.

  8. From neural oscillations to reasoning ability: Simulating the effect of the theta-to-gamma cycle length ratio on individual scores in a figural analogy test.

    PubMed

    Chuderski, Adam; Andrelczyk, Krzysztof

    2015-02-01

    Several existing computational models of working memory (WM) have predicted a positive relationship (later confirmed empirically) between WM capacity and the individual ratio of theta to gamma oscillatory band lengths. These models assume that each gamma cycle represents one WM object (e.g., a binding of its features), whereas the theta cycle integrates such objects into the maintained list. As WM capacity strongly predicts reasoning, it might be expected that this ratio also predicts performance in reasoning tasks. However, no computational model has yet explained how the differences in the theta-to-gamma ratio found among adult individuals might contribute to their scores on a reasoning test. Here, we propose a novel model of how WM capacity constraints figural analogical reasoning, aimed at explaining inter-individual differences in reasoning scores in terms of the characteristics of oscillatory patterns in the brain. In the model, the gamma cycle encodes the bindings between objects/features and the roles they play in the relations processed. Asynchrony between consecutive gamma cycles results from lateral inhibition between oscillating bindings. Computer simulations showed that achieving the highest WM capacity required reaching the optimal level of inhibition. When too strong, this inhibition eliminated some bindings from WM, whereas, when inhibition was too weak, the bindings became unstable and fell apart or became improperly grouped. The model aptly replicated several empirical effects and the distribution of individual scores, as well as the patterns of correlations found in the 100-people sample attempting the same reasoning task. Most importantly, the model's reasoning performance strongly depended on its theta-to-gamma ratio in same way as the performance of human participants depended on their WM capacity. The data suggest that proper regulation of oscillations in the theta and gamma bands may be crucial for both high WM capacity and effective complex cognition. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Knowing, Applying, and Reasoning about Arithmetic: Roles of Domain-General and Numerical Skills in Multiple Domains of Arithmetic Learning

    ERIC Educational Resources Information Center

    Zhang, Xiao; Räsänen, Pekka; Koponen, Tuire; Aunola, Kaisa; Lerkkanen, Marja-Kristiina; Nurmi, Jari-Erik

    2017-01-01

    The longitudinal relations of domain-general and numerical skills at ages 6-7 years to 3 cognitive domains of arithmetic learning, namely knowing (written computation), applying (arithmetic word problems), and reasoning (arithmetic reasoning) at age 11, were examined for a representative sample of 378 Finnish children. The results showed that…

  10. The Contribution of Reasoning to the Utilization of Feedback from Software When Solving Mathematical Problems

    ERIC Educational Resources Information Center

    Olsson, Jan

    2018-01-01

    This study investigates how students' reasoning contributes to their utilization of computer-generated feedback. Sixteen 16-year-old students solved a linear function task designed to present a challenge to them using dynamic software, GeoGebra, for assistance. The data were analysed with respect both to character of reasoning and to the use of…

  11. Age-Related Changes in Reasons for Using Alcohol and Marijuana From Ages 18 to 30 in a National Sample

    PubMed Central

    Patrick, Megan E.; Schulenberg, John E.; O’Malley, Patrick M.; Maggs, Jennifer L.; Kloska, Deborah D.; Johnston, Lloyd D.; Bachman, Jerald G.

    2011-01-01

    This study used up to seven waves of data from 32 consecutive cohorts of participants in the national longitudinal Monitoring the Future study to model changes in self-reported reasons for using alcohol and marijuana by age (18 to 30), gender, and recent substance use. The majority of stated reasons for use decreased in prevalence across young adulthood (e.g., social/recreational and coping with negative affect reasons); exceptions included age-related increases in using to relax (alcohol and marijuana), to sleep (alcohol), because it tastes good (alcohol), and to get high (marijuana). Women were more likely than men to report drinking for reasons involving distress (i.e., to get away from problems), while men were more likely than women to endorse all other reasons. Greater substance use at age 18 was associated with greater likelihood of all reasons except to experiment and to fit in. A better understanding of developmental changes in reasons for use is important for understanding normative changes in substance use behaviors and for informing intervention efforts involving underlying reasons for use. PMID:21417516

  12. Combining Computational and Social Effort for Collaborative Problem Solving

    PubMed Central

    Wagy, Mark D.; Bongard, Josh C.

    2015-01-01

    Rather than replacing human labor, there is growing evidence that networked computers create opportunities for collaborations of people and algorithms to solve problems beyond either of them. In this study, we demonstrate the conditions under which such synergy can arise. We show that, for a design task, three elements are sufficient: humans apply intuitions to the problem, algorithms automatically determine and report back on the quality of designs, and humans observe and innovate on others’ designs to focus creative and computational effort on good designs. This study suggests how such collaborations should be composed for other domains, as well as how social and computational dynamics mutually influence one another during collaborative problem solving. PMID:26544199

  13. Computing Cluster for Large Scale Turbulence Simulations and Applications in Computational Aeroacoustics

    NASA Astrophysics Data System (ADS)

    Lele, Sanjiva K.

    2002-08-01

    Funds were received in April 2001 under the Department of Defense DURIP program for construction of a 48 processor high performance computing cluster. This report details the hardware which was purchased and how it has been used to enable and enhance research activities directly supported by, and of interest to, the Air Force Office of Scientific Research and the Department of Defense. The report is divided into two major sections. The first section after this summary describes the computer cluster, its setup, and some cluster performance benchmark results. The second section explains ongoing research efforts which have benefited from the cluster hardware, and presents highlights of those efforts since installation of the cluster.

  14. 38 CFR 21.184 - “Evaluation and planning” status.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... prior initial evaluations, or (ii) Current or previous individualized rehabilitation plans. (b... every reasonable effort to enable the veteran to complete the evaluation and planning phase of the...

  15. An interval logic for higher-level temporal reasoning

    NASA Technical Reports Server (NTRS)

    Schwartz, R. L.; Melliar-Smith, P. M.; Vogt, F. H.; Plaisted, D. A.

    1983-01-01

    Prior work explored temporal logics, based on classical modal logics, as a framework for specifying and reasoning about concurrent programs, distributed systems, and communications protocols, and reported on efforts using temporal reasoning primitives to express very high level abstract requirements that a program or system is to satisfy. Based on experience with those primitives, this report describes an Interval Logic that is more suitable for expressing such higher level temporal properties. The report provides a formal semantics for the Interval Logic, and several examples of its use. A description of decision procedures for the logic is also included.

  16. 77 FR 34063 - Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-08

    ... Phones and Tablet Computers, and Components Thereof Institution of Investigation AGENCY: U.S... computers, and components thereof by reason of infringement of certain claims of U.S. Patent No. 5,570,369... mobile phones and tablet computers, and components thereof that infringe one or more of claims 1-3 and 5...

  17. Eric Bonnema | NREL

    Science.gov Websites

    contributes to the research efforts for commercial buildings. This effort is dedicated to studying the , commercial sector whole-building energy simulation, scientific computing, and software configuration and

  18. Access control and privacy in large distributed systems

    NASA Technical Reports Server (NTRS)

    Leiner, B. M.; Bishop, M.

    1986-01-01

    Large scale distributed systems consists of workstations, mainframe computers, supercomputers and other types of servers, all connected by a computer network. These systems are being used in a variety of applications including the support of collaborative scientific research. In such an environment, issues of access control and privacy arise. Access control is required for several reasons, including the protection of sensitive resources and cost control. Privacy is also required for similar reasons, including the protection of a researcher's proprietary results. A possible architecture for integrating available computer and communications security technologies into a system that meet these requirements is described. This architecture is meant as a starting point for discussion, rather that the final answer.

  19. Designing Computer Learning Environments for Engineering and Computer Science: The Scaffolded Knowledge Integration Framework.

    ERIC Educational Resources Information Center

    Linn, Marcia C.

    1995-01-01

    Describes a framework called scaffolded knowledge integration and illustrates how it guided the design of two successful course enhancements in the field of computer science and engineering: the LISP Knowledge Integration Environment and the spatial reasoning environment. (101 references) (Author/MKR)

  20. 77 FR 52759 - Certain Wireless Communication Devices, Portable Music and Data Processing Devices, Computers and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-745] Certain Wireless Communication Devices, Portable Music and Data Processing Devices, Computers and Components Thereof; Notice of... communication devices, portable music and data processing devices, computers and components thereof by reason of...

  1. Conversational Simulation in Computer-Assisted Language Learning: Potential and Reality.

    ERIC Educational Resources Information Center

    Coleman, D. Wells

    1988-01-01

    Addresses the potential of conversational simulations for computer-assisted language learning (CALL) and reasons why this potential is largely untapped. Topics discussed include artificial intelligence; microworlds; parsing; realism versus reality in computer software; intelligent tutoring systems; and criteria to clarify what kinds of CALL…

  2. Association between females' perceptions of college aerobic class motivational climates and their responses.

    PubMed

    Brown, Theresa C; Fry, Mary D

    2013-01-01

    The aim of this study was to examine the relationship between female college students' perceptions of the motivational climate in their aerobics classes to their adaptive exercise responses. Data were collected from university group exercise classes in spring 2008. The participants (N = 213) responded to a questionnaire measuring perceptions of the climate (i.e., caring, task-, and ego-involving), correlates of intrinsic motivation (i.e., interest/enjoyment, perceived competence, effort/importance, and tension/pressure), commitment to exercise, and reasons for exercising. Canonical correlation analyses revealed that participants who perceived a predominately caring, task-involving climate reported higher interest/enjoyment, perceived competence, effort/importance, and commitment to exercise, as well as lower tension/pressure. Further, those who perceived a high caring, task-involving, and low ego-involving climate were also more likely to report more health-related reasons for exercise versus appearance-focused reasons. Results suggested that important motivational benefits might exist when women perceive caring, task-involving climates in their aerobics class settings. Aerobics class instructors who intentionally create caring, task-involving climates may promote more adaptive motivational responses among their female participants.

  3. Study of the Use of Time-Mean Vortices to Generate Lift for MAV Applications

    DTIC Science & Technology

    2011-05-31

    microplate to in-plane resonance. Computational effort centers around optimization of a range of parameters (geometry, frequency, amplitude of oscillation, etc...issue involved. Towards this end, a suspended microplate was fabricated via MEMS technology and driven to in-plane resonance via Lorentz force...force to drive the suspended MEMS-based microplate to in-plane resonance. Computational effort centers around optimization of a range of parameters

  4. Improving the learning of clinical reasoning through computer-based cognitive representation.

    PubMed

    Wu, Bian; Wang, Minhong; Johnson, Janice M; Grotzer, Tina A

    2014-01-01

    Objective Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Methods Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. Results A significant improvement was found in students' learning products from the beginning to the end of the study, consistent with students' report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. Conclusions The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction.

  5. Improving the learning of clinical reasoning through computer-based cognitive representation

    PubMed Central

    Wu, Bian; Wang, Minhong; Johnson, Janice M.; Grotzer, Tina A.

    2014-01-01

    Objective Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Methods Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. Results A significant improvement was found in students’ learning products from the beginning to the end of the study, consistent with students’ report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. Conclusions The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction. PMID:25518871

  6. Improving the learning of clinical reasoning through computer-based cognitive representation.

    PubMed

    Wu, Bian; Wang, Minhong; Johnson, Janice M; Grotzer, Tina A

    2014-01-01

    Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. A significant improvement was found in students' learning products from the beginning to the end of the study, consistent with students' report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction.

  7. Combatting Prejudice: Understanding Media Prejudice Toward Muslims and Advocacy Organizations’ Efforts to Combat It

    DTIC Science & Technology

    2017-12-01

    civil society, focusing on civil society groups that may be responsible for promoting stereotypes. The thesis suggests that Orientalism and efforts...reach wider audiences to effect change based on intergroup contact theory, which promotes interaction among different groups . Second, advocacy...the reasons behind American media’s promotion of prejudice in civil society, focusing on civil society groups that may be responsible for promoting

  8. The Reality of Assessment in Business Schools: Rejoinder to "Why Assessment Will Never Work at Many Business Schools: A Call for Better Utilization of Pedagogical Research"

    ERIC Educational Resources Information Center

    Burke-Smalley, Lisa A.

    2017-01-01

    Bacon and Stewart (2016) argued that assurance of learning efforts in most business schools is largely futile--a stance held by many faculty members, for a variety of reasons. They provided detailed evidence that most schools' data collection efforts for assessment, particularly in graduate or niche programs, suffers from insufficient statistical…

  9. A General Approach to Measuring Test-Taking Effort on Computer-Based Tests

    ERIC Educational Resources Information Center

    Wise, Steven L.; Gao, Lingyun

    2017-01-01

    There has been an increased interest in the impact of unmotivated test taking on test performance and score validity. This has led to the development of new ways of measuring test-taking effort based on item response time. In particular, Response Time Effort (RTE) has been shown to provide an assessment of effort down to the level of individual…

  10. Image analysis in cytology: DNA-histogramming versus cervical smear prescreening.

    PubMed

    Bengtsson, E W; Nordin, B

    1993-01-01

    The visual inspection of cellular specimens and histological sections through a light microscope plays an important role in clinical medicine and biomedical research. The human visual system is very good at the recognition of various patterns but less efficient at quantitative assessment of these patterns. Some samples are prepared in great numbers, most notably the screening for cervical cancer, the so-called PAP-smears, which results in hundreds of millions of samples each year, creating a tedious mass inspection task. Numerous attempts have been made over the last 40 years to create systems that solve these two tasks, the quantitative supplement to the human visual system and the automation of mass screening. The most difficult task, the total automation, has received the greatest attention with many large scale projects over the decades. In spite of all these efforts, still no generally accepted automated prescreening device exists on the market. The main reason for this failure is the great pattern recognition capabilities needed to distinguish between cancer cells and all other kinds of objects found in the specimens: cellular clusters, debris, degenerate cells, etc. Improved algorithms, the ever-increasing processing power of computers and progress in biochemical specimen preparation techniques make it likely that eventually useful automated prescreening systems will become available. Meanwhile, much less effort has been put into the development of interactive cell image analysis systems. Still, some such systems have been developed and put into use at thousands of laboratories worldwide. In these the human pattern recognition capability is used to select the fields and objects that are to be analysed while the computational power of the computer is used for the quantitative analysis of cellular DNA content or other relevant markers. Numerous studies have shown that the quantitative information about the distribution of cellular DNA content is of prognostic significance in many types of cancer. Several laboratories are therefore putting these techniques into routine clinical use. The more advanced systems can also study many other markers and cellular features, some known to be of clinical interest, others useful in research. The advances in computer technology are making these systems more generally available through decreasing cost, increasing computational power and improved user interfaces. We have been involved in research and development of both automated and interactive cell analysis systems during the last 20 years. Here some experiences and conclusions from this work will be presented as well as some predictions about what can be expected in the near future.

  11. New frontiers for intelligent content-based retrieval

    NASA Astrophysics Data System (ADS)

    Benitez, Ana B.; Smith, John R.

    2001-01-01

    In this paper, we examine emerging frontiers in the evolution of content-based retrieval systems that rely on an intelligent infrastructure. Here, we refer to intelligence as the capabilities of the systems to build and maintain situational or world models, utilize dynamic knowledge representation, exploit context, and leverage advanced reasoning and learning capabilities. We argue that these elements are essential to producing effective systems for retrieving audio-visual content at semantic levels matching those of human perception and cognition. In this paper, we review relevant research on the understanding of human intelligence and construction of intelligent system in the fields of cognitive psychology, artificial intelligence, semiotics, and computer vision. We also discus how some of the principal ideas form these fields lead to new opportunities and capabilities for content-based retrieval systems. Finally, we describe some of our efforts in these directions. In particular, we present MediaNet, a multimedia knowledge presentation framework, and some MPEG-7 description tools that facilitate and enable intelligent content-based retrieval.

  12. New frontiers for intelligent content-based retrieval

    NASA Astrophysics Data System (ADS)

    Benitez, Ana B.; Smith, John R.

    2000-12-01

    In this paper, we examine emerging frontiers in the evolution of content-based retrieval systems that rely on an intelligent infrastructure. Here, we refer to intelligence as the capabilities of the systems to build and maintain situational or world models, utilize dynamic knowledge representation, exploit context, and leverage advanced reasoning and learning capabilities. We argue that these elements are essential to producing effective systems for retrieving audio-visual content at semantic levels matching those of human perception and cognition. In this paper, we review relevant research on the understanding of human intelligence and construction of intelligent system in the fields of cognitive psychology, artificial intelligence, semiotics, and computer vision. We also discus how some of the principal ideas form these fields lead to new opportunities and capabilities for content-based retrieval systems. Finally, we describe some of our efforts in these directions. In particular, we present MediaNet, a multimedia knowledge presentation framework, and some MPEG-7 description tools that facilitate and enable intelligent content-based retrieval.

  13. Ground terminal expert (GTEX). Part 2: Expert system diagnostics for a 30/20 Gigahertz satellite transponder

    NASA Technical Reports Server (NTRS)

    Durkin, John; Schlegelmilch, Richard; Tallo, Donald

    1992-01-01

    A research effort was undertaken to investigate how expert system technology could be applied to a satellite communications system. The focus of the expert system is the satellite earth station. A proof of concept expert system called the Ground Terminal Expert (GTEX) was developed at the University of Akron in collaboration with the NASA Lewis Research Center. With the increasing demand for satellite earth stations, maintenance is becoming a vital issue. Vendors of such systems will be looking for cost effective means of maintaining such systems. The objective of GTEX is to aid in diagnosis of faults occurring with the digital earth station. GTEX was developed on a personal computer using the Automated Reasoning Tool for Information Management (ART-IM) developed by the Inference Corporation. Developed for the Phase 2 digital earth station, GTEX is a part of the Systems Integration Test and Evaluation (SITE) facility located at the NASA Lewis Research Center.

  14. Turbine Vane External Heat Transfer. Volume 2. Numerical Solutions of the Navier-stokes Equations for Two- and Three-dimensional Turbine Cascades with Heat Transfer

    NASA Technical Reports Server (NTRS)

    Yang, R. J.; Weinberg, B. C.; Shamroth, S. J.; Mcdonald, H.

    1985-01-01

    The application of the time-dependent ensemble-averaged Navier-Stokes equations to transonic turbine cascade flow fields was examined. In particular, efforts focused on an assessment of the procedure in conjunction with a suitable turbulence model to calculate steady turbine flow fields using an O-type coordinate system. Three cascade configurations were considered. Comparisons were made between the predicted and measured surface pressures and heat transfer distributions wherever available. In general, the pressure predictions were in good agreement with the data. Heat transfer calculations also showed good agreement when an empirical transition model was used. However, further work in the development of laminar-turbulent transitional models is indicated. The calculations showed most of the known features associated with turbine cascade flow fields. These results indicate the ability of the Navier-Stokes analysis to predict, in reasonable amounts of computation time, the surface pressure distribution, heat transfer rates, and viscous flow development for turbine cascades operating at realistic conditions.

  15. Design knowledge capture for a corporate memory facility

    NASA Technical Reports Server (NTRS)

    Boose, John H.; Shema, David B.; Bradshaw, Jeffrey M.

    1990-01-01

    Currently, much of the information regarding decision alternatives and trade-offs made in the course of a major program development effort is not represented or retained in a way that permits computer-based reasoning over the life cycle of the program. The loss of this information results in problems in tracing design alternatives to requirements, in assessing the impact of change in requirements, and in configuration management. To address these problems, the problem was studied of building an intelligent, active corporate memory facility which would provide for the capture of the requirements and standards of a program, analyze the design alternatives and trade-offs made over the program's lifetime, and examine relationships between requirements and design trade-offs. Early phases of the work have concentrated on design knowledge capture for the Space Station Freedom. Tools are demonstrated and extended which helps automate and document engineering trade studies, and another tool is being developed to help designers interactively explore design alternatives and constraints.

  16. Task network models in the prediction of workload imposed by extravehicular activities during the Hubble Space Telescope servicing mission

    NASA Technical Reports Server (NTRS)

    Diaz, Manuel F.; Takamoto, Neal; Woolford, Barbara

    1994-01-01

    In a joint effort with Brooks AFB, Texas, the Flight Crew Support Division at JSC has begun a computer simulation and performance modeling program directed at establishing the predictive validity of software tools for modeling human performance during spaceflight. This paper addresses the utility of task network modeling for predicting the workload that astronauts are likely to encounter in extravehicular activities (EVA) during the Hubble Space Telescope (HST) repair mission. The intent of the study was to determine whether two EVA crewmembers and one intravehicular activity (IVA) crewmember could reasonably be expected to complete HST Wide Field/Planetary Camera (WFPC) replacement in the allotted time. Ultimately, examination of the points during HST servicing that may result in excessive workload will lead to recommendations to the HST Flight Systems and Servicing Project concerning (1) expectation of degraded performance, (2) the need to change task allocation across crewmembers, (3) the need to expand the timeline, and (4) the need to increase the number of EVA's.

  17. Automatising the analysis of stochastic biochemical time-series

    PubMed Central

    2015-01-01

    Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821

  18. Assessment, development, and application of combustor aerothermal models

    NASA Technical Reports Server (NTRS)

    Holdeman, J. D.; Mongia, H. C.; Mularz, E. J.

    1989-01-01

    The gas turbine combustion system design and development effort is an engineering exercise to obtain an acceptable solution to the conflicting design trade-offs between combustion efficiency, gaseous emissions, smoke, ignition, restart, lean blowout, burner exit temperature quality, structural durability, and life cycle cost. For many years, these combustor design trade-offs have been carried out with the help of fundamental reasoning and extensive component and bench testing, backed by empirical and experience correlations. Recent advances in the capability of computational fluid dynamics codes have led to their application to complex 3-D flows such as those in the gas turbine combustor. A number of U.S. Government and industry sponsored programs have made significant contributions to the formulation, development, and verification of an analytical combustor design methodology which will better define the aerothermal loads in a combustor, and be a valuable tool for design of future combustion systems. The contributions made by NASA Hot Section Technology (HOST) sponsored Aerothermal Modeling and supporting programs are described.

  19. Integrated Planning for Telepresence With Time Delays

    NASA Technical Reports Server (NTRS)

    Johnston, Mark; Rabe, Kenneth

    2009-01-01

    A conceptual "intelligent assistant" and an artificial-intelligence computer program that implements the intelligent assistant have been developed to improve control exerted by a human supervisor over a robot that is so distant that communication between the human and the robot involves significant signal-propagation delays. The goal of the effort is not only to help the human supervisor monitor and control the state of the robot, but also to improve the efficiency of the robot by allowing the supervisor to "work ahead". The intelligent assistant is an integrated combination of an artificial-intelligence planner and a monitor of states of both the human supervisor and the remote robot. The novelty of the system lies in the way it uses the planner to reason about the states at both ends of the time delay. The purpose served by the assistant is to provide advice to the human supervisor about current and future activities, derived from a sequence of high-level goals to be achieved.

  20. New modified multi-level residue harmonic balance method for solving nonlinearly vibrating double-beam problem

    NASA Astrophysics Data System (ADS)

    Rahman, Md. Saifur; Lee, Yiu-Yin

    2017-10-01

    In this study, a new modified multi-level residue harmonic balance method is presented and adopted to investigate the forced nonlinear vibrations of axially loaded double beams. Although numerous nonlinear beam or linear double-beam problems have been tackled and solved, there have been few studies of this nonlinear double-beam problem. The geometric nonlinear formulations for a double-beam model are developed. The main advantage of the proposed method is that a set of decoupled nonlinear algebraic equations is generated at each solution level. This heavily reduces the computational effort compared with solving the coupled nonlinear algebraic equations generated in the classical harmonic balance method. The proposed method can generate the higher-level nonlinear solutions that are neglected by the previous modified harmonic balance method. The results from the proposed method agree reasonably well with those from the classical harmonic balance method. The effects of damping, axial force, and excitation magnitude on the nonlinear vibrational behaviour are examined.

  1. Big data and visual analytics in anaesthesia and health care.

    PubMed

    Simpao, A F; Ahumada, L M; Rehman, M A

    2015-09-01

    Advances in computer technology, patient monitoring systems, and electronic health record systems have enabled rapid accumulation of patient data in electronic form (i.e. big data). Organizations such as the Anesthesia Quality Institute and Multicenter Perioperative Outcomes Group have spearheaded large-scale efforts to collect anaesthesia big data for outcomes research and quality improvement. Analytics--the systematic use of data combined with quantitative and qualitative analysis to make decisions--can be applied to big data for quality and performance improvements, such as predictive risk assessment, clinical decision support, and resource management. Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces, and it can facilitate performance of cognitive activities involving big data. Ongoing integration of big data and analytics within anaesthesia and health care will increase demand for anaesthesia professionals who are well versed in both the medical and the information sciences. © The Author 2015. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Real Time Computation of Kinetic Constraints to Support Equilibrium Reconstruction

    NASA Astrophysics Data System (ADS)

    Eggert, W. J.; Kolemen, E.; Eldon, D.

    2016-10-01

    A new method for quickly and automatically applying kinetic constraints to EFIT equilibrium reconstructions using readily available data is presented. The ultimate goal is to produce kinetic equilibrium reconstructions in real time and use them to constrain the DCON stability code as part of a disruption avoidance scheme. A first effort presented here replaces CPU-time expensive modules, such as the fast ion pressure profile calculation, with a simplified model. We show with a DIII-D database analysis that we can achieve reasonable predictions for selected applications by modeling the fast ion pressure profile and determining the fit parameters as functions of easily measured quantities including neutron rate and electron temperature on axis. Secondly, we present a strategy for treating Thomson scattering and Charge Exchange Recombination data to automatically form constraints for a kinetic equilibrium reconstruction, a process that historically was performed by hand. Work supported by US DOE DE-AC02-09CH11466 and DE-FC02-04ER54698.

  3. Web-based, GPU-accelerated, Monte Carlo simulation and visualization of indirect radiation imaging detector performance.

    PubMed

    Dong, Han; Sharma, Diksha; Badano, Aldo

    2014-12-01

    Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridmantis, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webmantis and visualmantis to facilitate the setup of computational experiments via hybridmantis. The visualization tools visualmantis and webmantis enable the user to control simulation properties through a user interface. In the case of webmantis, control via a web browser allows access through mobile devices such as smartphones or tablets. webmantis acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridmantis. The users can download the output images and statistics through a zip file for future reference. In addition, webmantis provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. The visualization tools visualmantis and webmantis provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying input parameters to receiving visual feedback for the model predictions.

  4. Computer-Based Tools for Evaluating Graphical User Interfaces

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  5. Expanded opportunities of THz passive camera for the detection of concealed objects

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Kuchik, Igor E.

    2013-10-01

    Among the security problems, the detection of object implanted into either the human body or animal body is the urgent problem. At the present time the main tool for the detection of such object is X-raying only. However, X-ray is the ionized radiation and therefore can not be used often. Other way for the problem solving is passive THz imaging using. In our opinion, using of the passive THz camera may help to detect the object implanted into the human body under certain conditions. The physical reason of such possibility arises from temperature trace on the human skin as a result of the difference in temperature between object and parts of human body. Modern passive THz cameras have not enough resolution in temperature to see this difference. That is why, we use computer processing to enhance the passive THz camera resolution for this application. After computer processing of images captured by passive THz camera TS4, developed by ThruVision Systems Ltd., we may see the pronounced temperature trace on the human body skin from the water, which is drunk by person, or other food eaten by person. Nevertheless, there are many difficulties on the way of full soution of this problem. We illustrate also an improvement of quality of the image captured by comercially available passive THz cameras using computer processing. In some cases, one can fully supress a noise on the image without loss of its quality. Using computer processing of the THz image of objects concealed on the human body, one may improve it many times. Consequently, the instrumental resolution of such device may be increased without any additional engineering efforts.

  6. Algorithmics - Is There Hope for a Unified Theory?

    NASA Astrophysics Data System (ADS)

    Hromkovič, Juraj

    Computer science was born with the formal definition of the notion of an algorithm. This definition provides clear limits of automatization, separating problems into algorithmically solvable problems and algorithmically unsolvable ones. The second big bang of computer science was the development of the concept of computational complexity. People recognized that problems that do not admit efficient algorithms are not solvable in practice. The search for a reasonable, clear and robust definition of the class of practically solvable algorithmic tasks started with the notion of the class {P} and of {NP}-completeness. In spite of the fact that this robust concept is still fundamental for judging the hardness of computational problems, a variety of approaches was developed for solving instances of {NP}-hard problems in many applications. Our 40-years short attempt to fix the fuzzy border between the practically solvable problems and the practically unsolvable ones partially reminds of the never-ending search for the definition of "life" in biology or for the definitions of matter and energy in physics. Can the search for the formal notion of "practical solvability" also become a never-ending story or is there hope for getting a well-accepted, robust definition of it? Hopefully, it is not surprising that we are not able to answer this question in this invited talk. But to deal with this question is of crucial importance, because only due to enormous effort scientists get a better and better feeling of what the fundamental notions of science like life and energy mean. In the flow of numerous technical results, we must not forget the fact that most of the essential revolutionary contributions to science were done by defining new concepts and notions.

  7. Computational strategies for three-dimensional flow simulations on distributed computer systems

    NASA Technical Reports Server (NTRS)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-01-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  8. Computational strategies for three-dimensional flow simulations on distributed computer systems

    NASA Astrophysics Data System (ADS)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-08-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  9. Increasing the impact of medical image computing using community-based open-access hackathons: The NA-MIC and 3D Slicer experience.

    PubMed

    Kapur, Tina; Pieper, Steve; Fedorov, Andriy; Fillion-Robin, J-C; Halle, Michael; O'Donnell, Lauren; Lasso, Andras; Ungi, Tamas; Pinter, Csaba; Finet, Julien; Pujol, Sonia; Jagadeesan, Jayender; Tokuda, Junichi; Norton, Isaiah; Estepar, Raul San Jose; Gering, David; Aerts, Hugo J W L; Jakab, Marianna; Hata, Nobuhiko; Ibanez, Luiz; Blezek, Daniel; Miller, Jim; Aylward, Stephen; Grimson, W Eric L; Fichtinger, Gabor; Wells, William M; Lorensen, William E; Schroeder, Will; Kikinis, Ron

    2016-10-01

    The National Alliance for Medical Image Computing (NA-MIC) was launched in 2004 with the goal of investigating and developing an open source software infrastructure for the extraction of information and knowledge from medical images using computational methods. Several leading research and engineering groups participated in this effort that was funded by the US National Institutes of Health through a variety of infrastructure grants. This effort transformed 3D Slicer from an internal, Boston-based, academic research software application into a professionally maintained, robust, open source platform with an international leadership and developer and user communities. Critical improvements to the widely used underlying open source libraries and tools-VTK, ITK, CMake, CDash, DCMTK-were an additional consequence of this effort. This project has contributed to close to a thousand peer-reviewed publications and a growing portfolio of US and international funded efforts expanding the use of these tools in new medical computing applications every year. In this editorial, we discuss what we believe are gaps in the way medical image computing is pursued today; how a well-executed research platform can enable discovery, innovation and reproducible science ("Open Science"); and how our quest to build such a software platform has evolved into a productive and rewarding social engineering exercise in building an open-access community with a shared vision. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Theory-Guided Technology in Computer Science.

    ERIC Educational Resources Information Center

    Ben-Ari, Mordechai

    2001-01-01

    Examines the history of major achievements in computer science as portrayed by winners of the prestigious Turing award and identifies a possibly unique activity called Theory-Guided Technology (TGT). Researchers develop TGT by using theoretical results to create practical technology. Discusses reasons why TGT is practical in computer science and…

  11. Computer-Based Education (CBE): Tomorrow's Traditional System.

    ERIC Educational Resources Information Center

    Rizza, Peter J., Jr.

    1981-01-01

    Examines the role of computer technology in education; discusses reasons for the slow evolution of Computer-Based Education (CBE); explores educational areas in which CBE can be used; presents barriers to widespread use of CBE; and describes the responsibilities of education, government, and business in supporting technology-oriented education.…

  12. Learner Assessment Methods Using a Computer Based Interactive Videodisc System.

    ERIC Educational Resources Information Center

    Ehrlich, Lisa R.

    This paper focuses on item design considerations faced by instructional designers and evaluators when using computer videodisc delivery systems as a means of assessing learner comprehension and competencies. Media characteristics of various interactive computer/videodisc training systems are briefly discussed as well as reasons for using such…

  13. Participation, Interaction and Social Presence: An Exploratory Study of Collaboration in Online Peer Review Groups

    ERIC Educational Resources Information Center

    Zhao, Huahui; Sullivan, Kirk P. H.; Mellenius, Ingmarie

    2014-01-01

    A key reason for using asynchronous computer conferencing in instruction is its potential for supporting collaborative learning. However, few studies have examined collaboration in computer conferencing. This study examined collaboration in six peer review groups within an asynchronous computer conferencing. Eighteen tertiary students participated…

  14. 14 CFR 389.14 - Locating and copying records and documents.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Records Service (NARS) of the General Services Administration or by computer service bureaus. (1) The..., will furnish the tapes for a reasonable length of time to a computer service bureau chosen by the applicant subject to the Director's approval. The computer service bureau shall assume the liability for the...

  15. 14 CFR 389.14 - Locating and copying records and documents.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Records Service (NARS) of the General Services Administration or by computer service bureaus. (1) The..., will furnish the tapes for a reasonable length of time to a computer service bureau chosen by the applicant subject to the Director's approval. The computer service bureau shall assume the liability for the...

  16. 14 CFR 389.14 - Locating and copying records and documents.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Records Service (NARS) of the General Services Administration or by computer service bureaus. (1) The..., will furnish the tapes for a reasonable length of time to a computer service bureau chosen by the applicant subject to the Director's approval. The computer service bureau shall assume the liability for the...

  17. Integrating Human and Computer Intelligence. Technical Report No. 32.

    ERIC Educational Resources Information Center

    Pea, Roy D.

    This paper explores the thesis that advances in computer applications and artificial intelligence have important implications for the study of development and learning in psychology. Current approaches to the use of computers as devices for problem solving, reasoning, and thinking--i.e., expert systems and intelligent tutoring systems--are…

  18. 75 FR 65656 - In the Matter of: Certain Notebook Computer Products and Components Thereof; Notice of Commission...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-26

    ... States after importation of certain notebook computer products and components thereof by reason of... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-705] In the Matter of: Certain Notebook Computer Products and Components Thereof; Notice of Commission Determination Not To Review an Initial...

  19. 1999 NCCS Highlights

    NASA Technical Reports Server (NTRS)

    Bennett, Jerome (Technical Monitor)

    2002-01-01

    The NASA Center for Computational Sciences (NCCS) is a high-performance scientific computing facility operated, maintained and managed by the Earth and Space Data Computing Division (ESDCD) of NASA Goddard Space Flight Center's (GSFC) Earth Sciences Directorate. The mission of the NCCS is to advance leading-edge science by providing the best people, computers, and data storage systems to NASA's Earth and space sciences programs and those of other U.S. Government agencies, universities, and private institutions. Among the many computationally demanding Earth science research efforts supported by the NCCS in Fiscal Year 1999 (FY99) are the NASA Seasonal-to-Interannual Prediction Project, the NASA Search and Rescue Mission, Earth gravitational model development efforts, the National Weather Service's North American Observing System program, Data Assimilation Office studies, a NASA-sponsored project at the Center for Ocean-Land-Atmosphere Studies, a NASA-sponsored microgravity project conducted by researchers at the City University of New York and the University of Pennsylvania, the completion of a satellite-derived global climate data set, simulations of a new geodynamo model, and studies of Earth's torque. This document presents highlights of these research efforts and an overview of the NCCS, its facilities, and its people.

  20. Intuitive and deliberate judgments are based on common principles.

    PubMed

    Kruglanski, Arie W; Gigerenzer, Gerd

    2011-01-01

    A popular distinction in cognitive and social psychology has been between intuitive and deliberate judgments. This juxtaposition has aligned in dual-process theories of reasoning associative, unconscious, effortless, heuristic, and suboptimal processes (assumed to foster intuitive judgments) versus rule-based, conscious, effortful, analytic, and rational processes (assumed to characterize deliberate judgments). In contrast, we provide convergent arguments and evidence for a unified theoretical approach to both intuitive and deliberative judgments. Both are rule-based, and in fact, the very same rules can underlie both intuitive and deliberate judgments. The important open question is that of rule selection, and we propose a 2-step process in which the task itself and the individual's memory constrain the set of applicable rules, whereas the individual's processing potential and the (perceived) ecological rationality of the rule for the task guide the final selection from that set. Deliberate judgments are not generally more accurate than intuitive judgments; in both cases, accuracy depends on the match between rule and environment: the rules' ecological rationality. Heuristics that are less effortful and in which parts of the information are ignored can be more accurate than cognitive strategies that have more information and computation. The proposed framework adumbrates a unified approach that specifies the critical dimensions on which judgmental situations may vary and the environmental conditions under which rules can be expected to be successful.

  1. Development of a New Low-Cost Indoor Mapping System - System Design, System Calibration and First Results

    NASA Astrophysics Data System (ADS)

    Kersten, T. P.; Stallmann, D.; Tschirschwitz, F.

    2016-06-01

    For mapping of building interiors various 2D and 3D indoor surveying systems are available today. These systems essentially differ from each other by price and accuracy as well as by the effort required for fieldwork and post-processing. The Laboratory for Photogrammetry & Laser Scanning of HafenCity University (HCU) Hamburg has developed, as part of an industrial project, a lowcost indoor mapping system, which enables systematic inventory mapping of interior facilities with low staffing requirements and reduced, measurable expenditure of time and effort. The modelling and evaluation of the recorded data take place later in the office. The indoor mapping system of HCU Hamburg consists of the following components: laser range finder, panorama head (pan-tilt-unit), single-board computer (Raspberry Pi) with digital camera and battery power supply. The camera is pre-calibrated in a photogrammetric test field under laboratory conditions. However, remaining systematic image errors are corrected simultaneously within the generation of the panorama image. Due to cost reasons the camera and laser range finder are not coaxially arranged on the panorama head. Therefore, eccentricity and alignment of the laser range finder against the camera must be determined in a system calibration. For the verification of the system accuracy and the system calibration, the laser points were determined from measurements with total stations. The differences to the reference were 4-5mm for individual coordinates.

  2. 48 CFR 42.801 - Notice of intent to disallow costs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... every reasonable effort to reach a satisfactory settlement through discussions with the contractor. (b... written decision, except when elements of indirect cost are involved, in which case the contracting...

  3. DoD Information Technology Acquisition: Delivering Information Technology Capabilities Expeditiously

    DTIC Science & Technology

    2013-09-01

    but they possess limited IT acquisition experience (Fryer- Biggs , 2012). According to Frank Kendall (2012), USD(AT&L), acquisition personnel need...we have a very long recovery time to correct it. (Fryer- Biggs , 2012) The DoD has the means to develop its own trained professionals but even that... John P. Kotter (1996) postulated what he considered to be the primary reasons why change efforts fail. Kotter offered eight reasons why this failure

  4. Teaching Bitter Lessons: China’s Use of Force in Territorial Disputes 1962-1988

    DTIC Science & Technology

    2013-06-01

    and risk mitigation efforts, followed by an overall analysis of Chinese behavior during the war. The Military Engagement...behave reasonably. In fact, China’s leader’s reasoning on strategic issues can be described as thoughtful, deliberative, and risk aware. Mao said...Chinese behavior during the Sino-Indo war of 1962 suggests the following: 1) China appeared to be risk averse in that it did not desire a conflict

  5. Should Computing Be Taught in Single-Sex Environments? An Analysis of the Computing Learning Environment of Upper Secondary Students

    ERIC Educational Resources Information Center

    Logan, Keri

    2007-01-01

    It has been well established in the literature that girls are turning their backs on computing courses at all levels of the education system. One reason given for this is that the computer learning environment is not conducive to girls, and it is often suggested that they would benefit from learning computing in a single-sex environment. The…

  6. Using Relational Reasoning Strategies to Help Improve Clinical Reasoning Practice.

    PubMed

    Dumas, Denis; Torre, Dario M; Durning, Steven J

    2018-05-01

    Clinical reasoning-the steps up to and including establishing a diagnosis and/or therapy-is a fundamentally important mental process for physicians. Unfortunately, mounting evidence suggests that errors in clinical reasoning lead to substantial problems for medical professionals and patients alike, including suboptimal care, malpractice claims, and rising health care costs. For this reason, cognitive strategies by which clinical reasoning may be improved-and that many expert clinicians are already using-are highly relevant for all medical professionals, educators, and learners.In this Perspective, the authors introduce one group of cognitive strategies-termed relational reasoning strategies-that have been empirically shown, through limited educational and psychological research, to improve the accuracy of learners' reasoning both within and outside of the medical disciplines. The authors contend that relational reasoning strategies may help clinicians to be metacognitive about their own clinical reasoning; such strategies may also be particularly well suited for explicitly organizing clinical reasoning instruction for learners. Because the particular curricular efforts that may improve the relational reasoning of medical students are not known at this point, the authors describe the nature of previous research on relational reasoning strategies to encourage the future design, implementation, and evaluation of instructional interventions for relational reasoning within the medical education literature. The authors also call for continued research on using relational reasoning strategies and their role in clinical practice and medical education, with the long-term goal of improving diagnostic accuracy.

  7. 16 CFR 1115.14 - Time computations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT REGULATIONS SUBSTANTIAL... spend a reasonable time for investigation and evaluation. (See § 1115.14(d).) (d) Time for investigation and evaluation. A subject firm may conduct a reasonably expeditious investigation in order to evaluate...

  8. Educational strategies for improving clinical reasoning.

    PubMed

    Cutrer, William B; Sullivan, William M; Fleming, Amy E

    2013-10-01

    Clinical reasoning serves as a crucial skill for all physicians regardless of their area of expertise. Helping trainees develop effective and appropriate clinical reasoning abilities is a central aim of medical education. Teaching clinical reasoning however can be a very difficult challenge for practicing physicians. Better understanding of the different cognitive processes involved in physician clinical reasoning provides a foundation from which to guide learner development of effective reasoning skills, while pairing assessment of learner reasoning abilities with understanding of different improvement strategies offers the opportunity to maximize educational efforts for learners. Clinical reasoning errors often can occur as a result of one of four problems in trainees as well as practicing physicians; inadequate knowledge, faulty data gathering, faulty data processing, or faulty metacognition. Educators are encouraged to consider at which point a given learner's reasoning is breaking down. Experimentation with different strategies for improving clinical reasoning can help address learner struggles in each of these domains. In this chapter, various strategies for improving reasoning related to knowledge acquisition, data gathering, data processing, and clinician metacognition will be discussed. Understanding and gaining experience using the different educational strategies will provide practicing physicians with a toolbox of techniques for helping learners improve their reasoning abilities. © 2013 Mosby, Inc. All rights reserved.

  9. Computerizing the Accounting Curriculum.

    ERIC Educational Resources Information Center

    Nash, John F.; England, Thomas G.

    1986-01-01

    Discusses the use of computers in college accounting courses. Argues that the success of new efforts in using computers in teaching accounting is dependent upon increasing instructors' computer skills, and choosing appropriate hardware and software, including commercially available business software packages. (TW)

  10. Computers in Schools: White Boys Only?

    ERIC Educational Resources Information Center

    Hammett, Roberta F.

    1997-01-01

    Discusses the role of computers in today's world and the construction of computer use attitudes, such as gender gaps. Suggests how schools might close the gaps. Includes a brief explanation about how facility with computers is important for women in their efforts to gain equitable treatment in all aspects of their lives. (PA)

  11. Preparing Future Secondary Computer Science Educators

    ERIC Educational Resources Information Center

    Ajwa, Iyad

    2007-01-01

    Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…

  12. Understanding and preventing military suicide.

    PubMed

    Bryan, Craig J; Jennings, Keith W; Jobes, David A; Bradley, John C

    2012-01-01

    The continual rise in the U.S. military's suicide rate since 2004 is one of the most vexing issues currently facing military leaders, mental health professionals, and suicide experts. Despite considerable efforts to address this problem, however, suicide rates have not decreased. The authors consider possible reasons for this frustrating reality, and question common assumptions and approaches to military suicide prevention. They further argue that suicide prevention efforts that more explicitly embrace the military culture and implement evidence-based strategies across the full spectrum of prevention and treatment could improve success. Several recommendations for augmenting current efforts to prevent military suicide are proposed.

  13. The theory of reasoned action as parallel constraint satisfaction: towards a dynamic computational model of health behavior.

    PubMed

    Orr, Mark G; Thrush, Roxanne; Plaut, David C

    2013-01-01

    The reasoned action approach, although ubiquitous in health behavior theory (e.g., Theory of Reasoned Action/Planned Behavior), does not adequately address two key dynamical aspects of health behavior: learning and the effect of immediate social context (i.e., social influence). To remedy this, we put forth a computational implementation of the Theory of Reasoned Action (TRA) using artificial-neural networks. Our model re-conceptualized behavioral intention as arising from a dynamic constraint satisfaction mechanism among a set of beliefs. In two simulations, we show that constraint satisfaction can simultaneously incorporate the effects of past experience (via learning) with the effects of immediate social context to yield behavioral intention, i.e., intention is dynamically constructed from both an individual's pre-existing belief structure and the beliefs of others in the individual's social context. In a third simulation, we illustrate the predictive ability of the model with respect to empirically derived behavioral intention. As the first known computational model of health behavior, it represents a significant advance in theory towards understanding the dynamics of health behavior. Furthermore, our approach may inform the development of population-level agent-based models of health behavior that aim to incorporate psychological theory into models of population dynamics.

  14. The Theory of Reasoned Action as Parallel Constraint Satisfaction: Towards a Dynamic Computational Model of Health Behavior

    PubMed Central

    Orr, Mark G.; Thrush, Roxanne; Plaut, David C.

    2013-01-01

    The reasoned action approach, although ubiquitous in health behavior theory (e.g., Theory of Reasoned Action/Planned Behavior), does not adequately address two key dynamical aspects of health behavior: learning and the effect of immediate social context (i.e., social influence). To remedy this, we put forth a computational implementation of the Theory of Reasoned Action (TRA) using artificial-neural networks. Our model re-conceptualized behavioral intention as arising from a dynamic constraint satisfaction mechanism among a set of beliefs. In two simulations, we show that constraint satisfaction can simultaneously incorporate the effects of past experience (via learning) with the effects of immediate social context to yield behavioral intention, i.e., intention is dynamically constructed from both an individual’s pre-existing belief structure and the beliefs of others in the individual’s social context. In a third simulation, we illustrate the predictive ability of the model with respect to empirically derived behavioral intention. As the first known computational model of health behavior, it represents a significant advance in theory towards understanding the dynamics of health behavior. Furthermore, our approach may inform the development of population-level agent-based models of health behavior that aim to incorporate psychological theory into models of population dynamics. PMID:23671603

  15. 41 CFR 105-8.171 - Complaints against an occupant agency.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... violation found. (b) GSA shall make reasonable efforts to follow the time frames for complaint resolution that go into effect under the notifying occupant agency's compliance procedures when it receives a...

  16. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming

    Treesearch

    Philip A. Araman

    1990-01-01

    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  17. Biomechanics of Head, Neck, and Chest Injury Prevention for Soldiers: Phase 2 and 3

    DTIC Science & Technology

    2016-08-01

    understanding of the biomechanics of the head and brain. Task 2.3 details the computational modeling efforts conducted to evaluate the response of the...section also details the progress made on the development of a testing apparatus to evaluate cervical spine implants in survivable loading scenarios...computational modeling efforts conducted to evaluate the response of the cervical spine and the effects of cervical arthrodesis and arthroplasty during

  18. Limits on fundamental limits to computation.

    PubMed

    Markov, Igor L

    2014-08-14

    An indispensable part of our personal and working lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the past fifty years. Such Moore scaling now requires ever-increasing efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and increase our understanding of integrated-circuit scaling, here I review fundamental limits to computation in the areas of manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, I recapitulate how some limits were circumvented, and compare loose and tight limits. Engineering difficulties encountered by emerging technologies may indicate yet unknown limits.

  19. Computational electronics and electromagnetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shang, C C

    The Computational Electronics and Electromagnetics thrust area serves as the focal point for Engineering R and D activities for developing computer-based design and analysis tools. Representative applications include design of particle accelerator cells and beamline components; design of transmission line components; engineering analysis and design of high-power (optical and microwave) components; photonics and optoelectronics circuit design; electromagnetic susceptibility analysis; and antenna synthesis. The FY-97 effort focuses on development and validation of (1) accelerator design codes; (2) 3-D massively parallel, time-dependent EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; andmore » (5) development of beam control algorithms coupled to beam transport physics codes. These efforts are in association with technology development in the power conversion, nondestructive evaluation, and microtechnology areas. The efforts complement technology development in Lawrence Livermore National programs.« less

  20. Office workers' computer use patterns are associated with workplace stressors.

    PubMed

    Eijckelhof, Belinda H W; Huysmans, Maaike A; Blatter, Birgitte M; Leider, Priscilla C; Johnson, Peter W; van Dieën, Jaap H; Dennerlein, Jack T; van der Beek, Allard J

    2014-11-01

    This field study examined associations between workplace stressors and office workers' computer use patterns. We collected keyboard and mouse activities of 93 office workers (68F, 25M) for approximately two work weeks. Linear regression analyses examined the associations between self-reported effort, reward, overcommitment, and perceived stress and software-recorded computer use duration, number of short and long computer breaks, and pace of input device usage. Daily duration of computer use was, on average, 30 min longer for workers with high compared to low levels of overcommitment and perceived stress. The number of short computer breaks (30 s-5 min long) was approximately 20% lower for those with high compared to low effort and for those with low compared to high reward. These outcomes support the hypothesis that office workers' computer use patterns vary across individuals with different levels of workplace stressors. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  1. Neurocomputational mechanisms underlying subjective valuation of effort costs

    PubMed Central

    Giehl, Kathrin; Sillence, Annie

    2017-01-01

    In everyday life, we have to decide whether it is worth exerting effort to obtain rewards. Effort can be experienced in different domains, with some tasks requiring significant cognitive demand and others being more physically effortful. The motivation to exert effort for reward is highly subjective and varies considerably across the different domains of behaviour. However, very little is known about the computational or neural basis of how different effort costs are subjectively weighed against rewards. Is there a common, domain-general system of brain areas that evaluates all costs and benefits? Here, we used computational modelling and functional magnetic resonance imaging (fMRI) to examine the mechanisms underlying value processing in both the cognitive and physical domains. Participants were trained on two novel tasks that parametrically varied either cognitive or physical effort. During fMRI, participants indicated their preferences between a fixed low-effort/low-reward option and a variable higher-effort/higher-reward offer for each effort domain. Critically, reward devaluation by both cognitive and physical effort was subserved by a common network of areas, including the dorsomedial and dorsolateral prefrontal cortex, the intraparietal sulcus, and the anterior insula. Activity within these domain-general areas also covaried negatively with reward and positively with effort, suggesting an integration of these parameters within these areas. Additionally, the amygdala appeared to play a unique, domain-specific role in processing the value of rewards associated with cognitive effort. These results are the first to reveal the neurocomputational mechanisms underlying subjective cost–benefit valuation across different domains of effort and provide insight into the multidimensional nature of motivation. PMID:28234892

  2. Parallel computing for automated model calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, John S.; Danielson, Gary R.; Schulz, Douglas A.

    2002-07-29

    Natural resources model calibration is a significant burden on computing and staff resources in modeling efforts. Most assessments must consider multiple calibration objectives (for example magnitude and timing of stream flow peak). An automated calibration process that allows real time updating of data/models, allowing scientists to focus effort on improving models is needed. We are in the process of building a fully featured multi objective calibration tool capable of processing multiple models cheaply and efficiently using null cycle computing. Our parallel processing and calibration software routines have been generically, but our focus has been on natural resources model calibration. Somore » far, the natural resources models have been friendly to parallel calibration efforts in that they require no inter-process communication, only need a small amount of input data and only output a small amount of statistical information for each calibration run. A typical auto calibration run might involve running a model 10,000 times with a variety of input parameters and summary statistical output. In the past model calibration has been done against individual models for each data set. The individual model runs are relatively fast, ranging from seconds to minutes. The process was run on a single computer using a simple iterative process. We have completed two Auto Calibration prototypes and are currently designing a more feature rich tool. Our prototypes have focused on running the calibration in a distributed computing cross platform environment. They allow incorporation of?smart? calibration parameter generation (using artificial intelligence processing techniques). Null cycle computing similar to SETI@Home has also been a focus of our efforts. This paper details the design of the latest prototype and discusses our plans for the next revision of the software.« less

  3. A Quantitative Risk Assessment Model Involving Frequency and Threat Degree under Line-of-Business Services for Infrastructure of Emerging Sensor Networks.

    PubMed

    Jing, Xu; Hu, Hanwen; Yang, Huijun; Au, Man Ho; Li, Shuqin; Xiong, Naixue; Imran, Muhammad; Vasilakos, Athanasios V

    2017-03-21

    The prospect of Line-of-Business Services (LoBSs) for infrastructure of Emerging Sensor Networks (ESNs) is exciting. Access control remains a top challenge in this scenario as the service provider's server contains a lot of valuable resources. LoBSs' users are very diverse as they may come from a wide range of locations with vastly different characteristics. Cost of joining could be low and in many cases, intruders are eligible users conducting malicious actions. As a result, user access should be adjusted dynamically. Assessing LoBSs' risk dynamically based on both frequency and threat degree of malicious operations is therefore necessary. In this paper, we proposed a Quantitative Risk Assessment Model (QRAM) involving frequency and threat degree based on value at risk. To quantify the threat degree as an elementary intrusion effort, we amend the influence coefficient of risk indexes in the network security situation assessment model. To quantify threat frequency as intrusion trace effort, we make use of multiple behavior information fusion. Under the influence of intrusion trace, we adapt the historical simulation method of value at risk to dynamically access LoBSs' risk. Simulation based on existing data is used to select appropriate parameters for QRAM. Our simulation results show that the duration influence on elementary intrusion effort is reasonable when the normalized parameter is 1000. Likewise, the time window of intrusion trace and the weight between objective risk and subjective risk can be set to 10 s and 0.5, respectively. While our focus is to develop QRAM for assessing the risk of LoBSs for infrastructure of ESNs dynamically involving frequency and threat degree, we believe it is also appropriate for other scenarios in cloud computing.

  4. A Quantitative Risk Assessment Model Involving Frequency and Threat Degree under Line-of-Business Services for Infrastructure of Emerging Sensor Networks

    PubMed Central

    Jing, Xu; Hu, Hanwen; Yang, Huijun; Au, Man Ho; Li, Shuqin; Xiong, Naixue; Imran, Muhammad; Vasilakos, Athanasios V.

    2017-01-01

    The prospect of Line-of-Business Services (LoBSs) for infrastructure of Emerging Sensor Networks (ESNs) is exciting. Access control remains a top challenge in this scenario as the service provider’s server contains a lot of valuable resources. LoBSs’ users are very diverse as they may come from a wide range of locations with vastly different characteristics. Cost of joining could be low and in many cases, intruders are eligible users conducting malicious actions. As a result, user access should be adjusted dynamically. Assessing LoBSs’ risk dynamically based on both frequency and threat degree of malicious operations is therefore necessary. In this paper, we proposed a Quantitative Risk Assessment Model (QRAM) involving frequency and threat degree based on value at risk. To quantify the threat degree as an elementary intrusion effort, we amend the influence coefficient of risk indexes in the network security situation assessment model. To quantify threat frequency as intrusion trace effort, we make use of multiple behavior information fusion. Under the influence of intrusion trace, we adapt the historical simulation method of value at risk to dynamically access LoBSs’ risk. Simulation based on existing data is used to select appropriate parameters for QRAM. Our simulation results show that the duration influence on elementary intrusion effort is reasonable when the normalized parameter is 1000. Likewise, the time window of intrusion trace and the weight between objective risk and subjective risk can be set to 10 s and 0.5, respectively. While our focus is to develop QRAM for assessing the risk of LoBSs for infrastructure of ESNs dynamically involving frequency and threat degree, we believe it is also appropriate for other scenarios in cloud computing. PMID:28335569

  5. Hindcasting of Storm Surges, Currents, and Waves at Lower Delaware Bay during Hurricane Isabel

    NASA Astrophysics Data System (ADS)

    Salehi, M.

    2017-12-01

    Hurricanes are a major threat to coastal communities and infrastructures including nuclear power plants located in low-lying coastal zones. In response, their sensitive elements should be protected by smart design to withstand against drastic impact of such natural phenomena. Accurate and reliable estimate of hurricane attributes is the first step to that effort. Numerical models have extensively grown over the past few years and are effective tools in modeling large scale natural events such as hurricane. The impact of low probability hurricanes on the lower Delaware Bay is investigated using dynamically coupled meteorological, hydrodynamic, and wave components of Delft3D software. Efforts are made to significantly reduce the computational overburden of performing such analysis for the industry, yet keeping the same level of accuracy at the area of study (AOS). The model is comprised of overall and nested domains. The overall model domain includes portion of Atlantic Ocean, Delaware, and Chesapeake bays. The nested model domain includes Delaware Bay, its floodplain, and portion of the continental shelf. This study is portion of a larger modeling effort to study the impact of low probability hurricanes on sensitive infrastructures located at the coastal zones prone to hurricane activity. The AOS is located on the east bank of Delaware Bay almost 16 miles upstream of its mouth. Model generated wind speed, significant wave height, water surface elevation, and current are calibrated for hurricane Isabel (2003). The model calibration results agreed reasonably well with field observations. Furthermore, sensitivity of surge and wave responses to various hurricane parameters was tested. In line with findings from other researchers, accuracy of wind field played a major role in hindcasting the hurricane attributes.

  6. Temporal Imagery. An Approach to Reasoning about Time for Planning and Problem Solving.

    DTIC Science & Technology

    1985-10-01

    about protections ......... .. 97 3.6 Hypothesis generation and abductive inference .................... 98 3.7 Facilities for automatic projection and...events, and simultaneous actions. It you’re not careful, you can waste a considerable amount of effort just determining whether or not two points are or...the planner may construct "some plan, it may also ignore opportunities for merging tasks and con- solidating effort. My main objection, however, is

  7. Towards a computational- and algorithmic-level account of concept blending using analogies and amalgams

    NASA Astrophysics Data System (ADS)

    Besold, Tarek R.; Kühnberger, Kai-Uwe; Plaza, Enric

    2017-10-01

    Concept blending - a cognitive process which allows for the combination of certain elements (and their relations) from originally distinct conceptual spaces into a new unified space combining these previously separate elements, and enables reasoning and inference over the combination - is taken as a key element of creative thought and combinatorial creativity. In this article, we summarise our work towards the development of a computational-level and algorithmic-level account of concept blending, combining approaches from computational analogy-making and case-based reasoning (CBR). We present the theoretical background, as well as an algorithmic proposal integrating higher-order anti-unification matching and generalisation from analogy with amalgams from CBR. The feasibility of the approach is then exemplified in two case studies.

  8. Scientific Reasoning across Different Domains.

    ERIC Educational Resources Information Center

    Glaser, Robert; And Others

    This study seeks to establish which scientific reasoning skills are primarily domain-general and which appear to be domain-specific. The subjects, 12 university undergraduates, each participated in self-directed experimentation with three different content domains. The experimentation contexts were computer-based laboratories in d.c. circuits…

  9. Combining qualitative and quantitative spatial and temporal information in a hierarchical structure: Approximate reasoning for plan execution monitoring

    NASA Technical Reports Server (NTRS)

    Hoebel, Louis J.

    1993-01-01

    The problem of plan generation (PG) and the problem of plan execution monitoring (PEM), including updating, queries, and resource-bounded replanning, have different reasoning and representation requirements. PEM requires the integration of qualitative and quantitative information. PEM is the receiving of data about the world in which a plan or agent is executing. The problem is to quickly determine the relevance of the data, the consistency of the data with respect to the expected effects, and if execution should continue. Only spatial and temporal aspects of the plan are addressed for relevance in this work. Current temporal reasoning systems are deficient in computational aspects or expressiveness. This work presents a hybrid qualitative and quantitative system that is fully expressive in its assertion language while offering certain computational efficiencies. In order to proceed, methods incorporating approximate reasoning using hierarchies, notions of locality, constraint expansion, and absolute parameters need be used and are shown to be useful for the anytime nature of PEM.

  10. Information Security: Governmentwide Guidance Needed to Assist Agencies in Implementing Cloud Computing

    DTIC Science & Technology

    2010-07-01

    Cloud computing , an emerging form of computing in which users have access to scalable, on-demand capabilities that are provided through Internet... cloud computing , (2) the information security implications of using cloud computing services in the Federal Government, and (3) federal guidance and...efforts to address information security when using cloud computing . The complete report is titled Information Security: Federal Guidance Needed to

  11. Part-whole reasoning in medical ontologies revisited--introducing SEP triplets into classification-based description logics.

    PubMed

    Schulz, S; Romacker, M; Hahn, U

    1998-01-01

    The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics.

  12. Adversarial reasoning: challenges and approaches

    NASA Astrophysics Data System (ADS)

    Kott, Alexander; Ownby, Michael

    2005-05-01

    This paper defines adversarial reasoning as computational approaches to inferring and anticipating an enemy's perceptions, intents and actions. It argues that adversarial reasoning transcends the boundaries of game theory and must also leverage such disciplines as cognitive modeling, control theory, AI planning and others. To illustrate the challenges of applying adversarial reasoning to real-world problems, the paper explores the lessons learned in the CADET -- a battle planning system that focuses on brigade-level ground operations and involves adversarial reasoning. From this example of current capabilities, the paper proceeds to describe RAID -- a DARPA program that aims to build capabilities in adversarial reasoning, and how such capabilities would address practical requirements in Defense and other application areas.

  13. Part-whole reasoning in medical ontologies revisited--introducing SEP triplets into classification-based description logics.

    PubMed Central

    Schulz, S.; Romacker, M.; Hahn, U.

    1998-01-01

    The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics. Images Figure 3 PMID:9929335

  14. Razonamiento de Estudiantes Universitarios sobre Variabilidad e Intervalos de Confianza en un Contexto Inferencial Informal = University Students' Reasoning on Variability and Confidence Intervals in Inferential Informal Context

    ERIC Educational Resources Information Center

    Inzunsa Cazares, Santiago

    2016-01-01

    This article presents the results of a qualitative research with a group of 15 university students of social sciences on informal inferential reasoning developed in a computer environment on concepts involved in the confidence intervals. The results indicate that students developed a correct reasoning about sampling variability and visualized…

  15. Using Rasch Measurement to Develop a Computer Modeling-Based Instrument to Assess Students' Conceptual Understanding of Matter

    ERIC Educational Resources Information Center

    Wei, Silin; Liu, Xiufeng; Wang, Zuhao; Wang, Xingqiao

    2012-01-01

    Research suggests that difficulty in making connections among three levels of chemical representations--macroscopic, submicroscopic, and symbolic--is a primary reason for student alternative conceptions of chemistry concepts, and computer modeling is promising to help students make the connections. However, no computer modeling-based assessment…

  16. The Role of Context-Related Parameters in Adults' Mental Computational Acts

    ERIC Educational Resources Information Center

    Naresh, Nirmala; Presmeg, Norma

    2012-01-01

    Researchers who have carried out studies pertaining to mental computation and everyday mathematics point out that adults and children reason intuitively based upon experiences within specific contexts; they use invented strategies of their own to solve real-life problems. We draw upon research areas of mental computation and everyday mathematics…

  17. What Do Computer Science Students Think about Software Piracy?

    ERIC Educational Resources Information Center

    Konstantakis, Nikos I.; Palaigeorgiou, George E.; Siozos, Panos D.; Tsoukalas, Ioannis A.

    2010-01-01

    Today, software piracy is an issue of global importance. Computer science students are the future information and communication technologies professionals and it is important to study the way they approach this issue. In this article, we attempt to study attitudes, behaviours and the corresponding reasoning of computer science students in Greece…

  18. Using E-mail in a Math/Computer Core Course.

    ERIC Educational Resources Information Center

    Gurwitz, Chaya

    This paper notes the advantages of using e-mail in computer literacy classes, and discusses the results of incorporating an e-mail assignment in the "Introduction to Mathematical Reasoning and Computer Programming" core course at Brooklyn College (New York). The assignment consisted of several steps. The students first read and responded…

  19. 75 FR 4583 - In the Matter of: Certain Electronic Devices, Including Mobile Phones, Portable Music Players...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-28

    ..., Including Mobile Phones, Portable Music Players, and Computers; Notice of Investigation AGENCY: U.S... music players, and computers, by reason of infringement of certain claims of U.S. Patent Nos. 6,714,091... importation of certain electronic devices, including mobile phones, portable music players, or computers that...

  20. Forty years of collaborative computational crystallography.

    PubMed

    Agirre, Jon; Dodson, Eleanor

    2018-01-01

    A brief overview is provided of the history of collaborative computational crystallography, with an emphasis on the Collaborative Computational Project No. 4. The key steps in its development are outlined, with consideration also given to the underlying reasons which contributed, and ultimately led to, the unprecedented success of this venture. © 2017 The Protein Society.

Top