Sample records for idea quantitative modeling

  1. Industry Software Trustworthiness Criterion Research Based on Business Trustworthiness

    NASA Astrophysics Data System (ADS)

    Zhang, Jin; Liu, Jun-fei; Jiao, Hai-xing; Shen, Yi; Liu, Shu-yuan

    To industry software Trustworthiness problem, an idea aiming to business to construct industry software trustworthiness criterion is proposed. Based on the triangle model of "trustworthy grade definition-trustworthy evidence model-trustworthy evaluating", the idea of business trustworthiness is incarnated from different aspects of trustworthy triangle model for special industry software, power producing management system (PPMS). Business trustworthiness is the center in the constructed industry trustworthy software criterion. Fusing the international standard and industry rules, the constructed trustworthy criterion strengthens the maneuverability and reliability. Quantitive evaluating method makes the evaluating results be intuitionistic and comparable.

  2. Quantitative Finance

    NASA Astrophysics Data System (ADS)

    James, Jessica

    2017-01-01

    Quantitative finance is a field that has risen to prominence over the last few decades. It encompasses the complex models and calculations that value financial contracts, particularly those which reference events in the future, and apply probabilities to these events. While adding greatly to the flexibility of the market available to corporations and investors, it has also been blamed for worsening the impact of financial crises. But what exactly does quantitative finance encompass, and where did these ideas and models originate? We show that the mathematics behind finance and behind games of chance have tracked each other closely over the centuries and that many well-known physicists and mathematicians have contributed to the field.

  3. Planner-Based Control of Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Kortenkamp, David; Fry, Chuck; Bell, Scott

    2005-01-01

    The paper describes an approach to the integration of qualitative and quantitative modeling techniques for advanced life support (ALS) systems. Developing reliable control strategies that scale up to fully integrated life support systems requires augmenting quantitative models and control algorithms with the abstractions provided by qualitative, symbolic models and their associated high-level control strategies. This will allow for effective management of the combinatorics due to the integration of a large number of ALS subsystems. By focusing control actions at different levels of detail and reactivity we can use faster: simpler responses at the lowest level and predictive but complex responses at the higher levels of abstraction. In particular, methods from model-based planning and scheduling can provide effective resource management over long time periods. We describe reference implementation of an advanced control system using the IDEA control architecture developed at NASA Ames Research Center. IDEA uses planning/scheduling as the sole reasoning method for predictive and reactive closed loop control. We describe preliminary experiments in planner-based control of ALS carried out on an integrated ALS simulation developed at NASA Johnson Space Center.

  4. A Demonstration on Every Exam.

    ERIC Educational Resources Information Center

    Julian, Glenn M.

    1995-01-01

    Argues that inclusion of demonstrations on examinations increases students' ability to observe carefully the physical world around them, translate from observation in terms of models, and make quantitative estimates and physicist-type "back-of-the-envelope" calculations. Presents demonstration ideas covering the following topics:…

  5. A model of comprehensive unification

    NASA Astrophysics Data System (ADS)

    Reig, Mario; Valle, José W. F.; Vaquera-Araujo, C. A.; Wilczek, Frank

    2017-11-01

    Comprehensive - that is, gauge and family - unification using spinors has many attractive features, but it has been challenged to explain chirality. Here, by combining an orbifold construction with more traditional ideas, we address that difficulty. Our candidate model features three chiral families and leads to an acceptable result for quantitative unification of couplings. A potential target for accelerator and astronomical searches emerges.

  6. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    PubMed

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  7. The Ether Wind and the Global Positioning System.

    ERIC Educational Resources Information Center

    Muller, Rainer

    2000-01-01

    Explains how students can perform a refutation of the ether theory using information from the Global Positioning System (GPS). Discusses the functioning of the GPS, qualitatively describes how position determination would be affected by an ether wind, and illustrates the pertinent ideas with a simple quantitative model. (WRM)

  8. Bridging the Gap between Theory and Practice in Educational Research: Methods at the Margins

    ERIC Educational Resources Information Center

    Winkle-Wagner, Rachelle, Ed.; Hunter, Cheryl A., Ed.; Ortloff, Debora Hinderliter, Ed.

    2009-01-01

    This book provides new ways of thinking about educational processes, using quantitative and qualitative methodologies. Concrete examples of research techniques are provided for those conducting research with marginalized populations or about marginalized ideas. This volume asserts theoretical models related to research methods and the study of…

  9. Children's ideas about the solar system and the chaos in learning science

    NASA Astrophysics Data System (ADS)

    Sharp, John G.; Kuerbis, Paul

    2006-01-01

    Findings from a quasi-experimental study of children's ideas about the solar system and how these ideas changed in response to a 10-week intervention period of formal astronomy teaching at a single primary school in England are presented in detail. Initial interviews with all of the 9- to 11-year-olds involved revealed a relatively poorly developed prior knowledge base, and this was reflected in the predominantly intuitive and transitional nature of the different mental models expressed and used when answering questions and completing tasks. Following intervention, progression was evident in many different forms and this could be described and measured both qualitatively and quantitatively. The routes and pathways toward scientific conceptualization were often direct, and most changes could be attributed largely to the processes of weak and radical knowledge restructuring. Together with the retention of newly formed ideas over time, learning outcomes were considered particularly encouraging. In order to explain findings more fully, evidence is presented which lends some support to the notion of chaos in cognition.

  10. Toward a Quantitative Theory of Intellectual Discovery (Especially in Physics).

    ERIC Educational Resources Information Center

    Fowler, Richard G.

    1987-01-01

    Studies time intervals in a list of critical ideas in physics. Infers that the rate of growth of ideas has been proportional to the totality of known ideas multiplied by the totality of people in the world. Indicates that the rate of discovery in physics has been decreasing. (CW)

  11. Using metal-ligand binding characteristics to predict metal toxicity: quantitative ion character-activity relationships (QICARs).

    PubMed Central

    Newman, M C; McCloskey, J T; Tatara, C P

    1998-01-01

    Ecological risk assessment can be enhanced with predictive models for metal toxicity. Modelings of published data were done under the simplifying assumption that intermetal trends in toxicity reflect relative metal-ligand complex stabilities. This idea has been invoked successfully since 1904 but has yet to be applied widely in quantitative ecotoxicology. Intermetal trends in toxicity were successfully modeled with ion characteristics reflecting metal binding to ligands for a wide range of effects. Most models were useful for predictive purposes based on an F-ratio criterion and cross-validation, but anomalous predictions did occur if speciation was ignored. In general, models for metals with the same valence (i.e., divalent metals) were better than those combining mono-, di-, and trivalent metals. The softness parameter (sigma p) and the absolute value of the log of the first hydrolysis constant ([symbol: see text] log KOH [symbol: see text]) were especially useful in model construction. Also, delta E0 contributed substantially to several of the two-variable models. In contrast, quantitative attempts to predict metal interactions in binary mixtures based on metal-ligand complex stabilities were not successful. PMID:9860900

  12. Compilation of a near-infrared library for the construction of quantitative models of amoxicillin and potassium clavulanate oral dosage forms

    NASA Astrophysics Data System (ADS)

    Zou, Wen-bo; Chong, Xiao-meng; Wang, Yan; Hu, Chang-qin

    2018-05-01

    The accuracy of NIR quantitative models depends on calibration samples with concentration variability. Conventional sample collecting methods have some shortcomings especially the time-consuming which remains a bottleneck in the application of NIR models for Process Analytical Technology (PAT) control. A study was performed to solve the problem of sample selection collection for construction of NIR quantitative models. Amoxicillin and potassium clavulanate oral dosage forms were used as examples. The aim was to find a normal approach to rapidly construct NIR quantitative models using an NIR spectral library based on the idea of a universal model [2021]. The NIR spectral library of amoxicillin and potassium clavulanate oral dosage forms was defined and consisted of spectra of 377 batches of samples produced by 26 domestic pharmaceutical companies, including tablets, dispersible tablets, chewable tablets, oral suspensions, and granules. The correlation coefficient (rT) was used to indicate the similarities of the spectra. The samples’ calibration sets were selected from a spectral library according to the median rT of the samples to be analyzed. The rT of the samples selected was close to the median rT. The difference in rT of those samples was 1.0% to 1.5%. We concluded that sample selection is not a problem when constructing NIR quantitative models using a spectral library versus conventional methods of determining universal models. The sample spectra with a suitable concentration range in the NIR models were collected quickly. In addition, the models constructed through this method were more easily targeted.

  13. Binaural signal detection - Equalization and cancellation theory.

    NASA Technical Reports Server (NTRS)

    Durlach, N. I.

    1972-01-01

    The improvement in masked-signal detection afforded by two ears (i.e., binaural unmasking) is explained on the basis of a descriptive model of the processing of binaural stimuli by a system consisting of two bandpass filters, an equalization and cancellation mechanism, and a decision device. The main ideas of the model are initially explained, and a general equation is derived for the purpose of making quantitative predictions. Comparisons are then made between various special cases of this equation and experimental data. Failures of the preliminary model in predicting the data are considered, and possible revisions are discussed.

  14. Infrasonic waves generated by supersonic auroral arcs

    NASA Astrophysics Data System (ADS)

    Pasko, Victor P.

    2012-10-01

    A finite-difference time-domain (FDTD) model of infrasound propagation in a realistic atmosphere is used to provide quantitative interpretation of infrasonic waves produced by auroral arcs moving with supersonic speed. The Lorentz force and Joule heating are discussed in the existing literature as primary sources producing infrasound waves in the frequency range 0.1-0.01 Hz associated with the auroral electrojet. The results are consistent with original ideas of Swift (1973) and demonstrate that the synchronization of the speed of auroral arc and phase speed of the acoustic wave in the electrojet volume is an important condition for generation of magnitudes and frequency contents of infrasonic waves observable on the ground. The reported modeling also allows accurate quantitative reproduction of previously observed complex infrasonic waveforms including direct shock and reflected shockwaves, which are refracted back to the earth by the thermosphere.

  15. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  16. Crater Lakes on Mars: Development of Quantitative Thermal and Geomorphic Models

    NASA Technical Reports Server (NTRS)

    Barnhart, C. J.; Tulaczyk, S.; Asphaug, E.; Kraal, E. R.; Moore, J.

    2005-01-01

    Impact craters on Mars have served as catchments for channel-eroding surface fluids, and hundreds of examples of candidate paleolakes are documented [1,2] (see Figure 1). Because these features show similarity to terrestrial shorelines, wave action has been hypothesized as the geomorphic agent responsible for the generation of these features [3]. Recent efforts have examined the potential for shoreline formation by wind-driven waves, in order to turn an important but controversial idea into a quantitative, falsifiable hypothesis. These studies have concluded that significant wave-action shorelines are unlikely to have formed commonly within craters on Mars, barring Earth-like weather for approx.1000 years [4,5,6].

  17. CDMBE: A Case Description Model Based on Evidence

    PubMed Central

    Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing

    2015-01-01

    By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006

  18. A Collective Case Study of Secondary Students' Model-Based Inquiry on Natural Selection through Programming in an Agent-Based Modeling Environment

    NASA Astrophysics Data System (ADS)

    Xiang, Lin

    This is a collective case study seeking to develop detailed descriptions of how programming an agent-based simulation influences a group of 8 th grade students' model-based inquiry (MBI) by examining students' agent-based programmable modeling (ABPM) processes and the learning outcomes. The context of the present study was a biology unit on natural selection implemented in a charter school of a major California city during spring semester of 2009. Eight 8th grade students, two boys and six girls, participated in this study. All of them were low socioeconomic status (SES). English was a second language for all of them, but they had been identified as fluent English speakers at least a year before the study. None of them had learned either natural selection or programming before the study. The study spanned over 7 weeks and was comprised of two study phases. In phase one the subject students learned natural selection in science classroom and how to do programming in NetLogo, an ABPM tool, in a computer lab; in phase two, the subject students were asked to program a simulation of adaptation based on the natural selection model in NetLogo. Both qualitative and quantitative data were collected in this study. The data resources included (1) pre and post test questionnaire, (2) student in-class worksheet, (3) programming planning sheet, (4) code-conception matching sheet, (5) student NetLogo projects, (6) videotaped programming processes, (7) final interview, and (8) investigator's field notes. Both qualitative and quantitative approaches were applied to analyze the gathered data. The findings suggested that students made progress on understanding adaptation phenomena and natural selection at the end of ABPM-supported MBI learning but the progress was limited. These students still held some misconceptions in their conceptual models, such as the idea that animals need to "learn" to adapt into the environment. Besides, their models of natural selection appeared to be incomplete and many relationships among the model ideas had not been well established by the end of the study. Most of them did not treat the natural selection model as a whole but only focused on some ideas within the model. Very few of them could scientifically apply the natural selection model to interpret other evolutionary phenomena. The findings about participating students' programming processes revealed these processes were composed of consecutive programming cycles. The cycle typically included posing a task, constructing and running program codes, and examining the resulting simulation. Students held multiple ideas and applied various programming strategies in these cycles. Students were involved in MBI at each step of a cycle. Three types of ideas, six programming strategies and ten MBI actions were identified out of the processes. The relationships among these ideas, strategies and actions were also identified and described. Findings suggested that ABPM activities could support MBI by (1) exposing students' personal models and understandings, (2) provoking and supporting a series of model-based inquiry activities, such as elaborating target phenomena, abstracting patterns, and revising conceptual models, and (3) provoking and supporting tangible and productive conversations among students, as well as between the instructor and students. Findings also revealed three programming behaviors that appeared to impede productive MBI, including (1) solely phenomenon-orientated programming, (2) transplanting program codes, and (3) blindly running procedures. Based on the findings, I propose a general modeling process in ABPM activities, summarize the ways in which MBI can be supported in ABPM activities and constrained by multiple factors, and suggest the implications of this study in the future ABPM-assisted science instructional design and research.

  19. Cosmology. A first course

    NASA Astrophysics Data System (ADS)

    Lachieze-Rey, Marc

    This book delivers a quantitative account of the science of cosmology, designed for a non-specialist audience. The basic principles are outlined using simple maths and physics, while still providing rigorous models of the Universe. It offers an ideal introduction to the key ideas in cosmology, without going into technical details. The approach used is based on the fundamental ideas of general relativity such as the spacetime interval, comoving coordinates, and spacetime curvature. It provides an up-to-date and thoughtful discussion of the big bang, and the crucial questions of structure and galaxy formation. Questions of method and philosophical approaches in cosmology are also briefly discussed. Advanced undergraduates in either physics or mathematics would benefit greatly from use either as a course text or as a supplementary guide to cosmology courses.

  20. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  1. A study of preservice elementary teachers enrolled in a discrepant-event-based physical science class

    NASA Astrophysics Data System (ADS)

    Lilly, James Edward

    This research evaluated the POWERFUL IDEAS IN PHYSICAL SCIENCE (PIiPS) curriculum model used to develop a physical science course taken by preservice elementary teachers. The focus was on the evaluation of discrepant events used to induce conceptual change in relation to students' ideas concerning heat, temperature, and specific heat. Both quantitative and qualitative methodologies were used for the analysis. Data was collected during the 1998 Fall semester using two classes of physical science for elementary school teachers. The traditionally taught class served as the control group and the class using the PIiPS curriculum model was the experimental group. The PIiPS curriculum model was evaluated quantitatively for its influence on students' attitude toward science, anxiety towards teaching science, self efficacy toward teaching science, and content knowledge. An analysis of covariance was performed on the quantitative data to test for significant differences between the means of the posttests for the control and experimental groups while controlling for pretest. It was found that there were no significant differences between the means of the control and experimental groups with respect to changes in their attitude toward science, anxiety toward teaching science and self efficacy toward teaching science. A significant difference between the means of the content examination was found (F(1,28) = 14.202 and p = 0.001), however, the result is questionable. The heat and energy module was the target for qualitative scrutiny. Coding for discrepant events was adapted from Appleton's 1996 work on student's responses to discrepant event science lessons. The following qualitative questions were posed for the investigation: (1) what were the ideas of the preservice elementary students prior to entering the classroom regarding heat and energy, (2) how effective were the discrepant events as presented in the PIiPS heat and energy module, and (3) how much does the "risk taking factor" associated with not telling the students the answer right away, affect the learning of the material. It was found that preservice elementary teachers harbor similar preconceptions as the general population according to the literature. The discrepant events used in this module of the PIiPS curriculum model met with varied results. It appeared that those students who had not successfully confronted their preconceptions were less likely to accept the new concepts that were to be developed using the discrepant events. Lastly, students had shown great improvement in content understanding and developed the ability to ask deep and probing questions.

  2. Jointly characterizing epigenetic dynamics across multiple human cell types

    PubMed Central

    An, Lin; Yue, Feng; Hardison, Ross C

    2016-01-01

    Advanced sequencing technologies have generated a plethora of data for many chromatin marks in multiple tissues and cell types, yet there is lack of a generalized tool for optimal utility of those data. A major challenge is to quantitatively model the epigenetic dynamics across both the genome and many cell types for understanding their impacts on differential gene regulation and disease. We introduce IDEAS, an integrative and discriminative epigenome annotation system, for jointly characterizing epigenetic landscapes in many cell types and detecting differential regulatory regions. A key distinction between our method and existing state-of-the-art algorithms is that IDEAS integrates epigenomes of many cell types simultaneously in a way that preserves the position-dependent and cell type-specific information at fine scales, thereby greatly improving segmentation accuracy and producing comparable annotations across cell types. PMID:27095202

  3. Charles' Law of Gases.

    ERIC Educational Resources Information Center

    Petty, John T.

    1995-01-01

    Describes an experiment that uses air to test Charles' law. Reinforces the student's intuitive feel for Charles' law with quantitative numbers they can see, introduces the idea of extrapolating experimental data to obtain a theoretical value, and gives a physical quantitative meaning to the concept of absolute zero. (JRH)

  4. Discussion of skill improvement in marine ecosystem dynamic models based on parameter optimization and skill assessment

    NASA Astrophysics Data System (ADS)

    Shen, Chengcheng; Shi, Honghua; Liu, Yongzhi; Li, Fen; Ding, Dewen

    2016-07-01

    Marine ecosystem dynamic models (MEDMs) are important tools for the simulation and prediction of marine ecosystems. This article summarizes the methods and strategies used for the improvement and assessment of MEDM skill, and it attempts to establish a technical framework to inspire further ideas concerning MEDM skill improvement. The skill of MEDMs can be improved by parameter optimization (PO), which is an important step in model calibration. An efficient approach to solve the problem of PO constrained by MEDMs is the global treatment of both sensitivity analysis and PO. Model validation is an essential step following PO, which validates the efficiency of model calibration by analyzing and estimating the goodness-of-fit of the optimized model. Additionally, by focusing on the degree of impact of various factors on model skill, model uncertainty analysis can supply model users with a quantitative assessment of model confidence. Research on MEDMs is ongoing; however, improvement in model skill still lacks global treatments and its assessment is not integrated. Thus, the predictive performance of MEDMs is not strong and model uncertainties lack quantitative descriptions, limiting their application. Therefore, a large number of case studies concerning model skill should be performed to promote the development of a scientific and normative technical framework for the improvement of MEDM skill.

  5. Mapping quantitative trait loci for binary trait in the F2:3 design.

    PubMed

    Zhu, Chengsong; Zhang, Yuan-Ming; Guo, Zhigang

    2008-12-01

    In the analysis of inheritance of quantitative traits with low heritability, an F(2:3) design that genotypes plants in F(2) and phenotypes plants in F(2:3) progeny is often used in plant genetics. Although statistical approaches for mapping quantitative trait loci (QTL) in the F(2:3) design have been well developed, those for binary traits of biological interest and economic importance are seldom addressed. In this study, an attempt was made to map binary trait loci (BTL) in the F(2:3) design. The fundamental idea was: the F(2) plants were genotyped, all phenotypic values of each F(2:3) progeny were measured for binary trait, and these binary trait values and the marker genotype informations were used to detect BTL under the penetrance and liability models. The proposed method was verified by a series of Monte-Carlo simulation experiments. These results showed that maximum likelihood approaches under the penetrance and liability models provide accurate estimates for the effects and the locations of BTL with high statistical power, even under of low heritability. Moreover, the penetrance model is as efficient as the liability model, and the F(2:3) design is more efficient than classical F(2) design, even though only a single progeny is collected from each F(2:3) family. With the maximum likelihood approaches under the penetrance and the liability models developed in this study, we can map binary traits as we can do for quantitative trait in the F(2:3) design.

  6. A Cognitive Analysis of Students’ Mathematical Communication Ability on Geometry

    NASA Astrophysics Data System (ADS)

    Sari, D. S.; Kusnandi, K.; Suhendra, S.

    2017-09-01

    This study aims to analyze the difficulties of mathematical communication ability of students in one of secondary school on “three-dimensional space” topic. This research conducted by using quantitative approach with descriptive method. The population in this research was all students of that school and the sample was thirty students that was chosen by purposive sampling technique. Data of mathematical communication were collected through essay test. Furthermore, the data were analyzed with a descriptive way. The results of this study indicate that the percentage of achievement of student mathematical communication indicators as follows 1) Stating a situation, ideas, and mathematic correlation into images, graphics, or algebraic expressions is 35%; 2) Stating daily experience into a mathematic language / symbol, or a mathematic model is 35%; and 3) Associating images or diagrams into mathematical ideas is 53.3%. Based on the percentage of achievement on each indicator, it can be concluded that the level of achievement of students’ mathematical communication ability is still low. It can be caused the students were not used to convey or write their mathematical ideas systematically. Therefore students’ mathematical communication ability need to be improved.

  7. An Online Risk Monitor System (ORMS) to Increase Safety and Security Levels in Industry

    NASA Astrophysics Data System (ADS)

    Zubair, M.; Rahman, Khalil Ur; Hassan, Mehmood Ul

    2013-12-01

    The main idea of this research is to develop an Online Risk Monitor System (ORMS) based on Living Probabilistic Safety Assessment (LPSA). The article highlights the essential features and functions of ORMS. The basic models and modules such as, Reliability Data Update Model (RDUM), running time update, redundant system unavailability update, Engineered Safety Features (ESF) unavailability update and general system update have been described in this study. ORMS not only provides quantitative analysis but also highlights qualitative aspects of risk measures. ORMS is capable of automatically updating the online risk models and reliability parameters of equipment. ORMS can support in the decision making process of operators and managers in Nuclear Power Plants.

  8. The Testing Effect Is Alive and Well with Complex Materials

    ERIC Educational Resources Information Center

    Karpicke, Jeffrey D.; Aue, William R.

    2015-01-01

    Van Gog and Sweller (2015) claim that there is no testing effect--no benefit of practicing retrieval--for complex materials. We show that this claim is incorrect on several grounds. First, Van Gog and Sweller's idea of "element interactivity" is not defined in a quantitative, measurable way. As a consequence, the idea is applied…

  9. A Study Investigating Indian Middle School Students' Ideas of Design and Designers

    ERIC Educational Resources Information Center

    Ara, Farhat; Chunawala, Sugra; Natarajan, Chitra

    2011-01-01

    This paper reports on an investigation into middle school students' naive ideas about, and attitudes towards design and designers. The sample for the survey consisted of students from Classes 7 to 9 from a school located in Mumbai. The data were analysed qualitatively and quantitatively to look for trends in students' responses. Results show that…

  10. Safe uses of Hill's model: an exact comparison with the Adair-Klotz model

    PubMed Central

    2011-01-01

    Background The Hill function and the related Hill model are used frequently to study processes in the living cell. There are very few studies investigating the situations in which the model can be safely used. For example, it has been shown, at the mean field level, that the dose response curve obtained from a Hill model agrees well with the dose response curves obtained from a more complicated Adair-Klotz model, provided that the parameters of the Adair-Klotz model describe strongly cooperative binding. However, it has not been established whether such findings can be extended to other properties and non-mean field (stochastic) versions of the same, or other, models. Results In this work a rather generic quantitative framework for approaching such a problem is suggested. The main idea is to focus on comparing the particle number distribution functions for Hill's and Adair-Klotz's models instead of investigating a particular property (e.g. the dose response curve). The approach is valid for any model that can be mathematically related to the Hill model. The Adair-Klotz model is used to illustrate the technique. One main and two auxiliary similarity measures were introduced to compare the distributions in a quantitative way. Both time dependent and the equilibrium properties of the similarity measures were studied. Conclusions A strongly cooperative Adair-Klotz model can be replaced by a suitable Hill model in such a way that any property computed from the two models, even the one describing stochastic features, is approximately the same. The quantitative analysis showed that boundaries of the regions in the parameter space where the models behave in the same way exhibit a rather rich structure. PMID:21521501

  11. The Pathway for Oxygen: Tutorial Modelling on Oxygen Transport from Air to Mitochondrion: The Pathway for Oxygen.

    PubMed

    Bassingthwaighte, James B; Raymond, Gary M; Dash, Ranjan K; Beard, Daniel A; Nolan, Margaret

    2016-01-01

    The 'Pathway for Oxygen' is captured in a set of models describing quantitative relationships between fluxes and driving forces for the flux of oxygen from the external air source to the mitochondrial sink at cytochrome oxidase. The intervening processes involve convection, membrane permeation, diffusion of free and heme-bound O2 and enzymatic reactions. While this system's basic elements are simple: ventilation, alveolar gas exchange with blood, circulation of the blood, perfusion of an organ, uptake by tissue, and consumption by chemical reaction, integration of these pieces quickly becomes complex. This complexity led us to construct a tutorial on the ideas and principles; these first PathwayO2 models are simple but quantitative and cover: (1) a 'one-alveolus lung' with airway resistance, lung volume compliance, (2) bidirectional transport of solute gasses like O2 and CO2, (3) gas exchange between alveolar air and lung capillary blood, (4) gas solubility in blood, and circulation of blood through the capillary syncytium and back to the lung, and (5) blood-tissue gas exchange in capillaries. These open-source models are at Physiome.org and provide background for the many respiratory models there.

  12. The Pathway for Oxygen: Tutorial Modelling on Oxygen Transport from Air to Mitochondrion

    PubMed Central

    Bassingthwaighte, James B.; Raymond, Gary M.; Dash, Ranjan K.; Beard, Daniel A.; Nolan, Margaret

    2016-01-01

    The ‘Pathway for Oxygen’ is captured in a set of models describing quantitative relationships between fluxes and driving forces for the flux of oxygen from the external air source to the mitochondrial sink at cytochrome oxidase. The intervening processes involve convection, membrane permeation, diffusion of free and heme-bound O2 and enzymatic reactions. While this system’s basic elements are simple: ventilation, alveolar gas exchange with blood, circulation of the blood, perfusion of an organ, uptake by tissue, and consumption by chemical reaction, integration of these pieces quickly becomes complex. This complexity led us to construct a tutorial on the ideas and principles; these first PathwayO2 models are simple but quantitative and cover: 1) a ‘one-alveolus lung’ with airway resistance, lung volume compliance, 2) bidirectional transport of solute gasses like O2 and CO2, 3) gas exchange between alveolar air and lung capillary blood, 4) gas solubility in blood, and circulation of blood through the capillary syncytium and back to the lung, and 5) blood-tissue gas exchange in capillaries. These open-source models are at Physiome.org and provide background for the many respiratory models there. PMID:26782201

  13. The inventory for déjà vu experiences assessment. Development, utility, reliability, and validity.

    PubMed

    Sno, H N; Schalken, H F; de Jonghe, F; Koeter, M W

    1994-01-01

    In this article the development, utility, reliability, and validity of the Inventory for Déjà vu Experiences Assessment (IDEA) are described. The IDEA is a 23-item self-administered questionnaire consisting of a general section of nine questions and qualitative section of 14 questions. The latter questions comprise 48 topics. The questionnaire appeared to be a user-friendly instrument with satisfactory to good reliability and validity. The IDEA permits the study of quantitative and qualitative characteristics of déjà vu experiences.

  14. Applications of population genetics to animal breeding, from wright, fisher and lush to genomic prediction.

    PubMed

    Hill, William G

    2014-01-01

    Although animal breeding was practiced long before the science of genetics and the relevant disciplines of population and quantitative genetics were known, breeding programs have mainly relied on simply selecting and mating the best individuals on their own or relatives' performance. This is based on sound quantitative genetic principles, developed and expounded by Lush, who attributed much of his understanding to Wright, and formalized in Fisher's infinitesimal model. Analysis at the level of individual loci and gene frequency distributions has had relatively little impact. Now with access to genomic data, a revolution in which molecular information is being used to enhance response with "genomic selection" is occurring. The predictions of breeding value still utilize multiple loci throughout the genome and, indeed, are largely compatible with additive and specifically infinitesimal model assumptions. I discuss some of the history and genetic issues as applied to the science of livestock improvement, which has had and continues to have major spin-offs into ideas and applications in other areas.

  15. Electrostatic Estimation of Intercalant Jump-Diffusion Barriers Using Finite-Size Ion Models.

    PubMed

    Zimmermann, Nils E R; Hannah, Daniel C; Rong, Ziqin; Liu, Miao; Ceder, Gerbrand; Haranczyk, Maciej; Persson, Kristin A

    2018-02-01

    We report on a scheme for estimating intercalant jump-diffusion barriers that are typically obtained from demanding density functional theory-nudged elastic band calculations. The key idea is to relax a chain of states in the field of the electrostatic potential that is averaged over a spherical volume using different finite-size ion models. For magnesium migrating in typical intercalation materials such as transition-metal oxides, we find that the optimal model is a relatively large shell. This data-driven result parallels typical assumptions made in models based on Onsager's reaction field theory to quantitatively estimate electrostatic solvent effects. Because of its efficiency, our potential of electrostatics-finite ion size (PfEFIS) barrier estimation scheme will enable rapid identification of materials with good ionic mobility.

  16. Innovating in health delivery: The Penn medicine innovation tournament.

    PubMed

    Terwiesch, Christian; Mehta, Shivan J; Volpp, Kevin G

    2013-06-01

    Innovation tournaments can drive engagement and value generation by shifting problem-solving towards the end user. In health care, where the frontline workers have the most intimate understanding of patients' experience and the delivery process, encouraging them to generate and develop new approaches is critical to improving health care delivery. In many health care organizations, senior managers and clinicians retain control of innovation. Frontline workers need to be engaged in the innovation process. Penn Medicine launched a system-wide innovation tournament with the goal of improving the patient experience. We set a quantitative goal of receiving 500 ideas and getting at least 1000 employees to participate in the tournament. A secondary goal was to involve various groups of the care process (doctors, nurses, clerical staff, transporters). The tournament was broken up into three phases. During Phase 1, employees were encouraged to submit ideas. Submissions were judged by an expert panel and crowd sourcing based on their potential to improve patient experience and ability to be implemented within 6 months. During Phase 2, the best 200 ideas were pitched during a series of 5 workshops and ten finalists were selected. During Phase 3, the best 10 ideas were presented to and judged by an audience of about 200 interested employees and a judging panel of 15 administrators. Two winners were selected. A total of 1739 ideas were submitted and over 5000 employees participated in the innovation tournament. Patient convenience/amenities (21%) was the top category of submission, with other popular areas including technology optimization (11%), assistance with navigation within UPHS (10%), and improving patient/family centered care (9%) and care delivery models/transitions (9%). A combination of winning and submitted ideas were implemented. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Structure of the first- and second-neighbor shells of simulated water: Quantitative relation to translational and orientational order

    NASA Astrophysics Data System (ADS)

    Yan, Zhenyu; Buldyrev, Sergey V.; Kumar, Pradeep; Giovambattista, Nicolas; Debenedetti, Pablo G.; Stanley, H. Eugene

    2007-11-01

    We perform molecular dynamics simulations of water using the five-site transferable interaction potential (TIP5P) model to quantify structural order in both the first shell (defined by four nearest neighbors) and second shell (defined by twelve next-nearest neighbors) of a central water molecule. We find that the anomalous decrease of orientational order upon compression occurs in both shells, but the anomalous decrease of translational order upon compression occurs mainly in the second shell. The decreases of translational order and orientational order upon compression (called the “structural anomaly”) are thus correlated only in the second shell. Our findings quantitatively confirm the qualitative idea that the thermodynamic, structural, and hence dynamic anomalies of water are related to changes upon compression in the second shell.

  18. Analysis techniques for tracer studies of oxidation. M. S. Thesis Final Report

    NASA Technical Reports Server (NTRS)

    Basu, S. N.

    1984-01-01

    Analysis techniques to obtain quantitative diffusion data from tracer concentration profiles were developed. Mass balance ideas were applied to determine the mechanism of oxide growth and to separate the fraction of inward and outward growth of oxide scales. The process of inward oxygen diffusion with exchange was theoretically modelled and the effect of lattice diffusivity, grain boundary diffusivity and grain size on the tracer concentration profile was studied. The development of the tracer concentration profile in a growing oxide scale was simulated. The double oxidation technique was applied to a FeCrAl-Zr alloy using 0-18 as a tracer. SIMS was used to obtain the tracer concentration profile. The formation of lacey oxide on the alloy was discussed. Careful consideration was given to the quality of data required to obtain quantitative information.

  19. Thresholds and the rising pion inclusive cross section

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, S.T.

    In the context of the hypothesis of the Pomeron-f identity, it is shown that the rising pion inclusive cross section can be explained over a wide range of energies as a series of threshold effects. Low-mass thresholds are seen to be important. In order to understand the contributions of high-mass thresholds (flavoring), a simple two-channel multiperipheral model is examined. The analysis sheds light on the relation between thresholds and Mueller-Regge couplings. In particular, it is seen that inclusive-, and total-cross-section threshold mechanisms may differ. A quantitative model based on this idea and utilizing previous total-cross-section fits is seen to agreemore » well with experiment.« less

  20. An approach to computing direction relations between separated object groups

    NASA Astrophysics Data System (ADS)

    Yan, H.; Wang, Z.; Li, J.

    2013-06-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on Gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups; and then it constructs the Voronoi Diagram between the two groups using the triangular network; after this, the normal of each Vornoi edge is calculated, and the quantitative expression of the direction relations is constructed; finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  1. An approach to computing direction relations between separated object groups

    NASA Astrophysics Data System (ADS)

    Yan, H.; Wang, Z.; Li, J.

    2013-09-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups, and then it constructs the Voronoi diagram between the two groups using the triangular network. After this, the normal of each Voronoi edge is calculated, and the quantitative expression of the direction relations is constructed. Finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  2. Explanation Constraint Programming for Model-based Diagnosis of Engineered Systems

    NASA Technical Reports Server (NTRS)

    Narasimhan, Sriram; Brownston, Lee; Burrows, Daniel

    2004-01-01

    We can expect to see an increase in the deployment of unmanned air and land vehicles for autonomous exploration of space. In order to maintain autonomous control of such systems, it is essential to track the current state of the system. When the system includes safety-critical components, failures or faults in the system must be diagnosed as quickly as possible, and their effects compensated for so that control and safety are maintained under a variety of fault conditions. The Livingstone fault diagnosis and recovery kernel and its temporal extension L2 are examples of model-based reasoning engines for health management. Livingstone has been shown to be effective, it is in demand, and it is being further developed. It was part of the successful Remote Agent demonstration on Deep Space One in 1999. It has been and is being utilized by several projects involving groups from various NASA centers, including the In Situ Propellant Production (ISPP) simulation at Kennedy Space Center, the X-34 and X-37 experimental reusable launch vehicle missions, Techsat-21, and advanced life support projects. Model-based and consistency-based diagnostic systems like Livingstone work only with discrete and finite domain models. When quantitative and continuous behaviors are involved, these are abstracted to discrete form using some mapping. This mapping from the quantitative domain to the qualitative domain is sometimes very involved and requires the design of highly sophisticated and complex monitors. We propose a diagnostic methodology that deals directly with quantitative models and behaviors, thereby mitigating the need for these sophisticated mappings. Our work brings together ideas from model-based diagnosis systems like Livingstone and concurrent constraint programming concepts. The system uses explanations derived from the propagation of quantitative constraints to generate conflicts. Fast conflict generation algorithms are used to generate and maintain multiple candidates whose consistency can be tracked across multiple time steps.

  3. Overcoming pain thresholds with multilevel models-an example using quantitative sensory testing (QST) data.

    PubMed

    Hirschfeld, Gerrit; Blankenburg, Markus R; Süß, Moritz; Zernikow, Boris

    2015-01-01

    The assessment of somatosensory function is a cornerstone of research and clinical practice in neurology. Recent initiatives have developed novel protocols for quantitative sensory testing (QST). Application of these methods led to intriguing findings, such as the presence lower pain-thresholds in healthy children compared to healthy adolescents. In this article, we (re-) introduce the basic concepts of signal detection theory (SDT) as a method to investigate such differences in somatosensory function in detail. SDT describes participants' responses according to two parameters, sensitivity and response-bias. Sensitivity refers to individuals' ability to discriminate between painful and non-painful stimulations. Response-bias refers to individuals' criterion for giving a "painful" response. We describe how multilevel models can be used to estimate these parameters and to overcome central critiques of these methods. To provide an example we apply these methods to data from the mechanical pain sensitivity test of the QST protocol. The results show that adolescents are more sensitive to mechanical pain and contradict the idea that younger children simply use more lenient criteria to report pain. Overall, we hope that the wider use of multilevel modeling to describe somatosensory functioning may advance neurology research and practice.

  4. Statistical Physics Approaches to Microbial Ecology

    NASA Astrophysics Data System (ADS)

    Mehta, Pankaj

    The unprecedented ability to quantitatively measure and probe complex microbial communities has renewed interest in identifying the fundamental ecological principles governing community ecology in microbial ecosystems. Here, we present work from our group and others showing how ideas from statistical physics can help us uncover these ecological principles. Two major lessons emerge from this work. First, large, ecosystems with many species often display new, emergent ecological behaviors that are absent in small ecosystems with just a few species. To paraphrase Nobel laureate Phil Anderson, ''More is Different'', especially in community ecology. Second, the lack of trophic layer separation in microbial ecology fundamentally distinguishes microbial ecology from classical paradigms of community ecology and leads to qualitative different rules for community assembly in microbes. I illustrate these ideas using both theoretical modeling and novel new experiments on large microbial ecosystems performed by our collaborators (Joshua Goldford and Alvaro Sanchez). Work supported by Simons Investigator in MMLS and NIH R35 R35 GM119461.

  5. You can run, you can hide: The epidemiology and statistical mechanics of zombies

    NASA Astrophysics Data System (ADS)

    Alemi, Alexander A.; Bierbaum, Matthew; Myers, Christopher R.; Sethna, James P.

    2015-11-01

    We use a popular fictional disease, zombies, in order to introduce techniques used in modern epidemiology modeling, and ideas and techniques used in the numerical study of critical phenomena. We consider variants of zombie models, from fully connected continuous time dynamics to a full scale exact stochastic dynamic simulation of a zombie outbreak on the continental United States. Along the way, we offer a closed form analytical expression for the fully connected differential equation, and demonstrate that the single person per site two dimensional square lattice version of zombies lies in the percolation universality class. We end with a quantitative study of the full scale US outbreak, including the average susceptibility of different geographical regions.

  6. Integrating art into science education: a survey of science teachers' practices

    NASA Astrophysics Data System (ADS)

    Turkka, Jaakko; Haatainen, Outi; Aksela, Maija

    2017-07-01

    Numerous case studies suggest that integrating art and science education could engage students with creative projects and encourage students to express science in multitude of ways. However, little is known about art integration practices in everyday science teaching. With a qualitative e-survey, this study explores the art integration of science teachers (n = 66). A pedagogical model for science teachers' art integration emerged from a qualitative content analysis conducted on examples of art integration. In the model, art integration is characterised as integration through content and activities. Whilst the links in the content were facilitated either directly between concepts and ideas or indirectly through themes or artefacts, the integration through activity often connected an activity in one domain and a concept, idea or artefact in the other domain with the exception of some activities that could belong to both domains. Moreover, the examples of art integration in everyday classroom did not include expression of emotions often associated with art. In addition, quantitative part of the survey confirmed that integration is infrequent in all mapped areas. The findings of this study have implications for science teacher education that should offer opportunities for more consistent art integration.

  7. Using Statistical Multivariable Models to Understand the Relationship Between Interplanetary Coronal Mass Ejecta and Magnetic Flux Ropes

    NASA Technical Reports Server (NTRS)

    Riley, P.; Richardson, I. G.

    2012-01-01

    In-situ measurements of interplanetary coronal mass ejections (ICMEs) display a wide range of properties. A distinct subset, "magnetic clouds" (MCs), are readily identifiable by a smooth rotation in an enhanced magnetic field, together with an unusually low solar wind proton temperature. In this study, we analyze Ulysses spacecraft measurements to systematically investigate five possible explanations for why some ICMEs are observed to be MCs and others are not: i) An observational selection effect; that is, all ICMEs do in fact contain MCs, but the trajectory of the spacecraft through the ICME determines whether the MC is actually encountered; ii) interactions of an erupting flux rope (PR) with itself or between neighboring FRs, which produce complex structures in which the coherent magnetic structure has been destroyed; iii) an evolutionary process, such as relaxation to a low plasma-beta state that leads to the formation of an MC; iv) the existence of two (or more) intrinsic initiation mechanisms, some of which produce MCs and some that do not; or v) MCs are just an easily identifiable limit in an otherwise corntinuous spectrum of structures. We apply quantitative statistical models to assess these ideas. In particular, we use the Akaike information criterion (AIC) to rank the candidate models and a Gaussian mixture model (GMM) to uncover any intrinsic clustering of the data. Using a logistic regression, we find that plasma-beta, CME width, and the ratio O(sup 7) / O(sup 6) are the most significant predictor variables for the presence of an MC. Moreover, the propensity for an event to be identified as an MC decreases with heliocentric distance. These results tend to refute ideas ii) and iii). GMM clustering analysis further identifies three distinct groups of ICMEs; two of which match (at the 86% level) with events independently identified as MCs, and a third that matches with non-MCs (68 % overlap), Thus, idea v) is not supported. Choosing between ideas i) and iv) is more challenging, since they may effectively be indistinguishable from one another by a single in-situ spacecraft. We offer some suggestions on how future studies may address this.

  8. Engineering as a new frontier for translational medicine

    PubMed Central

    Chien, Shu; Bashir, Rashid; Nerem, Robert M.; Pettigrew, Roderic

    2015-01-01

    The inclusion of engineering ideas and approaches makes medicine a quantitative and systems-based discipline that facilitates precision diagnostics and therapeutics to improve health care delivery for all. PMID:25834106

  9. Applications of Population Genetics to Animal Breeding, from Wright, Fisher and Lush to Genomic Prediction

    PubMed Central

    Hill, William G.

    2014-01-01

    Although animal breeding was practiced long before the science of genetics and the relevant disciplines of population and quantitative genetics were known, breeding programs have mainly relied on simply selecting and mating the best individuals on their own or relatives’ performance. This is based on sound quantitative genetic principles, developed and expounded by Lush, who attributed much of his understanding to Wright, and formalized in Fisher’s infinitesimal model. Analysis at the level of individual loci and gene frequency distributions has had relatively little impact. Now with access to genomic data, a revolution in which molecular information is being used to enhance response with “genomic selection” is occurring. The predictions of breeding value still utilize multiple loci throughout the genome and, indeed, are largely compatible with additive and specifically infinitesimal model assumptions. I discuss some of the history and genetic issues as applied to the science of livestock improvement, which has had and continues to have major spin-offs into ideas and applications in other areas. PMID:24395822

  10. Quantitative brain tissue oximetry, phase spectroscopy and imaging the range of homeostasis in piglet brain.

    PubMed

    Chance, Britton; Ma, Hong Yan; Nioka, Shoko

    2003-01-01

    The quantification of tissue oxygen by frequency or time domain methods has been discussed in a number of prior publications where the meaning of the tissue hemoglobin oxygen saturation was unclear and where the CW instruments were unsuitable for proper quantitative measurements [1, 2]. The development of the IQ Phase Meter has greatly simplified and made reliable the difficult determination of precise phase and amplitude signals from brain. This contribution reports on the calibration of the instrument in model systems and the use of the instrument to measure tissue saturation (StO2) in a small animal model. In addition, a global interpretation of the meaning of tissue oxygen has been formulated based on the idea that autoregulation will maintain tissue oxygen at a fixed value over a range of arterial and venous oxygen values over the range of autoregulation. Beyond that range, the tissue oxygen is still correctly measured but, as expected, approaches the arterial saturation at low metabolic rates and the venous saturation at high metabolic rates of mitochondria.

  11. Frontiers of finance: evolution and efficient markets.

    PubMed

    Farmer, J D; Lo, A W

    1999-08-31

    In this review article, we explore several recent advances in the quantitative modeling of financial markets. We begin with the Efficient Markets Hypothesis and describe how this controversial idea has stimulated a number of new directions of research, some focusing on more elaborate mathematical models that are capable of rationalizing the empirical facts, others taking a completely different tack in rejecting rationality altogether. One of the most promising directions is to view financial markets from a biological perspective and, specifically, within an evolutionary framework in which markets, instruments, institutions, and investors interact and evolve dynamically according to the "law" of economic selection. Under this view, financial agents compete and adapt, but they do not necessarily do so in an optimal fashion. Evolutionary and ecological models of financial markets is truly a new frontier whose exploration has just begun.

  12. Frontiers of finance: Evolution and efficient markets

    PubMed Central

    Farmer, J. Doyne; Lo, Andrew W.

    1999-01-01

    In this review article, we explore several recent advances in the quantitative modeling of financial markets. We begin with the Efficient Markets Hypothesis and describe how this controversial idea has stimulated a number of new directions of research, some focusing on more elaborate mathematical models that are capable of rationalizing the empirical facts, others taking a completely different tack in rejecting rationality altogether. One of the most promising directions is to view financial markets from a biological perspective and, specifically, within an evolutionary framework in which markets, instruments, institutions, and investors interact and evolve dynamically according to the “law” of economic selection. Under this view, financial agents compete and adapt, but they do not necessarily do so in an optimal fashion. Evolutionary and ecological models of financial markets is truly a new frontier whose exploration has just begun. PMID:10468547

  13. Performance of GUNGEN Idea Generation Support Groupware: Lessons from Over A Few Hundred Trial Sessions

    NASA Astrophysics Data System (ADS)

    Yuizono, Takaya; Munemori, Jun

    GUNGEN-DXII, a new version of the GUNGEN groupware, allows the users to process hundreds of qualitative data segments (phrases and sentences) and compose a coherent piece of text containing a number of emergent ideas. The idea generation process is guided by the KJ method, a leading idea generation technique in Japan. This paper describes functions of GUNGEN supporting three major sub-activities of idea generation, namely, brainstorming, idea clustering, and text composition, and also summarizes the results obtained from a few hundred trial sessions with the old and new GUNGEN systems in terms of some qualitative and quantitative measures. The results show that the sessions with GUNGEN yield intermediate and final products at least as good as those from the original paper-and-pencil KJ method sessions, in addition to the advantages of the online system, such as distance collaboration and digital storage of the products. Moreover, results from the new GUNGEN-DXII raises hope for enabling the users to handle an extremely large number of qualitative data segments in the near future.

  14. Missing heritability in the tails of quantitative traits? A simulation study on the impact of slightly altered true genetic models.

    PubMed

    Pütter, Carolin; Pechlivanis, Sonali; Nöthen, Markus M; Jöckel, Karl-Heinz; Wichmann, Heinz-Erich; Scherag, André

    2011-01-01

    Genome-wide association studies have identified robust associations between single nucleotide polymorphisms and complex traits. As the proportion of phenotypic variance explained is still limited for most of the traits, larger and larger meta-analyses are being conducted to detect additional associations. Here we investigate the impact of the study design and the underlying assumption about the true genetic effect in a bimodal mixture situation on the power to detect associations. We performed simulations of quantitative phenotypes analysed by standard linear regression and dichotomized case-control data sets from the extremes of the quantitative trait analysed by standard logistic regression. Using linear regression, markers with an effect in the extremes of the traits were almost undetectable, whereas analysing extremes by case-control design had superior power even for much smaller sample sizes. Two real data examples are provided to support our theoretical findings and to explore our mixture and parameter assumption. Our findings support the idea to re-analyse the available meta-analysis data sets to detect new loci in the extremes. Moreover, our investigation offers an explanation for discrepant findings when analysing quantitative traits in the general population and in the extremes. Copyright © 2011 S. Karger AG, Basel.

  15. Electron energy loss spectrometry of interstellar diamonds

    NASA Technical Reports Server (NTRS)

    Bernatowicz, Thomas J.; Gibbons, Patrick C.; Lewis, Roy S.

    1990-01-01

    The results are reported of electron energy loss spectra (EELS) measurements on diamond residues from carbonaceous meteorites designed to elucidate the structure and composition of interstellar diamonds. Dynamic effective medium theory is used to model the dielectric properties of the diamonds and in particular to synthesize the observed spectra as mixtures of diamond and various pi-bonded carbons. The results are shown to be quantitatively consistent with the idea that diamonds and their surfaces are the only contributors to the electron energy loss spectra of the diamond residues and that these peculiar spectra are the result of the exceptionally small grain size and large specific surface area of the interstellar diamonds.

  16. A two-level structure for advanced space power system automation

    NASA Technical Reports Server (NTRS)

    Loparo, Kenneth A.; Chankong, Vira

    1990-01-01

    The tasks to be carried out during the three-year project period are: (1) performing extensive simulation using existing mathematical models to build a specific knowledge base of the operating characteristics of space power systems; (2) carrying out the necessary basic research on hierarchical control structures, real-time quantitative algorithms, and decision-theoretic procedures; (3) developing a two-level automation scheme for fault detection and diagnosis, maintenance and restoration scheduling, and load management; and (4) testing and demonstration. The outlines of the proposed system structure that served as a master plan for this project, work accomplished, concluding remarks, and ideas for future work are also addressed.

  17. Qualitative and quantitative descriptions of glenohumeral motion.

    PubMed

    Hill, A M; Bull, A M J; Wallace, A L; Johnson, G R

    2008-02-01

    Joint modelling plays an important role in qualitative and quantitative descriptions of both normal and abnormal joints, as well as predicting outcomes of alterations to joints in orthopaedic practice and research. Contemporary efforts in modelling have focussed upon the major articulations of the lower limb. Well-constrained arthrokinematics can form the basis of manageable kinetic and dynamic mathematical predictions. In order to contain computation of shoulder complex modelling, glenohumeral joint representations in both limited and complete shoulder girdle models have undergone a generic simplification. As such, glenohumeral joint models are often based upon kinematic descriptions of inadequate degrees of freedom (DOF) for clinical purposes and applications. Qualitative descriptions of glenohumeral motion range from the parody of a hinge joint to the complex realism of a spatial joint. In developing a model, a clear idea of intention is required in order to achieve a required application. Clinical applicability of a model requires both descriptive and predictive output potentials, and as such, a high level of validation is required. Without sufficient appreciation of the clinical intention of the arthrokinematic foundation to a model, error is all too easily introduced. Mathematical description of joint motion serves to quantify all relevant clinical parameters. Commonly, both the Euler angle and helical (screw) axis methods have been applied to the glenohumeral joint, although concordance between these methods and classical anatomical appreciation of joint motion is limited, resulting in miscommunication between clinician and engineer. Compounding these inconsistencies in motion quantification is gimbal lock and sequence dependency.

  18. The Role of Introductory Geosciences in Students' Quantitative Literacy

    NASA Astrophysics Data System (ADS)

    Wenner, J. M.; Manduca, C.; Baer, E. M.

    2006-12-01

    Quantitative literacy is more than mathematics; it is about reasoning with data. Colleges and universities have begun to recognize the distinction between mathematics and quantitative literacy, modifying curricula to reflect the need for numerate citizens. Although students may view geology as 'rocks for jocks', the geosciences are truthfully rife with data, making introductory geoscience topics excellent context for developing the quantitative literacy of students with diverse backgrounds. In addition, many news items that deal with quantitative skills, such as the global warming phenomenon, have their basis in the Earth sciences and can serve as timely examples of the importance of quantitative literacy for all students in introductory geology classrooms. Participants at a workshop held in 2006, 'Infusing Quantitative Literacy into Introductory Geoscience Courses,' discussed and explored the challenges and opportunities associated with the inclusion of quantitative material and brainstormed about effective practices for imparting quantitative literacy to students with diverse backgrounds. The tangible results of this workshop add to the growing collection of quantitative materials available through the DLESE- and NSF-supported Teaching Quantitative Skills in the Geosciences website, housed at SERC. There, faculty can find a collection of pages devoted to the successful incorporation of quantitative literacy in introductory geoscience. The resources on the website are designed to help faculty to increase their comfort with presenting quantitative ideas to students with diverse mathematical abilities. A methods section on "Teaching Quantitative Literacy" (http://serc.carleton.edu/quantskills/methods/quantlit/index.html) focuses on connecting quantitative concepts with geoscience context and provides tips, trouble-shooting advice and examples of quantitative activities. The goal in this section is to provide faculty with material that can be readily incorporated into existing introductory geoscience courses. In addition, participants at the workshop (http://serc.carleton.edu/quantskills/workshop06/index.html) submitted and modified more than 20 activities and model courses (with syllabi) designed to use best practices for helping introductory geoscience students to become quantitatively literate. We present insights from the workshop and other sources for a framework that can aid in increasing quantitative literacy of students from a variety of backgrounds in the introductory geoscience classroom.

  19. Representational Similarity Analysis – Connecting the Branches of Systems Neuroscience

    PubMed Central

    Kriegeskorte, Nikolaus; Mur, Marieke; Bandettini, Peter

    2008-01-01

    A fundamental challenge for systems neuroscience is to quantitatively relate its three major branches of research: brain-activity measurement, behavioral measurement, and computational modeling. Using measured brain-activity patterns to evaluate computational network models is complicated by the need to define the correspondency between the units of the model and the channels of the brain-activity data, e.g., single-cell recordings or voxels from functional magnetic resonance imaging (fMRI). Similar correspondency problems complicate relating activity patterns between different modalities of brain-activity measurement (e.g., fMRI and invasive or scalp electrophysiology), and between subjects and species. In order to bridge these divides, we suggest abstracting from the activity patterns themselves and computing representational dissimilarity matrices (RDMs), which characterize the information carried by a given representation in a brain or model. Building on a rich psychological and mathematical literature on similarity analysis, we propose a new experimental and data-analytical framework called representational similarity analysis (RSA), in which multi-channel measures of neural activity are quantitatively related to each other and to computational theory and behavior by comparing RDMs. We demonstrate RSA by relating representations of visual objects as measured with fMRI in early visual cortex and the fusiform face area to computational models spanning a wide range of complexities. The RDMs are simultaneously related via second-level application of multidimensional scaling and tested using randomization and bootstrap techniques. We discuss the broad potential of RSA, including novel approaches to experimental design, and argue that these ideas, which have deep roots in psychology and neuroscience, will allow the integrated quantitative analysis of data from all three branches, thus contributing to a more unified systems neuroscience. PMID:19104670

  20. Identifying and modeling the structural discontinuities of human interactions

    NASA Astrophysics Data System (ADS)

    Grauwin, Sebastian; Szell, Michael; Sobolevsky, Stanislav; Hövel, Philipp; Simini, Filippo; Vanhoof, Maarten; Smoreda, Zbigniew; Barabási, Albert-László; Ratti, Carlo

    2017-04-01

    The idea of a hierarchical spatial organization of society lies at the core of seminal theories in human geography that have strongly influenced our understanding of social organization. Along the same line, the recent availability of large-scale human mobility and communication data has offered novel quantitative insights hinting at a strong geographical confinement of human interactions within neighboring regions, extending to local levels within countries. However, models of human interaction largely ignore this effect. Here, we analyze several country-wide networks of telephone calls - both, mobile and landline - and in either case uncover a systematic decrease of communication induced by borders which we identify as the missing variable in state-of-the-art models. Using this empirical evidence, we propose an alternative modeling framework that naturally stylizes the damping effect of borders. We show that this new notion substantially improves the predictive power of widely used interaction models. This increases our ability to understand, model and predict social activities and to plan the development of infrastructures across multiple scales.

  1. Identifying and modeling the structural discontinuities of human interactions

    PubMed Central

    Grauwin, Sebastian; Szell, Michael; Sobolevsky, Stanislav; Hövel, Philipp; Simini, Filippo; Vanhoof, Maarten; Smoreda, Zbigniew; Barabási, Albert-László; Ratti, Carlo

    2017-01-01

    The idea of a hierarchical spatial organization of society lies at the core of seminal theories in human geography that have strongly influenced our understanding of social organization. Along the same line, the recent availability of large-scale human mobility and communication data has offered novel quantitative insights hinting at a strong geographical confinement of human interactions within neighboring regions, extending to local levels within countries. However, models of human interaction largely ignore this effect. Here, we analyze several country-wide networks of telephone calls - both, mobile and landline - and in either case uncover a systematic decrease of communication induced by borders which we identify as the missing variable in state-of-the-art models. Using this empirical evidence, we propose an alternative modeling framework that naturally stylizes the damping effect of borders. We show that this new notion substantially improves the predictive power of widely used interaction models. This increases our ability to understand, model and predict social activities and to plan the development of infrastructures across multiple scales. PMID:28443647

  2. Identifying and modeling the structural discontinuities of human interactions.

    PubMed

    Grauwin, Sebastian; Szell, Michael; Sobolevsky, Stanislav; Hövel, Philipp; Simini, Filippo; Vanhoof, Maarten; Smoreda, Zbigniew; Barabási, Albert-László; Ratti, Carlo

    2017-04-26

    The idea of a hierarchical spatial organization of society lies at the core of seminal theories in human geography that have strongly influenced our understanding of social organization. Along the same line, the recent availability of large-scale human mobility and communication data has offered novel quantitative insights hinting at a strong geographical confinement of human interactions within neighboring regions, extending to local levels within countries. However, models of human interaction largely ignore this effect. Here, we analyze several country-wide networks of telephone calls - both, mobile and landline - and in either case uncover a systematic decrease of communication induced by borders which we identify as the missing variable in state-of-the-art models. Using this empirical evidence, we propose an alternative modeling framework that naturally stylizes the damping effect of borders. We show that this new notion substantially improves the predictive power of widely used interaction models. This increases our ability to understand, model and predict social activities and to plan the development of infrastructures across multiple scales.

  3. A Method for Quantifying, Visualising, and Analysing Gastropod Shell Form

    PubMed Central

    Liew, Thor-Seng; Schilthuizen, Menno

    2016-01-01

    Quantitative analysis of organismal form is an important component for almost every branch of biology. Although generally considered an easily-measurable structure, the quantification of gastropod shell form is still a challenge because many shells lack homologous structures and have a spiral form that is difficult to capture with linear measurements. In view of this, we adopt the idea of theoretical modelling of shell form, in which the shell form is the product of aperture ontogeny profiles in terms of aperture growth trajectory that is quantified as curvature and torsion, and of aperture form that is represented by size and shape. We develop a workflow for the analysis of shell forms based on the aperture ontogeny profile, starting from the procedure of data preparation (retopologising the shell model), via data acquisition (calculation of aperture growth trajectory, aperture form and ontogeny axis), and data presentation (qualitative comparison between shell forms) and ending with data analysis (quantitative comparison between shell forms). We evaluate our methods on representative shells of the genera Opisthostoma and Plectostoma, which exhibit great variability in shell form. The outcome suggests that our method is a robust, reproducible, and versatile approach for the analysis of shell form. Finally, we propose several potential applications of our methods in functional morphology, theoretical modelling, taxonomy, and evolutionary biology. PMID:27280463

  4. REVIEWS OF TOPICAL PROBLEMS: Nonlinear dynamics of the brain: emotion and cognition

    NASA Astrophysics Data System (ADS)

    Rabinovich, Mikhail I.; Muezzinoglu, M. K.

    2010-07-01

    Experimental investigations of neural system functioning and brain activity are standardly based on the assumption that perceptions, emotions, and cognitive functions can be understood by analyzing steady-state neural processes and static tomographic snapshots. The new approaches discussed in this review are based on the analysis of transient processes and metastable states. Transient dynamics is characterized by two basic properties, structural stability and information sensitivity. The ideas and methods that we discuss provide an explanation for the occurrence of and successive transitions between metastable states observed in experiments, and offer new approaches to behavior analysis. Models of the emotional and cognitive functions of the brain are suggested. The mathematical object that represents the observed transient brain processes in the phase space of the model is a structurally stable heteroclinic channel. The possibility of using the suggested models to construct a quantitative theory of some emotional and cognitive functions is illustrated.

  5. Retrograde spins of near-Earth asteroids from the Yarkovsky effect.

    PubMed

    La Spina, A; Paolicchi, P; Kryszczyńska, A; Pravec, P

    2004-03-25

    Dynamical resonances in the asteroid belt are the gateway for the production of near-Earth asteroids (NEAs). To generate the observed number of NEAs, however, requires the injection of many asteroids into those resonant regions. Collisional processes have long been claimed as a possible source, but difficulties with that idea have led to the suggestion that orbital drift arising from the Yarkovsky effect dominates the injection process. (The Yarkovsky effect is a force arising from differential heating-the 'afternoon' side of an asteroid is warmer than the 'morning' side.) The two models predict different rotational properties of NEAs: the usual collisional theories are consistent with a nearly isotropic distribution of rotation vectors, whereas the 'Yarkovsky model' predicts an excess of retrograde rotations. Here we report that the spin vectors of NEAs show a strong and statistically significant excess of retrograde rotations, quantitatively consistent with the theoretical expectations of the Yarkovsky model.

  6. Image-based characterization of thrombus formation in time-lapse DIC microscopy

    PubMed Central

    Brieu, Nicolas; Navab, Nassir; Serbanovic-Canic, Jovana; Ouwehand, Willem H.; Stemple, Derek L.; Cvejic, Ana; Groher, Martin

    2012-01-01

    The characterization of thrombus formation in time-lapse DIC microscopy is of increased interest for identifying genes which account for atherothrombosis and coronary artery diseases (CADs). In particular, we are interested in large-scale studies on zebrafish, which result in large amount of data, and require automatic processing. In this work, we present an image-based solution for the automatized extraction of parameters quantifying the temporal development of thrombotic plugs. Our system is based on the joint segmentation of thrombotic and aortic regions over time. This task is made difficult by the low contrast and the high dynamic conditions observed in vivo DIC microscopic scenes. Our key idea is to perform this segmentation by distinguishing the different motion patterns in image time series rather than by solving standard image segmentation tasks in each image frame. Thus, we are able to compensate for the poor imaging conditions. We model motion patterns by energies based on the idea of dynamic textures, and regularize the model by two prior energies on the shape of the aortic region and on the topological relationship between the thrombus and the aorta. We demonstrate the performance of our segmentation algorithm by qualitative and quantitative experiments on synthetic examples as well as on real in vivo microscopic sequences. PMID:22482997

  7. Mullins effect in a filled elastomer under uniaxial tension

    DOE PAGES

    Maiti, A.; Small, W.; Gee, R. H.; ...

    2014-01-16

    Modulus softening and permanent set in filled polymeric materials due to cyclic loading and unloading, commonly known as the Mullins effect, can have a significant impact on their use as support cushions. The quantitative analysis of such behavior is essential to ensure the effectiveness of such materials in long-term deployment. In this work we combine existing ideas of filler-induced modulus enhancement, strain amplification, and irreversible deformation within a simple non-Gaussian constitutive model to quantitatively interpret recent measurements on a relevant PDMS-based elastomeric cushion. Also, we find that the experimental stress-strain data is consistent with the picture that during stretching (loading)more » two effects take place simultaneously: (1) the physical constraints (entanglements) initially present in the polymer network get disentangled, thus leading to a gradual decrease in the effective cross-link density, and (2) the effective filler volume fraction gradually decreases with increasing strain due to the irreversible pulling out of an initially occluded volume of the soft polymer domain.« less

  8. Developing a model for effective leadership in healthcare: a concept mapping approach.

    PubMed

    Hargett, Charles William; Doty, Joseph P; Hauck, Jennifer N; Webb, Allison Mb; Cook, Steven H; Tsipis, Nicholas E; Neumann, Julie A; Andolsek, Kathryn M; Taylor, Dean C

    2017-01-01

    Despite increasing awareness of the importance of leadership in healthcare, our understanding of the competencies of effective leadership remains limited. We used a concept mapping approach (a blend of qualitative and quantitative analysis of group processes to produce a visual composite of the group's ideas) to identify stakeholders' mental model of effective healthcare leadership, clarifying the underlying structure and importance of leadership competencies. Literature review, focus groups, and consensus meetings were used to derive a representative set of healthcare leadership competency statements. Study participants subsequently sorted and rank-ordered these statements based on their perceived importance in contributing to effective healthcare leadership in real-world settings. Hierarchical cluster analysis of individual sortings was used to develop a coherent model of effective leadership in healthcare. A diverse group of 92 faculty and trainees individually rank-sorted 33 leadership competency statements. The highest rated statements were "Acting with Personal Integrity", "Communicating Effectively", "Acting with Professional Ethical Values", "Pursuing Excellence", "Building and Maintaining Relationships", and "Thinking Critically". Combining the results from hierarchical cluster analysis with our qualitative data led to a healthcare leadership model based on the core principle of Patient Centeredness and the core competencies of Integrity, Teamwork, Critical Thinking, Emotional Intelligence, and Selfless Service. Using a mixed qualitative-quantitative approach, we developed a graphical representation of a shared leadership model derived in the healthcare setting. This model may enhance learning, teaching, and patient care in this important area, as well as guide future research.

  9. Communication and effectiveness in a US nursing home quality-improvement collaborative.

    PubMed

    Arling, Priscilla A; Abrahamson, Kathleen; Miech, Edward J; Inui, Thomas S; Arling, Greg

    2014-09-01

    In this study, we explored the relationship between changes in resident health outcomes, practitioner communication patterns, and practitioner perceptions of group effectiveness within a quality-improvement collaborative of nursing home clinicians. Survey and interview data were collected from nursing home clinicians participating in a quality-improvement collaborative. Quality-improvement outcomes were evaluated using US Federal and State minimum dataset measures. Models were specified evaluating the relationships between resident outcomes, staff perceptions of communication patterns, and staff perceptions of collaborative effectiveness. Interview data provided deeper understanding of the quantitative findings. Reductions in fall rates were highest in facilities where respondents experienced the highest levels of communication with collaborative members outside of scheduled meetings, and where respondents perceived that the collaborative kept them informed and provided new ideas. Clinicians observed that participation in a quality-improvement collaborative positively influenced the ability to share innovative ideas and expand the quality-improvement program within their nursing home. For practitioners, a high level of communication, both inside and outside of meetings, was key to making measurable gains in resident health outcomes. © 2013 Wiley Publishing Asia Pty Ltd.

  10. Using qualitative methods to develop a contextually tailored instrument: Lessons learned.

    PubMed

    Lee, Haeok; Kiang, Peter; Kim, Minjin; Semino-Asaro, Semira; Colten, Mary Ellen; Tang, Shirley S; Chea, Phala; Peou, Sonith; Grigg-Saito, Dorcas C

    2015-01-01

    To develop a population-specific instrument to inform hepatitis B virus (HBV) and human papilloma virus (HPV) prevention education and intervention based on data and evidence obtained from the targeted population of Khmer mothers reflecting their socio-cultural and health behaviors. The principles of community-based participatory research (CBPR) guided the development of a standardized survey interview. Four stages of development and testing of the survey instrument took place in order to inform the quantitative health survey used to collect data in stage five of the project. This article reports only on Stages 1-4. This process created a new quantitative measure of HBV and HPV prevention behavior based on the revised Network Episode Model and informed by the targeted population. The CBPR method facilitated the application and translation of abstract theoretical ideas of HBV and HPV prevention behavior into culturally-relevant words and expressions of Cambodian Americans (CAs). The design of an instrument development process that accounts for distinctive socio-cultural backgrounds of CA refugee/immigrant women provides a model for use in developing future health surveys that are intended to aid minority-serving health care professionals and researchers as well as targeted minority populations.

  11. Using qualitative methods to develop a contextually tailored instrument: Lessons learned

    PubMed Central

    Lee, Haeok; Kiang, Peter; Kim, Minjin; Semino-Asaro, Semira; Colten, Mary Ellen; Tang, Shirley S.; Chea, Phala; Peou, Sonith; Grigg-Saito, Dorcas C.

    2015-01-01

    Objective: To develop a population-specific instrument to inform hepatitis B virus (HBV) and human papilloma virus (HPV) prevention education and intervention based on data and evidence obtained from the targeted population of Khmer mothers reflecting their socio-cultural and health behaviors. Methods: The principles of community-based participatory research (CBPR) guided the development of a standardized survey interview. Four stages of development and testing of the survey instrument took place in order to inform the quantitative health survey used to collect data in stage five of the project. This article reports only on Stages 1-4. Results: This process created a new quantitative measure of HBV and HPV prevention behavior based on the revised Network Episode Model and informed by the targeted population. The CBPR method facilitated the application and translation of abstract theoretical ideas of HBV and HPV prevention behavior into culturally-relevant words and expressions of Cambodian Americans (CAs). Conclusions: The design of an instrument development process that accounts for distinctive socio-cultural backgrounds of CA refugee/immigrant women provides a model for use in developing future health surveys that are intended to aid minority-serving health care professionals and researchers as well as targeted minority populations. PMID:27981114

  12. Contraction and stress-dependent growth shape the forebrain of the early chicken embryo.

    PubMed

    Garcia, Kara E; Okamoto, Ruth J; Bayly, Philip V; Taber, Larry A

    2017-01-01

    During early vertebrate development, local constrictions, or sulci, form to divide the forebrain into the diencephalon, telencephalon, and optic vesicles. These partitions are maintained and exaggerated as the brain tube inflates, grows, and bends. Combining quantitative experiments on chick embryos with computational modeling, we investigated the biophysical mechanisms that drive these changes in brain shape. Chemical perturbations of contractility indicated that actomyosin contraction plays a major role in the creation of initial constrictions (Hamburger-Hamilton stages HH11-12), and fluorescent staining revealed that F-actin is circumferentially aligned at all constrictions. A finite element model based on these findings shows that the observed shape changes are consistent with circumferential contraction in these regions. To explain why sulci continue to deepen as the forebrain expands (HH12-20), we speculate that growth depends on wall stress. This idea was examined by including stress-dependent growth in a model with cerebrospinal fluid pressure and bending (cephalic flexure). The results given by the model agree with observed morphological changes that occur in the brain tube under normal and reduced eCSF pressure, quantitative measurements of relative sulcal depth versus time, and previously published patterns of cell proliferation. Taken together, our results support a biphasic mechanism for forebrain morphogenesis consisting of differential contractility (early) and stress-dependent growth (late). Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. QSAR DataBank - an approach for the digital organization and archiving of QSAR model information

    PubMed Central

    2014-01-01

    Background Research efforts in the field of descriptive and predictive Quantitative Structure-Activity Relationships or Quantitative Structure–Property Relationships produce around one thousand scientific publications annually. All the materials and results are mainly communicated using printed media. The printed media in its present form have obvious limitations when they come to effectively representing mathematical models, including complex and non-linear, and large bodies of associated numerical chemical data. It is not supportive of secondary information extraction or reuse efforts while in silico studies poses additional requirements for accessibility, transparency and reproducibility of the research. This gap can and should be bridged by introducing domain-specific digital data exchange standards and tools. The current publication presents a formal specification of the quantitative structure-activity relationship data organization and archival format called the QSAR DataBank (QsarDB for shorter, or QDB for shortest). Results The article describes QsarDB data schema, which formalizes QSAR concepts (objects and relationships between them) and QsarDB data format, which formalizes their presentation for computer systems. The utility and benefits of QsarDB have been thoroughly tested by solving everyday QSAR and predictive modeling problems, with examples in the field of predictive toxicology, and can be applied for a wide variety of other endpoints. The work is accompanied with open source reference implementation and tools. Conclusions The proposed open data, open source, and open standards design is open to public and proprietary extensions on many levels. Selected use cases exemplify the benefits of the proposed QsarDB data format. General ideas for future development are discussed. PMID:24910716

  14. Model-Unified Planning and Execution for Distributed Autonomous System Control

    NASA Technical Reports Server (NTRS)

    Aschwanden, Pascal; Baskaran, Vijay; Bernardini, Sara; Fry, Chuck; Moreno, Maria; Muscettola, Nicola; Plaunt, Chris; Rijsman, David; Tompkins, Paul

    2006-01-01

    The Intelligent Distributed Execution Architecture (IDEA) is a real-time architecture that exploits artificial intelligence planning as the core reasoning engine for interacting autonomous agents. Rather than enforcing separate deliberation and execution layers, IDEA unifies them under a single planning technology. Deliberative and reactive planners reason about and act according to a single representation of the past, present and future domain state. The domain state behaves the rules dictated by a declarative model of the subsystem to be controlled, internal processes of the IDEA controller, and interactions with other agents. We present IDEA concepts - modeling, the IDEA core architecture, the unification of deliberation and reaction under planning - and illustrate its use in a simple example. Finally, we present several real-world applications of IDEA, and compare IDEA to other high-level control approaches.

  15. Social in, social out: How the brain responds to social language with more social language.

    PubMed

    O'Donnell, Matthew Brook; Falk, Emily B; Lieberman, Matthew D

    Social connection is a fundamental human need. As such, people's brains are sensitized to social cues, such as those carried by language, and to promoting social communication. The neural mechanisms of certain key building blocks in this process, such as receptivity to and reproduction of social language, however, are not known. We combined quantitative linguistic analysis and neuroimaging to connect neural activity in brain regions used to simulate the mental states of others with exposure to, and re-transmission of, social language. Our results link findings on successful idea transmission from communication science, sociolinguistics and cognitive neuroscience to prospectively predict the degree of social language that participants utilize when re-transmitting ideas as a function of 1) initial language inputs and 2) neural activity during idea exposure.

  16. Qualitative and quantitative reasoning about thermodynamics

    NASA Technical Reports Server (NTRS)

    Skorstad, Gordon; Forbus, Ken

    1989-01-01

    One goal of qualitative physics is to capture the tacit knowledge of engineers and scientists. It is shown how Qualitative Process theory can be used to express concepts of engineering thermodynamics. In particular, it is shown how to integrate qualitative and quantitative knowledge to solve textbook problems involving thermodynamic cycles, such as gas turbine plants and steam power plants. These ideas were implemented in a program called SCHISM. Its analysis of a sample textbook problem is described and plans for future work are discussed.

  17. Quantitative Aspects of Cyclosis in Plant Cells.

    ERIC Educational Resources Information Center

    Howells, K. F.; Fell, D. A.

    1979-01-01

    Describes an exercise which is currently used in a course in cell physiology at Oxford Polytechnic in England. This exercise can give students some idea of the molecular events involved in bringing about movement of chloroplasts (and other organelles) in plant cells. (HM)

  18. EVALUATING DISCONTINUITIES IN COMPLEX SYSTEMS: TOWARD QUANTITATIVE MEASURE OF RESILIENCE

    EPA Science Inventory

    The textural discontinuity hypothesis (TDH) is based on the observation that animal body mass distributions exhibit discontinuities that may reflect the texture of the landscape available for exploitation. This idea has been extended to other complex systems, hinting that the ide...

  19. Evaluation: Review of the Past, Preview of the Future.

    ERIC Educational Resources Information Center

    Smith, M. F.

    1994-01-01

    This paper summarized contributors' ideas about evaluation as a field and where it is going. Topics discussed were qualitative versus quantitative debate; evaluation's purpose; professionalization; program failure; program development; evaluators as advocates; evaluation knowledge; evaluation expansion; and methodology and design. (SLD)

  20. The Quantitative Preparation of Future Geoscience Graduate Students

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways. Calculus, calculus-based physics, chemistry, statistics, programming and linear algebra were viewed as important course preparation for a successful graduate experience. A set of recommendations for departments and for new community resources includes ideas for infusing quantitative reasoning throughout the undergraduate experience and mechanisms for learning from successful experiments in both geoscience and mathematics. A full list of participants, summaries of the meeting discussion and recommendations are available at http://serc.carleton.edu/quantskills/winter06/index.html. These documents, crafted by a small but diverse group can serve as a starting point for broader community discussion of the quantitative preparation of future geoscience graduate students.

  1. Non-linear mixed effects modeling - from methodology and software development to driving implementation in drug development science.

    PubMed

    Pillai, Goonaseelan Colin; Mentré, France; Steimer, Jean-Louis

    2005-04-01

    Few scientific contributions have made significant impact unless there was a champion who had the vision to see the potential for its use in seemingly disparate areas-and who then drove active implementation. In this paper, we present a historical summary of the development of non-linear mixed effects (NLME) modeling up to the more recent extensions of this statistical methodology. The paper places strong emphasis on the pivotal role played by Lewis B. Sheiner (1940-2004), who used this statistical methodology to elucidate solutions to real problems identified in clinical practice and in medical research and on how he drove implementation of the proposed solutions. A succinct overview of the evolution of the NLME modeling methodology is presented as well as ideas on how its expansion helped to provide guidance for a more scientific view of (model-based) drug development that reduces empiricism in favor of critical quantitative thinking and decision making.

  2. Rethinking the learning of belief network probabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musick, R.

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rotemore » learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neutral networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.« less

  3. Modeling and predicting abstract concept or idea introduction and propagation through geopolitical groups

    NASA Astrophysics Data System (ADS)

    Jaenisch, Holger M.; Handley, James W.; Hicklen, Michael L.

    2007-04-01

    This paper describes a novel capability for modeling known idea propagation transformations and predicting responses to new ideas from geopolitical groups. Ideas are captured using semantic words that are text based and bear cognitive definitions. We demonstrate a unique algorithm for converting these into analytical predictive equations. Using the illustrative idea of "proposing a gasoline price increase of 1 per gallon from 2" and its changing perceived impact throughout 5 demographic groups, we identify 13 cost of living Diplomatic, Information, Military, and Economic (DIME) features common across all 5 demographic groups. This enables the modeling and monitoring of Political, Military, Economic, Social, Information, and Infrastructure (PMESII) effects of each group to this idea and how their "perception" of this proposal changes. Our algorithm and results are summarized in this paper.

  4. Mathematization in introductory physics

    NASA Astrophysics Data System (ADS)

    Brahmia, Suzanne M.

    Mathematization is central to STEM disciplines as a cornerstone of the quantitative reasoning that characterizes these fields. Introductory physics is required for most STEM majors in part so that students develop expert-like mathematization. This dissertation describes coordinated research and curriculum development for strengthening mathematization in introductory physics; it blends scholarship in physics and mathematics education in the form of three papers. The first paper explores mathematization in the context of physics, and makes an original contribution to the measurement of physics students' struggle to mathematize. Instructors naturally assume students have a conceptual mastery of algebra before embarking on a college physics course because these students are enrolled in math courses beyond algebra. This paper provides evidence that refutes the validity of this assumption and categorizes some of the barriers students commonly encounter with quantification and representing ideas symbolically. The second paper develops a model of instruction that can help students progress from their starting points to their instructor's desired endpoints. Instructors recognize that the introductory physics course introduces new ideas at an astonishing rate. More than most physicists realize, however, the way that mathematics is used in the course is foreign to a large portion of class. This paper puts forth an instructional model that can move all students toward better quantitative and physical reasoning, despite the substantial variability of those students' initial states. The third paper describes the design and testing of curricular materials that foster mathematical creativity to prepare students to better understand physics reasoning. Few students enter introductory physics with experience generating equations in response to specific challenges involving unfamiliar quantities and units, yet this generative use of mathematics is typical of the thinking involved in doing physics. It contrasts with their more common experience with mathematics as the practice of specified procedures to improve efficiency. This paper describes new curricular materials based on invention instruction provide students with opportunities to generate mathematical relationships in physics, and the paper presents preliminary evidence of the effectiveness of this method with mathematically underprepared engineering students.

  5. SCREENING LIFE CYCLE ASSESSMENT OF GASOLINE ADDITIVES

    EPA Science Inventory

    The EPA's ORD is conducting a screening of Life Cycle Assessment (LCA) of selected automotive fuel (i.e., gasoline) systems. Although no specific guidelines exist on how to conduct such a streamlined approach, the basic idea is to use a mix of qualitative and quantitative generi...

  6. Promoting Multicultural Awareness through Electronic Communication

    ERIC Educational Resources Information Center

    Huang, Hui-Ju

    2006-01-01

    This project utilized computer technology to establish an email discussion forum for communication and learning in which students shared information, ideas, and processes of learning multicultural education. This paper presents the quantitative count of email messages and qualitative analysis of students' perceptions of email discussions. It then…

  7. Manipulating Models and Grasping the Ideas They Represent

    NASA Astrophysics Data System (ADS)

    Bryce, T. G. K.; Blown, E. J.

    2016-03-01

    This article notes the convergence of recent thinking in neuroscience and grounded cognition regarding the way we understand mental representation and recollection: ideas are dynamic and multi-modal, actively created at the point of recall. Also, neurophysiologically, re-entrant signalling among cortical circuits allows non-conscious processing to support our deliberative thoughts and actions. The qualitative research we describe examines the exchanges occurring during semi-structured interviews with 360 children age 3-13, including 294 from New Zealand (158 boys, 136 girls) and 66 from China (34 boys, 32 girls) concerning their understanding of the shape and motion of the Earth, Sun and Moon (ESM). We look closely at the relationships between what is revealed as children manipulate their own play-dough models and their apparent understandings of ESM concepts. In particular, we focus on the switching taking place between what is said, what is drawn and what is modelled. The evidence is supportive of Edelman's view that memory is non-representational and that concepts are the outcome of perceptual mappings, a view which is also in accord with Barsalou's notion that concepts are simulators or skills which operate consistently across several modalities. Quantitative data indicate that the dynamic structure of memory/concept creation is similar in both genders and common to the cultures/ethnicities compared (New Zealand European and Māori; Chinese Han) and that repeated interviews in this longitudinal research lead to more advanced modelling skills and/or more advanced shape and motion concepts, the results supporting hypotheses ( Kolmogorov- Smirnov alpha levels .05; r s : p < .001).

  8. Use of laser 3D surface digitizer in data collection and 3D modeling of anatomical structures

    NASA Astrophysics Data System (ADS)

    Tse, Kelly; Van Der Wall, Hans; Vu, Dzung H.

    2006-02-01

    A laser digitizer (Konica-Minolta Vivid 910) is used to obtain 3-dimensional surface scans of anatomical structures with a maximum resolution of 0.1mm. Placing the specimen on a turntable allows multiple scans allaround because the scanner only captures data from the portion facing its lens. A computer model is generated using 3D modeling software such as Geomagic. The 3D model can be manipulated on screen for repeated analysis of anatomical features, a useful capability when the specimens are rare or inaccessible (museum collection, fossils, imprints in rock formation.). As accurate measurements can be performed on the computer model, instead of taking measurements on actual specimens only at the archeological excavation site e.g., a variety of quantitative data can be later obtained on the computer model in the laboratory as new ideas come to mind. Our group had used a mechanical contact digitizer (Microscribe) for this purpose, but with the surface digitizer, we have been obtaining data sets more accurately and more quickly.

  9. Age and size at maturity: a quantitative review of diet-induced reaction norms in insects.

    PubMed

    Teder, Tiit; Vellau, Helen; Tammaru, Toomas

    2014-11-01

    Optimality models predict that diet-induced bivariate reaction norms for age and size at maturity can have diverse shapes, with the slope varying from negative to positive. To evaluate these predictions, we perform a quantitative review of relevant data, using a literature-derived database of body sizes and development times for over 200 insect species. We show that bivariate reaction norms with a negative slope prevail in nearly all taxonomic and ecological categories of insects as well as in some other ectotherm taxa with comparable life histories (arachnids and amphibians). In insects, positive slopes are largely limited to species, which feed on discrete resource items, parasitoids in particular. By contrast, with virtually no meaningful exceptions, herbivorous and predatory insects display reaction norms with a negative slope. This is consistent with the idea that predictable resource depletion, a scenario selecting for positively sloped reaction norms, is not frequent for these insects. Another source of such selection-a positive correlation between resource levels and juvenile mortality rates-should similarly be rare among insects. Positive slopes can also be predicted by models which integrate life-history evolution and population dynamics. As bottom-up regulation is not common in most insect groups, such models may not be most appropriate for insects. © 2014 The Author(s). Evolution © 2014 The Society for the Study of Evolution.

  10. Computational studies of novel chymase inhibitors against cardiovascular and allergic diseases: mechanism and inhibition.

    PubMed

    Arooj, Mahreen; Thangapandian, Sundarapandian; John, Shalini; Hwang, Swan; Park, Jong K; Lee, Keun W

    2012-12-01

    To provide a new idea for drug design, a computational investigation is performed on chymase and its novel 1,4-diazepane-2,5-diones inhibitors that explores the crucial molecular features contributing to binding specificity. Molecular docking studies of inhibitors within the active site of chymase were carried out to rationalize the inhibitory properties of these compounds and understand their inhibition mechanism. The density functional theory method was used to optimize molecular structures with the subsequent analysis of highest occupied molecular orbital, lowest unoccupied molecular orbital, and molecular electrostatic potential maps, which revealed that negative potentials near 1,4-diazepane-2,5-diones ring are essential for effective binding of inhibitors at active site of enzyme. The Bayesian model with receiver operating curve statistic of 0.82 also identified arylsulfonyl and aminocarbonyl as the molecular features favoring and not favoring inhibition of chymase, respectively. Moreover, genetic function approximation was applied to construct 3D quantitative structure-activity relationships models. Two models (genetic function approximation model 1 r(2) = 0.812 and genetic function approximation model 2 r(2) = 0.783) performed better in terms of correlation coefficients and cross-validation analysis. In general, this study is used as example to illustrate how combinational use of 2D/3D quantitative structure-activity relationships modeling techniques, molecular docking, frontier molecular orbital density fields (highest occupied molecular orbital and lowest unoccupied molecular orbital), and molecular electrostatic potential analysis may be useful to gain an insight into the binding mechanism between enzyme and its inhibitors. © 2012 John Wiley & Sons A/S.

  11. Developing a model for effective leadership in healthcare: a concept mapping approach

    PubMed Central

    Hargett, Charles William; Doty, Joseph P; Hauck, Jennifer N; Webb, Allison MB; Cook, Steven H; Tsipis, Nicholas E; Neumann, Julie A; Andolsek, Kathryn M; Taylor, Dean C

    2017-01-01

    Purpose Despite increasing awareness of the importance of leadership in healthcare, our understanding of the competencies of effective leadership remains limited. We used a concept mapping approach (a blend of qualitative and quantitative analysis of group processes to produce a visual composite of the group’s ideas) to identify stakeholders’ mental model of effective healthcare leadership, clarifying the underlying structure and importance of leadership competencies. Methods Literature review, focus groups, and consensus meetings were used to derive a representative set of healthcare leadership competency statements. Study participants subsequently sorted and rank-ordered these statements based on their perceived importance in contributing to effective healthcare leadership in real-world settings. Hierarchical cluster analysis of individual sortings was used to develop a coherent model of effective leadership in healthcare. Results A diverse group of 92 faculty and trainees individually rank-sorted 33 leadership competency statements. The highest rated statements were “Acting with Personal Integrity”, “Communicating Effectively”, “Acting with Professional Ethical Values”, “Pursuing Excellence”, “Building and Maintaining Relationships”, and “Thinking Critically”. Combining the results from hierarchical cluster analysis with our qualitative data led to a healthcare leadership model based on the core principle of Patient Centeredness and the core competencies of Integrity, Teamwork, Critical Thinking, Emotional Intelligence, and Selfless Service. Conclusion Using a mixed qualitative-quantitative approach, we developed a graphical representation of a shared leadership model derived in the healthcare setting. This model may enhance learning, teaching, and patient care in this important area, as well as guide future research. PMID:29355249

  12. Social in, social out: How the brain responds to social language with more social language

    PubMed Central

    O’Donnell, Matthew Brook; Falk, Emily B.; Lieberman, Matthew D.

    2014-01-01

    Social connection is a fundamental human need. As such, people’s brains are sensitized to social cues, such as those carried by language, and to promoting social communication. The neural mechanisms of certain key building blocks in this process, such as receptivity to and reproduction of social language, however, are not known. We combined quantitative linguistic analysis and neuroimaging to connect neural activity in brain regions used to simulate the mental states of others with exposure to, and re-transmission of, social language. Our results link findings on successful idea transmission from communication science, sociolinguistics and cognitive neuroscience to prospectively predict the degree of social language that participants utilize when re-transmitting ideas as a function of 1) initial language inputs and 2) neural activity during idea exposure. PMID:27642220

  13. Application of scenario analysis and multiagent technique in land-use planning: a case study on Sanjiang wetlands.

    PubMed

    Yu, Huan; Ni, Shi-Jun; Kong, Bo; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong

    2013-01-01

    Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources.

  14. Application of Scenario Analysis and Multiagent Technique in Land-Use Planning: A Case Study on Sanjiang Wetlands

    PubMed Central

    Ni, Shi-Jun; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong

    2013-01-01

    Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources. PMID:23818816

  15. Nordic in Nature: Friluftsliv and Environmental Connectedness

    ERIC Educational Resources Information Center

    Beery, Thomas H.

    2013-01-01

    This study explored the question of whether a relationship exists between the Nordic cultural idea of friluftsliv and the psychological construct of environmental connectedness (EC). This quantitative study employed a correlational design with existing data from the Swedish Outdoor Recreation in Change national survey. Results indicate that there…

  16. Teachers' Knowledge of Special Education Policies and Practices

    ERIC Educational Resources Information Center

    Sanders, Pamela

    2015-01-01

    The Individuals with Disabilities Education Act (IDEA) greatly improved the educational opportunities for students with disabilities. Teachers require knowledge of the law to deliver necessary and appropriate services to students with disabilities. The purpose of this quantitative study was to examine teachers' knowledge of special education…

  17. Using Google Apps to Develop the Mathematical Practices

    ERIC Educational Resources Information Center

    Layton, Rebecca D.; Cady, Jo Ann; Layton, Christopher A.

    2017-01-01

    Recent recommendations for the teaching of mathematics place an emphasis on the Common Core's Standards for Mathematical Practice (SMP) (CCSSI 2010). The SMPs emphasize constructing viable arguments, critiquing the ideas of others, reasoning abstractly and quantitatively, and using computational procedures. These skills, including the use of…

  18. Probes, Surveys, and the Ontology of the Social

    ERIC Educational Resources Information Center

    Collins, Harry; Evans, Robert

    2017-01-01

    By distinguishing between a survey and--a newly introduced term--a "probe," we recast the relationship between qualitative and quantitative approaches to social science. The difference turns on the "uniformity" of the phenomenon being examined. Uniformity is a fundamental idea underlying all scientific research but is rarely…

  19. Direct Allocation Costing: Informed Management Decisions in a Changing Environment.

    ERIC Educational Resources Information Center

    Mancini, Cesidio G.; Goeres, Ernest R.

    1995-01-01

    It is argued that colleges and universities can use direct allocation costing to provide quantitative information needed for decision making. This method of analysis requires institutions to modify traditional ideas of costing, looking to the private sector for examples of accurate costing techniques. (MSE)

  20. Two-population model for medial temporal lobe neurons: The vast majority are almost silent

    NASA Astrophysics Data System (ADS)

    Magyar, Andrew; Collins, John

    2015-07-01

    Recordings in the human medial temporal lobe have found many neurons that respond to pictures (and related stimuli) of just one particular person of those presented. It has been proposed that these are concept cells, responding to just a single concept. However, a direct experimental test of the concept cell idea appears impossible, because it would need the measurement of the response of each cell to enormous numbers of other stimuli. Here we propose a new statistical method for analysis of the data that gives a more powerful way to analyze how close data are to the concept-cell idea. Central to the model is the neuronal sparsity, defined as the total fraction of stimuli that elicit an above-threshold response in the neuron. The model exploits the large number of sampled neurons to give sensitivity to situations where the average response sparsity is much less than one response for the number of presented stimuli. We show that a conventional model where a single sparsity is postulated for all neurons gives an extremely poor fit to the data. In contrast, a model with two dramatically different populations gives an excellent fit to data from the hippocampus and entorhinal cortex. In the hippocampus, one population has 7% of the cells with a 2.6% sparsity. But a much larger fraction (93%) respond to only 0.1% of the stimuli. This can result in an extreme bias in the responsiveness of reported neurons compared with a typical neuron. Finally, we show how to allow for the fact that some identified units correspond to multiple neurons and find that our conclusions at the neural level are quantitatively changed but strengthened, with an even stronger difference between the two populations.

  1. Vernier Caliper and Micrometer Computer Models Using Easy Java Simulation and Its Pedagogical Design Features--Ideas for Augmenting Learning with Real Instruments

    ERIC Educational Resources Information Center

    Wee, Loo Kang; Ning, Hwee Tiang

    2014-01-01

    This paper presents the customization of Easy Java Simulation models, used with actual laboratory instruments, to create active experiential learning for measurements. The laboratory instruments are the vernier caliper and the micrometer. Three computer model design ideas that complement real equipment are discussed. These ideas involve (1) a…

  2. Yeast Droplets

    NASA Astrophysics Data System (ADS)

    Nguyen, Baochi; Upadhyaya, Arpita; van Oudenaarden, Alexander; Brenner, Michael

    2002-11-01

    It is well known that the Young's law and surface tension govern the shape of liquid droplets on solid surfaces. Here we address through experiments and theory the shape of growing aggregates of yeast on agar substrates, and assess whether these ideas still hold. Experiments are carried out on Baker's yeast, with different levels of expressions of an adhesive protein governing cell-cell and cell-substrate adhesion. Changing either the agar concentration or the expression of this protein modifies the local contact angle of a yeast droplet. When the colony is small, the shape is a spherical cap with the contact angle obeying Young's law. However, above a critical volume this structure is unstable, and the droplet becomes nonspherical. We present a theoretical model where this instability is caused by bulk elastic effects. The model predicts that the transition depends on both volume and contact angle, in a manner quantitatively consistent with our experiments.

  3. Evidence for success in health promotion: suggestions for improvement.

    PubMed

    Macdonald, G; Veen, C; Tones, K

    1996-09-01

    This paper argues that health promotion needs to develop an approach to evaluation and effectiveness that values qualitative methodologies. It posits the idea that qualitative research could learn from the experience of quantitative researchers and promote more useful ways of measuring effectiveness by the use of intermediate and indirect indicators. It refers to a European-wide project designed to gather information on the effectiveness of health promotion interventions. This project discovered that there was a need for an instrument that allowed qualitative intervention methodologies to be assessed in the same way as quantitative methods.

  4. Integral evaluation of variants of renovation projects for Moscow city blocks

    NASA Astrophysics Data System (ADS)

    Kotov, Egor; Chulkov, Vitaly; Chulkov, Georgy

    2018-06-01

    The idea of renovation, as a reorganization of the urban area, and the term "renovation" were proposed at the Moscow State University of Civil Engineering at the peak of the period of "spot construction" of the existing town-planning composition of the city of Moscow. The idea was actively supported, and the term "renovation" was actively used in the directive documents and Decrees of the Government of Moscow at that time. Then the idea of " spot construction" was criticized, its popularity and intensity of use decreased significantly. Recently the term "renovation", the content of which is now not associated with the idea of "dotted building" of urban areas, once again acquired relevance in the activities of the Moscow Government in connection with the need to significantly improve the level and quality of life of the population. Renovation is now understood as a complex problem, combining new construction, reconstruction, social and transport aspects of life and work, demolition of buildings and structures that have served their time, handling construction waste and again, as well as organizational and logistical issues of resettlement of Muscovites. The complexity of the coverage of all these aspects of renovation, as a multi-layered and multi-parametrical socio-technical field of activity supposes an examination of the significant diversity of individual characteristics, revealing the degree of their significance and interrelationships, and requires creating new information computer technologies. Their use is able not only to interconnect individual characteristics, to perform diagnostics and monitoring changes in the quantitative values of individual parameters, but also to operate with integral evaluations of the interaction of all the above-mentioned aspects of renovation. Russian construction science pays serious attention to creation and application of models of "folding" of certain parameters and characteristics into integrated comprehensive assessments that allow to operate at such levels of management with such arguments when assessing the quality of processes and performance results. One of the varieties of such models is considered, which is used in the analysis of stationary and mobile environments of construction production.

  5. Quantitative Analysis of Strategic Voting in Anonymous Voting Systems

    ERIC Educational Resources Information Center

    Wang, Tiance

    2016-01-01

    Democratically choosing a single preference from three or more candidate options is not a straightforward matter. There are many competing ideas on how to aggregate rankings of candidates. However, the Gibbard-Satterthwaite theorem implies that no fair voting system (equality among voters and equality among candidates) is immune to strategic…

  6. The Changing Discourse on Higher Education and the Nation-State, 1960-2010

    ERIC Educational Resources Information Center

    Buckner, Elizabeth S.

    2017-01-01

    This article examines changing ideas about the relationship between the nation-state and the university in international higher education development discourse through a quantitative content analysis of over 700 academic articles, conference proceedings and research reports published by the United Nations Educational, Scientific and Cultural…

  7. Knowledge Representation in a Physics Tutor. COINS Technical Report 86-37.

    ERIC Educational Resources Information Center

    Murray, Tom; Woolf, Beverly

    This paper is based on the idea that designing a knowledge representation for an intelligent physics computer tutoring system depends, in part, on the target behavior anticipated from the student. In addition, the document distinguishes between qualitative and quantitative competence in physics. These competencies are illustrated through questions…

  8. A Quantitative Assessment of Lareau's Qualitative Conclusions about Class, Race, and Parenting

    ERIC Educational Resources Information Center

    Cheadle, Jacob E.; Amato, Paul R.

    2011-01-01

    The authors used the Early Childhood Longitudinal Study, Kindergarten Class of 1998-1999, to test ideas from Lareau's qualitative study of social class differences in parenting. Consistent with Lareau, a confirmatory factor analysis supported the general concerted cultivation construct--a parenting strategy that subsumes parents' school…

  9. Modeling fixation locations using spatial point processes.

    PubMed

    Barthelmé, Simon; Trukenbrod, Hans; Engbert, Ralf; Wichmann, Felix

    2013-10-01

    Whenever eye movements are measured, a central part of the analysis has to do with where subjects fixate and why they fixated where they fixated. To a first approximation, a set of fixations can be viewed as a set of points in space; this implies that fixations are spatial data and that the analysis of fixation locations can be beneficially thought of as a spatial statistics problem. We argue that thinking of fixation locations as arising from point processes is a very fruitful framework for eye-movement data, helping turn qualitative questions into quantitative ones. We provide a tutorial introduction to some of the main ideas of the field of spatial statistics, focusing especially on spatial Poisson processes. We show how point processes help relate image properties to fixation locations. In particular we show how point processes naturally express the idea that image features' predictability for fixations may vary from one image to another. We review other methods of analysis used in the literature, show how they relate to point process theory, and argue that thinking in terms of point processes substantially extends the range of analyses that can be performed and clarify their interpretation.

  10. Mixis and Diagnôsis: Aristotle and the "Chemistry" of the Sublunary World.

    PubMed

    Viano, Cristina

    2015-08-01

    In On Generation and Corruption 1.10, Aristotle introduces the new idea of "chemical mixture" (mixis) to explain the constitution of those homogeneous substances from which all things in the sublunary world are comprised. In a mixture, the ingredients interact with one another to give rise to a new substance, qualitatively different, yet preserving the original ingredients in potentia, so that they can be separated again. In Book IV of the Meteorologica, Aristotle further suggests that bodies may be "diagnosed" according to certain passive properties, such as the fusibility of metals. While his theory of mixture has often led historians of science to identify Aristotle as one of the precursors of chemical science, his ideas have also been criticised as archaic, and implicated in a qualitative conception of the cosmos that delayed progress towards quantifying natural phenomena. In this paper, I take up the defence of Aristotle's theory by showing that his concept of mixture is not an obstacle to the development of natural science and chemistry, but, on the contrary, opens the way by offering an advanced model of qualitative analysis which does not exclude the possibility of quantitative development.

  11. Application Study of Comprehensive Forecasting Model Based on Entropy Weighting Method on Trend of PM2.5 Concentration in Guangzhou, China

    PubMed Central

    Liu, Dong-jun; Li, Li

    2015-01-01

    For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field. PMID:26110332

  12. Application Study of Comprehensive Forecasting Model Based on Entropy Weighting Method on Trend of PM2.5 Concentration in Guangzhou, China.

    PubMed

    Liu, Dong-jun; Li, Li

    2015-06-23

    For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field.

  13. Mathematics Teachers' Ideas about Mathematical Models: A Diverse Landscape

    ERIC Educational Resources Information Center

    Bautista, Alfredo; Wilkerson-Jerde, Michelle H.; Tobin, Roger G.; Brizuela, Bárbara M.

    2014-01-01

    This paper describes the ideas that mathematics teachers (grades 5-9) have regarding mathematical models of real-world phenomena, and explores how teachers' ideas differ depending on their educational background. Participants were 56 United States in-service mathematics teachers. We analyzed teachers' written responses to three open-ended…

  14. Simple mathematical law benchmarks human confrontations.

    PubMed

    Johnson, Neil F; Medina, Pablo; Zhao, Guannan; Messinger, Daniel S; Horgan, John; Gill, Paul; Bohorquez, Juan Camilo; Mattson, Whitney; Gangi, Devon; Qi, Hong; Manrique, Pedro; Velasquez, Nicolas; Morgenstern, Ana; Restrepo, Elvira; Johnson, Nicholas; Spagat, Michael; Zarama, Roberto

    2013-12-10

    Many high-profile societal problems involve an individual or group repeatedly attacking another - from child-parent disputes, sexual violence against women, civil unrest, violent conflicts and acts of terror, to current cyber-attacks on national infrastructure and ultrafast cyber-trades attacking stockholders. There is an urgent need to quantify the likely severity and timing of such future acts, shed light on likely perpetrators, and identify intervention strategies. Here we present a combined analysis of multiple datasets across all these domains which account for >100,000 events, and show that a simple mathematical law can benchmark them all. We derive this benchmark and interpret it, using a minimal mechanistic model grounded by state-of-the-art fieldwork. Our findings provide quantitative predictions concerning future attacks; a tool to help detect common perpetrators and abnormal behaviors; insight into the trajectory of a 'lone wolf'; identification of a critical threshold for spreading a message or idea among perpetrators; an intervention strategy to erode the most lethal clusters; and more broadly, a quantitative starting point for cross-disciplinary theorizing about human aggression at the individual and group level, in both real and online worlds.

  15. Simple mathematical law benchmarks human confrontations

    NASA Astrophysics Data System (ADS)

    Johnson, Neil F.; Medina, Pablo; Zhao, Guannan; Messinger, Daniel S.; Horgan, John; Gill, Paul; Bohorquez, Juan Camilo; Mattson, Whitney; Gangi, Devon; Qi, Hong; Manrique, Pedro; Velasquez, Nicolas; Morgenstern, Ana; Restrepo, Elvira; Johnson, Nicholas; Spagat, Michael; Zarama, Roberto

    2013-12-01

    Many high-profile societal problems involve an individual or group repeatedly attacking another - from child-parent disputes, sexual violence against women, civil unrest, violent conflicts and acts of terror, to current cyber-attacks on national infrastructure and ultrafast cyber-trades attacking stockholders. There is an urgent need to quantify the likely severity and timing of such future acts, shed light on likely perpetrators, and identify intervention strategies. Here we present a combined analysis of multiple datasets across all these domains which account for >100,000 events, and show that a simple mathematical law can benchmark them all. We derive this benchmark and interpret it, using a minimal mechanistic model grounded by state-of-the-art fieldwork. Our findings provide quantitative predictions concerning future attacks; a tool to help detect common perpetrators and abnormal behaviors; insight into the trajectory of a `lone wolf' identification of a critical threshold for spreading a message or idea among perpetrators; an intervention strategy to erode the most lethal clusters; and more broadly, a quantitative starting point for cross-disciplinary theorizing about human aggression at the individual and group level, in both real and online worlds.

  16. The Upward Progress of Unusually Good Ideas in a Four-Tier Hierarchy.

    ERIC Educational Resources Information Center

    Dunning, Robert Scott; Sincoff, Michael Z.

    In long established research organizations, it is necessary to safeguard good research ideas originating at lower organizational levels. The upward progress of unusually good ideas in an organizational hierarchy may be compared with that of ordinary ideas by means of a mathematical model, with the assumption that ideas follow a Poisson…

  17. Students' Ideas and Radical Constructivism

    NASA Astrophysics Data System (ADS)

    Sánchez Gómez, Pedro J.

    2016-08-01

    In this article, I study, from the point of view of the analytic philosophy of mind, the compatibility of students' ideas studies (SIS) with radical constructivism (RC). I demonstrate that RC is based on a psychology of narrow mental states; that is, the idea that the mental content of an individual can be fully characterised without any reference external to her or him. I show that this fact imposes some severe restrictions to SIS to be incorporated into RC. In particular, I argue that only qualitative studies can comply with the requirement of narrowness. Nevertheless, I propose that quantitative works can be employed as sources of types in order to study token actual students. I use this type-token dichotomy to put forward an outline of a theory of the relation between school contents and mental contents. In this view, token mental contents regarding a given topic can be defined, and probed, only by resorting to typical school contents.

  18. Schools Can be Made Better: The Ideas, Models, and Tools of Robert Fox.

    ERIC Educational Resources Information Center

    Lippitt, Ronald; Johnson, Patricia L.

    The humanistic ideas, models, and tools of educator Robert Fox are presented in eight chapters. Chapter I summarizes Fox's ideas toward clarifying values, projecting possible goals and plans toward humane education, the balance and linkage between intellectual, socio-emotional, and citizenship development, the individualization of curriculum, and…

  19. Statistical mechanics of complex neural systems and high dimensional data

    NASA Astrophysics Data System (ADS)

    Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya

    2013-03-01

    Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.

  20. Setting health research priorities using the CHNRI method: VI. Quantitative properties of human collective opinion

    PubMed Central

    Yoshida, Sachiyo; Rudan, Igor; Cousens, Simon

    2016-01-01

    Introduction Crowdsourcing has become an increasingly important tool to address many problems – from government elections in democracies, stock market prices, to modern online tools such as TripAdvisor or Internet Movie Database (IMDB). The CHNRI method (the acronym for the Child Health and Nutrition Research Initiative) for setting health research priorities has crowdsourcing as the major component, which it uses to generate, assess and prioritize between many competing health research ideas. Methods We conducted a series of analyses using data from a group of 91 scorers to explore the quantitative properties of their collective opinion. We were interested in the stability of their collective opinion as the sample size increases from 15 to 90. From a pool of 91 scorers who took part in a previous CHNRI exercise, we used sampling with replacement to generate multiple random samples of different size. First, for each sample generated, we identified the top 20 ranked research ideas, among 205 that were proposed and scored, and calculated the concordance with the ranking generated by the 91 original scorers. Second, we used rank correlation coefficients to compare the ranks assigned to all 205 proposed research ideas when samples of different size are used. We also analysed the original pool of 91 scorers to to look for evidence of scoring variations based on scorers' characteristics. Results The sample sizes investigated ranged from 15 to 90. The concordance for the top 20 scored research ideas increased with sample sizes up to about 55 experts. At this point, the median level of concordance stabilized at 15/20 top ranked questions (75%), with the interquartile range also generally stable (14–16). There was little further increase in overlap when the sample size increased from 55 to 90. When analysing the ranking of all 205 ideas, the rank correlation coefficient increased as the sample size increased, with a median correlation of 0.95 reached at the sample size of 45 experts (median of the rank correlation coefficient = 0.95; IQR 0.94–0.96). Conclusions Our analyses suggest that the collective opinion of an expert group on a large number of research ideas, expressed through categorical variables (Yes/No/Not Sure/Don't know), stabilises relatively quickly in terms of identifying the ideas that have most support. In the exercise we found a high degree of reproducibility of the identified research priorities was achieved with as few as 45–55 experts. PMID:27350874

  1. Setting health research priorities using the CHNRI method: VI. Quantitative properties of human collective opinion.

    PubMed

    Yoshida, Sachiyo; Rudan, Igor; Cousens, Simon

    2016-06-01

    Crowdsourcing has become an increasingly important tool to address many problems - from government elections in democracies, stock market prices, to modern online tools such as TripAdvisor or Internet Movie Database (IMDB). The CHNRI method (the acronym for the Child Health and Nutrition Research Initiative) for setting health research priorities has crowdsourcing as the major component, which it uses to generate, assess and prioritize between many competing health research ideas. We conducted a series of analyses using data from a group of 91 scorers to explore the quantitative properties of their collective opinion. We were interested in the stability of their collective opinion as the sample size increases from 15 to 90. From a pool of 91 scorers who took part in a previous CHNRI exercise, we used sampling with replacement to generate multiple random samples of different size. First, for each sample generated, we identified the top 20 ranked research ideas, among 205 that were proposed and scored, and calculated the concordance with the ranking generated by the 91 original scorers. Second, we used rank correlation coefficients to compare the ranks assigned to all 205 proposed research ideas when samples of different size are used. We also analysed the original pool of 91 scorers to to look for evidence of scoring variations based on scorers' characteristics. The sample sizes investigated ranged from 15 to 90. The concordance for the top 20 scored research ideas increased with sample sizes up to about 55 experts. At this point, the median level of concordance stabilized at 15/20 top ranked questions (75%), with the interquartile range also generally stable (14-16). There was little further increase in overlap when the sample size increased from 55 to 90. When analysing the ranking of all 205 ideas, the rank correlation coefficient increased as the sample size increased, with a median correlation of 0.95 reached at the sample size of 45 experts (median of the rank correlation coefficient = 0.95; IQR 0.94-0.96). Our analyses suggest that the collective opinion of an expert group on a large number of research ideas, expressed through categorical variables (Yes/No/Not Sure/Don't know), stabilises relatively quickly in terms of identifying the ideas that have most support. In the exercise we found a high degree of reproducibility of the identified research priorities was achieved with as few as 45-55 experts.

  2. Context Dependence of Students' Views about the Role of Equations in Understanding Biology

    ERIC Educational Resources Information Center

    Watkins, Jessica; Elby, Andrew

    2013-01-01

    Students' epistemological views about biology--their ideas about what "counts" as learning and understanding biology--play a role in how they approach their courses and respond to reforms. As introductory biology courses incorporate more physics and quantitative reasoning, student attitudes about the role of equations in biology become…

  3. Teaching Electrical Energy, Voltage and Current: An Alternative Approach.

    ERIC Educational Resources Information Center

    Licht, Pieter

    1991-01-01

    A program for teaching the concepts of electric energy, voltage, and current is proposed. The ideas and concepts are introduced in a sequence that places more emphasis on some aspects that are normally treated very briefly. A phenomenological orientation, qualitative and quantitative micro- and macroscopic treatments, and the inclusion of the…

  4. Louis Guttman's Contributions to Classical Test Theory

    ERIC Educational Resources Information Center

    Zimmerman, Donald W.; Williams, Richard H.; Zumbo, Bruno D.; Ross, Donald

    2005-01-01

    This article focuses on Louis Guttman's contributions to the classical theory of educational and psychological tests, one of the lesser known of his many contributions to quantitative methods in the social sciences. Guttman's work in this field provided a rigorous mathematical basis for ideas that, for many decades after Spearman's initial work,…

  5. Inertial Mass

    ERIC Educational Resources Information Center

    King, Kenneth P.

    2007-01-01

    The inertial balance is one device that can help students to quantify the quality of inertia--a body's resistance to a change in movement--in more generally understood terms of mass. In this hands-on activity, students use the inertial balance to develop a more quantitative idea of what mass means in an inertial sense. The activity also helps…

  6. Why did the apple fall? A new model to explain Einstein’s gravity

    NASA Astrophysics Data System (ADS)

    Stannard, Warren; Blair, David; Zadnik, Marjan; Kaur, Tejinder

    2017-01-01

    Newton described gravity as an attractive force between two masses but Einstein’s General Theory of Relativity provides a very different explanation. Implicit in Einstein’s theory is the idea that gravitational effects are the result of a distortion in the shape of space-time. Despite its elegance, Einstein’s concept of gravity is rarely encountered outside of an advanced physics course as it is often considered to be too complex and too mathematical. This paper describes a new conceptual and quantitative model of gravity based on General Relativity at a level most science students should be able to understand. The model illustrates geodesics using analogies with paths of navigation on the surface of the Earth. This is extended to space and time maps incorporating the time warping effects of General Relativity. Using basic geometry, the geodesic path of a falling object near the surface of the Earth is found. From this the acceleration of an object in free fall is calculated. The model presented in this paper can answer the question, ‘Why do things fall?’ without resorting to Newton’s gravitational force.

  7. Decision making model for Foreign Object Debris/Damage (FOD) elimination in aeronautics using quantitative modeling approach

    NASA Astrophysics Data System (ADS)

    Lafon, Jose J.

    (FOD) Foreign Object Debris/Damage has been a costly issue for the commercial and military aircraft manufacturers at their production lines every day. FOD can put pilots, passengers and other crews' lives into high-risk. FOD refers to any type of foreign object, particle, debris or agent in the manufacturing environment, which could contaminate/damage the product or otherwise undermine quality standards. Nowadays, FOD is currently addressed with prevention programs, elimination techniques, and designation of FOD areas, controlled access to FOD areas, restrictions of personal items entering designated areas, tool accountability, etc. All of the efforts mentioned before, have not shown a significant reduction in FOD occurrence in the manufacturing processes. This research presents a Decision Making Model approach based on a logistic regression predictive model that was previously made by other researchers. With a general idea of the FOD expected, elimination plans can be put in place and start eradicating the problem minimizing the cost and time spend on the prediction, detection and/or removal of FOD.

  8. A Constructivist-Based Model for the Teaching of Dissolution of Gas in a Liquid

    ERIC Educational Resources Information Center

    Calik, Muammer; Ayas, Alipasa; Coll, Richard K.

    2006-01-01

    In this article we present details of a four-step constructivist-based teaching strategy, which helps students understand the dissolution of a gas in a liquid. The model derived from Ayas (1995) involves elicitation of pre-existing ideas, focusing on the target concept, challenging students' ideas, and applying newly constructed ideas to similar…

  9. Higher Education Development in Korea: Western University Ideas, Confucian Tradition, and Economic Development

    ERIC Educational Resources Information Center

    Shin, Jung Cheol

    2012-01-01

    The features of Korean higher education development are related to sociocultural tradition (Confucian tradition), the model university ideas, and economic development in Korea. The modern university ideas adopted in Korean are based on the German model which was established by the Japanese colonial government and drawing on the US university model…

  10. Models in biology: ‘accurate descriptions of our pathetic thinking’

    PubMed Central

    2014-01-01

    In this essay I will sketch some ideas for how to think about models in biology. I will begin by trying to dispel the myth that quantitative modeling is somehow foreign to biology. I will then point out the distinction between forward and reverse modeling and focus thereafter on the former. Instead of going into mathematical technicalities about different varieties of models, I will focus on their logical structure, in terms of assumptions and conclusions. A model is a logical machine for deducing the latter from the former. If the model is correct, then, if you believe its assumptions, you must, as a matter of logic, also believe its conclusions. This leads to consideration of the assumptions underlying models. If these are based on fundamental physical laws, then it may be reasonable to treat the model as ‘predictive’, in the sense that it is not subject to falsification and we can rely on its conclusions. However, at the molecular level, models are more often derived from phenomenology and guesswork. In this case, the model is a test of its assumptions and must be falsifiable. I will discuss three models from this perspective, each of which yields biological insights, and this will lead to some guidelines for prospective model builders. PMID:24886484

  11. Old wine in new bottles: decanting systemic family process research in the era of evidence-based practice.

    PubMed

    Rohrbaugh, Michael J

    2014-09-01

    Social cybernetic (systemic) ideas from the early Family Process era, though emanating from qualitative clinical observation, have underappreciated heuristic potential for guiding quantitative empirical research on problem maintenance and change. The old conceptual wines we have attempted to repackage in new, science-friendly bottles include ironic processes (when "solutions" maintain problems), symptom-system fit (when problems stabilize relationships), and communal coping (when we-ness helps people change). Both self-report and observational quantitative methods have been useful in tracking these phenomena, and together the three constructs inform a team-based family consultation approach to working with difficult health and behavior problems. In addition, a large-scale, quantitatively focused effectiveness trial of family therapy for adolescent drug abuse highlights the importance of treatment fidelity and qualitative approaches to examining it. In this sense, echoing the history of family therapy research, our experience with juxtaposing quantitative and qualitative methods has gone full circle-from qualitative to quantitative observation and back again. © 2014 FPI, Inc.

  12. Old Wine in New Bottles: Decanting Systemic Family Process Research in the Era of Evidence-Based Practice†

    PubMed Central

    Rohrbaugh, Michael J.

    2015-01-01

    Social cybernetic (systemic) ideas from the early Family Process era, though emanating from qualitative clinical observation, have underappreciated heuristic potential for guiding quantitative empirical research on problem maintenance and change. The old conceptual wines we have attempted to repackage in new, science-friendly bottles include ironic processes (when “solutions” maintain problems), symptom-system fit (when problems stabilize relationships), and communal coping (when we-ness helps people change). Both self-report and observational quantitative methods have been useful in tracking these phenomena, and together the three constructs inform a team-based family consultation (FAMCON) approach to working with difficult health and behavior problems. In addition, a large-scale, quantitatively focused effectiveness trial of family therapy for adolescent drug abuse highlights the importance of treatment fidelity and qualitative approaches to examining it. In this sense, echoing the history of family therapy research, our experience with juxtaposing quantitative and qualitative methods has gone full circle – from qualitative to quantitative observation and back again. PMID:24905101

  13. Using Active Learning to Teach Concepts and Methods in Quantitative Biology.

    PubMed

    Waldrop, Lindsay D; Adolph, Stephen C; Diniz Behn, Cecilia G; Braley, Emily; Drew, Joshua A; Full, Robert J; Gross, Louis J; Jungck, John A; Kohler, Brynja; Prairie, Jennifer C; Shtylla, Blerta; Miller, Laura A

    2015-11-01

    This article provides a summary of the ideas discussed at the 2015 Annual Meeting of the Society for Integrative and Comparative Biology society-wide symposium on Leading Students and Faculty to Quantitative Biology through Active Learning. It also includes a brief review of the recent advancements in incorporating active learning approaches into quantitative biology classrooms. We begin with an overview of recent literature that shows that active learning can improve students' outcomes in Science, Technology, Engineering and Math Education disciplines. We then discuss how this approach can be particularly useful when teaching topics in quantitative biology. Next, we describe some of the recent initiatives to develop hands-on activities in quantitative biology at both the graduate and the undergraduate levels. Throughout the article we provide resources for educators who wish to integrate active learning and technology into their classrooms. © The Author 2015. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  14. Nonlinear Chemical Dynamics and Synchronization

    NASA Astrophysics Data System (ADS)

    Li, Ning

    Alan Turing's work on morphogenesis, more than half a century ago, continues to motivate and inspire theoretical and experimental biologists even today. That said, there are very few experimental systems for which Turing's theory is applicable. In this thesis we present an experimental reaction-diffusion system ideally suited for testing Turing's ideas in synthetic "cells" consisting of microfluidically produced surfactant-stabilized emulsions in which droplets containing the Belousov-Zhabotinsky (BZ) oscillatory chemical reactants are dispersed in oil. The BZ reaction has become the prototype of nonlinear dynamics in chemistry and a preferred system for exploring the behavior of coupled nonlinear oscillators. Our system consists of a surfactant stabilized monodisperse emulsion of drops of aqueous BZ solution dispersed in a continuous phase of oil. In contrast to biology, here the chemistry is understood, rate constants are measured and interdrop coupling is purely diffusive. We explore a large set of parameters through control of rate constants, drop size, spacing, and spatial arrangement of the drops in lines and rings in one-dimension (1D) and hexagonal arrays in two-dimensions (2D). The Turing model is regarded as a metaphor for morphogenesis in biology but not for prediction. Here, we develop a quantitative and falsifiable reaction-diffusion model that we experimentally test with synthetic cells. We quantitatively establish the extent to which the Turing model in 1D describes both stationary pattern formation and temporal synchronization of chemical oscillators via reaction-diffusion and in 2D demonstrate that chemical morphogenesis drives physical differentiation in synthetic cells.

  15. Evolutionary and ecological approaches to the study of personality

    PubMed Central

    Réale, Denis; Dingemanse, Niels J.; Kazem, Anahita J. N.; Wright, Jonathan

    2010-01-01

    This introduction to the themed issue on Evolutionary and ecological approaches to the study of personality provides an overview of conceptual, theoretical and methodological progress in research on animal personalities over the last decade, and places the contributions to this volume in context. The issue has three main goals. First, we aimed to bring together theoreticians to contribute to the development of models providing adaptive explanations for animal personality that could guide empiricists, and stimulate exchange of ideas between the two groups of researchers. Second, we aimed to stimulate cross-fertilization between different scientific fields that study personality, namely behavioural ecology, psychology, genomics, quantitative genetics, neuroendocrinology and developmental biology. Third, we aimed to foster the application of an evolutionary framework to the study of personality. PMID:21078646

  16. Perspectives on Porous Media MR in Clinical MRI

    NASA Astrophysics Data System (ADS)

    Sigmund, E. E.

    2011-03-01

    Many goals and challenges of research in natural or synthetic porous media are mirrored in quantitative medical MRI. This review will describe examples where MR techniques used in porous media (particularly diffusion-weighted imaging (DWI)) are applied to physiological pathologies. Tissue microstructure is one area with great overlap with porous media science. Diffusion-weighting (esp. in neurological tissue) has motivated models with explicit physical dimensions, statistical parameters, empirical descriptors, or hybrids thereof. Another clinically relevant microscopic process is active flow. Renal (kidney) tissue possesses significant active vascular / tubular transport that manifests as "pseudodiffusion." Cancerous lesions involve anomalies in both structure and flow. The tools of magnetic resonance and their interpretation in porous media has had great impact on clinical MRI, and continued cross-fertilization of ideas can only enhance the progress of both fields.

  17. Image dehazing based on non-local saturation

    NASA Astrophysics Data System (ADS)

    Wang, Linlin; Zhang, Qian; Yang, Deyun; Hou, Yingkun; He, Xiaoting

    2018-04-01

    In this paper, a method based on non-local saturation algorithm is proposed to avoid block and halo effect for single image dehazing with dark channel prior. First we convert original image from RGB color space into HSV color space with the idea of non-local method. Image saturation is weighted equally by the size of fixed window according to image resolution. Second we utilize the saturation to estimate the atmospheric light value and transmission rate. Then through the function of saturation and transmission, the haze-free image is obtained based on the atmospheric scattering model. Comparing the results of existing methods, our method can restore image color and enhance contrast. We guarantee the proposed method with quantitative and qualitative evaluation respectively. Experiments show the better visual effect with high efficiency.

  18. Accomplishing the Visions for Teacher Education Programs Advocated in the National Science Education Standards

    NASA Astrophysics Data System (ADS)

    Akcay, Hakan; Yager, Robert

    2010-10-01

    The purpose of this study was to investigate the advantages of an approach to instruction using current problems and issues as curriculum organizers and illustrating how teaching must change to accomplish real learning. The study sample consisted of 41 preservice science teachers (13 males and 28 females) in a model science teacher education program. Both qualitative and quantitative research methods were used to determine success with science discipline-specific “Societal and Educational Applications” courses as one part of a total science teacher education program at a large Midwestern university. Students were involved with idea generation, consideration of multiple points of views, collaborative inquiries, and problem solving. All of these factors promoted grounded instruction using constructivist perspectives that situated science with actual experiences in the lives of students.

  19. Productivity of "collisions generate heat" for reconciling an energy model with mechanistic reasoning: A case study

    NASA Astrophysics Data System (ADS)

    Scherr, Rachel E.; Robertson, Amy D.

    2015-06-01

    We observe teachers in professional development courses about energy constructing mechanistic accounts of energy transformations. We analyze a case in which teachers investigating adiabatic compression develop a model of the transformation of kinetic energy to thermal energy. Among their ideas is the idea that thermal energy is generated as a byproduct of individual particle collisions, which is represented in science education research literature as an obstacle to learning. We demonstrate that in this instructional context, the idea that individual particle collisions generate thermal energy is not an obstacle to learning, but instead is productive: it initiates intellectual progress. Specifically, this idea initiates the reconciliation of the teachers' energy model with mechanistic reasoning about adiabatic compression, and leads to a canonically correct model of the transformation of kinetic energy into thermal energy. We claim that the idea's productivity is influenced by features of our particular instructional context, including the instructional goals of the course, the culture of collaborative sense making, and the use of certain representations of energy.

  20. The adaptive nature of eye movements in linguistic tasks: how payoff and architecture shape speed-accuracy trade-offs.

    PubMed

    Lewis, Richard L; Shvartsman, Michael; Singh, Satinder

    2013-07-01

    We explore the idea that eye-movement strategies in reading are precisely adapted to the joint constraints of task structure, task payoff, and processing architecture. We present a model of saccadic control that separates a parametric control policy space from a parametric machine architecture, the latter based on a small set of assumptions derived from research on eye movements in reading (Engbert, Nuthmann, Richter, & Kliegl, 2005; Reichle, Warren, & McConnell, 2009). The eye-control model is embedded in a decision architecture (a machine and policy space) that is capable of performing a simple linguistic task integrating information across saccades. Model predictions are derived by jointly optimizing the control of eye movements and task decisions under payoffs that quantitatively express different desired speed-accuracy trade-offs. The model yields distinct eye-movement predictions for the same task under different payoffs, including single-fixation durations, frequency effects, accuracy effects, and list position effects, and their modulation by task payoff. The predictions are compared to-and found to accord with-eye-movement data obtained from human participants performing the same task under the same payoffs, but they are found not to accord as well when the assumptions concerning payoff optimization and processing architecture are varied. These results extend work on rational analysis of oculomotor control and adaptation of reading strategy (Bicknell & Levy, ; McConkie, Rayner, & Wilson, 1973; Norris, 2009; Wotschack, 2009) by providing evidence for adaptation at low levels of saccadic control that is shaped by quantitatively varying task demands and the dynamics of processing architecture. Copyright © 2013 Cognitive Science Society, Inc.

  1. Investigating Elementary Teachers' Thinking About and Learning to Notice Students' Science Ideas

    NASA Astrophysics Data System (ADS)

    Luna, Melissa Jo

    Children naturally use observations and everyday thinking to construct explanations as to why phenomena happen in the world. Science instruction can benefit by starting with these ideas to help children build coherent scientific understandings of how the physical world works. To do so, science teaching must involve attending to students' ideas so that those ideas become the basis for learning. Yet while science education reform requires teachers to pay close attention to their students' ideas, we know little about what teachers think this means in practice. To examine this issue, my dissertation research is two-fold. First, I examine teacher thinking by investigating how teachers understand what it means to pay attention to students' science ideas. Specifically, using new digital technology, three participating teachers captured moments of student thinking in the midst of instruction. Analysis of these moments reveals that teachers capture many different kinds of moments containing students' ideas and think about students' science ideas in different ways at different times. In particular, these three teachers most often think about students' ideas as being (a) from authority, (b) from experience, and (c) under construction. Second, I examine teacher learning through the development of an innovative science teaching video club model. The model differs from previous research on video clubs in several key ways in an attempt to focus teachers on student thinking in a sustained way. I investigate the ways in which this model was effective for engaging teachers in noticing and making sense of their students' science ideas during one implementation. Results indicate that teachers talked about student thinking early, often, and in meaningful ways. Science education leaders have recognized the potential of science teaching video clubs as a form of professional development, and the model presented in this work promotes the conditions for successful teacher learning. This work contributes to research on teacher cognition by advancing what we know about teachers' understanding of attending to students' science ideas. In addition, it provides practical information concerning the design of teacher professional development supporting their learning to attend closely to the ideas students raise about scientific phenomena.

  2. Kant on historiography and the use of regulative ideas.

    PubMed

    Kleingeld, Pauline

    2008-12-01

    In this paper, I examine Kant's methodological remarks in the 'Idea for a universal history' against the background of the Critique of pure reason. I argue that Kant's approach to the function of regulative ideas of human history as a whole may still be fruitful. This approach allows for regulative ideas that are grand in scope, but modest and fallibilistic in their epistemic status. Kant's methodological analysis should be distinguished from the specific teleological model of history he developed on its basis, however, because this model can no longer be appropriated for current purposes.

  3. Superfluid Fermi atomic gas as a quantum simulator for the study of the neutron-star equation of state in the low-density region

    NASA Astrophysics Data System (ADS)

    van Wyk, Pieter; Tajima, Hiroyuki; Inotani, Daisuke; Ohnishi, Akira; Ohashi, Yoji

    2018-01-01

    We propose a theoretical idea to use an ultracold Fermi gas as a quantum simulator for the study of the low-density region of a neutron-star interior. Our idea is different from the standard quantum simulator that heads for perfect replication of another system, such as the Hubbard model discussed in high-Tc cuprates. Instead, we use the similarity between two systems and theoretically make up for the difference between them. That is, (1) we first show that the strong-coupling theory developed by Nozières and Schmitt-Rink (NSR) can quantitatively explain the recent experiment on the equation of state (EoS) in a 6Li superfluid Fermi gas in the BCS (Bardeen-Cooper-Schrieffer) unitary limit far below the superfluid phase-transition temperature Tc. This region is considered to be very similar to the low-density region (crust regime) of a neutron star (where a nearly unitary s -wave neutron superfluid is expected). (2) We then theoretically compensate the difference that, while the effective range reff is negligibly small in a superfluid 6Li Fermi gas, it cannot be ignored (reff=2.7 fm) in a neutron star, by extending the NSR theory to include effects of reff. The calculated EoS when reff=2.7 fm is shown to agree well with the previous neutron-star EoS in the low-density region predicted in nuclear physics. Our idea indicates that an ultracold atomic gas may more flexibly be used as a quantum simulator for the study of other complicated quantum many-body systems, when we use not only the experimental high tunability, but also the recent theoretical development in this field. Since it is difficult to directly observe a neutron-star interior, our idea would provide a useful approach to the exploration for this mysterious astronomical object.

  4. Considerations for interpreting probabilistic estimates of uncertainty of forest carbon

    Treesearch

    James E. Smith; Linda S. Heath

    2000-01-01

    Quantitative estimated of carbon inventories are needed as part of nationwide attempts to reduce net release of greenhouse gases and the associated climate forcing. Naturally, an appreciable amount of uncertainty is inherent in such large-scale assessments, especially since both science and policy issues are still evolving. Decision makers need an idea of the...

  5. Researching the Impact of Teacher Professional Development Programmes Based on Action Research, Constructivism, and Systems Theory

    ERIC Educational Resources Information Center

    Zehetmeier, Stefan; Andreitz, Irina; Erlacher, Willibald; Rauch, Franz

    2015-01-01

    This paper deals with the topic of professional development programmes' impact. Concepts and ideas of action research, constructivism, and systems theory are used as a theoretical framework and are combined to describe and analyse an exemplary professional development programme in Austria. Empirical findings from both quantitative and qualitative…

  6. Investigating Early Childhood Teachers' Understandings of and Practices in Education for Sustainability in Queensland: A Japan-Australia Research Collaboration

    ERIC Educational Resources Information Center

    Inoue, Michiko; O'Gorman, Lyndal; Davis, Julie

    2016-01-01

    In a study undertaken in Queensland, Australia, analysis of a survey that included both qualitative and quantitative questions revealed that, like their Japanese counterparts, early childhood teachers do not have well-developed ideas and practices in education for sustainability (EfS). Instead, they mainly practise traditional nature-based…

  7. The Brain Network for Deductive Reasoning: A Quantitative Meta-Analysis of 28 Neuroimaging Studies

    ERIC Educational Resources Information Center

    Prado, Jerome; Chadha, Angad; Booth, James R.

    2011-01-01

    Over the course of the past decade, contradictory claims have been made regarding the neural bases of deductive reasoning. Researchers have been puzzled by apparent inconsistencies in the literature. Some have even questioned the effectiveness of the methodology used to study the neural bases of deductive reasoning. However, the idea that…

  8. Communities of Practice: A Research Paradigm for the Mixed Methods Approach

    ERIC Educational Resources Information Center

    Denscombe, Martyn

    2008-01-01

    The mixed methods approach has emerged as a "third paradigm" for social research. It has developed a platform of ideas and practices that are credible and distinctive and that mark the approach out as a viable alternative to quantitative and qualitative paradigms. However, there are also a number of variations and inconsistencies within the mixed…

  9. Definition of Historical Models of Gene Function and Their Relation to Students' Understanding of Genetics

    ERIC Educational Resources Information Center

    Gericke, Niklas Markus; Hagberg, Mariana

    2007-01-01

    Models are often used when teaching science. In this paper historical models and students' ideas about genetics are compared. The historical development of the scientific idea of the gene and its function is described and categorized into five historical models of gene function. Differences and similarities between these historical models are made…

  10. Will I be able to do my work at 60? An analysis of working conditions that hinder active ageing.

    PubMed

    Barros, Carla; Carnide, Filomena; Cunha, Liliana; Santos, Marta; Silva, Catarina

    2015-01-01

    Most developed countries have considered population ageing as one of the economic challenges that need to be overcome. Managing ageing has led to consideration of a number of policies where it is essential to increase the employment rate for older workers. This study aims to analyze the working conditions which tend to be perceived as hindering continuity in the workplace at the age of 60. 1234 workers from different sectors and socio-professional categories (52% men and 48% women; 64.5% younger than 45 years old). A quantitative overview was adopted with the use of logistic regression models. The INSAT was used (Work and Health Questionnaire). Apart from factors of great physical constraint, other less visible aspects play a role in the idea of workers not being able to continue to work by the age of 60, namely factors linked to work organizational options and relationships with others. Working conditions have a great influence in the idea of inability to perform the same type of work at 60. This notion does not only apply to older workers. In fact, even younger workers under certain working conditions hold the same view, thus raising social concerns that should be taken into account by public policies.

  11. Eliciting improved quantitative judgements using the IDEA protocol: A case study in natural resource management.

    PubMed

    Hemming, Victoria; Walshe, Terry V; Hanea, Anca M; Fidler, Fiona; Burgman, Mark A

    2018-01-01

    Natural resource management uses expert judgement to estimate facts that inform important decisions. Unfortunately, expert judgement is often derived by informal and largely untested protocols, despite evidence that the quality of judgements can be improved with structured approaches. We attribute the lack of uptake of structured protocols to the dearth of illustrative examples that demonstrate how they can be applied within pressing time and resource constraints, while also improving judgements. In this paper, we demonstrate how the IDEA protocol for structured expert elicitation may be deployed to overcome operational challenges while improving the quality of judgements. The protocol was applied to the estimation of 14 future abiotic and biotic events on the Great Barrier Reef, Australia. Seventy-six participants with varying levels of expertise related to the Great Barrier Reef were recruited and allocated randomly to eight groups. Each participant provided their judgements using the four-step question format of the IDEA protocol ('Investigate', 'Discuss', 'Estimate', 'Aggregate') through remote elicitation. When the events were realised, the participant judgements were scored in terms of accuracy, calibration and informativeness. The results demonstrate that the IDEA protocol provides a practical, cost-effective, and repeatable approach to the elicitation of quantitative estimates and uncertainty via remote elicitation. We emphasise that i) the aggregation of diverse individual judgements into pooled group judgments almost always outperformed individuals, and ii) use of a modified Delphi approach helped to remove linguistic ambiguity, and further improved individual and group judgements. Importantly, the protocol encourages review, critical appraisal and replication, each of which is required if judgements are to be used in place of data in a scientific context. The results add to the growing body of literature that demonstrates the merit of using structured elicitation protocols. We urge decision-makers and analysts to use insights and examples to improve the evidence base of expert judgement in natural resource management.

  12. Reinforced communication and social navigation: Remember your friends and remember yourself

    NASA Astrophysics Data System (ADS)

    Mirshahvalad, A.; Rosvall, M.

    2011-09-01

    In social systems, people communicate with each other and form groups based on their interests. The pattern of interactions, the network, and the ideas that flow on the network naturally evolve together. Researchers use simple models to capture the feedback between changing network patterns and ideas on the network, but little is understood about the role of past events in the feedback process. Here, we introduce a simple agent-based model to study the coupling between peoples’ ideas and social networks, and better understand the role of history in dynamic social networks. We measure how information about ideas can be recovered from information about network structure and, the other way around, how information about network structure can be recovered from information about ideas. We find that it is, in general, easier to recover ideas from the network structure than vice versa.

  13. LANGUAGE DEVELOPMENT. The developmental dynamics of marmoset monkey vocal production.

    PubMed

    Takahashi, D Y; Fenley, A R; Teramoto, Y; Narayanan, D Z; Borjon, J I; Holmes, P; Ghazanfar, A A

    2015-08-14

    Human vocal development occurs through two parallel interactive processes that transform infant cries into more mature vocalizations, such as cooing sounds and babbling. First, natural categories of sounds change as the vocal apparatus matures. Second, parental vocal feedback sensitizes infants to certain features of those sounds, and the sounds are modified accordingly. Paradoxically, our closest living ancestors, nonhuman primates, are thought to undergo few or no production-related acoustic changes during development, and any such changes are thought to be impervious to social feedback. Using early and dense sampling, quantitative tracking of acoustic changes, and biomechanical modeling, we showed that vocalizations in infant marmoset monkeys undergo dramatic changes that cannot be solely attributed to simple consequences of growth. Using parental interaction experiments, we found that contingent parental feedback influences the rate of vocal development. These findings overturn decades-old ideas about primate vocalizations and show that marmoset monkeys are a compelling model system for early vocal development in humans. Copyright © 2015, American Association for the Advancement of Science.

  14. In silico evolution of biochemical networks

    NASA Astrophysics Data System (ADS)

    Francois, Paul

    2010-03-01

    We use computational evolution to select models of genetic networks that can be built from a predefined set of parts to achieve a certain behavior. Selection is made with the help of a fitness defining biological functions in a quantitative way. This fitness has to be specific to a process, but general enough to find processes common to many species. Computational evolution favors models that can be built by incremental improvements in fitness rather than via multiple neutral steps or transitions through less fit intermediates. With the help of these simulations, we propose a kinetic view of evolution, where networks are rapidly selected along a fitness gradient. This mathematics recapitulates Darwin's original insight that small changes in fitness can rapidly lead to the evolution of complex structures such as the eye, and explain the phenomenon of convergent/parallel evolution of similar structures in independent lineages. We will illustrate these ideas with networks implicated in embryonic development and patterning of vertebrates and primitive insects.

  15. Simple mathematical law benchmarks human confrontations

    PubMed Central

    Johnson, Neil F.; Medina, Pablo; Zhao, Guannan; Messinger, Daniel S.; Horgan, John; Gill, Paul; Bohorquez, Juan Camilo; Mattson, Whitney; Gangi, Devon; Qi, Hong; Manrique, Pedro; Velasquez, Nicolas; Morgenstern, Ana; Restrepo, Elvira; Johnson, Nicholas; Spagat, Michael; Zarama, Roberto

    2013-01-01

    Many high-profile societal problems involve an individual or group repeatedly attacking another – from child-parent disputes, sexual violence against women, civil unrest, violent conflicts and acts of terror, to current cyber-attacks on national infrastructure and ultrafast cyber-trades attacking stockholders. There is an urgent need to quantify the likely severity and timing of such future acts, shed light on likely perpetrators, and identify intervention strategies. Here we present a combined analysis of multiple datasets across all these domains which account for >100,000 events, and show that a simple mathematical law can benchmark them all. We derive this benchmark and interpret it, using a minimal mechanistic model grounded by state-of-the-art fieldwork. Our findings provide quantitative predictions concerning future attacks; a tool to help detect common perpetrators and abnormal behaviors; insight into the trajectory of a ‘lone wolf'; identification of a critical threshold for spreading a message or idea among perpetrators; an intervention strategy to erode the most lethal clusters; and more broadly, a quantitative starting point for cross-disciplinary theorizing about human aggression at the individual and group level, in both real and online worlds. PMID:24322528

  16. Ecosystem functioning and maximum entropy production: a quantitative test of hypotheses.

    PubMed

    Meysman, Filip J R; Bruers, Stijn

    2010-05-12

    The idea that entropy production puts a constraint on ecosystem functioning is quite popular in ecological thermodynamics. Yet, until now, such claims have received little quantitative verification. Here, we examine three 'entropy production' hypotheses that have been forwarded in the past. The first states that increased entropy production serves as a fingerprint of living systems. The other two hypotheses invoke stronger constraints. The state selection hypothesis states that when a system can attain multiple steady states, the stable state will show the highest entropy production rate. The gradient response principle requires that when the thermodynamic gradient increases, the system's new stable state should always be accompanied by a higher entropy production rate. We test these three hypotheses by applying them to a set of conventional food web models. Each time, we calculate the entropy production rate associated with the stable state of the ecosystem. This analysis shows that the first hypothesis holds for all the food webs tested: the living state shows always an increased entropy production over the abiotic state. In contrast, the state selection and gradient response hypotheses break down when the food web incorporates more than one trophic level, indicating that they are not generally valid.

  17. A simulation model of IT risk on program trading

    NASA Astrophysics Data System (ADS)

    Xia, Bingying; Jiang, Wenbao; Luo, Guangxuan

    2015-12-01

    The biggest difficulty for Program trading IT risk measures lies in the loss of data, in view of this situation, the current scholars approach is collecting court, network and other public media such as all kinds of accident of IT both at home and abroad for data collection, and the loss of IT risk quantitative analysis based on this database. However, the IT risk loss database established by this method can only fuzzy reflect the real situation and not for real to make fundamental explanation. In this paper, based on the study of the concept and steps of the MC simulation, we use computer simulation method, by using the MC simulation method in the "Program trading simulation system" developed by team to simulate the real programming trading and get the IT risk loss of data through its IT failure experiment, at the end of the article, on the effectiveness of the experimental data is verified. In this way, better overcome the deficiency of the traditional research method and solves the problem of lack of IT risk data in quantitative research. More empirically provides researchers with a set of simulation method are used to study the ideas and the process template.

  18. The Role of Narratives in Sociohydrological Models of Flood Behaviors

    NASA Astrophysics Data System (ADS)

    Leong, Ching

    2018-04-01

    While current efforts to model sociohydrologic phenomena provide crucial insight, critics argue that these do not fully reflect the complexity one observes empirically in the real world. The policy sciences, with its focus on the interaction between human agency and the institutions that constrain public choice, can complement such efforts by providing a narrative approach. This paper demonstrates this complementarity by investigating the idea of resilience in a community response to floods. Using the quantitative Q methodology, we trace the dynamics of a common sociohydrologic hypothesis—the "memory effect" and how it decreases vulnerability and, more crucially, the instances when such memory effects do not obtain. Our analysis of a floodprone maladaptive community in Assam, India, finds four distinct narrative types: the Hardened Preparer, the Engineer, Discontent, and the Pessimist. This paper put forward an explicitly sociohydrological conception of resilience which takes into account the role of sociological indicators such as narrative types and perceptions. Such contextual understandings and narrative types can form the basis of generic resilience indicators which complement the anticipated outcomes of sociohydrologic models generally.

  19. Molecule-specific determination of atomic polarizabilities with the polarizable atomic multipole model.

    PubMed

    Woo Kim, Hyun; Rhee, Young Min

    2012-07-30

    Recently, many polarizable force fields have been devised to describe induction effects between molecules. In popular polarizable models based on induced dipole moments, atomic polarizabilities are the essential parameters and should be derived carefully. Here, we present a parameterization scheme for atomic polarizabilities using a minimization target function containing both molecular and atomic information. The main idea is to adopt reference data only from quantum chemical calculations, to perform atomic polarizability parameterizations even when relevant experimental data are scarce as in the case of electronically excited molecules. Specifically, our scheme assigns the atomic polarizabilities of any given molecule in such a way that its molecular polarizability tensor is well reproduced. We show that our scheme successfully works for various molecules in mimicking dipole responses not only in ground states but also in valence excited states. The electrostatic potential around a molecule with an externally perturbing nearby charge also exhibits a near-quantitative agreement with the reference data from quantum chemical calculations. The limitation of the model with isotropic atoms is also discussed to examine the scope of its applicability. Copyright © 2012 Wiley Periodicals, Inc.

  20. Factors that impact the stability of vitamin C at intermediate temperatures in a food matrix.

    PubMed

    Herbig, Anna-Lena; Renard, Catherine M G C

    2017-04-01

    The study comprises a systematic and quantitative evaluation of potential intrinsic and extrinsic factors that impact vitamin C degradation in a real food matrix. The supernatant of centrifuged apple purée was fortified in vitamin C, and degradation was followed without stirring. Model discrimination indicated better fit for the zero order model than the first order model which was hence chosen for determination of rate constants. pH influenced strongly vitamin C degradation in citrate-phosphate buffer but not in the apple purée serum. To get an idea of the impact of the food matrix, stability in apple purée serum was compared with that in carrot purée. In the latter, stability was slightly higher. Vitamin C degradation rates were not influenced by its initial concentration. The temperature effect was only marked in the temperature range 40-60°C. In the range 60-80°C, filling height of tubes had the greatest impact. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Structural relaxation in supercooled orthoterphenyl.

    PubMed

    Chong, S-H; Sciortino, F

    2004-05-01

    We report molecular-dynamics simulation results performed for a model of molecular liquid orthoterphenyl in supercooled states, which we then compare with both experimental data and mode-coupling-theory (MCT) predictions, aiming at a better understanding of structural relaxation in orthoterphenyl. We pay special attention to the wave number dependence of the collective dynamics. It is shown that the simulation results for the model share many features with experimental data for real system, and that MCT captures the simulation results at the semiquantitative level except for intermediate wave numbers connected to the overall size of the molecule. Theoretical results at the intermediate wave number region are found to be improved by taking into account the spatial correlation of the molecule's geometrical center. This supports the idea that unusual dynamical properties at the intermediate wave numbers, reported previously in simulation studies for the model and discernible in coherent neutron-scattering experimental data, are basically due to the coupling of the rotational motion to the geometrical-center dynamics. However, there still remain qualitative as well as quantitative discrepancies between theoretical prediction and corresponding simulation results at the intermediate wave numbers, which call for further theoretical investigation.

  2. Non-Markovianity in the collision model with environmental block

    NASA Astrophysics Data System (ADS)

    Jin, Jiasen; Yu, Chang-shui

    2018-05-01

    We present an extended collision model to simulate the dynamics of an open quantum system. In our model, the unit to represent the environment is, instead of a single particle, a block which consists of a number of environment particles. The introduced blocks enable us to study the effects of different strategies of system–environment interactions and states of the blocks on the non-Markovianities. We demonstrate our idea in the Gaussian channels of an all-optical system and derive a necessary and sufficient condition of non-Markovianity for such channels. Moreover, we show the equivalence of our criterion to the non-Markovian quantum jump in the simulation of the pure damping process of a single-mode field. We also show that the non-Markovianity of the channel working in the strategy that the system collides with environmental particles in each block in a certain order will be affected by the size of the block and the embedded entanglement and the effects of heating and squeezing the vacuum environmental state will quantitatively enhance the non-Markovianity.

  3. Young Children's Reasoning About Physical & Behavioural Family Resemblance: Is There a Place for a Precursor Model of Inheritance?

    NASA Astrophysics Data System (ADS)

    Ergazaki, Marida; Alexaki, Aspa; Papadopoulou, Chrysa; Kalpakiori, Marieleni

    2014-02-01

    This paper aims at exploring (a) whether preschoolers recognize that offspring share physical traits with their parents due to birth and behavioural ones due to nurture, and (b) whether they seem ready to explain shared physical traits with a `pre-biological' causal model that includes the contribution of both parents and a rudimentary notion of genes. This exploration is supposed to provide evidence for our next step, which is the development of an early years' learning environment about inheritance. Conducting individual, semi-structured interviews with 90 preschoolers (age 4.5-5.5) of four public kindergartens in Patras, we attempted to trace their reasoning about (a) whether and why offspring share physical and behavioural traits with parents and (b) which mechanism could better explain the shared physical traits. The probes were a modified six-case version of Solomon et al. (Child Dev 67:151-171, 1996) `adoption task, as well as a three-case task based on Springer's (Child Dev 66:547-558, 1995) `mechanism task' and on Solomon and Johnson's (Br J Dev Psychol 18(1):81-96, 2000) idea of genes as a `conceptual placeholder'. The qualitative and quantitative analysis of the interviews showed overlapping reasoning about the origin of physical and behavioural family resemblance. Nevertheless, we did trace the `birth-driven' argument for the attribution of the offspring's physical traits to the biological parents, as well as a preference for the `pre-biological' model that introduces a rudimentary idea of genes in order to explain shared physical traits between parents and offspring. The findings of the study and the educational implications are thoroughly discussed.

  4. New reflective symmetry design capability in the JPL-IDEAS Structure Optimization Program

    NASA Technical Reports Server (NTRS)

    Strain, D.; Levy, R.

    1986-01-01

    The JPL-IDEAS antenna structure analysis and design optimization computer program was modified to process half structure models of symmetric structures subjected to arbitrary external static loads, synthesize the performance, and optimize the design of the full structure. Significant savings in computation time and cost (more than 50%) were achieved compared to the cost of full model computer runs. The addition of the new reflective symmetry analysis design capabilities to the IDEAS program allows processing of structure models whose size would otherwise prevent automated design optimization. The new program produced synthesized full model iterative design results identical to those of actual full model program executions at substantially reduced cost, time, and computer storage.

  5. Computational method for multi-modal microscopy based on transport of intensity equation

    NASA Astrophysics Data System (ADS)

    Li, Jiaji; Chen, Qian; Sun, Jiasong; Zhang, Jialin; Zuo, Chao

    2017-02-01

    In this paper, we develop the requisite theory to describe a hybrid virtual-physical multi-modal imaging system which yields quantitative phase, Zernike phase contrast, differential interference contrast (DIC), and light field moment imaging simultaneously based on transport of intensity equation(TIE). We then give the experimental demonstration of these ideas by time-lapse imaging of live HeLa cell mitosis. Experimental results verify that a tunable lens based TIE system, combined with the appropriate post-processing algorithm, can achieve a variety of promising imaging modalities in parallel with the quantitative phase images for the dynamic study of cellular processes.

  6. Limits of quantitation - Yet another suggestion

    NASA Astrophysics Data System (ADS)

    Carlson, Jill; Wysoczanski, Artur; Voigtman, Edward

    2014-06-01

    The work presented herein suggests that the limit of quantitation concept may be rendered substantially less ambiguous and ultimately more useful as a figure of merit by basing it upon the significant figure and relative measurement error ideas due to Coleman, Auses and Gram, coupled with the correct instantiation of Currie's detection limit methodology. Simple theoretical results are presented for a linear, univariate chemical measurement system with homoscedastic Gaussian noise, and these are tested against both Monte Carlo computer simulations and laser-excited molecular fluorescence experimental results. Good agreement among experiment, theory and simulation is obtained and an easy extension to linearly heteroscedastic Gaussian noise is also outlined.

  7. The NAIMS cooperative pilot project: Design, implementation and future directions.

    PubMed

    Oh, Jiwon; Bakshi, Rohit; Calabresi, Peter A; Crainiceanu, Ciprian; Henry, Roland G; Nair, Govind; Papinutto, Nico; Constable, R Todd; Reich, Daniel S; Pelletier, Daniel; Rooney, William; Schwartz, Daniel; Tagge, Ian; Shinohara, Russell T; Simon, Jack H; Sicotte, Nancy L

    2017-10-01

    The North American Imaging in Multiple Sclerosis (NAIMS) Cooperative represents a network of 27 academic centers focused on accelerating the pace of magnetic resonance imaging (MRI) research in multiple sclerosis (MS) through idea exchange and collaboration. Recently, NAIMS completed its first project evaluating the feasibility of implementation and reproducibility of quantitative MRI measures derived from scanning a single MS patient using a high-resolution 3T protocol at seven sites. The results showed the feasibility of utilizing advanced quantitative MRI measures in multicenter studies and demonstrated the importance of careful standardization of scanning protocols, central image processing, and strategies to account for inter-site variability.

  8. Students' Development and Use of Models to Explain Electrostatic Interactions

    NASA Astrophysics Data System (ADS)

    Mayer, Kristin Elizabeth

    The National Research Council (2012) recently published A Framework for K-12 Science Education that describes a vision for science classrooms where students engage in three dimensions--scientific and engineering practices, crosscutting concepts, and disciplinary core ideas--to explain phenomena or observations they can make about the universe around them. This vision of science instruction is a significant shift from current classroom instruction. This dissertation provides detailed examples of how students developed and used models to build causal explanations of phenomena. I co-taught classes that focused on having students develop and revise models of electric fields and atomic structure using a curriculum that was designed to align with the three-dimensional vision of learning. I developed case studies of eleven students from these classes. I analyzed the students' responses and interviewed the students throughout the school year. By comparing and contrasting the analysis across the analysis of students' interviews, I identified four themes: 1) students could apply their ideas to explain novel and abstract phenomena; 2) students struggled to connect changes in their atomic models to evidence, but ended up with dynamic models of atomic structure that they could apply to explain phenomena; 3) students developed models of atomic structure that they applied to explain phenomena, but they did not use models of electric fields in this way; and 4) too much focus on details interfered with students' ability to apply their models to explain new phenomena. This dissertation highlights the importance of focusing on phenomena in classrooms that aim at aligning with three-dimensional learning. Students struggled to focus on specific content and apply their ideas to explain phenomena at the same time. In order to apply ideas to new context, students had to shift their focus from recalling ideas to applying the ideas they do have. A focus on phenomena allowed students to show their understanding through applying their ideas to new context. During this transition, students struggled, and in particular, had a hard time using evidence from experiments to justify the changes they made to their models of atomic structure. While the changes students made looked unproductive at times, by the end of the semester, students had developed models of atomic structure that incorporated relationships among charged components that they could apply to explain complex phenomena. Asking students to explore and evaluate their own ideas supported their development of models that they could apply to explain new context they experience in their future.

  9. The Foundation of Western Ideas: The Ancient Hebrews and Greeks. Grade 6 Model Lesson for Unit III. California History-Social Science Course Models.

    ERIC Educational Resources Information Center

    Zachlod, Michelle, Ed.

    Important ideas from the Judeo-Christian and Greco-Roman traditions are formally introduced to students in the sixth-grade course. Units three and five focus on people and ideas that form the roots of western civilization. The goal is to help students perceive how the viewpoints of Judeo-Christian and Greco-Roman are congruent and divergent. Unit…

  10. We Will Learn Better Only If Some Things Were Different: Arab Student Voices about Their Performance in IELTS

    ERIC Educational Resources Information Center

    Aboudan, Rima

    2011-01-01

    Although quantitative studies of educational research usually suggest some links between conditions of learning and student learning outcome, behavior and performance, the idea of engaging students in discussions on teaching and learning has not had as much attention in the United Arab Emirates as in some other countries. This paper presents…

  11. A Paradigm Shift in Nuclear Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Westmeier, Wolfram; Siemon, Klaus

    2012-08-01

    An overview of the latest developments in quantitative spectrometry software is presented. New strategies and algorithms introduced are characterized by buzzwords “Physics, no numerology”, “Fuzzy logic” and “Repeated analyses”. With the implementation of these new ideas one arrives at software capabilities that were unreachable before and which are now realized in the GAMMA-W, SODIGAM and ALPS packages.

  12. Looking at Images with Human Figures: Comparison between Autistic and Normal Children.

    ERIC Educational Resources Information Center

    van der Geest, J. N.; Kemner, C.; Camfferman, G.; Verbaten, M. N.; van Engeland, H.

    2002-01-01

    In this study, the looking behavior of 16 autistic and 14 non-autistic children toward cartoon-like scenes that included a human figure was measured quantitatively using an infrared eye-tracking device. Fixation behavior of autistic children was similar to that of their age-and IQ-matched normal peers. Results do not support the idea that autistic…

  13. Picturing Science: The Who, What, and Where of Images in Children's Award-Winning Science Trade Books

    ERIC Educational Resources Information Center

    Neutze, Donna Lee

    2008-01-01

    Educators, students, and parents are among those who have stereotypical preconceived ideas about science and scientists. The study reports on a content analysis of graphic images in 303 of the "Outstanding Science Trade Books for Students K-12" from the years 1973 through 2005. Using quantitative and qualitative content analysis, all of the images…

  14. Breakthrough thinking from inside the box.

    PubMed

    Coyne, Kevin P; Clifford, Patricia Gorman; Dye, Renée

    2007-12-01

    Companies often begin their search for great ideas either by encouraging wild, outside-the-box thinking or by conducting quantitative analysis of existing market and financial data and customer opinions. Those approaches can produce middling ideas at best, say Coyne, founder of an executive-counseling firm in Atlanta, and Clifford and Dye, strategy experts at McKinsey. The problem with the first method is that few people are very good at unstructured, abstract brainstorming. The problems with the second are that databases are usually compiled to describe current--not future--offerings, and customers rarely can tell you whether they need or want a product if they've never seen it. The secret to getting your organization to regularly generate lots of good ideas, and occasionally some great ones, is deceptively simple: First, create new boxes for people to think within so that they don't get lost in the cosmos and they have a basis for offering ideas and knowing whether they're making progress in the brainstorming session. Second, redesign ideation processes to remove obstacles that interfere with the flow of ideas--such as most people's aversion to speaking in groups larger than ten. This article offers a tested approach that poses concrete questions. For example, what do Rollerblades, Häagen-Dazs ice cream, and Spider-Man movies have in common? The answer: Each is something that adults loved as children and that was reproduced in an expensive form for grown-ups. Asking brainstorming participants to ponder how their childhood passions could be recast as adult offerings might generate some fabulous ideas for new products or services.

  15. On Quantitative Comparative Research in Communication and Language Evolution

    PubMed Central

    Oller, D. Kimbrough; Griebel, Ulrike

    2014-01-01

    Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives. PMID:25285057

  16. On Quantitative Comparative Research in Communication and Language Evolution.

    PubMed

    Oller, D Kimbrough; Griebel, Ulrike

    2014-09-01

    Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives.

  17. Measuring the effectiveness and impact of an open innovation platform.

    PubMed

    Carroll, Glenn P; Srivastava, Sanjay; Volini, Adam S; Piñeiro-Núñez, Marta M; Vetman, Tatiana

    2017-05-01

    Today, most pharmaceutical companies complement their traditional R&D models with some variation on the Open Innovation (OI) approach in an effort to better access global scientific talent, ideas and hypotheses. Traditional performance indicators that measure economic returns from R&D through commercialization are often not applicable to the practical assessment of these OI approaches, particularly within the context of early drug discovery. This leaves OI programs focused on early R&D without a standard assessment framework from which to evaluate overall performance. This paper proposes a practical dashboard for such assessment, encompassing quantitative and qualitative elements, to enable decision-making and improvement of future performance. The use of this dashboard is illustrated using real-time data from the Lilly Open Innovation Drug Discovery (OIDD) program. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  18. Quantitative computer simulations of extraterrestrial processing operations

    NASA Technical Reports Server (NTRS)

    Vincent, T. L.; Nikravesh, P. E.

    1989-01-01

    The automation of a small, solid propellant mixer was studied. Temperature control is under investigation. A numerical simulation of the system is under development and will be tested using different control options. Control system hardware is currently being put into place. The construction of mathematical models and simulation techniques for understanding various engineering processes is also studied. Computer graphics packages were utilized for better visualization of the simulation results. The mechanical mixing of propellants is examined. Simulation of the mixing process is being done to study how one can control for chaotic behavior to meet specified mixing requirements. An experimental mixing chamber is also being built. It will allow visual tracking of particles under mixing. The experimental unit will be used to test ideas from chaos theory, as well as to verify simulation results. This project has applications to extraterrestrial propellant quality and reliability.

  19. Quantifying social development in autism.

    PubMed

    Volkmar, F R; Carter, A; Sparrow, S S; Cicchetti, D V

    1993-05-01

    This study was concerned with the development of quantitative measures of social development in autism. Multiple regression equations predicting social, communicative, and daily living skills on the Vineland Adaptive Behavior Scales were derived from a large, normative sample and applied to groups of autistic and nonautistic, developmentally disordered children. Predictive models included either mental or chronological age and other relevant variables. Social skills in the autistic group were more than two standard deviations below those predicted by their mental age; an index derived from the ratio of actual to predicted social skills correctly classified 94% of the autistic and 92% of the nonautistic, developmentally disordered cases. The findings are consistent with the idea that social disturbance is central in the definition of autism. The approach used in this study has potential advantages for providing more precise measures of social development in autism.

  20. Active mask segmentation of fluorescence microscope images.

    PubMed

    Srinivasa, Gowri; Fickus, Matthew C; Guo, Yusong; Linstedt, Adam D; Kovacević, Jelena

    2009-08-01

    We propose a new active mask algorithm for the segmentation of fluorescence microscope images of punctate patterns. It combines the (a) flexibility offered by active-contour methods, (b) speed offered by multiresolution methods, (c) smoothing offered by multiscale methods, and (d) statistical modeling offered by region-growing methods into a fast and accurate segmentation tool. The framework moves from the idea of the "contour" to that of "inside and outside," or masks, allowing for easy multidimensional segmentation. It adapts to the topology of the image through the use of multiple masks. The algorithm is almost invariant under initialization, allowing for random initialization, and uses a few easily tunable parameters. Experiments show that the active mask algorithm matches the ground truth well and outperforms the algorithm widely used in fluorescence microscopy, seeded watershed, both qualitatively, as well as quantitatively.

  1. Introductory physics going soft

    NASA Astrophysics Data System (ADS)

    Langbeheim, Elon; Livne, Shelly; Safran, Samuel A.; Yerushalmi, Edit

    2012-01-01

    We describe an elective course on soft matter at the level of introductory physics. Soft matter physics serves as a context that motivates the presentation of basic ideas in statistical thermodynamics and their applications. It also is an example of a contemporary field that is interdisciplinary and touches on chemistry, biology, and physics. We outline a curriculum that uses the lattice gas model as a quantitative and visual tool, initially to introduce entropy, and later to facilitate the calculation of interactions. We demonstrate how free energy minimization can be used to teach students to understand the properties of soft matter systems such as the phases of fluid mixtures, wetting of interfaces, self-assembly of surfactants, and polymers. We discuss several suggested activities in the form of inquiry projects which allow students to apply the concepts they have learned to experimental systems.

  2. Endogenous versus Exogenous Origins of Crises

    NASA Astrophysics Data System (ADS)

    Sornette, Didier

    Are large biological extinctions such as the Cretaceous/Tertiary KT boundary due to a meteorite, extreme volcanic activity or self-organized critical extinction cascades? Are commercial successes due to a progressive reputation cascade or the result of a well orchestrated advertisement? Determining the chain of causality for Xevents in complex systems requires disentangling interwoven exogenous and endogenous contributions with either no clear signature or too many signatures. Here, I review several efforts carried out with collaborators which suggest a general strategy for understanding the organizations of several complex systems under the dual effect of endogenous and exogenous fluctuations. The studied examples are: internet download shocks, book sale shocks, social shocks, financial volatility shocks, and financial crashes. Simple models are offered to quantitatively relate the endogenous organization to the exogenous response of the system. Suggestions for applications of these ideas to many other systems are offered.

  3. Snakes as hazards: modelling risk by chasing chimpanzees.

    PubMed

    McGrew, William C

    2015-04-01

    Snakes are presumed to be hazards to primates, including humans, by the snake detection hypothesis (Isbell in J Hum Evol 51:1-35, 2006; Isbell, The fruit, the tree, and the serpent. Why we see so well, 2009). Quantitative, systematic data to test this idea are lacking for the behavioural ecology of living great apes and human foragers. An alternative proxy is snakes encountered by primatologists seeking, tracking, and observing wild chimpanzees. We present 4 years of such data from Mt. Assirik, Senegal. We encountered 14 species of snakes a total of 142 times. Almost two-thirds of encounters were with venomous snakes. Encounters occurred most often in forest and least often in grassland, and more often in the dry season. The hypothesis seems to be supported, if frequency of encounter reflects selective risk of morbidity or mortality.

  4. Ecosystem stewardship: good idea, but how?

    USDA-ARS?s Scientific Manuscript database

    Ecosystem stewardship and resilience-based management are admirable concepts that remain largely conceptual. Beyond a suite of general ideas, including linkages among ecological models, monitoring, stakeholder engagement, and social learning, there is not a replicable method to use the ideas in the ...

  5. Selection of appropriate training and validation set chemicals for modelling dermal permeability by U-optimal design.

    PubMed

    Xu, G; Hughes-Oliver, J M; Brooks, J D; Yeatts, J L; Baynes, R E

    2013-01-01

    Quantitative structure-activity relationship (QSAR) models are being used increasingly in skin permeation studies. The main idea of QSAR modelling is to quantify the relationship between biological activities and chemical properties, and thus to predict the activity of chemical solutes. As a key step, the selection of a representative and structurally diverse training set is critical to the prediction power of a QSAR model. Early QSAR models selected training sets in a subjective way and solutes in the training set were relatively homogenous. More recently, statistical methods such as D-optimal design or space-filling design have been applied but such methods are not always ideal. This paper describes a comprehensive procedure to select training sets from a large candidate set of 4534 solutes. A newly proposed 'Baynes' rule', which is a modification of Lipinski's 'rule of five', was used to screen out solutes that were not qualified for the study. U-optimality was used as the selection criterion. A principal component analysis showed that the selected training set was representative of the chemical space. Gas chromatograph amenability was verified. A model built using the training set was shown to have greater predictive power than a model built using a previous dataset [1].

  6. A constitutive law for dense granular flows.

    PubMed

    Jop, Pierre; Forterre, Yoël; Pouliquen, Olivier

    2006-06-08

    A continuum description of granular flows would be of considerable help in predicting natural geophysical hazards or in designing industrial processes. However, the constitutive equations for dry granular flows, which govern how the material moves under shear, are still a matter of debate. One difficulty is that grains can behave like a solid (in a sand pile), a liquid (when poured from a silo) or a gas (when strongly agitated). For the two extreme regimes, constitutive equations have been proposed based on kinetic theory for collisional rapid flows, and soil mechanics for slow plastic flows. However, the intermediate dense regime, where the granular material flows like a liquid, still lacks a unified view and has motivated many studies over the past decade. The main characteristics of granular liquids are: a yield criterion (a critical shear stress below which flow is not possible) and a complex dependence on shear rate when flowing. In this sense, granular matter shares similarities with classical visco-plastic fluids such as Bingham fluids. Here we propose a new constitutive relation for dense granular flows, inspired by this analogy and recent numerical and experimental work. We then test our three-dimensional (3D) model through experiments on granular flows on a pile between rough sidewalls, in which a complex 3D flow pattern develops. We show that, without any fitting parameter, the model gives quantitative predictions for the flow shape and velocity profiles. Our results support the idea that a simple visco-plastic approach can quantitatively capture granular flow properties, and could serve as a basic tool for modelling more complex flows in geophysical or industrial applications.

  7. Speech motor development: Integrating muscles, movements, and linguistic units.

    PubMed

    Smith, Anne

    2006-01-01

    A fundamental problem for those interested in human communication is to determine how ideas and the various units of language structure are communicated through speaking. The physiological concepts involved in the control of muscle contraction and movement are theoretically distant from the processing levels and units postulated to exist in language production models. A review of the literature on adult speakers suggests that they engage complex, parallel processes involving many units, including sentence, phrase, syllable, and phoneme levels. Infants must develop multilayered interactions among language and motor systems. This discussion describes recent studies of speech motor performance relative to varying linguistic goals during the childhood, teenage, and young adult years. Studies of the developing interactions between speech motor and language systems reveal both qualitative and quantitative differences between the developing and the mature systems. These studies provide an experimental basis for a more comprehensive theoretical account of how mappings between units of language and units of action are formed and how they function. Readers will be able to: (1) understand the theoretical differences between models of speech motor control and models of language processing, as well as the nature of the concepts used in the two different kinds of models, (2) explain the concept of coarticulation and state why this phenomenon has confounded attempts to determine the role of linguistic units, such as syllables and phonemes, in speech production, (3) describe the development of speech motor performance skills and specify quantitative and qualitative differences between speech motor performance in children and adults, and (4) describe experimental methods that allow scientists to study speech and limb motor control, as well as compare units of action used to study non-speech and speech movements.

  8. Analogical Scaffolding and the Learning of Abstract Ideas in Physics: An Example from Electromagnetic Waves

    ERIC Educational Resources Information Center

    Podolefsky, Noah S.; Finkelstein, Noah D.

    2007-01-01

    This paper describes a model of analogy, analogical scaffolding, which explains present and prior results of student learning with analogies. We build on prior models of representation, blending, and layering of ideas. Extending this model's explanatory power, we propose ways in which the model can be applied to design a curriculum directed at…

  9. Markov model plus k-word distributions: a synergy that produces novel statistical measures for sequence comparison.

    PubMed

    Dai, Qi; Yang, Yanchun; Wang, Tianming

    2008-10-15

    Many proposed statistical measures can efficiently compare biological sequences to further infer their structures, functions and evolutionary information. They are related in spirit because all the ideas for sequence comparison try to use the information on the k-word distributions, Markov model or both. Motivated by adding k-word distributions to Markov model directly, we investigated two novel statistical measures for sequence comparison, called wre.k.r and S2.k.r. The proposed measures were tested by similarity search, evaluation on functionally related regulatory sequences and phylogenetic analysis. This offers the systematic and quantitative experimental assessment of our measures. Moreover, we compared our achievements with these based on alignment or alignment-free. We grouped our experiments into two sets. The first one, performed via ROC (receiver operating curve) analysis, aims at assessing the intrinsic ability of our statistical measures to search for similar sequences from a database and discriminate functionally related regulatory sequences from unrelated sequences. The second one aims at assessing how well our statistical measure is used for phylogenetic analysis. The experimental assessment demonstrates that our similarity measures intending to incorporate k-word distributions into Markov model are more efficient.

  10. Induced Currents in Multiple Resonant Scattering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruby, Stanley L

    We will describe here some results from a MRS scattering model designed to be appropriate for slow resonant scattering. This temporal model is based squarely in induced currents in individual nuclei; a natural consequence is that reradiation into 4{pi} is natural, and does not involve special mechanisms like spin-flips or imperfections of the lattice. Driven by these ideas, we have been able to do experiments where the 4{pi}-shine decay rate around the scattering (FS) slabs is measured simultaneously with the FS rate. Our SS scattering slabs are simple as possible--no hyperfine fields, no crystal structure, and quite static in time.more » Get mainly the one important set of currents jp, an associated FS field Ep, and finally an associated beamlike intensity R{sub fs}(t). But in addition, each current, even jp, contributes to the 4{pi}-shine intensity. This gives quantitative agreement with R{sub 4{pi}}(t), which is rather more complicated than the simple e{sup {minus}t} one might first expect. MRS predicts another set of currents ju, with an associated 4{pi} intensity R{sub 4{pi}}(t). The modifiers refer to unphased and phased. With static SS slabs, this branch is weak, and can be neglected. Driven by these ideas, we have prepared scattering samples where the atoms holding the currents are being stirred about (by diffusion) rather rapidly. This provides a method for dephasing the jp, but also provides a generation rate for ju. The experimental data is not of great quality at this early stage. But the present rough MRS calculations fit easily.« less

  11. Examining the Effects of Model-Based Inquiry on Concepetual Understanding and Engagement in Science

    NASA Astrophysics Data System (ADS)

    Baze, Christina L.

    Model-Based Inquiry (MBI) is an instructional model which engages students in the scientific practices of modeling, explanation, and argumentation while they work to construct explanations for natural phenomena. This instructional model has not been previously studied at the community college level. The purpose of this study is to better understand how MBI affects the development of community college students' conceptual understanding of evolution and engagement in the practices of science. Mixed-methods were employed to collect quantitative and qualitative data through the multiple-choice Concepts Inventory of Natural Selection, student artifacts, and semi-structured interviews. Participants were enrolled in Biology Concepts, an introductory class for non-science majors, at a small, rural community college in the southwestern United States. Preliminary data shows that conceptual understanding is not adversely affected by the implementation of MBI, and that students gain valuable insights into the practices of science. Specifically, students who participated in the MBI intervention group gained a better understanding of the role of models in explaining and predicting phenomena and experienced feeling ownership of their ideas, an appropriate depth of thinking, more opportunities for collaboration, and coherence and context within the unit. Implications of this study will be of interest to postsecondary science educators and researchers who seek to reform and improve science education.

  12. Achieving Liftoff

    ERIC Educational Resources Information Center

    Turley, Renee; Trotochaud, Alan; Campbell, Todd

    2016-01-01

    Sense-making has been described as working on and with ideas--both students' ideas and authoritative ideas in texts--to build coherent storylines, models, and/or explanations. This article describes the process for developing storyline units to support students' making sense of and explaining a rocket launch. The storyline approach, which aligns…

  13. He-3, Pierre Morel and Me—Early Work on Anisotropic Superfluidity

    NASA Astrophysics Data System (ADS)

    Anderson, Philip W.

    2011-08-01

    The idea that there are alternative, anisotropic solutions of BCS equations, which might apply to He-3, surfaced independently in at least three places, one of which was Pierre Morel's thesis project (for the ENS, under me at Bell Labs) I was skeptical of quantitative estimates of transition temperatures and instead focused, with Pierre, on conceptual and experimental properties of such states.

  14. Power Relations: Their Embodiment in Research.

    PubMed

    Florczak, Kristine L

    2016-07-01

    The purpose of this column is to consider the notion of power in research. To that end, the idea of power is considered from the perspective of philosophy and then linked to a nursing concept analysis that compares the differences between power over and power to. Then, the nature of power relations is compared and contrasted between quantitative and qualitative methodologies. © The Author(s) 2016.

  15. Qualitative to Quantitative and Spectrum to Report: An Instrument-Focused Research Methods Course for First-Year Students

    ERIC Educational Resources Information Center

    Thomas, Alyssa C.; Boucher, Michelle A.; Pulliam, Curtis R.

    2015-01-01

    Our Introduction to Research Methods course is a first-year majors course built around the idea of helping students learn to work like chemists, write like chemists, and think like chemists. We have developed this course as a hybrid hands-on/ lecture experience built around instrumentation use and report preparation. We take the product from one…

  16. The Role of Scientific Modeling Criteria in Advancing Students' Explanatory Ideas of Magnetism

    ERIC Educational Resources Information Center

    Cheng, Meng-Fei; Brown, David E.

    2015-01-01

    Student construction of models is a strong focus of current research and practice in science education. In order to study in detail the interactions between students' model generation and evaluation and their development of explanatory ideas to account for magnetic phenomena, a multi-session teaching experiment was conducted with a small number of…

  17. Advanced Civics for U.S. History Teachers: Professional Development Models Focusing on the Founding Documents. White Paper No. 139

    ERIC Educational Resources Information Center

    Lewis, Anders; Donovan, William

    2015-01-01

    The idea that the purpose of education, let alone history education, is to remove a student from the here and now and to get them to understand ideas and worlds beyond their immediate interests is anathema to proponents of today's trendy reform ideas. The idea, as well, that the stories of the past are intrinsically fascinating in and of…

  18. The Predicting Model of E-commerce Site Based on the Ideas of Curve Fitting

    NASA Astrophysics Data System (ADS)

    Tao, Zhang; Li, Zhang; Dingjun, Chen

    On the basis of the idea of the second multiplication curve fitting, the number and scale of Chinese E-commerce site is analyzed. A preventing increase model is introduced in this paper, and the model parameters are solved by the software of Matlab. The validity of the preventing increase model is confirmed though the numerical experiment. The experimental results show that the precision of preventing increase model is ideal.

  19. Assessment of access to primary health care among children and adolescents hospitalized due to avoidable conditions.

    PubMed

    Ferrer, Ana Paula Scoleze; Grisi, Sandra Josefina Ferraz Ellero

    2016-09-01

    Hospitalizations for ambulatory care-sensitive conditions (HACSC) are considered an indicator of the effectiveness of primary health care (PHC). High rates of HACSC represent problems in the access or the quality of health care. In Brazil, HACSC rates are high and there are few studies on the factors associated with it. To evaluate the access to PHC offered to children and adolescents hospitalized due to ACSC and analyze the conditioning factors. Cross-sectional study with a quantitative and qualitative approach. Five hundred and one (501) users (guardians/caregivers) and 42 professionals of PHC units were interviewed over one year. Quantitative data were obtained using Primary Care Assessment Tool validated in Brazil (PCATool-Brazil), while qualitative data were collected by semi-structured interview. The independent variables were: age, maternal education, family income, type of diagnosis, and model of care offered, and the dependent variables were access and its components (accessibility and use of services). Sixty-five percent (65.2%) of hospitalizations were ACSC. From the perspective of both users and professionals, access and its components presented low scores. Age, type of diagnosis, and model of care affected the results. The proportion of HACSC was high in this population. Access to services is inappropriate due to: barriers to access, appreciation of the emergency services, and attitude towards health needs. Professional attitudes and opinions reinforce inadequate ideas of users reflecting on the pattern of service use.

  20. IDEA: Planning at the Core of Autonomous Reactive Agents

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Dorais, Gregory A.; Fry, Chuck; Levinson, Richard; Plaunt, Christian; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Several successful autonomous systems are separated into technologically diverse functional layers operating at different levels of abstraction. This diversity makes them difficult to implement and validate. In this paper, we present IDEA (Intelligent Distributed Execution Architecture), a unified planning and execution framework. In IDEA a layered system can be implemented as separate agents, one per layer, each representing its interactions with the world in a model. At all levels, the model representation primitives and their semantics is the same. Moreover, each agent relies on a single model, plan database, plan runner and on a variety of planners, both reactive and deliberative. The framework allows the specification of agents that operate, within a guaranteed reaction time and supports flexible specification of reactive vs. deliberative agent behavior. Within the IDEA framework we are working to fully duplicate the functionalities of the DS1 Remote Agent and extend it to domains of higher complexity than autonomous spacecraft control.

  1. Modeling Mathematical Ideas: Developing Strategic Competence in Elementary and Middle School

    ERIC Educational Resources Information Center

    Suh, Jennifer M.; Seshaiyer, Padmanabhan

    2016-01-01

    "Modeling Mathematical Ideas" combining current research and practical strategies to build teachers and students strategic competence in problem solving.This must-have book supports teachers in understanding learning progressions that addresses conceptual guiding posts as well as students' common misconceptions in investigating and…

  2. Knowledge-based modelling of historical surfaces using lidar data

    NASA Astrophysics Data System (ADS)

    Höfler, Veit; Wessollek, Christine; Karrasch, Pierre

    2016-10-01

    Currently in archaeological studies digital elevation models are mainly used especially in terms of shaded reliefs for the prospection of archaeological sites. Hesse (2010) provides a supporting software tool for the determination of local relief models during the prospection using LiDAR scans. Furthermore the search for relicts from WW2 is also in the focus of his research. In James et al. (2006) the determined contour lines were used to reconstruct locations of archaeological artefacts such as buildings. This study is much more and presents an innovative workflow of determining historical high resolution terrain surfaces using recent high resolution terrain models and sedimentological expert knowledge. Based on archaeological field studies (Franconian Saale near Bad Neustadt in Germany) the sedimentological analyses shows that archaeological interesting horizon and geomorphological expert knowledge in combination with particle size analyses (Koehn, DIN ISO 11277) are useful components for reconstructing surfaces of the early Middle Ages. Furthermore the paper traces how it is possible to use additional information (extracted from a recent digital terrain model) to support the process of determination historical surfaces. Conceptual this research is based on methodology of geomorphometry and geo-statistics. The basic idea is that the working procedure is based on the different input data. One aims at tracking the quantitative data and the other aims at processing the qualitative data. Thus, the first quantitative data were available for further processing, which were later processed with the qualitative data to convert them to historical heights. In the final stage of the workflow all gathered information are stored in a large data matrix for spatial interpolation using the geostatistical method of Kriging. Besides the historical surface, the algorithm also provides a first estimation of accuracy of the modelling. The presented workflow is characterized by a high flexibility and the opportunity to include new available data in the process at any time.

  3. Concept mapping as an approach for expert-guided model building: The example of health literacy.

    PubMed

    Soellner, Renate; Lenartz, Norbert; Rudinger, Georg

    2017-02-01

    Concept mapping served as the starting point for the aim of capturing the comprehensive structure of the construct of 'health literacy.' Ideas about health literacy were generated by 99 experts and resulted in 105 statements that were subsequently organized by 27 experts in an unstructured card sorting. Multidimensional scaling was applied to the sorting data and a two and three-dimensional solution was computed. The three dimensional solution was used in subsequent cluster analysis and resulted in a concept map of nine "clusters": (1) self-regulation, (2) self-perception, (3) proactive approach to health, (4) basic literacy and numeracy skills, (5) information appraisal, (6) information search, (7) health care system knowledge and acting, (8) communication and cooperation, and (9) beneficial personality traits. Subsequently, this concept map served as a starting point for developing a "qualitative" structural model of health literacy and a questionnaire for the measurement of health literacy. On the basis of questionnaire data, a "quantitative" structural model was created by first applying exploratory factor analyses (EFA) and then cross-validating the model with confirmatory factor analyses (CFA). Concept mapping proved to be a highly valuable tool for the process of model building up to translational research in the "real world". Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Modifying ``Six Ideas that Shaped Physics'' for a Life-Science major audience at Hope College

    NASA Astrophysics Data System (ADS)

    Mader, Catherine

    2005-04-01

    The ``Six Ideas That Shaped Physics'' textbook has been adapted and used for use in the algebra-based introductory physics course for non-physics science majors at Hope College. The results of the first use will be presented. Comparison of FCI for pre and post test scores will be compared with results from 8 years of results from both the algebra-based course and the calculus-based course (when we first adopted ``Six Ideas that Shaped Physcs" for the Calculus-based course). In addition, comparison on quantitative tests and homework problems with prior student groups will also be made. Because a large fraction of the audience in the algebra-based course is life-science majors, a goal of this project is to make the material relevant for these students. Supplemental materials that emphasize the connection between the life sciences and the fundamental physics concepts are being be developed to accompany the new textbook. Samples of these materials and how they were used (and received) during class testing will be presented.

  5. Does drywall installers' innovative idea reduce the ergonomic exposures of ceiling installation: A field case study.

    PubMed

    Dasgupta, Priyadarshini Sengupta; Punnett, Laura; Moir, Susan; Kuhn, Sarah; Buchholz, Bryan

    2016-07-01

    The study was conducted to assess an intervention suggested by the workers to reduce the physical or ergonomic exposures of the drywall installation task. The drywall installers were asked to brainstorm on innovative ideas that could reduce their ergonomic exposures during the drywall installation work. The workers proposed the idea of using a 'deadman' (narrow panel piece) to hold the panels to the ceiling while installing them. The researcher collected quantitative exposure data (PATH, 3DSSPP) at the baseline and intervention phases and compared the phases to find out any change in the exposure while using the 'deadman'. Results showed that ergonomic exposures (such as overhead arm and awkward trunk postures and heavy load handling) were reduced at the intervention phase while using the 'deadman' with an electrically operated lift. The concept of the 'deadman', which was shown to help reduce musculoskeletal exposures during ceiling installation, can be used to fabricate a permanent ergonomic tool to support the ceiling drywall panel. Copyright © 2016 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  6. Stable elastic knots with no self-contact

    NASA Astrophysics Data System (ADS)

    Moulton, Derek E.; Grandgeorge, Paul; Neukirch, Sébastien

    2018-07-01

    We study an elastic rod bent into an open trefoil knot and clamped at both ends. The question we consider is whether there are stable configurations for which there are no points of self-contact. This idea can be fairly easily replicated with a thin strip of paper, but is more difficult or even impossible with a flexible wire. We search for such configurations within the space of three tuning parameters related to the degrees of freedom in a simple experiment. Mathematically, we show, both within standard Kirchhoff theory as well within an elastic strip theory, that stable and contact-free knotted configurations can be found, and we classify the corresponding parametric regions. Numerical results are complemented with an asymptotic analysis that demonstrates the presence of knots near the doubly-covered ring. In the case of the strip model, quantitative experiments of the region of good knots are also provided to validate the theory.

  7. Knowledge-base for interpretation of cerebrospinal fluid data patterns. Essentials in neurology and psychiatry.

    PubMed

    Reiber, Hansotto

    2016-06-01

    The physiological and biophysical knowledge base for interpretations of cerebrospinal fluid (CSF) data and reference ranges are essential for the clinical pathologist and neurochemist. With the popular description of the CSF flow dependent barrier function, the dynamics and concentration gradients of blood-derived, brain-derived and leptomeningeal proteins in CSF or the specificity-independent functions of B-lymphocytes in brain also the neurologist, psychiatrist, neurosurgeon as well as the neuropharmacologist may find essentials for diagnosis, research or development of therapies. This review may help to replace the outdated ideas like "leakage" models of the barriers, linear immunoglobulin Index Interpretations or CSF electrophoresis. Calculations, Interpretations and analytical pitfalls are described for albumin quotients, quantitation of immunoglobulin synthesis in Reibergrams, oligoclonal IgG, IgM analysis, the polyspecific ( MRZ- ) antibody reaction, the statistical treatment of CSF data and general quality assessment in the CSF laboratory. The diagnostic relevance is documented in an accompaning review.

  8. Floral organ MADS-box genes in Cercidiphyllum japonicum (Cercidiphyllaceae): Implications for systematic evolution and bracts definition.

    PubMed

    Jin, Yupei; Wang, Yubing; Zhang, Dechun; Shen, Xiangling; Liu, Wen; Chen, Faju

    2017-01-01

    The dioecious relic Cercidiphyllum japonicum is one of two species of the sole genus Cercidiphyllum, with a tight inflorescence lacking an apparent perianth structure. In addition, its systematic place has been much debated and, so far researches have mainly focused on its morphology and chloroplast genes. In our investigation, we identified 10 floral organ identity genes, including four A-class, three B-class, two C-class and one D-class. Phylogenetic analyses showed that all ten genes are grouped with Saxifragales plants, which confirmed the phylogenetic place of C. japonicum. Expression patterns of those genes were examined by quantitative reverse transcriptase PCR, with some variations that did not completely coincide with the ABCDE model, suggesting some subfunctionalization. As well, our research supported the idea that thebract actually is perianth according to our morphological and molecular analyses in Cercidiphyllum japonicum.

  9. Floral organ MADS-box genes in Cercidiphyllum japonicum (Cercidiphyllaceae): Implications for systematic evolution and bracts definition

    PubMed Central

    Zhang, Dechun; Shen, Xiangling; Chen, Faju

    2017-01-01

    The dioecious relic Cercidiphyllum japonicum is one of two species of the sole genus Cercidiphyllum, with a tight inflorescence lacking an apparent perianth structure. In addition, its systematic place has been much debated and, so far researches have mainly focused on its morphology and chloroplast genes. In our investigation, we identified 10 floral organ identity genes, including four A-class, three B-class, two C-class and one D-class. Phylogenetic analyses showed that all ten genes are grouped with Saxifragales plants, which confirmed the phylogenetic place of C. japonicum. Expression patterns of those genes were examined by quantitative reverse transcriptase PCR, with some variations that did not completely coincide with the ABCDE model, suggesting some subfunctionalization. As well, our research supported the idea that thebract actually is perianth according to our morphological and molecular analyses in Cercidiphyllum japonicum. PMID:28562649

  10. Data mining for materials design: A computational study of single molecule magnet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dam, Hieu Chi; Faculty of Physics, Vietnam National University, 334 Nguyen Trai, Hanoi; Pham, Tien Lam

    2014-01-28

    We develop a method that combines data mining and first principles calculation to guide the designing of distorted cubane Mn{sup 4+} Mn {sub 3}{sup 3+} single molecule magnets. The essential idea of the method is a process consisting of sparse regressions and cross-validation for analyzing calculated data of the materials. The method allows us to demonstrate that the exchange coupling between Mn{sup 4+} and Mn{sup 3+} ions can be predicted from the electronegativities of constituent ligands and the structural features of the molecule by a linear regression model with high accuracy. The relations between the structural features and magnetic propertiesmore » of the materials are quantitatively and consistently evaluated and presented by a graph. We also discuss the properties of the materials and guide the material design basing on the obtained results.« less

  11. Mitochondrial network complexity emerges from fission/fusion dynamics.

    PubMed

    Zamponi, Nahuel; Zamponi, Emiliano; Cannas, Sergio A; Billoni, Orlando V; Helguera, Pablo R; Chialvo, Dante R

    2018-01-10

    Mitochondrial networks exhibit a variety of complex behaviors, including coordinated cell-wide oscillations of energy states as well as a phase transition (depolarization) in response to oxidative stress. Since functional and structural properties are often interwinded, here we characterized the structure of mitochondrial networks in mouse embryonic fibroblasts using network tools and percolation theory. Subsequently we perturbed the system either by promoting the fusion of mitochondrial segments or by inducing mitochondrial fission. Quantitative analysis of mitochondrial clusters revealed that structural parameters of healthy mitochondria laid in between the extremes of highly fragmented and completely fusioned networks. We confirmed our results by contrasting our empirical findings with the predictions of a recently described computational model of mitochondrial network emergence based on fission-fusion kinetics. Altogether these results offer not only an objective methodology to parametrize the complexity of this organelle but also support the idea that mitochondrial networks behave as critical systems and undergo structural phase transitions.

  12. Constrained vertebrate evolution by pleiotropic genes.

    PubMed

    Hu, Haiyang; Uesaka, Masahiro; Guo, Song; Shimai, Kotaro; Lu, Tsai-Ming; Li, Fang; Fujimoto, Satoko; Ishikawa, Masato; Liu, Shiping; Sasagawa, Yohei; Zhang, Guojie; Kuratani, Shigeru; Yu, Jr-Kai; Kusakabe, Takehiro G; Khaitovich, Philipp; Irie, Naoki

    2017-11-01

    Despite morphological diversification of chordates over 550 million years of evolution, their shared basic anatomical pattern (or 'bodyplan') remains conserved by unknown mechanisms. The developmental hourglass model attributes this to phylum-wide conserved, constrained organogenesis stages that pattern the bodyplan (the phylotype hypothesis); however, there has been no quantitative testing of this idea with a phylum-wide comparison of species. Here, based on data from early-to-late embryonic transcriptomes collected from eight chordates, we suggest that the phylotype hypothesis would be better applied to vertebrates than chordates. Furthermore, we found that vertebrates' conserved mid-embryonic developmental programmes are intensively recruited to other developmental processes, and the degree of the recruitment positively correlates with their evolutionary conservation and essentiality for normal development. Thus, we propose that the intensively recruited genetic system during vertebrates' organogenesis period imposed constraints on its diversification through pleiotropic constraints, which ultimately led to the common anatomical pattern observed in vertebrates.

  13. Role of Discrepant Questioning Leading to Model Element Modification

    ERIC Educational Resources Information Center

    Rea-Ramirez, Mary Anne; Nunez-Oviedo, Maria Cecilia; Clement, John

    2009-01-01

    Discrepant questioning is a teaching technique that can help students "unlearn" misconceptions and process science ideas for deep understanding. Discrepant questioning is a technique in which teachers question students in a way that requires them to examine their ideas or models, without giving information prematurely to the student or passing…

  14. Children's Models about Colours in Nahuatl-Speaking Communities

    ERIC Educational Resources Information Center

    Gallegos-Cázares, Leticia; Flores-Camacho, Fernando; Calderón-Canales, Elena; Perrusquía-Máximo, Elvia; García-Rivera, Beatriz

    2014-01-01

    This paper presents the development and structure of indigenous children's ideas about mixing colours as well as their ideas about each colour, derived from their traditions. The children were interviewed both at school and outside it, and an educational proposal was implemented. Ideas expressed in the school context were analysed using the…

  15. Steady and transient sliding under rate-and-state friction

    NASA Astrophysics Data System (ADS)

    Putelat, Thibaut; Dawes, Jonathan H. P.

    2015-05-01

    The physics of dry friction is often modelled by assuming that static and kinetic frictional forces can be represented by a pair of coefficients usually referred to as μs and μk, respectively. In this paper we re-examine this discontinuous dichotomy and relate it quantitatively to the more general, and smooth, framework of rate-and-state friction. This is important because it enables us to link the ideas behind the widely used static and dynamic coefficients to the more complex concepts that lie behind the rate-and-state framework. Further, we introduce a generic framework for rate-and-state friction that unifies different approaches found in the literature. We consider specific dynamical models for the motion of a rigid block sliding on an inclined surface. In the Coulomb model with constant dynamic friction coefficient, sliding at constant velocity is not possible. In the rate-and-state formalism steady sliding states exist, and analysing their existence and stability enables us to show that the static friction coefficient μs should be interpreted as the local maximum at very small slip rates of the steady state rate-and-state friction law. Next, we revisit the often-cited experiments of Rabinowicz (J. Appl. Phys., 22:1373-1379, 1951). Rabinowicz further developed the idea of static and kinetic friction by proposing that the friction coefficient maintains its higher and static value μs over a persistence length before dropping to the value μk. We show that there is a natural identification of the persistence length with the distance that the block slips as measured along the stable manifold of the saddle point equilibrium in the phase space of the rate-and-state dynamics. This enables us explicitly to define μs in terms of the rate-and-state variables and hence link Rabinowicz's ideas to rate-and-state friction laws. This stable manifold naturally separates two basins of attraction in the phase space: initial conditions in the first one lead to the block eventually stopping, while in the second basin of attraction the sliding motion continues indefinitely. We show that a second definition of μs is possible, compatible with the first one, as the weighted average of the rate-and-state friction coefficient over the time the block is in motion.

  16. A numerical model of the physical and chemical evolution of Vesta

    NASA Astrophysics Data System (ADS)

    Mizzon, H.; Monnereau, M.; Toplis, M. J.; Prettyman, T. H.; McSween, H. Y.; Raymond, C. A.; Russell, C. T.

    2015-10-01

    Vesta is a 262 km radius asteroid that has been proposed as the parent body of the Howardite- Eucrite-Diogenite family of meteorites. The observations of the Dawn spacecraft confirm the idea that this protoplanet underwent magmatic differentiation, providing evidence for regions of the upper crust rich in basaltic (eucritic) lithologies, while regions that have experienced excavation related to large impacts (i.e. Rheasilvia) are richer in pyroxene-dominated (diogenitic) lithologies [1,2]. One of the most striking results of the Dawn mission is the absence of olivine at the near-surface, even in the deep Rheasiliva basin. This observation has been used to question the chondritic nature of bulk Vesta and/or question its status as an intact protoplanet [3]. From a geochemical point of view, the HED meteorites are consistent with chondritic precursors [4], but petrological models have met difficulties explaining both eucrites and diogenites in a unified way [5]. These models comprise two extreme endmember scenarios: the first considers the partial melting of the primitive mantle of Vesta, followed by melt extraction [6], while the second involves the solidification of an initially entirely molten magma ocean [7]. In the latter case, major-element chemistry of eucrites and diogenites can be reproduced [7], but not the extreme range of trace element concentrations observed in diogenites [8]. More importantly, the physics of melt migration seems to preclude the existence of a global magma ocean, assuming that 26Al is the only heat source capable of extensively producing melt in early small bodies. This is because plagioclase is one of the first phases to melt, thus early formed liquids are Al-rich. Rapid migration of such liquids redistributes 26Al, limiting melt production where liquid has been lost [9,10]. This idea was explored by [11], who qualitatively suggested that the first melts formed would migrate to the surface (as eucrites), while the lower mantle would become enriched in refractory olivine through its downward compaction. This last point potentially explains the lack of this mineral near Vesta's surface. The aim of work presented here is to quantitatively explore this idea by computing the mineralogy as a function of depth and time, using a set of numerical solutions of conservative equations and an appropriate phase diagram.

  17. Determining Whether Including Student-Generated Drawing before Narrative Writing Increases the Clarity and Description Used by Third-Grade Students

    ERIC Educational Resources Information Center

    Anderson, Susan M.

    2010-01-01

    The purpose of this quantitative study was to determine whether asking third grade students in one school district in the Metro Detroit area to draw a picture of a narrative story idea before writing about it would improve the points earned on the Michigan standardized writing assessment (MEAP test). One half, or four out of eight, of the…

  18. ["Human races": history of a dangerous illusion].

    PubMed

    Louryan, S

    2014-01-01

    The multiplication of offences prompted by racism and the increase of complaints for racism leads us to consider the illusory concept of "human races". This idea crossed the history, and was reinforced by the discovery of remote tribes and human fossils, and by the development of sociobiology and quantitative psychology. Deprived of scientific base, the theory of the "races" must bow before the notions of genetic variation and unicity of mankind.

  19. Engaging Undergraduate Biology Students in Scientific Modeling: Analysis of Group Interactions, Sense-Making, and Justification

    PubMed Central

    Bierema, Andrea M.-K.; Schwarz, Christina V.; Stoltzfus, Jon R.

    2017-01-01

    National calls for improving science education (e.g., Vision and Change) emphasize the need to learn disciplinary core ideas through scientific practices. To address this need, we engaged small groups of students in developing diagrammatic models within two (one large-enrollment and one medium-enrollment) undergraduate introductory biology courses. During these activities, students developed scientific models of biological phenomena such as enhanced growth in genetically modified fish. To investigate whether undergraduate students productively engaged in scientific practices during these modeling activities, we recorded groups of students as they developed models and examined three characteristics: how students 1) interacted with one another, 2) made sense of phenomena, and 3) justified their ideas. Our analysis indicates that students spent most of the time on task, developing and evaluating their models. Moreover, they worked cooperatively to make sense of core ideas and justified their ideas to one another throughout the activities. These results demonstrate that, when provided with the opportunity to develop models during class, students in large-enrollment lecture courses can productively engage in scientific practices. We discuss potential reasons for these outcomes and suggest areas of future research to continue advancing knowledge regarding engaging students in scientific practices in large-enrollment lecture courses. PMID:29196429

  20. The Determinant Factors of Regional Development Toward Land Use Change in Deli Serdang

    NASA Astrophysics Data System (ADS)

    Lindarto, D.; Sirojuzilam; Badaruddin; Dwira

    2017-03-01

    The concept of regional development Mebidangro (Medan, Binjai, Deli Serdang, and Karo) creating neighboring region hinterland Medan city with Deli Serdang Regency especially in Tembung village, Percut Sei Tuan District. Population structure in Tembung shows occurrence condition of rural-urban change which seen from the sprawl land use change. The aim of the study is to reveal the genius locus as one of land use change factors. The study conducted with quantitative approach intended at obtaining variables which describing several factors forming land use change. Descriptive approach intended to give an idea, justification, and fact-finding with correct interpretation. Data collected through a purposive sampling of 300 respondents who have built the house between 2010 till 2014. With overlay figure/ground technique, scoring analysis, descriptive quantitative and SEM (Structural Equational Models) gained a result that place character/genius locus (p=0,007) potentially as one of the main land use change driving factors besides accessibility (p=0,039), infrastructure (p=0,005), social-economic p=0,038). Topographic (p=0,663) was inversely potentially. The implication of the findings is required intensive control in space utilization considering the rapid change in land use transformation that tend to have the negative impact of urban sprawl.

  1. Argument Based Science Inquiry (ABSI) Learning Model in Voltaic Cell Concept

    NASA Astrophysics Data System (ADS)

    Subarkah, C. Z.; Fadilah, A.; Aisyah, R.

    2017-09-01

    Voltaic Cell is a sub-concept of electrochemistry that is considered difficult to be comprehended by learners Voltaic Cell is a sub concept of electrochemistry that is considered difficult to be understood by learners so that impacts on student activity in learning process. Therefore the learning model Argument Based Science Inquiry (ABSI) will be applied to the concept of Voltaic cell. This research aims to describe students’ activities during learning process using ABSI model and to analyze students’ competency to solve ABSI-based worksheets (LK) of Voltaic Cell concept. The method used in this research was the “mix-method-quantitative-embedded” method with subjects of the study: 39 second-semester students of Chemistry Education study program. The student activity is quite good during ABSI learning. The students’ ability to complete worksheet (LK) for every average phase is good. In the phase of exploration of post instruction understanding, it is categorized very good, and in the phase of negotiation shape III: comparing science ideas to textbooks or other printed resources merely reach enough category. Thus, the ABSI learning has improved the student levels of activity and students’ competency to solve the ABSI-based worksheet (LK).

  2. A system dynamics optimization framework to achieve population desired of average weight target

    NASA Astrophysics Data System (ADS)

    Abidin, Norhaslinda Zainal; Zulkepli, Jafri Haji; Zaibidi, Nerda Zura

    2017-11-01

    Obesity is becoming a serious problem in Malaysia as it has been rated as the highest among Asian countries. The aim of the paper is to propose a system dynamics (SD) optimization framework to achieve population desired weight target based on the changes in physical activity behavior and its association to weight and obesity. The system dynamics approach of stocks and flows diagram was used to quantitatively model the impact of both behavior on the population's weight and obesity trends. This work seems to bring this idea together and highlighting the interdependence of the various aspects of eating and physical activity behavior on the complex of human weight regulation system. The model was used as an experimentation vehicle to investigate the impacts of changes in physical activity on weight and prevalence of obesity implications. This framework paper provides evidence on the usefulness of SD optimization as a strategic decision making approach to assist in decision making related to obesity prevention. SD applied in this research is relatively new in Malaysia and has a high potential to apply to any feedback models that address the behavior cause to obesity.

  3. A Trade Secret Model for Genomic Biobanking

    PubMed Central

    Conley, John M.; Kenan, William Rand; Mitchell, Robert; Cadigan, R. Jean; Davis, Arlene M.; Dobson, Allison W.; Gladden, Ryan Q.

    2012-01-01

    Genomic biobanks present ethical challenges that are qualitatively unique and quantitatively unprecedented. Many critics have questioned whether the current system of informed consent can be meaningfully applied to genomic biobanking. Proposals for reform have come from many directions, but have tended to involve incremental change in current informed consent practice. This paper reports on our efforts to seek new ideas and approaches from those whom informed consent is designed to protect: research subjects. Our model emerged from semi-structured interviews with healthy volunteers who had been recruited to join either of two biobanks (some joined, some did not), and whom we encouraged to explain their concerns and how they understood the relationship between specimen contributors and biobanks. These subjects spoke about their DNA and the information it contains in ways that were strikingly evocative of the legal concept of the trade secret. They then described the terms and conditions under which they might let others study their DNA, and there was a compelling analogy to the commonplace practice of trade secret licensing. We propose a novel biobanking model based on this trade secret concept, and argue that it would be a practical, legal, and ethical improvement on the status quo. PMID:23061589

  4. Misconceptions About Sound Among Engineering Students

    NASA Astrophysics Data System (ADS)

    Pejuan, Arcadi; Bohigas, Xavier; Jaén, Xavier; Periago, Cristina

    2012-12-01

    Our first objective was to detect misconceptions about the microscopic nature of sound among senior university students enrolled in different engineering programmes (from chemistry to telecommunications). We sought to determine how these misconceptions are expressed (qualitative aspect) and, only very secondarily, to gain a general idea of the extent to which they are held (quantitative aspect). Our second objective was to explore other misconceptions about wave aspects of sound. We have also considered the degree of consistency in the model of sound used by each student. Forty students answered a questionnaire including open-ended questions. Based on their free, spontaneous answers, the main results were as follows: a large majority of students answered most of the questions regarding the microscopic model of sound according to the scientifically accepted model; however, only a small number answered consistently. The main model misconception found was the notion that sound is propagated through the travelling of air particles, even in solids. Misconceptions and mental-model inconsistencies tended to depend on the engineering programme in which the student was enrolled. However, students in general were inconsistent also in applying their model of sound to individual sound properties. The main conclusion is that our students have not truly internalised the scientifically accepted model that they have allegedly learnt. This implies a need to design learning activities that take these findings into account in order to be truly efficient.

  5. Mechanisms governing the visco-elastic responses of living cells assessed by foam and tensegrity models.

    PubMed

    Cañadas, P; Laurent, V M; Chabrand, P; Isabey, D; Wendling-Mansuy, S

    2003-11-01

    The visco-elastic properties of living cells, measured to date by various authors, vary considerably, depending on the experimental methods and/or on the theoretical models used. In the present study, two mechanisms thought to be involved in cellular visco-elastic responses were analysed, based on the idea that the cytoskeleton plays a fundamental role in cellular mechanical responses. For this purpose, the predictions of an open unit-cell model and a 30-element visco-elastic tensegrity model were tested, taking into consideration similar properties of the constitutive F-actin. The quantitative predictions of the time constant and viscosity modulus obtained by both models were compared with previously published experimental data obtained from living cells. The small viscosity modulus values (10(0)-10(3) Pa x s) predicted by the tensegrity model may reflect the combined contributions of the spatially rearranged constitutive filaments and the internal tension to the overall cytoskeleton response to external loading. In contrast, the high viscosity modulus values (10(3)-10(5) Pa x s) predicted by the unit-cell model may rather reflect the mechanical response of the cytoskeleton to the bending of the constitutive filaments and/or to the deformation of internal components. The present results suggest the existence of a close link between the overall visco-elastic response of micromanipulated cells and the underlying architecture.

  6. Multimodal computational microscopy based on transport of intensity equation

    NASA Astrophysics Data System (ADS)

    Li, Jiaji; Chen, Qian; Sun, Jiasong; Zhang, Jialin; Zuo, Chao

    2016-12-01

    Transport of intensity equation (TIE) is a powerful tool for phase retrieval and quantitative phase imaging, which requires intensity measurements only at axially closely spaced planes without a separate reference beam. It does not require coherent illumination and works well on conventional bright-field microscopes. The quantitative phase reconstructed by TIE gives valuable information that has been encoded in the complex wave field by passage through a sample of interest. Such information may provide tremendous flexibility to emulate various microscopy modalities computationally without requiring specialized hardware components. We develop a requisite theory to describe such a hybrid computational multimodal imaging system, which yields quantitative phase, Zernike phase contrast, differential interference contrast, and light field moment imaging, simultaneously. It makes the various observations for biomedical samples easy. Then we give the experimental demonstration of these ideas by time-lapse imaging of live HeLa cell mitosis. Experimental results verify that a tunable lens-based TIE system, combined with the appropriate postprocessing algorithm, can achieve a variety of promising imaging modalities in parallel with the quantitative phase images for the dynamic study of cellular processes.

  7. Modeling the Residual Strength of a Fibrous Composite Using the Residual Daniels Function

    NASA Astrophysics Data System (ADS)

    Paramonov, Yu.; Cimanis, V.; Varickis, S.; Kleinhofs, M.

    2016-09-01

    The concept of a residual Daniels function (RDF) is introduced. Together with the concept of Daniels sequence, the RDF is used for estimating the residual (after some preliminary fatigue loading) static strength of a unidirectional fibrous composite (UFC) and its S-N curve on the bases of test data. Usually, the residual strength is analyzed on the basis of a known S-N curve. In our work, an inverse approach is used: the S-N curve is derived from an analysis of the residual strength. This approach gives a good qualitive description of the process of decreasing residual strength and explanes the existence of the fatigue limit. The estimates of parameters of the corresponding regression model can be interpreted as estimates of parameters of the local strength of components of the UFC. In order to approach the quantitative experimental estimates of the fatigue life, some ideas based on the mathematics of the semiMarkovian process are employed. Satisfactory results in processing experimental data on the fatigue life and residual strength of glass/epoxy laminates are obtained.

  8. Internet dynamics

    NASA Astrophysics Data System (ADS)

    Lukose, Rajan Mathew

    The World Wide Web and the Internet are rapidly expanding spaces, of great economic and social significance, which offer an opportunity to study many phenomena, often previously inaccessible, on an unprecedented scale and resolution with relative ease. These phenomena are measurable on the scale of tens of millions of users and hundreds of millions of pages. By virtue of nearly complete electronic mediation, it is possible in principle to observe the time and ``spatial'' evolution of nearly all choices and interactions. This cyber-space therefore provides a view into a number of traditional research questions (from many academic disciplines) and creates its own new phenomena accessible for study. Despite its largely self-organized and dynamic nature, a number of robust quantitative regularities are found in the aggregate statistics of interesting and useful quantities. These regularities can be understood with the help of models that draw on ideas from statistical physics as well as other fields such as economics, psychology and decision theory. This thesis develops models that can account for regularities found in the statistics of Internet congestion and user surfing patterns and discusses some practical consequences. practical consequences.

  9. Forum: The challenge of global change

    NASA Astrophysics Data System (ADS)

    Roederer, Juan G.

    1990-09-01

    How can we sustain a public sense of the common danger of global change while remaining honest in view of the realities of scientific uncertainty? How can we nurture this sense of common danger without making statements based on half-baked ideas, statistically unreliable results, or oversimplified models? How can we strike a balance between the need to overstate a case to attract the attention of the media and the obligation to adhere strictly to the ethos of science?The task of achieving a scientific understanding of the inner workings of the terrestrial environment is one of the most difficult and ambitious endeavors of humankind. It is full of traps, temptations and deceptions for the participating scientists. We are dealing with a horrendously complex, strongly interactive, highly non-linear system. Lessons learned from disciplines such as plasma physics and solid state physics which have been dealing with complex non-linear systems for decades, are not very encouraging. The first thing one learns is that there are intrinsic, physical limits to the quantitative predictability of a complex system that have nothing to do with the particular techniques employed to model it.

  10. Effect of crystallographic orientations of grains on the global mechanical properties of steel sheets by depth sensing indentation

    NASA Astrophysics Data System (ADS)

    Burik, P.; Pesek, L.; Kejzlar, P.; Andrsova, Z.; Zubko, P.

    2017-01-01

    The main idea of this work is using a physical model to prepare a virtual material with required properties. The model is based on the relationship between the microstructure and mechanical properties. The macroscopic (global) mechanical properties of steel are highly dependent upon microstructure, crystallographic orientation of grains, distribution of each phase present, etc... We need to know the local mechanical properties of each phase separately in multiphase materials. The grain size is a scale, where local mechanical properties are responsible for the behavior. Nanomechanical testing using depth sensing indentation (DSI) provides a straightforward solution for quantitatively characterizing each of phases in microstructure because it is very powerful technique for characterization of materials in small volumes. The aim of this experimental investigation is: (i) to prove how the mixing rule works for local mechanical properties (indentation hardness HIT) in microstructure scale using the DSI technique on steel sheets with different microstructure; (ii) to compare measured global properties with properties achieved by mixing rule; (iii) to analyze the effect of crystallographic orientations of grains on the mixing rule.

  11. Experimental Fault Diagnosis in Systems Containing Finite Elements of Plate of Kirchoff by Using State Observers Methodology

    NASA Astrophysics Data System (ADS)

    Alegre, D. M.; Koroishi, E. H.; Melo, G. P.

    2015-07-01

    This paper presents a methodology for detection and localization of faults by using state observers. State Observers can rebuild the states not measured or values from points of difficult access in the system. So faults can be detected in these points without the knowledge of its measures, and can be track by the reconstructions of their states. In this paper this methodology will be applied in a system which represents a simplified model of a vehicle. In this model the chassis of the car was represented by a flat plate, which was divided in finite elements of plate (plate of Kirchoff), in addition, was considered the car suspension (springs and dampers). A test rig was built and the developed methodology was used to detect and locate faults on this system. In analyses done, the idea is to use a system with a specific fault, and then use the state observers to locate it, checking on a quantitative variation of the parameter of the system which caused this crash. For the computational simulations the software MATLAB was used.

  12. Telepsychiatry: an overview for psychiatrists.

    PubMed

    Hilty, Donald M; Luo, John S; Morache, Chris; Marcelo, Divine A; Nesbitt, Thomas S

    2002-01-01

    Telepsychiatry, in the form of videoconferencing and other modalities, brings enormous opportunities for clinical care, education, research and administration to the field of medicine. A comprehensive review of the literature related to telepsychiatry - specifically videoconferencing - was conducted using the MEDLINE, Embase, Science Citation Index, Social Sciences Citation Index and Telemedicine Information Exchange databases (1965 to June 2001). The keywords used were telepsychiatry, telemedicine, videoconferencing, Internet, primary care, education, personal digital assistant and handheld computers. Studies were selected for review if they discussed videoconferencing for patient care, satisfaction, outcomes, education and costs, and provided models of facilitating clinical service delivery. Literature on other technologies was also assessed and compared with telepsychiatry to provide an idea of future applications of technology. Published data indicate that telepsychiatry is successfully used for a variety of clinical services and educational initiatives. Telepsychiatry is generally feasible, offers a number of models of care and consultation, in general satisfies patients and providers, and has positive and negative effects on interpersonal behaviour. More quantitative and qualitative research is warranted with regard to the use of telepsychiatry in clinical and educational programmes and interventions.

  13. Beta-amyloid immunotherapy prevents synaptic degeneration in a mouse model of Alzheimer's disease.

    PubMed

    Buttini, Manuel; Masliah, Eliezer; Barbour, Robin; Grajeda, Henry; Motter, Ruth; Johnson-Wood, Kelly; Khan, Karen; Seubert, Peter; Freedman, Stephen; Schenk, Dale; Games, Dora

    2005-10-05

    Alzheimer's disease neuropathology is characterized by key features that include the deposition of the amyloid beta peptide (Abeta) into plaques, the formation of neurofibrillary tangles, and the loss of neurons and synapses in specific brain regions. The loss of synapses, and particularly the associated presynaptic vesicle protein synaptophysin in the hippocampus and association cortices, has been widely reported to be one of the most robust correlates of Alzheimer's disease-associated cognitive decline. The beta-amyloid hypothesis supports the idea that Abeta is the cause of these pathologies. However, the hypothesis is still controversial, in part because the direct role of Abeta in synaptic degeneration awaits confirmation. In this study, we show that Abeta reduction by active or passive Abeta immunization protects against the progressive loss of synaptophysin in the hippocampal molecular layer and frontal neocortex of a transgenic mouse model of Alzheimer's disease. These results, substantiated by quantitative electron microscopic analysis of synaptic densities, strongly support a direct causative role of Abeta in the synaptic degeneration seen in Alzheimer's disease and strengthen the potential of Abeta immunotherapy as a treatment approach for this disease.

  14. Automatic Cell Segmentation in Fluorescence Images of Confluent Cell Monolayers Using Multi-object Geometric Deformable Model.

    PubMed

    Yang, Zhen; Bogovic, John A; Carass, Aaron; Ye, Mao; Searson, Peter C; Prince, Jerry L

    2013-03-13

    With the rapid development of microscopy for cell imaging, there is a strong and growing demand for image analysis software to quantitatively study cell morphology. Automatic cell segmentation is an important step in image analysis. Despite substantial progress, there is still a need to improve the accuracy, efficiency, and adaptability to different cell morphologies. In this paper, we propose a fully automatic method for segmenting cells in fluorescence images of confluent cell monolayers. This method addresses several challenges through a combination of ideas. 1) It realizes a fully automatic segmentation process by first detecting the cell nuclei as initial seeds and then using a multi-object geometric deformable model (MGDM) for final segmentation. 2) To deal with different defects in the fluorescence images, the cell junctions are enhanced by applying an order-statistic filter and principal curvature based image operator. 3) The final segmentation using MGDM promotes robust and accurate segmentation results, and guarantees no overlaps and gaps between neighboring cells. The automatic segmentation results are compared with manually delineated cells, and the average Dice coefficient over all distinguishable cells is 0.88.

  15. An electrophysiological validation of stochastic DCM for fMRI

    PubMed Central

    Daunizeau, J.; Lemieux, L.; Vaudano, A. E.; Friston, K. J.; Stephan, K. E.

    2013-01-01

    In this note, we assess the predictive validity of stochastic dynamic causal modeling (sDCM) of functional magnetic resonance imaging (fMRI) data, in terms of its ability to explain changes in the frequency spectrum of concurrently acquired electroencephalography (EEG) signal. We first revisit the heuristic model proposed in Kilner et al. (2005), which suggests that fMRI activation is associated with a frequency modulation of the EEG signal (rather than an amplitude modulation within frequency bands). We propose a quantitative derivation of the underlying idea, based upon a neural field formulation of cortical activity. In brief, dense lateral connections induce a separation of time scales, whereby fast (and high spatial frequency) modes are enslaved by slow (low spatial frequency) modes. This slaving effect is such that the frequency spectrum of fast modes (which dominate EEG signals) is controlled by the amplitude of slow modes (which dominate fMRI signals). We then use conjoint empirical EEG-fMRI data—acquired in epilepsy patients—to demonstrate the electrophysiological underpinning of neural fluctuations inferred from sDCM for fMRI. PMID:23346055

  16. Statistical Mechanical Theory of Coupled Slow Dynamics in Glassy Polymer-Molecule Mixtures

    NASA Astrophysics Data System (ADS)

    Zhang, Rui; Schweizer, Kenneth

    The microscopic Elastically Collective Nonlinear Langevin Equation theory of activated relaxation in one-component supercooled liquids and glasses is generalized to polymer-molecule mixtures. The key idea is to account for dynamic coupling between molecule and polymer segment motion. For describing the molecule hopping event, a temporal casuality condition is formulated to self-consistently determine a dimensionless degree of matrix distortion relative to the molecule jump distance based on the concept of coupled dynamic free energies. Implementation for real materials employs an established Kuhn sphere model of the polymer liquid and a quantitative mapping to a hard particle reference system guided by the experimental equation-of-state. The theory makes predictions for the mixture dynamic shear modulus, activated relaxation time and diffusivity of both species, and mixture glass transition temperature as a function of molecule-Kuhn segment size ratio and attraction strength, composition and temperature. Model calculations illustrate the dynamical behavior in three distinct mixture regimes (fully miscible, bridging, clustering) controlled by the molecule-polymer interaction or chi-parameter. Applications to specific experimental systems will be discussed.

  17. Qualitative Variation in Constructive Alignment in Curriculum Design

    ERIC Educational Resources Information Center

    Trigwell, Keith; Prosser, Michael

    2014-01-01

    Constructive alignment has emerged as a powerful curriculum design idea, but little is known of the extent to which the effectiveness of this idea is a function of qualitative variation. This article introduces a model of qualitative variation in constructive alignment, and uses the results from known alignment studies to test the model. The…

  18. Discrepant Questioning as a Tool To Build Complex Mental Models of Respiration.

    ERIC Educational Resources Information Center

    Rea-Ramirez, Mary Anne; Nunez-Oviedo, Maria C.

    Discrepant questioning is a teaching technique that can help students "unlearn" misconceptions and process science ideas for deep understanding. Discrepant questioning is a technique in which teachers question students in a way that requires them to examine their ideas or models, without giving information prematurely to the student or passing…

  19. Science IDEAS: A Research-Based K-5 Interdisciplinary Instructional Model Linking Science and Literacy

    ERIC Educational Resources Information Center

    Romance, Nancy R.; Vitale, Michael R.

    2012-01-01

    Science IDEAS is an evidence-based model that reflects interdisciplinary research findings that support the integration of literacy (e.g., reading comprehension) within science instruction in grades K-5. Presented is a framework for planning integrated science and literacy instruction in which six elements (hands-on investigations, reading,…

  20. Ecology of Mind: A Batesonian Systems Thinking Approach to Curriculum Enactment

    ERIC Educational Resources Information Center

    Bloom, Jeffrey W.

    2012-01-01

    This article proposes a Batesonian systems thinking and ecology of mind approach to enacting curriculum. The key ideas for the model include ecology of mind, relationships, systems, systems thinking, pattern thinking, abductive thinking, and context. These ideas provide a basis for a recursive, three-part model involving developing (a) depth of…

  1. Helping Students Revise Disruptive Experientially Supported Ideas about Thermodynamics: Computer Visualizations and Tactile Models

    ERIC Educational Resources Information Center

    Clark, Douglas; Jorde, Doris

    2004-01-01

    This study analyzes the impact of an integrated sensory model within a thermal equilibrium visualization. We hypothesized that this intervention would not only help students revise their disruptive experientially supported ideas about why objects feel hot or cold, but also increase their understanding of thermal equilibrium. The analysis…

  2. Payoff Information Biases a Fast Guess Process in Perceptual Decision Making under Deadline Pressure: Evidence from Behavior, Evoked Potentials, and Quantitative Model Comparison.

    PubMed

    Noorbaloochi, Sharareh; Sharon, Dahlia; McClelland, James L

    2015-08-05

    We used electroencephalography (EEG) and behavior to examine the role of payoff bias in a difficult two-alternative perceptual decision under deadline pressure in humans. The findings suggest that a fast guess process, biased by payoff and triggered by stimulus onset, occurred on a subset of trials and raced with an evidence accumulation process informed by stimulus information. On each trial, the participant judged whether a rectangle was shifted to the right or left and responded by squeezing a right- or left-hand dynamometer. The payoff for each alternative (which could be biased or unbiased) was signaled 1.5 s before stimulus onset. The choice response was assigned to the first hand reaching a squeeze force criterion and reaction time was defined as time to criterion. Consistent with a fast guess account, fast responses were strongly biased toward the higher-paying alternative and the EEG exhibited an abrupt rise in the lateralized readiness potential (LRP) on a subset of biased payoff trials contralateral to the higher-paying alternative ∼ 150 ms after stimulus onset and 50 ms before stimulus information influenced the LRP. This rise was associated with poststimulus dynamometer activity favoring the higher-paying alternative and predicted choice and response time. Quantitative modeling supported the fast guess account over accounts of payoff effects supported in other studies. Our findings, taken with previous studies, support the idea that payoff and prior probability manipulations produce flexible adaptations to task structure and do not reflect a fixed policy for the integration of payoff and stimulus information. Humans and other animals often face situations in which they must make choices based on uncertain sensory information together with information about expected outcomes (gains or losses) about each choice. We investigated how differences in payoffs between available alternatives affect neural activity, overt choice, and the timing of choice responses. In our experiment, in which participants were under strong time pressure, neural and behavioral findings together with model fitting suggested that our human participants often made a fast guess toward the higher reward rather than integrating stimulus and payoff information. Our findings, taken with findings from other studies, support the idea that payoff and prior probability manipulations produce flexible adaptations to task structure and do not reflect a fixed policy. Copyright © 2015 the authors 0270-6474/15/3510989-23$15.00/0.

  3. Research-Supported Ideas for Implementing Reauthorized IDEA with Intelligent Professional Psychological Services

    ERIC Educational Resources Information Center

    Berninger, Virginia W.

    2006-01-01

    The recent reauthorization of the Individuals With Disabilities Education Improvement Act of 2004 (IDEA; 2004) is first discussed in its historical context. Then, a programmatic line of research (funded by the National Institute of Child Health and Human Development since 1989) is described that is relevant to a proposed model for universal…

  4. School Choice: How an Abstract Idea Became a Political Reality

    ERIC Educational Resources Information Center

    Viteritti, Joseph P.

    2005-01-01

    This paper traces the evolution of the choice idea over three generations, from a market model concerned with economic liberty, to a demand for social justice based on equality, to a political movement that translates the idea into policy. Focusing on the last generation, it explains why the market concept has lacked political appeal and how…

  5. Preliminary study of silica aerogel as a gas-equivalent material in ionization chambers

    NASA Astrophysics Data System (ADS)

    Caresana, M.; Zorloni, G.

    2017-12-01

    Since about two decades, a renewed interest on aerogels has risen. These peculiar materials show fairly unique properties. Thus, they are under investigation for both scientific and commercial purposes and new optimized production processes are studied. In this work, the possibility of using aerogel in the field of radiation detection is explored. The idea is to substitute the gas filling in a ionization chamber with the aerogel. The material possesses a density about 100 times greater than ambient pressure air. Where as the open-pore structure should allow the charge carriers to move freely. Small hydrophobic silica aerogel samples were studied. A custom ionization chamber, capable of working both with aerogel or in the classic gas set up, was built. The response of the chamber in current mode was investigated using an X-ray tube. The results obtained showed, under proper conditions, an enhancement of about 60 times of the current signal in the aerogel configuration with respect to the classic gas one. Moreover, some unusual behaviours were observed, i.e. time inertia of the signal and super-/sub-linear current response with respect to the dose rate. While testing high electric fields, aerogel configuration seemed to enhance the Townsend's effects. In order to represent the observed trends, a trapping-detrapping model is proposed, which is capable to predict semi-empirically the steady state currents measured. The time evolution of the signal is semi-quantitatively represented by the same model. The coefficients estimated by the fits are in agreement with similar trapping problems in the literature. In particular, a direct comparison between the benchmark of the FET silica gates and aerogel case endorses the idea that the same type of phenomenon occurs in the studied case.

  6. Localization and characterization of (/sup 3/H)desmethylimipramine binding sites in rat brain by quantitative autoradiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biegon, A.; Rainbow, T.C.

    1983-05-01

    The high affinity binding sites for the antidepressant desmethlyimipramine (DMI) have been localized in rat brain by quantitative autoradiography. There are high concentrations of binding sites in the locus ceruleus, the anterior ventral thalamus, the ventral portion of the bed nucleus of the stria terminalis, the paraventricular and the dorsomedial nuclei of the hypothalamus. The distribution of DMI binding sites is in striking accord with the distribution of norepinephrine terminals. Pretreatment of rats with the neurotoxin 6-hydroxydopamine, which causes a selective degeneration of catecholamine terminals, results in 60 to 90% decrease in DMI binding. These data support the idea thatmore » high affinity binding sites for DMI are located on presynaptic noradrenergic terminals.« less

  7. Rate Constants and Mechanisms of Protein–Ligand Binding

    PubMed Central

    Pang, Xiaodong; Zhou, Huan-Xiang

    2017-01-01

    Whereas protein–ligand binding affinities have long-established prominence, binding rate constants and binding mechanisms have gained increasing attention in recent years. Both new computational methods and new experimental techniques have been developed to characterize the latter properties. It is now realized that binding mechanisms, like binding rate constants, can and should be quantitatively determined. In this review, we summarize studies and synthesize ideas on several topics in the hope of providing a coherent picture of and physical insight into binding kinetics. The topics include microscopic formulation of the kinetic problem and its reduction to simple rate equations; computation of binding rate constants; quantitative determination of binding mechanisms; and elucidation of physical factors that control binding rate constants and mechanisms. PMID:28375732

  8. Guidelines for a graph-theoretic implementation of structural equation modeling

    USGS Publications Warehouse

    Grace, James B.; Schoolmaster, Donald R.; Guntenspergen, Glenn R.; Little, Amanda M.; Mitchell, Brian R.; Miller, Kathryn M.; Schweiger, E. William

    2012-01-01

    Structural equation modeling (SEM) is increasingly being chosen by researchers as a framework for gaining scientific insights from the quantitative analyses of data. New ideas and methods emerging from the study of causality, influences from the field of graphical modeling, and advances in statistics are expanding the rigor, capability, and even purpose of SEM. Guidelines for implementing the expanded capabilities of SEM are currently lacking. In this paper we describe new developments in SEM that we believe constitute a third-generation of the methodology. Most characteristic of this new approach is the generalization of the structural equation model as a causal graph. In this generalization, analyses are based on graph theoretic principles rather than analyses of matrices. Also, new devices such as metamodels and causal diagrams, as well as an increased emphasis on queries and probabilistic reasoning, are now included. Estimation under a graph theory framework permits the use of Bayesian or likelihood methods. The guidelines presented start from a declaration of the goals of the analysis. We then discuss how theory frames the modeling process, requirements for causal interpretation, model specification choices, selection of estimation method, model evaluation options, and use of queries, both to summarize retrospective results and for prospective analyses. The illustrative example presented involves monitoring data from wetlands on Mount Desert Island, home of Acadia National Park. Our presentation walks through the decision process involved in developing and evaluating models, as well as drawing inferences from the resulting prediction equations. In addition to evaluating hypotheses about the connections between human activities and biotic responses, we illustrate how the structural equation (SE) model can be queried to understand how interventions might take advantage of an environmental threshold to limit Typha invasions. The guidelines presented provide for an updated definition of the SEM process that subsumes the historical matrix approach under a graph-theory implementation. The implementation is also designed to permit complex specifications and to be compatible with various estimation methods. Finally, they are meant to foster the use of probabilistic reasoning in both retrospective and prospective considerations of the quantitative implications of the results.

  9. The IDEA model: A single equation approach to the Ebola forecasting challenge.

    PubMed

    Tuite, Ashleigh R; Fisman, David N

    2018-03-01

    Mathematical modeling is increasingly accepted as a tool that can inform disease control policy in the face of emerging infectious diseases, such as the 2014-2015 West African Ebola epidemic, but little is known about the relative performance of alternate forecasting approaches. The RAPIDD Ebola Forecasting Challenge (REFC) tested the ability of eight mathematical models to generate useful forecasts in the face of simulated Ebola outbreaks. We used a simple, phenomenological single-equation model (the "IDEA" model), which relies only on case counts, in the REFC. Model fits were performed using a maximum likelihood approach. We found that the model performed reasonably well relative to other more complex approaches, with performance metrics ranked on average 4th or 5th among participating models. IDEA appeared better suited to long- than short-term forecasts, and could be fit using nothing but reported case counts. Several limitations were identified, including difficulty in identifying epidemic peak (even retrospectively), unrealistically precise confidence intervals, and difficulty interpolating daily case counts when using a model scaled to epidemic generation time. More realistic confidence intervals were generated when case counts were assumed to follow a negative binomial, rather than Poisson, distribution. Nonetheless, IDEA represents a simple phenomenological model, easily implemented in widely available software packages that could be used by frontline public health personnel to generate forecasts with accuracy that approximates that which is achieved using more complex methodologies. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.

  10. A physical model for the acousto-ultrasonic method. Ph.D. Thesis Final Report

    NASA Technical Reports Server (NTRS)

    Kiernan, Michael T.; Duke, John C., Jr.

    1990-01-01

    A basic physical explanation, a model, and comments on NDE application of the acousto-ultrasonic (AU) method for composite materials are presented. The basis of this work is a set of experiments where a sending and a receiving piezoelectric transducer were both oriented normal to the surface, at different points, on aluminum plates, various composite plates, and a tapered aluminum plate. The purpose and basic idea is introduced. Also, general comments on the AU method are offered. A literature review is offered for areas pertinent, such as composite materials, wave propagation, ultrasonics, and the AU. Special emphasis is given to theory which is used later on and past experimental results that are important to the physical understanding of the AU method. The experimental set-up, procedure, and the ensuing analysis are described. The experimental results are presented in both a quantitative and qualitative manner. A physical understanding of experimental results based on elasticity solution is furnished. Modeling and applications of the AU method is discussed for composite material and general conclusions are stated. The physical model of the AU method for composite materials is offered, something which has been much needed and sorely lacking. This physical understanding is possible due to the extensive set of experimental measurements, also reported.

  11. The scaling law of human travel - A message from George

    NASA Astrophysics Data System (ADS)

    Brockmann, Dirk; Hufnagel, Lars

    The dispersal of individuals of a species is the key driving force of various spatiotemporal phenomena which occur on geographical scales. It can synchronize populations of interacting species, stabilize them, and diversify gene pools.1-3 The geographic spread of human infectious diseases such as influenza, measles and the recent severe acute respiratory syndrome (SARS) is essentially promoted by human travel which occurs on many length scales and is sustained by a variety of means of trans-portation4-8. In the light of increasing international trade, intensified human traffic, and an imminent influenza A pandemic the knowledge of dynamical and statistical properties of human dispersal is of fundamental importance and acute. 7,9,10 A quantitative statistical theory for human travel and concomitant reliable forecasts would substantially improve and extend existing prevention strategies. Despite its crucial role, a quantitative assessment of human dispersal remains elusive and the opinion that humans disperse diffusively still prevails in many models. 11 In this chapter we will report on a recently developed technique which permits a solid and quantitative assessment of human dispersal on geographical scales.12 The key idea is to infer the statistical properties of human travel by analysing the geographic circulation of individual bank notes for which comprehensive datasets are collected at online bill-tracking websites. The analysis shows that the distribution of traveling distances decays as a power law, indicating that the movement of bank notes is reminiscent of superdiffusive, scale free random walks known as Lévy flights.13 Secondly, the probability of remaining in a small, spatially confined region for a time T is dominated by heavy tails which attenuate superdiffusive dispersal. We will show that the dispersal of bank notes can be described on many spatiotemporal scales by a two parameter continuous time random walk (CTRW) model to a surprising accuracy. We will provide a brief introduction to continuous time random walk theory14 and will show that human disperal is an ambivalent, effectively superdiffusive process.

  12. Confronting the caring crisis in clinical practice.

    PubMed

    Ma, Fang; Li, Jiping; Zhu, Dan; Bai, Yangjuan; Song, Jianhua

    2013-10-01

    In light of the call for humanistic caring in the contemporary health care system globally and in China, the issue of improving the caring skills that are essential to student success, high-quality nursing practice and positive patient outcomes is at the forefront of nursing education. The aim of this mixed-methods quantitative and qualitative study was to investigate baccalaureate nursing students' caring ability in the context of China and to explore the role of clinical practice learning in the development of students' caring skills. A two-phase, descriptive study utilising a mixed methodology consisting of a caring ability survey and focus group interviews was conducted. In the quantitative phase, 598 baccalaureate nursing students at two colleges in Yunnan Province in southwest China were surveyed using the Caring Ability Inventory (CAI). In the qualitative phase, 16 of the students who had participated in the quantitative phase were interviewed. Students obtained lower scores on the CAI than have been reported elsewhere by other researchers. In addition, students in the clinical stage of training scored lower than students in the pre-clinical stage. Three themes concerning facilitation by and three themes concerning the obstructive effects of clinical practice learning in the development of caring ability were identified. Themes pertaining to facilitation were: (i) promoting a sense of professional responsibility and ethics; (ii) providing an arena in which to practise caring, and (iii) learning from positive role models. Themes pertaining to obstruction were: (i) a critical practice learning environment; (ii) encountering inappropriate clinical teachers, and (iii) experiencing shock at the contrast between an idealised and the real environment. The key to developing students' ability to care lies in highlighting caring across the entire health care system. By diminishing exposure to negative role models, and adopting appropriate pedagogical ideas about education in caring, such as truth telling and helping students to think in a critical manner, educators can help students to improve their caring ability. © 2013 John Wiley & Sons Ltd.

  13. The influence of HOPE VI neighborhood revitalization on neighborhood-based physical activity: A mixed-methods approach.

    PubMed

    Dulin-Keita, Akilah; Clay, Olivio; Whittaker, Shannon; Hannon, Lonnie; Adams, Ingrid K; Rogers, Michelle; Gans, Kim

    2015-08-01

    This study uses a mixed methods approach to 1) identify surrounding residents' perceived expectations for Housing Opportunities for People Everywhere (HOPE VI) policy on physical activity outcomes and to 2) quantitatively examine the odds of neighborhood-based physical activity pre-/post-HOPE VI in a low socioeconomic status, predominantly African American community in Birmingham, Alabama. To address aim one, we used group concept mapping which is a structured approach for data collection and analyses that produces pictures/maps of ideas. Fifty-eight residents developed statements about potential influences of HOPE VI on neighborhood-based physical activity. In the quantitative study, we examined whether these potential influences increased the odds of neighborhood walking/jogging. We computed block entry logistic regression models with a larger cohort of residents at baseline (n = 184) and six-months (n = 142, 77% retention; n = 120 for all informative variables). We examined perceived neighborhood disorder (perceived neighborhood disorder scale), walkability and aesthetics (Neighborhood Environment Walkability Scale) and HOPE VI-related community safety and safety for physical activity as predictors. During concept mapping, residents generated statements that clustered into three distinct concepts, "Increased Leisure Physical Activity," "Safe Play Areas," and "Generating Health Promoting Resources." The quantitative analyses indicated that changes in neighborhood walkability increased the odds of neighborhood-based physical activity (p = 0.04). When HOPE VI-related safety for physical activity was entered into the model, it was associated with increased odds of physical activity (p = 0.04). Walkability was no longer statistically significant. These results suggest that housing policies that create walkable neighborhoods and that improve perceptions of safety for physical activity may increase neighborhood-based physical activity. However, the longer term impacts of neighborhood-level policies on physical activity require more longitudinal evidence to determine whether increased participation in physical activity is sustained. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Inhibition of Breast Cancer-Induced Angiogenesis by a Diverged Homeobox Gene

    DTIC Science & Technology

    2006-05-01

    and then harvested for flow cytometry using appropriate antibodies. Ad.Gax blocked the expression of VCAM-1, E-selectin, and ICAM-1. DOD Idea Award...solution. Bands were visualized by chemiluminescence using the ECL-Plus reagent (Amersham, Piscataway, NJ). Flow Cytometry Cells were harvested after...33), all of whose down-regulation we have confirmed using real time quantitative RT-PCR, Western blot, and flow cytometry (Fig. 5). Moreover, Gax

  15. The Influence of Financial, Cultural and Social Capital on the Likelihood of Success of Community College Students: A Quantitative Study Utilizing the Education Longitudinal Study of 2002-2012

    ERIC Educational Resources Information Center

    Iffland, Aaron R.

    2016-01-01

    Community colleges are an integral part of the postsecondary education system in the United States. Unfortunately, college completion rates continue to decline. Additionally, median income in the United States is also declining. The idea that each successive generation of students will do better than the previous one is quickly becoming a fantasy.…

  16. The IT in Secondary Science Book. A Compendium of Ideas for Using Computers and Teaching Science.

    ERIC Educational Resources Information Center

    Frost, Roger

    Scientists need to measure and communicate, to handle information, and model ideas. In essence, they need to process information. Young scientists have the same needs. Computers have become a tremendously important addition to the processing of information through database use, graphing and modeling and also in the collection of information…

  17. Arguing Like a Scientist: Engaging Students in Core Scientific Practices

    ERIC Educational Resources Information Center

    Chen, Ying-Chih; Steenhoek, Joshua

    2014-01-01

    Argumentation is now seen as a core practice for helping students engage with the construction and critique of scientific ideas and for making students scientifically literate. This article demonstrates a negotiation model to show how argumentation can be a vehicle to drive students to learn science's big ideas. The model has six phases:…

  18. Explaining Entrepreneurial Behavior: Dispositional Personality Traits, Growth of Personal Entrepreneurial Resources, and Business Idea Generation

    ERIC Educational Resources Information Center

    Obschonka, Martin; Silbereisen, Rainer K.; Schmitt-Rodermund, Eva

    2012-01-01

    Applying a life-span approach of human development and using the example of science-based business idea generation, the authors used structural equation modeling to test a mediation model for predicting entrepreneurial behavior in a sample of German scientists (2 measurement occasions; Time 1, N = 488). It was found that recalled early…

  19. Productivity of "Collisions Generate Heat" for Reconciling an Energy Model with Mechanistic Reasoning: A Case Study

    ERIC Educational Resources Information Center

    Scherr, Rachel E.; Robertson, Amy D.

    2015-01-01

    We observe teachers in professional development courses about energy constructing mechanistic accounts of energy transformations. We analyze a case in which teachers investigating adiabatic compression develop a model of the transformation of kinetic energy to thermal energy. Among their ideas is the idea that thermal energy is generated as a…

  20. Developing a new model for the invention and translation of neurotechnologies in academic neurosurgery.

    PubMed

    Leuthardt, Eric C

    2013-01-01

    There is currently an acceleration of new scientific and technical capabilities that create new opportunities for academic neurosurgery. To engage these changing dynamics, the Center for Innovation in Neuroscience and Technology (CINT) was created on the premise that successful innovation of device-related ideas relies on collaboration between multiple disciplines. The CINT has created a unique model that integrates scientific, medical, engineering, and legal/business experts to participate in the continuum from idea generation to translation. To detail the method by which this model has been implemented in the Department of Neurological Surgery at Washington University in St. Louis and the experience that has been accrued thus far. The workflow is structured to enable cross-disciplinary interaction, both intramurally and extramurally between academia and industry. This involves a structured method for generating, evaluating, and prototyping promising device concepts. The process begins with the "invention session," which consists of a structured exchange between inventors from diverse technical and medical backgrounds. Successful ideas, which pass a separate triage mechanism, are then sent to industry-sponsored multidisciplinary fellowships to create functioning prototypes. After 3 years, the CINT has engaged 32 clinical and nonclinical inventors, resulting in 47 ideas, 16 fellowships, and 12 patents, for which 7 have been licensed to industry. Financial models project that if commercially successful, device sales could have a notable impact on departmental revenue. The CINT is a model that supports an integrated approach from the time an idea is created through its translational development. To date, the approach has been successful in creating numerous concepts that have led to industry licenses. In the long term, this model will create a novel revenue stream to support the academic neurosurgical mission.

  1. Examples of finite element mesh generation using SDRC IDEAS

    NASA Technical Reports Server (NTRS)

    Zapp, John; Volakis, John L.

    1990-01-01

    IDEAS (Integrated Design Engineering Analysis Software) offers a comprehensive package for mechanical design engineers. Due to its multifaceted capabilities, however, it can be manipulated to serve the needs of electrical engineers, also. IDEAS can be used to perform the following tasks: system modeling, system assembly, kinematics, finite element pre/post processing, finite element solution, system dynamics, drafting, test data analysis, and project relational database.

  2. The Impact of Semantic Relevance and Heterogeneity of Pictorial Stimuli on Individual Brainstorming: An Extension of the SIAM Model

    ERIC Educational Resources Information Center

    Guo, Jing; McLeod, Poppy Lauretta

    2014-01-01

    Drawing upon the Search for Ideas in Associative Memory (SIAM) model as the theoretical framework, the impact of heterogeneity and topic relevance of visual stimuli on ideation performance was examined. Results from a laboratory experiment showed that visual stimuli increased productivity and diversity of idea generation, that relevance to the…

  3. Quantum probability and cognitive modeling: some cautions and a promising direction in modeling physics learning.

    PubMed

    Franceschetti, Donald R; Gire, Elizabeth

    2013-06-01

    Quantum probability theory offers a viable alternative to classical probability, although there are some ambiguities inherent in transferring the quantum formalism to a less determined realm. A number of physicists are now looking at the applicability of quantum ideas to the assessment of physics learning, an area particularly suited to quantum probability ideas.

  4. MECHANICS OF THE LUNG IN THE 20TH CENTURY

    PubMed Central

    Mitzner, Wayne

    2015-01-01

    Major advances in respiratory mechanics occurred primarily in the latter half of the 20th century, and this is when much of our current understanding was secured. The earliest and ancient investigations involving respiratory physiology and mechanics were often done in conjunction with other scientific activities and often lacked the ability to make quantitative measurements. This situation changed rapidly in the 20th century, and this relatively recent history of lung mechanics has been greatly influenced by critical technological advances and applications, which have made quantitative experimental testing of ideas possible. From the spirometer of Hutchinson, to the pneumotachograph of Fleisch, to the measurement of esophageal pressure, to the use of the Wilhelmy balance by Clements, to the unassuming strain gauges for measuring pressure and rapid paper and electronic chart recorders, these enabling devices have generated numerous quantitative experimental studies with greatly increased physiologic understanding and validation of mechanistic theories of lung function in health and disease. PMID:23733695

  5. Opinion formation and distribution in a bounded-confidence model on various networks

    NASA Astrophysics Data System (ADS)

    Meng, X. Flora; Van Gorder, Robert A.; Porter, Mason A.

    2018-02-01

    In the social, behavioral, and economic sciences, it is important to predict which individual opinions eventually dominate in a large population, whether there will be a consensus, and how long it takes for a consensus to form. Such ideas have been studied heavily both in physics and in other disciplines, and the answers depend strongly both on how one models opinions and on the network structure on which opinions evolve. One model that was created to study consensus formation quantitatively is the Deffuant model, in which the opinion distribution of a population evolves via sequential random pairwise encounters. To consider heterogeneity of interactions in a population along with social influence, we study the Deffuant model on various network structures (deterministic synthetic networks, random synthetic networks, and social networks constructed from Facebook data). We numerically simulate the Deffuant model and conduct regression analyses to investigate the dependence of the time to reach steady states on various model parameters, including a confidence bound for opinion updates, the number of participating entities, and their willingness to compromise. We find that network structure and parameter values both have important effects on the convergence time and the number of steady-state opinion groups. For some network architectures, we observe that the relationship between the convergence time and model parameters undergoes a transition at a critical value of the confidence bound. For some networks, the steady-state opinion distribution also changes from consensus to multiple opinion groups at this critical value.

  6. Investigating students' mental models and knowledge construction of microscopic friction. II. Implications for curriculum design and development

    NASA Astrophysics Data System (ADS)

    Corpuz, Edgar D.; Rebello, N. Sanjay

    2011-12-01

    Our previous research showed that students’ mental models of friction at the atomic level are significantly influenced by their macroscopic ideas. For most students, friction is due to the meshing of bumps and valleys and rubbing of atoms. The aforementioned results motivated us to further investigate how students can be helped to improve their present models of microscopic friction. Teaching interviews were conducted to study the dynamics of their model construction as they interacted with the interviewer, the scaffolding activities, and/or with each other. In this paper, we present the different scaffolding activities and the variation in the ideas that students generated as they did the hands-on and minds-on scaffolding activities. Results imply that through a series of carefully designed scaffolding activities, it is possible to facilitate the refinement of students’ ideas of microscopic friction.

  7. Vernier caliper and micrometer computer models using Easy Java Simulation and its pedagogical design features—ideas for augmenting learning with real instruments

    NASA Astrophysics Data System (ADS)

    Wee, Loo Kang; Tiang Ning, Hwee

    2014-09-01

    This paper presents the customization of Easy Java Simulation models, used with actual laboratory instruments, to create active experiential learning for measurements. The laboratory instruments are the vernier caliper and the micrometer. Three computer model design ideas that complement real equipment are discussed. These ideas involve (1) a simple two-dimensional view for learning from pen and paper questions and the real world; (2) hints, answers, different scale options and the inclusion of zero error; (3) assessment for learning feedback. The initial positive feedback from Singaporean students and educators indicates that these tools could be successfully shared and implemented in learning communities. Educators are encouraged to change the source code for these computer models to suit their own purposes; they have creative commons attribution licenses for the benefit of all.

  8. Punctuated equilibrium and power law in economic dynamics

    NASA Astrophysics Data System (ADS)

    Gupta, Abhijit Kar

    2012-02-01

    This work is primarily based on a recently proposed toy model by Thurner et al. (2010) [3] on Schumpeterian economic dynamics (inspired by the idea of economist Joseph Schumpeter [9]). Interestingly, punctuated equilibrium has been shown to emerge from the dynamics. The punctuated equilibrium and Power law are known to be associated with similar kinds of biologically relevant evolutionary models proposed in the past. The occurrence of the Power law is a signature of Self-Organised Criticality (SOC). In our view, power laws can be obtained by controlling the dynamics through incorporating the idea of feedback into the algorithm in some way. The so-called 'feedback' was achieved by introducing the idea of fitness and selection processes in the biological evolutionary models. Therefore, we examine the possible emergence of a power law by invoking the concepts of 'fitness' and 'selection' in the present model of economic evolution.

  9. Transgressive Hybrids as Hopeful Monsters.

    PubMed

    Dittrich-Reed, Dylan R; Fitzpatrick, Benjamin M

    2013-06-01

    The origin of novelty is a critical subject for evolutionary biologists. Early geneticists speculated about the sudden appearance of new species via special macromutations, epitomized by Goldschmidt's infamous "hopeful monster". Although these ideas were easily dismissed by the insights of the Modern Synthesis, a lingering fascination with the possibility of sudden, dramatic change has persisted. Recent work on hybridization and gene exchange suggests an underappreciated mechanism for the sudden appearance of evolutionary novelty that is entirely consistent with the principles of modern population genetics. Genetic recombination in hybrids can produce transgressive phenotypes, "monstrous" phenotypes beyond the range of parental populations. Transgressive phenotypes can be products of epistatic interactions or additive effects of multiple recombined loci. We compare several epistatic and additive models of transgressive segregation in hybrids and find that they are special cases of a general, classic quantitative genetic model. The Dobzhansky-Muller model predicts "hopeless" monsters, sterile and inviable transgressive phenotypes. The Bateson model predicts "hopeful" monsters with fitness greater than either parental population. The complementation model predicts both. Transgressive segregation after hybridization can rapidly produce novel phenotypes by recombining multiple loci simultaneously. Admixed populations will also produce many similar recombinant phenotypes at the same time, increasing the probability that recombinant "hopeful monsters" will establish true-breeding evolutionary lineages. Recombination is not the only (or even most common) process generating evolutionary novelty, but might be the most credible mechanism for sudden appearance of new forms.

  10. Holographic multiverse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garriga, J.; Vilenkin, A., E-mail: jaume.garriga@ub.edu, E-mail: vilenkin@cosmos.phy.tufts.edu

    2009-01-15

    We explore the idea that the dynamics of the inflationary multiverse is encoded in its future boundary, where it is described by a lower dimensional theory which is conformally invariant in the UV. We propose that a measure for the multiverse, which is needed in order to extract quantitative probabilistic predictions, can be derived in terms of the boundary theory by imposing a UV cutoff. In the inflationary bulk, this is closely related (though not identical) to the so-called scale factor cutoff measure.

  11. Effectively Identifying eQTLs from Multiple Tissues by Combining Mixed Model and Meta-analytic Approaches

    PubMed Central

    Choi, Ted; Eskin, Eleazar

    2013-01-01

    Gene expression data, in conjunction with information on genetic variants, have enabled studies to identify expression quantitative trait loci (eQTLs) or polymorphic locations in the genome that are associated with expression levels. Moreover, recent technological developments and cost decreases have further enabled studies to collect expression data in multiple tissues. One advantage of multiple tissue datasets is that studies can combine results from different tissues to identify eQTLs more accurately than examining each tissue separately. The idea of aggregating results of multiple tissues is closely related to the idea of meta-analysis which aggregates results of multiple genome-wide association studies to improve the power to detect associations. In principle, meta-analysis methods can be used to combine results from multiple tissues. However, eQTLs may have effects in only a single tissue, in all tissues, or in a subset of tissues with possibly different effect sizes. This heterogeneity in terms of effects across multiple tissues presents a key challenge to detect eQTLs. In this paper, we develop a framework that leverages two popular meta-analysis methods that address effect size heterogeneity to detect eQTLs across multiple tissues. We show by using simulations and multiple tissue data from mouse that our approach detects many eQTLs undetected by traditional eQTL methods. Additionally, our method provides an interpretation framework that accurately predicts whether an eQTL has an effect in a particular tissue. PMID:23785294

  12. Cultivating characters (moral value) through internalization strategy in science classroom

    NASA Astrophysics Data System (ADS)

    Ibrahim, M.; Abadi

    2018-01-01

    It is still in a crucial debate that characters play an important learning outcome to be realized by design. So far, most people think that characters were reached as nurturance effect with the assumption that students who are knowledgeable and skillful will have good characters automatically. Lately, obtained evidence that this assumption is not true. Characters should be taught deliberately or by design. This study was designed to culture elementary school students’ characters through science classroom. The teaching-learning process was conducted to facilitate and bridge the students from the known (concrete images: Science phenomena) to the unknown (abstract ideas: characters: care, and tolerance. Characters were observed five weeks before and after the intervention. Data were analyzed from observation of 24 students in internalization strategy-based courses. Qualitative and quantitative data suggested that the internalization strategy that use of science phenomena to represent abstract ideas (characters) in science classroom positively cultivating characters.

  13. Intramolecular Long-Distance Electron Transfer in Organic Molecules

    NASA Astrophysics Data System (ADS)

    Closs, Gerhard L.; Miller, John R.

    1988-04-01

    Intramolecular long-distance electron transfer (ET) has been actively studied in recent years in order to test existing theories in a quantitative way and to provide the necessary constants for predicting ET rates from simple structural parameters. Theoretical predictions of an ``inverted region,'' where increasing the driving force of the reaction will decrease its rate, have begun to be experimentally confirmed. A predicted nonlinear dependence of ET rates on the polarity of the solvent has also been confirmed. This work has implications for the design of efficient photochemical charge-separation devices. Other studies have been directed toward determining the distance dependence of ET reactions. Model studies on different series of compounds give similar distance dependences. When different stereochemical structures are compared, it becomes apparent that geometrical factors must be taken into account. Finally, the mechanism of coupling between donor and acceptor in weakly interacting systems has become of major importance. The theoretical and experimental evidence favors a model in which coupling is provided by the interaction with the orbitals of the intervening molecular fragments, although more experimental evidence is needed. Studies on intramolecular ET in organic model compounds have established that current theories give an adequate description of the process. The separation of electronic from nuclear coordinates is only a convenient approximation applied to many models, but in long-distance ET it works remarkably well. It is particularly gratifying to see Marcus' ideas finally confirmed after three decades of skepticism. By obtaining the numbers for quantitative correlations between rates and distances, these experiments have shown that saturated hydrocarbon fragments can ``conduct'' electrons over tens of angstroms. A dramatic demonstration of this fact has recently been obtained by tunneling electron microscopy on Langmuir-Blodgett films, showing in a pictorial fashion that electrons prefer to travel from cathode to anode through the fatty-acid chains (46).

  14. Quantitative Modeling of Entangled Polymer Rheology: Experiments, Tube Models and Slip-Link Simulations

    NASA Astrophysics Data System (ADS)

    Desai, Priyanka Subhash

    Rheology properties are sensitive indicators of molecular structure and dynamics. The relationship between rheology and polymer dynamics is captured in the constitutive model, which, if accurate and robust, would greatly aid molecular design and polymer processing. This dissertation is thus focused on building accurate and quantitative constitutive models that can help predict linear and non-linear viscoelasticity. In this work, we have used a multi-pronged approach based on the tube theory, coarse-grained slip-link simulations, and advanced polymeric synthetic and characterization techniques, to confront some of the outstanding problems in entangled polymer rheology. First, we modified simple tube based constitutive equations in extensional rheology and developed functional forms to test the effect of Kuhn segment alignment on a) tube diameter enlargement and b) monomeric friction reduction between subchains. We, then, used these functional forms to model extensional viscosity data for polystyrene (PS) melts and solutions. We demonstrated that the idea of reduction in segmental friction due to Kuhn alignment is successful in explaining the qualitative difference between melts and solutions in extension as revealed by recent experiments on PS. Second, we compiled literature data and used it to develop a universal tube model parameter set and prescribed their values and uncertainties for 1,4-PBd by comparing linear viscoelastic G' and G" mastercurves for 1,4-PBds of various branching architectures. The high frequency transition region of the mastercurves superposed very well for all the 1,4-PBds irrespective of their molecular weight and architecture, indicating universality in high frequency behavior. Therefore, all three parameters of the tube model were extracted from this high frequency transition region alone. Third, we compared predictions of two versions of the tube model, Hierarchical model and BoB model against linear viscoelastic data of blends of 1,4-PBd star and linear melts. The star was carefully synthesized and characterized. We found massive failures of tube models to predict the terminal relaxation behavior of the star/linear blends. In addition, these blends were also tested against a coarse-grained slip-link model, the "Cluster Fixed Slip-link Model (CFSM)" of Schieber and coworkers. The CFSM with only two parameters gave excellent agreement with all experimental data for the blends.

  15. Exploring the boundary of a specialist service for adults with intellectual disabilities using a Delphi study: a quantification of stakeholder participation.

    PubMed

    Hempe, Eva-Maria; Morrison, Cecily; Holland, Anthony

    2015-10-01

    There are arguments that a specialist service for adults with intellectual disabilities is needed to address the health inequalities that this group experiences. The boundary of such a specialist service however is unclear, and definition is difficult, given the varying experiences of the multiple stakeholder groups. The study reported here quantitatively investigates divergence in stakeholders' views of what constitutes a good specialist service for people with intellectual disabilities. It is the first step of a larger project that aims to investigate the purpose, function and design of such a specialist service. The results are intended to support policy and service development. A Delphi study was carried out to elicit the requirements of this new specialist service from stakeholder groups. It consisted of three panels (carers, frontline health professionals, researchers and policymakers) and had three rounds. The quantification of stakeholder participation covers the number of unique ideas per panel, the value of these ideas as determined by the other panels and the level of agreement within and between panels. There is some overlap of ideas about of what should constitute this specialist service, but both carers and frontline health professionals contributed unique ideas. Many of these were valued by the researchers and policymakers. Interestingly, carers generated more ideas regarding how to deliver services than what services to deliver. Regarding whether ideas are considered appropriate, the variation both within and between groups is small. On the other hand, the feasibility of solutions is much more contested, with large variations among carers. This study provides a quantified representation of the diversity of ideas among stakeholder groups regarding where the boundary of a specialist service for adults with learning disabilities should sit. The results can be used as a starting point for the design process. The study also offers one way to measure the impact of participation for those interested in participation as a mechanism for service improvement. © 2013 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  16. A bottom-up evolution of terrestrial ecosystem modeling theory, and ideas toward global vegetation modeling

    NASA Technical Reports Server (NTRS)

    Running, Steven W.

    1992-01-01

    A primary purpose of this review is to convey lessons learned in the development of a forest ecosystem modeling approach, from it origins in 1973 as a single-tree water balance model to the current regional applications. The second intent is to use this accumulated experience to offer ideas of how terrestrial ecosystem modeling can be taken to the global scale: earth systems modeling. A logic is suggested where mechanistic ecosystem models are not themselves operated globally, but rather are used to 'calibrate' much simplified models, primarily driven by remote sensing, that could be implemented in a semiautomated way globally, and in principle could interface with atmospheric general circulation models (GCM's).

  17. There Are Better Ways. Building Smaller, Safer, Effective and Efficient Public Schools. New Ideas for School Construction in North Carolina and a Model for Implementation. New Ideas, Number 1.

    ERIC Educational Resources Information Center

    Haynes, Doug; Hood, John

    This paper offers unconventional and innovative ideas for school planning and construction in North Carolina for creating smaller and safer community schools in response to rising enrollment, tight budgets, and dwindling school space. Often using examples from across the country, the paper discusses school construction costs and economy of scale…

  18. Classroom Ideas for Encouraging Thinking and Feeling: A Total Creativity Program for Individualizing and Humanizing the Learning Process. Volume Five.

    ERIC Educational Resources Information Center

    Williams, Frank E.

    This volume, the final one in the series, presents about 400 ideas which teachers can use to teach creative thinking. The ideas are classified according to teacher behavior (strategies or modes of teaching) and by types of pupil behavior, as described in the rationale for the cognitive-affective instructional (CAI) model presented in volume 2. The…

  19. Status, Emerging Ideas and Future Directions of Turbulence Modeling Research in Aeronautics

    NASA Technical Reports Server (NTRS)

    Duraisamy, Karthik; Spalart, Philippe R.; Rumsey, Christopher L.

    2017-01-01

    In July 2017, a three-day Turbulence Modeling Symposium sponsored by the University of Michigan and NASA was held in Ann Arbor, Michigan. This meeting brought together nearly 90 experts from academia, government and industry, with good international participation, to discuss the state of the art in turbulence modeling, emerging ideas, and to wrestle with questions surrounding its future. Emphasis was placed on turbulence modeling in a predictive context in complex problems, rather than on turbulence theory or descriptive modeling. This report summarizes many of the questions, discussions, and conclusions from the symposium, and suggests immediate next steps.

  20. Improving Measurement in Health Education and Health Behavior Research Using Item Response Modeling: Comparison with the Classical Test Theory Approach

    ERIC Educational Resources Information Center

    Wilson, Mark; Allen, Diane D.; Li, Jun Corser

    2006-01-01

    This paper compares the approach and resultant outcomes of item response models (IRMs) and classical test theory (CTT). First, it reviews basic ideas of CTT, and compares them to the ideas about using IRMs introduced in an earlier paper. It then applies a comparison scheme based on the AERA/APA/NCME "Standards for Educational and…

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnowitt, R.; Nath, P.

    A survey is given of supersymmetry and supergravity and their phenomenology. Some of the topics discussed are the basic ideas of global supersymmetry, the minimal supersymmetric Standard Model (MSSM) and its phenomenology, the basic ideas of local supersymmetry (supergravity), grand unification, supersymmetry breaking in supergravity grand unified models, radiative breaking of SU(2) {times} U(1), proton decay, cosmological constraints, and predictions of supergravity grand unified models. While the number of detailed derivations are necessarily limited, a sufficient number of results are given so that a reader can get a working knowledge of this field.

  2. Space spin-offs: is technology transfer worth it?

    NASA Astrophysics Data System (ADS)

    Bush, Lance B.

    Dual-uses, spin-offs, and technology transfer have all become part of the space lexicon, creating a cultural attitude toward space activity justification. From the very beginning of space activities in the late 1950's, this idea of secondary benefits became a major part of the space culture and its beliefs system. Technology transfer has played a central role in public and political debates of funding for space activities. Over the years, several studies of the benefits of space activities have been performed, with some estimates reaching as high as a 60:1 return to the economy for each dollar spent in space activities. Though many of these models claiming high returns have been roundly criticized. More recent studies of technology transfer from federal laboratories to private sector are showing a return on investment of 2.8:1, with little evidence of jobs increases. Yet, a purely quantitative analysis is not sufficient as there exist cultural and social benefits attainable only through case studies. Space projects tend to have a long life cycle, making it difficult to track metrics on their secondary benefits. Recent studies have begun to make inroads towards a better understanding of the benefits and drawbacks of investing in technology transfer activities related to space, but there remains significant analyses to be performed which must include a combination of quantitative and qualitative analyses.

  3. Overview of States' Use of Telehealth for the Delivery of Early Intervention (IDEA Part C) Services.

    PubMed

    Cason, Jana; Behl, Diane; Ringwalt, Sharon

    2012-01-01

    Early intervention (EI) services are designed to promote the development of skills and enhance the quality of life of infants and toddlers who have been identified as having a disability or developmental delay, enhance capacity of families to care for their child with special needs, reduce future educational costs, and promote independent living (NECTAC, 2011). EI services are regulated by Part C of the Individuals with Disabilities Education Improvement Act (IDEA); however, personnel shortages, particularly in rural areas, limit access for children who qualify. Telehealth is an emerging delivery model demonstrating potential to deliver EI services effectively and efficiently, thereby improving access and ameliorating the impact of provider shortages in underserved areas. The use of a telehealth delivery model facilitates inter-disciplinary collaboration, coordinated care, and consultation with specialists not available within a local community. A survey sent by the National Early Childhood Technical Assistance Center (NECTAC) to IDEA Part C coordinators assessed their utilization of telehealth within states' IDEA Part C programs. Reimbursement for provider type and services and barriers to implement a telehealth service delivery model were identified. Representatives from 26 states and one jurisdiction responded to the NECTAC telehealth survey. Of these, 30% (n=9) indicated that they are either currently using telehealth as an adjunct service delivery model (n=6) or plan to incorporate telehealth within the next 1-2 years (n=3). Identified telehealth providers included developmental specialists, teachers of the Deaf/Hard of Hearing (DHH), speech-language pathologists, occupational therapists, physical therapists, behavior specialists, audiologists, and interpreters. Reimbursement was variable and included use of IDEA Part C funding, Medicaid, and private insurance. Expressed barriers and concerns for the implementation of telehealth as a delivery model within Part C programming included security issues (40%; n=11); privacy issues (44%; n=12); concerns about quality of services delivered via telehealth (40%; n=11); and lack of evidence to support the effectiveness of a telehealth service delivery model within IDEA Part C programming (3%; n=1). Reimbursement policy and billing processes and technology infrastructure were also identified as barriers impacting the implementation of telehealth programming. Provider shortages impact the quantity and quality of services available for children with disabilities and developmental delay, particularly in rural areas. While many states are incorporating telehealth within their Early Intervention (IDEA Part C) services in order to improve access and overcome personnel shortages, barriers persist. Policy development, education of stakeholders, research, utilization of secure and private delivery platforms, and advocacy may facilitate more widespread adoption of telehealth within IDEA Part C programs across the country.

  4. Overview of States’ Use of Telehealth for the Delivery of Early Intervention (IDEA Part C) Services

    PubMed Central

    Cason, Jana; Behl, Diane; Ringwalt, Sharon

    2012-01-01

    Background: Early intervention (EI) services are designed to promote the development of skills and enhance the quality of life of infants and toddlers who have been identified as having a disability or developmental delay, enhance capacity of families to care for their child with special needs, reduce future educational costs, and promote independent living (NECTAC, 2011). EI services are regulated by Part C of the Individuals with Disabilities Education Improvement Act (IDEA); however, personnel shortages, particularly in rural areas, limit access for children who qualify. Telehealth is an emerging delivery model demonstrating potential to deliver EI services effectively and efficiently, thereby improving access and ameliorating the impact of provider shortages in underserved areas. The use of a telehealth delivery model facilitates inter-disciplinary collaboration, coordinated care, and consultation with specialists not available within a local community. Method: A survey sent by the National Early Childhood Technical Assistance Center (NECTAC) to IDEA Part C coordinators assessed their utilization of telehealth within states’ IDEA Part C programs. Reimbursement for provider type and services and barriers to implement a telehealth service delivery model were identified. Results: Representatives from 26 states and one jurisdiction responded to the NECTAC telehealth survey. Of these, 30% (n=9) indicated that they are either currently using telehealth as an adjunct service delivery model (n=6) or plan to incorporate telehealth within the next 1–2 years (n=3). Identified telehealth providers included developmental specialists, teachers of the Deaf/Hard of Hearing (DHH), speech-language pathologists, occupational therapists, physical therapists, behavior specialists, audiologists, and interpreters. Reimbursement was variable and included use of IDEA Part C funding, Medicaid, and private insurance. Expressed barriers and concerns for the implementation of telehealth as a delivery model within Part C programming included security issues (40%; n=11); privacy issues (44%; n=12); concerns about quality of services delivered via telehealth (40%; n=11); and lack of evidence to support the effectiveness of a telehealth service delivery model within IDEA Part C programming (3%; n=1). Reimbursement policy and billing processes and technology infrastructure were also identified as barriers impacting the implementation of telehealth programming. Conclusions: Provider shortages impact the quantity and quality of services available for children with disabilities and developmental delay, particularly in rural areas. While many states are incorporating telehealth within their Early Intervention (IDEA Part C) services in order to improve access and overcome personnel shortages, barriers persist. Policy development, education of stakeholders, research, utilization of secure and private delivery platforms, and advocacy may facilitate more widespread adoption of telehealth within IDEA Part C programs across the country. PMID:25945202

  5. Linking Adverse Outcome Pathways to Dynamic Energy Budgets: A Conceptual Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, Cheryl; Nisbet, Roger; Antczak, Philipp

    Ecological risk assessment quantifies the likelihood of undesirable impacts of stressors, primarily at high levels of biological organization. Data used to inform ecological risk assessments come primarily from tests on individual organisms or from suborganismal studies, indicating a disconnect between primary data and protection goals. We know how to relate individual responses to population dynamics using individual-based models, and there are emerging ideas on how to make connections to ecosystem services. However, there is no established methodology to connect effects seen at higher levels of biological organization with suborganismal dynamics, despite progress made in identifying Adverse Outcome Pathways (AOPs) thatmore » link molecular initiating events to ecologically relevant key events. This chapter is a product of a working group at the National Center for Mathematical and Biological Synthesis (NIMBioS) that assessed the feasibility of using dynamic energy budget (DEB) models of individual organisms as a “pivot” connecting suborganismal processes to higher level ecological processes. AOP models quantify explicit molecular, cellular or organ-level processes, but do not offer a route to linking sub-organismal damage to adverse effects on individual growth, reproduction, and survival, which can be propagated to the population level through individual-based models. DEB models describe these processes, but use abstract variables with undetermined connections to suborganismal biology. We propose linking DEB and quantitative AOP models by interpreting AOP key events as measures of damage-inducing processes in a DEB model. Here, we present a conceptual model for linking AOPs to DEB models and review existing modeling tools available for both AOP and DEB.« less

  6. Complex systems as lenses on learning and teaching

    NASA Astrophysics Data System (ADS)

    Hurford, Andrew C.

    From metaphors to mathematized models, the complexity sciences are changing the ways disciplines view their worlds, and ideas borrowed from complexity are increasingly being used to structure conversations and guide research on teaching and learning. The purpose of this corpus of research is to further those conversations and to extend complex systems ideas, theories, and modeling to curricula and to research on learning and teaching. A review of the literatures of learning and of complexity science and a discussion of the intersections between those disciplines are provided. The work reported represents an evolving model of learning qua complex system and that evolution is the result of iterative cycles of design research. One of the signatures of complex systems is the presence of scale invariance and this line of research furnishes empirical evidence of scale invariant behaviors in the activity of learners engaged in participatory simulations. The offered discussion of possible causes for these behaviors and chaotic phase transitions in human learning favors real-time optimization of decision-making as the means for producing such behaviors. Beyond theoretical development and modeling, this work includes the development of teaching activities intended to introduce pre-service mathematics and science teachers to complex systems. While some of the learning goals for this activity focused on the introduction of complex systems as a content area, we also used complex systems to frame perspectives on learning. Results of scoring rubrics and interview responses from students illustrate attributes of the proposed model of complex systems learning and also how these pre-service teachers made sense of the ideas. Correlations between established theories of learning and a complex adaptive systems model of learning are established and made explicit, and a means for using complex systems ideas for designing instruction is offered. It is a fundamental assumption of this research and researcher that complex systems ideas and understandings can be appropriated from more complexity-developed disciplines and put to use modeling and building increasingly productive understandings of learning and teaching.

  7. Infrequent social interaction can accelerate the spread of a persuasive idea.

    PubMed

    Burridge, James; Gnacik, Michał

    2016-12-01

    We study the spread of a persuasive new idea through a population of continuous-time random walkers in one dimension. The idea spreads via social gatherings involving groups of nearby walkers who act according to a biased "majority rule": After each gathering, the group takes on the new idea if more than a critical fraction 1-ɛ/2<1/2 of them already hold it; otherwise they all reject it. The boundary of a domain where the new idea has taken hold expands as a traveling wave in the density of new idea holders. Our walkers move by Lévy motion, and we compute the wave velocity analytically as a function of the frequency of social gatherings and the exponent of the jump distribution. When this distribution is sufficiently heavy tailed, then, counter to intuition, the idea can propagate faster if social gatherings are held less frequently. When jumps are truncated, a critical gathering frequency can emerge which maximizes propagation velocity. We explore our model by simulation, confirming our analytical results.

  8. Quantitative safety assessment of air traffic control systems through system control capacity

    NASA Astrophysics Data System (ADS)

    Guo, Jingjing

    Quantitative Safety Assessments (QSA) are essential to safety benefit verification and regulations of developmental changes in safety critical systems like the Air Traffic Control (ATC) systems. Effectiveness of the assessments is particularly desirable today in the safe implementations of revolutionary ATC overhauls like NextGen and SESAR. QSA of ATC systems are however challenged by system complexity and lack of accident data. Extending from the idea "safety is a control problem" in the literature, this research proposes to assess system safety from the control perspective, through quantifying a system's "control capacity". A system's safety performance correlates to this "control capacity" in the control of "safety critical processes". To examine this idea in QSA of the ATC systems, a Control-capacity Based Safety Assessment Framework (CBSAF) is developed which includes two control capacity metrics and a procedural method. The two metrics are Probabilistic System Control-capacity (PSC) and Temporal System Control-capacity (TSC); each addresses an aspect of a system's control capacity. And the procedural method consists three general stages: I) identification of safety critical processes, II) development of system control models and III) evaluation of system control capacity. The CBSAF was tested in two case studies. The first one assesses an en-route collision avoidance scenario and compares three hypothetical configurations. The CBSAF was able to capture the uncoordinated behavior between two means of control, as was observed in a historic midair collision accident. The second case study compares CBSAF with an existing risk based QSA method in assessing the safety benefits of introducing a runway incursion alert system. Similar conclusions are reached between the two methods, while the CBSAF has the advantage of simplicity and provides a new control-based perspective and interpretation to the assessments. The case studies are intended to investigate the potential and demonstrate the utilities of CBSAF and are not intended for thorough studies of collision avoidance and runway incursions safety, which are extremely challenging problems. Further development and thorough validations are required to allow CBSAF to reach implementation phases, e.g. addressing the issues of limited scalability and subjectivity.

  9. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  10. Third-Kind Encounters in Biomedicine: Immunology Meets Mathematics and Informatics to Become Quantitative and Predictive.

    PubMed

    Eberhardt, Martin; Lai, Xin; Tomar, Namrata; Gupta, Shailendra; Schmeck, Bernd; Steinkasserer, Alexander; Schuler, Gerold; Vera, Julio

    2016-01-01

    The understanding of the immune response is right now at the center of biomedical research. There are growing expectations that immune-based interventions will in the midterm provide new, personalized, and targeted therapeutic options for many severe and highly prevalent diseases, from aggressive cancers to infectious and autoimmune diseases. To this end, immunology should surpass its current descriptive and phenomenological nature, and become quantitative, and thereby predictive.Immunology is an ideal field for deploying the tools, methodologies, and philosophy of systems biology, an approach that combines quantitative experimental data, computational biology, and mathematical modeling. This is because, from an organism-wide perspective, the immunity is a biological system of systems, a paradigmatic instance of a multi-scale system. At the molecular scale, the critical phenotypic responses of immune cells are governed by large biochemical networks, enriched in nested regulatory motifs such as feedback and feedforward loops. This network complexity confers them the ability of highly nonlinear behavior, including remarkable examples of homeostasis, ultra-sensitivity, hysteresis, and bistability. Moving from the cellular level, different immune cell populations communicate with each other by direct physical contact or receiving and secreting signaling molecules such as cytokines. Moreover, the interaction of the immune system with its potential targets (e.g., pathogens or tumor cells) is far from simple, as it involves a number of attack and counterattack mechanisms that ultimately constitute a tightly regulated multi-feedback loop system. From a more practical perspective, this leads to the consequence that today's immunologists are facing an ever-increasing challenge of integrating massive quantities from multi-platforms.In this chapter, we support the idea that the analysis of the immune system demands the use of systems-level approaches to ensure the success in the search for more effective and personalized immune-based therapies.

  11. Euromech 579 Arpino 3-8 April 2017: Generalized and microstructured continua: new ideas in modeling and/or applications to structures with (nearly)inextensible fibers—a review of presentations and discussions

    NASA Astrophysics Data System (ADS)

    Laudato, Marco; Di Cosmo, Fabio

    2018-04-01

    In the present paper, a rational report on Euromech 579, Generalized and Microstructured Continua: New ideas in modeling and/or Applications to Structures with (nearly)inextensible fibers (Arpino 3-8 April 2017), is provided. The main aim of the colloquium was to provide a forum for experts in generalized and microstructured continua with inextensible fibers to exchange ideas and get informed about the latest research trends in the domain. The interested reader will find more details about the colloquium at the dedicated web page http://www.memocsevents.eu/euromech579.

  12. Developing and Evaluating an Eighth Grade Curriculum Unit That Links Foundational Chemistry to Biological Growth. Paper #1: Selecting Core Ideas and Practices -- An Iterative Process

    ERIC Educational Resources Information Center

    Roseman, Jo Ellen; Herrmann-Abell, Cari; Flanagan, Jean; Kruse, Rebecca; Howes, Elaine; Carlson, Janet; Roth, Kathy; Bourdelat-Parks, Brooke

    2013-01-01

    Researchers at AAAS and BSCS have developed a six-week unit that aims to help middle school students learn important chemistry ideas that can be used to explain growth and repair in animals and plants. By integrating core physical and life science ideas and engaging students in the science practices of modeling and constructing explanations, the…

  13. Tutorial: Physics and modeling of Hall thrusters

    NASA Astrophysics Data System (ADS)

    Boeuf, Jean-Pierre

    2017-01-01

    Hall thrusters are very efficient and competitive electric propulsion devices for satellites and are currently in use in a number of telecommunications and government spacecraft. Their power spans from 100 W to 20 kW, with thrust between a few mN and 1 N and specific impulse values between 1000 and 3000 s. The basic idea of Hall thrusters consists in generating a large local electric field in a plasma by using a transverse magnetic field to reduce the electron conductivity. This electric field can extract positive ions from the plasma and accelerate them to high velocity without extracting grids, providing the thrust. These principles are simple in appearance but the physics of Hall thrusters is very intricate and non-linear because of the complex electron transport across the magnetic field and its coupling with the electric field and the neutral atom density. This paper describes the basic physics of Hall thrusters and gives a (non-exhaustive) summary of the research efforts that have been devoted to the modelling and understanding of these devices in the last 20 years. Although the predictive capabilities of the models are still not sufficient for a full computer aided design of Hall thrusters, significant progress has been made in the qualitative and quantitative understanding of these devices.

  14. A trade secret model for genomic biobanking.

    PubMed

    Conley, John M; Mitchell, Robert; Cadigan, R Jean; Davis, Arlene M; Dobson, Allison W; Gladden, Ryan Q

    2012-01-01

    Genomic biobanks present ethical challenges that are qualitatively unique and quantitatively unprecedented. Many critics have questioned whether the current system of informed consent can be meaningfully applied to genomic biobanking. Proposals for reform have come from many directions, but have tended to involve incremental change in current informed consent practice. This paper reports on our efforts to seek new ideas and approaches from those whom informed consent is designed to protect: research subjects. Our model emerged from semi-structured interviews with healthy volunteers who had been recruited to join either of two biobanks (some joined, some did not), and whom we encouraged to explain their concerns and how they understood the relationship between specimen contributors and biobanks. These subjects spoke about their DNA and the information it contains in ways that were strikingly evocative of the legal concept of the trade secret. They then described the terms and conditions under which they might let others study their DNA, and there was a compelling analogy to the commonplace practice of trade secret licensing. We propose a novel biobanking model based on this trade secret concept, and argue that it would be a practical, legal, and ethical improvement on the status quo. © 2012 American Society of Law, Medicine & Ethics, Inc.

  15. Fast history matching of time-lapse seismic and production data for high resolution models

    NASA Astrophysics Data System (ADS)

    Jimenez Arismendi, Eduardo Antonio

    Integrated reservoir modeling has become an important part of day-to-day decision analysis in oil and gas management practices. A very attractive and promising technology is the use of time-lapse or 4D seismic as an essential component in subsurface modeling. Today, 4D seismic is enabling oil companies to optimize production and increase recovery through monitoring fluid movements throughout the reservoir. 4D seismic advances are also being driven by an increased need by the petroleum engineering community to become more quantitative and accurate in our ability to monitor reservoir processes. Qualitative interpretations of time-lapse anomalies are being replaced by quantitative inversions of 4D seismic data to produce accurate maps of fluid saturations, pore pressure, temperature, among others. Within all steps involved in this subsurface modeling process, the most demanding one is integrating the geologic model with dynamic field data, including 4Dseismic when available. The validation of the geologic model with observed dynamic data is accomplished through a "history matching" (HM) process typically carried out with well-based measurements. Due to low resolution of production data, the validation process is severely limited in its reservoir areal coverage, compromising the quality of the model and any subsequent predictive exercise. This research will aim to provide a novel history matching approach that can use information from high-resolution seismic data to supplement the areally sparse production data. The proposed approach will utilize streamline-derived sensitivities as means of relating the forward model performance with the prior geologic model. The essential ideas underlying this approach are similar to those used for high-frequency approximations in seismic wave propagation. In both cases, this leads to solutions that are defined along "streamlines" (fluid flow), or "rays" (seismic wave propagation). Synthetic and field data examples will be used extensively to demonstrate the value and contribution of this work. Our results show that the problem of non-uniqueness in this complex history matching problem is greatly reduced when constraints in the form of saturation maps from spatially closely sampled seismic data are included. Further on, our methodology can be used to quickly identify discrepancies between static and dynamic modeling. Reducing this gap will ensure robust and reliable models leading to accurate predictions and ultimately an optimum hydrocarbon extraction.

  16. TOWARDS OPERATIONAL FORECASTING OF LOWER ATMOSPHERE EFFECTS ON THE UPPER ATMOSPHERE AND IONOSPHERE: INTEGRATED DYNAMICS IN EARTH’S ATMOSPHERE (IDEA)

    NASA Astrophysics Data System (ADS)

    Akmaev, R. A.; Fuller-Rowell, T. J.; Wu, F.; Wang, H.; Juang, H.; Moorthi, S.; Iredell, M.

    2009-12-01

    The upper atmosphere and ionosphere exhibit variability and phenomena that have been associated with planetary and tidal waves originating in the lower atmosphere. To study and be able to predict the effects of these global-scale dynamical perturbations on the coupled thermosphere-ionosphere-electrodynamics system a new coupled model is being developed under the IDEA project. To efficiently cross the infamous R2O “death valley”, from the outset the IDEA project leverages the natural synergy between NOAA’s National Weather Service’s (NWS) Space Weather Prediction and Environmental Modeling Centers and a NOAA-University of Colorado cooperative institute (CIRES). IDEA interactively couples a Whole Atmosphere Model (WAM) with ionosphere-plasmasphere and electrodynamics models. WAM is a 150-layer general circulation model (GCM) based on NWS’s operational weather prediction Global Forecast System (GFS) extended from its nominal top altitude of 62 km to over 600 km. It incorporates relevant physical processes including those responsible for the generation of tidal and planetary waves in the troposphere and stratosphere. Long-term simulations reveal realistic seasonal variability of tidal waves with a substantial contribution from non-migrating tidal modes, recently implicated in the observed morphology of the ionosphere. Such phenomena as the thermospheric Midnight Temperature Maximum (MTM), previously associated with the tides, are also realistically simulated for the first time.

  17. The Dreyfus model of clinical problem-solving skills acquisition: a critical perspective

    PubMed Central

    Peña, Adolfo

    2010-01-01

    Context The Dreyfus model describes how individuals progress through various levels in their acquisition of skills and subsumes ideas with regard to how individuals learn. Such a model is being accepted almost without debate from physicians to explain the ‘acquisition’ of clinical skills. Objectives This paper reviews such a model, discusses several controversial points, clarifies what kind of knowledge the model is about, and examines its coherence in terms of problem-solving skills. Dreyfus' main idea that intuition is a major aspect of expertise is also discussed in some detail. Relevant scientific evidence from cognitive science, psychology, and neuroscience is reviewed to accomplish these aims. Conclusions Although the Dreyfus model may partially explain the ‘acquisition’ of some skills, it is debatable if it can explain the acquisition of clinical skills. The complex nature of clinical problem-solving skills and the rich interplay between the implicit and explicit forms of knowledge must be taken into consideration when we want to explain ‘acquisition’ of clinical skills. The idea that experts work from intuition, not from reason, should be evaluated carefully. PMID:20563279

  18. Interpersonal Harmony and Conflict for Chinese People: A Yin-Yang Perspective.

    PubMed

    Huang, Li-Li

    2016-01-01

    This article provides an overview on a series of original studies conducted by the author. The aim here is to present the ideas that the author reconstructed, based on the dialectics of harmonization, regarding harmony and conflict embodied in traditional Chinese thought, and to describe how a formal psychological theory/model on interpersonal harmony and conflict was developed based on the Yin-Yang perspective. The paper also details how essential theories on interpersonal harmony and conflict were constructed under this formal model by conducting a qualitative study involving in-depth interviews with 30 adults. Psychological research in Western society has, intriguingly, long been focused more on interpersonal conflict than on interpersonal harmony. By contrast, the author's work started from the viewpoint of a materialist conception of history and dialectics of harmonization in order to reinterpret traditional Chinese thought. Next, a "dynamic model of interpersonal harmony and conflict" was developed, as a formal psychological theory, based on the real-virtual notions in the Yin-Yang perspective. Under this model, interpersonal harmony and conflict can be classified into genuine versus superficial harmony and authentic versus virtual focus conflict, and implicit/hidden conflict is regarded as superficial harmony. Subsequently, the author conducted a series of quantitative studies on interpersonal harmony and conflict within parent-child, supervisor-subordinate, and friend-friend relationships in order to verify the construct validity and the predictive validity of the dynamic model of interpersonal harmony and conflict. The claim presented herein is that Chinese traditional thought and the psychological theory/model based on the Yin-Yang perspective can be combined. Accordingly, by combining qualitative and quantitative empirical research, the relative substantial theory can be developed and the concepts can be validated. Thus, this work represents the realization of a series of modern Chinese indigenous psychological research studies rooted in traditional cultural thought and the Yin-Yang perspective. The work also mirrors the current conflict-management research that has incorporated the Chinese notion of harmony and adopted the Yin-Yang perspective on culture.

  19. Interpersonal Harmony and Conflict for Chinese People: A Yin–Yang Perspective

    PubMed Central

    Huang, Li-Li

    2016-01-01

    This article provides an overview on a series of original studies conducted by the author. The aim here is to present the ideas that the author reconstructed, based on the dialectics of harmonization, regarding harmony and conflict embodied in traditional Chinese thought, and to describe how a formal psychological theory/model on interpersonal harmony and conflict was developed based on the Yin–Yang perspective. The paper also details how essential theories on interpersonal harmony and conflict were constructed under this formal model by conducting a qualitative study involving in-depth interviews with 30 adults. Psychological research in Western society has, intriguingly, long been focused more on interpersonal conflict than on interpersonal harmony. By contrast, the author’s work started from the viewpoint of a materialist conception of history and dialectics of harmonization in order to reinterpret traditional Chinese thought. Next, a “dynamic model of interpersonal harmony and conflict” was developed, as a formal psychological theory, based on the real-virtual notions in the Yin–Yang perspective. Under this model, interpersonal harmony and conflict can be classified into genuine versus superficial harmony and authentic versus virtual focus conflict, and implicit/hidden conflict is regarded as superficial harmony. Subsequently, the author conducted a series of quantitative studies on interpersonal harmony and conflict within parent–child, supervisor–subordinate, and friend–friend relationships in order to verify the construct validity and the predictive validity of the dynamic model of interpersonal harmony and conflict. The claim presented herein is that Chinese traditional thought and the psychological theory/model based on the Yin–Yang perspective can be combined. Accordingly, by combining qualitative and quantitative empirical research, the relative substantial theory can be developed and the concepts can be validated. Thus, this work represents the realization of a series of modern Chinese indigenous psychological research studies rooted in traditional cultural thought and the Yin–Yang perspective. The work also mirrors the current conflict-management research that has incorporated the Chinese notion of harmony and adopted the Yin–Yang perspective on culture. PMID:27375526

  20. Thermal and Electrical Investigation of Conductive Polylactic Acid Based Filaments

    NASA Astrophysics Data System (ADS)

    Dobre, R. A.; Marcu, A. E.; Drumea, A.; Vlădescu, M.

    2018-06-01

    Printed electronics gain momentum as the involved technologies become affordable. The ability to shape electrostatic dissipative materials in almost any form is useful. The idea to use a general-purpose 3D printer to manufacture the electrical interconnections for a circuit is very attractive. The advantage of using a 3D printed structure over other technologies are mainly the lower price, less requirements concerning storage and use conditions, and the capability to build thicker traces while maintaining flexibility. The main element allowing this to happen is a printing filament with conductive properties. The paper shows the experiments that were performed to determine the thermal and electrical properties of polylactic acid (PLA) based ESD dissipative filament. Quantitative results regarding the thermal behavior of the DC resistance and the variation of the equivalent parallel impedance model parameters (losses resistance, capacitance, impedance magnitude and phase angle) with frequency are shown.. Using these results, new applications like printed temperature sensors can be imagined.

  1. A theoretically based evaluation of HIV/AIDS prevention campaigns along the trans-Africa highway in Kenya.

    PubMed

    Witte, K; Cameron, K A; Lapinski, M K; Nzyuko, S

    1998-01-01

    Print HIV/AIDS prevention campaign materials (e.g., posters, pamphlets, stickers) from 10 public health organizations in Kenya were evaluated according to the Extended Parallel Process Model (EPPM), a health behavior change theory based on the fear appeal literature, at various sites along the Trans-Africa Highway in Kenya. Three groups each of commercial sex workers (CSWs), truck drivers (TDs) and their assistants (ASSTs), and young men (YM) who live and work at the truck stops participated in focus group discussions where reactions to the campaign materials were gathered according to this theoretical base. Reactions to campaign materials varied substantially, according to the poster or pamphlet viewed. Overall, most participants wanted more detailed information about (a) the proper way to use condoms, (b) ideas for how to negotiate condom use with reluctant partners, and (c) accurate information on symptoms of AIDS and what to do once one contracted HIV. Both quantitative and qualitative analyses of the campaign materials are reported.

  2. Rethinking balance and impartiality in journalism? How the BBC attempted and failed to change the paradigm

    PubMed Central

    Wahl-Jorgensen, Karin; Berry, Mike; Garcia-Blanco, Iñaki; Bennett, Lucy; Cable, Jonathan

    2016-01-01

    This article reconsiders the concepts of balance and impartiality in journalism, in the context of a quantitative content analysis of sourcing patterns in BBC news programming on radio, television and online in 2007 and 2012. Impartiality is the cornerstone of principles of public service broadcasting at the BBC and other broadcasters modelled on it. However, the article suggests that in the case of the BBC, it is principally put into practice through juxtaposing the positions of the two main political parties – Conservative and Labour. On this basis, the article develops the idea of the ‘paradigm of impartiality-as-balance.’ This paradigm prevails despite the news organisation’s commitment to representing a broader range of opinion. The paradigm of impartiality-as-balance means that only a narrow range of views and voices are heard on the most contentious and important issues. Further, it results in reporting that focuses on party-political conflict, to the detriment of a journalism which provides much-needed context. PMID:29278243

  3. Research on Customer Value Based on Extension Data Mining

    NASA Astrophysics Data System (ADS)

    Chun-Yan, Yang; Wei-Hua, Li

    Extenics is a new discipline for dealing with contradiction problems with formulize model. Extension data mining (EDM) is a product combining Extenics with data mining. It explores to acquire the knowledge based on extension transformations, which is called extension knowledge (EK), taking advantage of extension methods and data mining technology. EK includes extensible classification knowledge, conductive knowledge and so on. Extension data mining technology (EDMT) is a new data mining technology that mining EK in databases or data warehouse. Customer value (CV) can weigh the essentiality of customer relationship for an enterprise according to an enterprise as a subject of tasting value and customers as objects of tasting value at the same time. CV varies continually. Mining the changing knowledge of CV in databases using EDMT, including quantitative change knowledge and qualitative change knowledge, can provide a foundation for that an enterprise decides the strategy of customer relationship management (CRM). It can also provide a new idea for studying CV.

  4. The application of systems thinking in health: why use systems thinking?

    PubMed

    Peters, David H

    2014-08-26

    This paper explores the question of what systems thinking adds to the field of global health. Observing that elements of systems thinking are already common in public health research, the article discusses which of the large body of theories, methods, and tools associated with systems thinking are more useful. The paper reviews the origins of systems thinking, describing a range of the theories, methods, and tools. A common thread is the idea that the behavior of systems is governed by common principles that can be discovered and expressed. They each address problems of complexity, which is a frequent challenge in global health. The different methods and tools are suited to different types of inquiry and involve both qualitative and quantitative techniques. The paper concludes by emphasizing that explicit models used in systems thinking provide new opportunities to understand and continuously test and revise our understanding of the nature of things, including how to intervene to improve people's health.

  5. Rethinking balance and impartiality in journalism? How the BBC attempted and failed to change the paradigm.

    PubMed

    Wahl-Jorgensen, Karin; Berry, Mike; Garcia-Blanco, Iñaki; Bennett, Lucy; Cable, Jonathan

    2017-08-01

    This article reconsiders the concepts of balance and impartiality in journalism, in the context of a quantitative content analysis of sourcing patterns in BBC news programming on radio, television and online in 2007 and 2012. Impartiality is the cornerstone of principles of public service broadcasting at the BBC and other broadcasters modelled on it. However, the article suggests that in the case of the BBC, it is principally put into practice through juxtaposing the positions of the two main political parties - Conservative and Labour. On this basis, the article develops the idea of the 'paradigm of impartiality-as-balance.' This paradigm prevails despite the news organisation's commitment to representing a broader range of opinion. The paradigm of impartiality-as-balance means that only a narrow range of views and voices are heard on the most contentious and important issues. Further, it results in reporting that focuses on party-political conflict, to the detriment of a journalism which provides much-needed context.

  6. Shock wave-free interface interaction

    NASA Astrophysics Data System (ADS)

    Frolov, Roman; Minev, Peter; Krechetnikov, Rouslan

    2016-11-01

    The problem of shock wave-free interface interaction has been widely studied in the context of compressible two-fluid flows using analytical, experimental, and numerical techniques. While various physical effects and possible interaction patterns for various geometries have been identified in the literature, the effects of viscosity and surface tension are usually neglected in such models. In our study, we apply a novel numerical algorithm for simulation of viscous compressible two-fluid flows with surface tension to investigate the influence of these effects on the shock-interface interaction. The method combines together the ideas from Finite Volume adaptation of invariant domains preserving algorithm for systems of hyperbolic conservation laws by Guermond and Popov and ADI parallel solver for viscous incompressible NSEs by Guermond and Minev. This combination has been further extended to a two-fluid flow case, including surface tension effects. Here we report on a quantitative study of how surface tension and viscosity affect the structure of the shock wave-free interface interaction region.

  7. Energy on the Home Front

    NASA Astrophysics Data System (ADS)

    Murphy, Thomas W.

    2011-11-01

    This article explores a variety of ways to measure, adjust, and augment home energy usage. Particular examples of using electricity and gas utility meters, power/energy meters for individual devices, whole-home energy monitoring, infrared cameras, and thermal measurements are discussed—leading to a factor-of-four reduction in home energy use in the case discussed. The net efficiency performance of a stand-alone photovoltaic system is also presented. Ideas for reducing one's energy/carbon footprint both within the home and in the larger community are quantitatively evaluated.

  8. Instantaneous Assessment Of Athletic Performance Using High Speed Video

    NASA Astrophysics Data System (ADS)

    Hubbard, Mont; Alaways, LeRoy W.

    1988-02-01

    We describe the use of high speed video to provide quantitative assessment of motion in athletic performance. Besides the normal requirement for accuracy, an essential feature is that the information be provided rapidly enough so that it my serve as valuable feedback in the learning process. The general considerations which must be addressed in the development of such a computer based system are discussed. These ideas are illustrated specifically through the description of a prototype system which has been designed for the javelin throw.

  9. Purists need not apply: the case for pragmatism in mixed methods research.

    PubMed

    Florczak, Kristine L

    2014-10-01

    The purpose of this column is to describe several different ways of conducting mixed method research. The paradigms that underpin both qualitative and quantitative research are also considered along with a cursory review of classical pragmatism as it relates conducting mixed methods studies. Finally, the idea of loosely coupled systems as a means to support mixed methods studies is proposed along with several caveats to researchers who desire to use this new way of obtaining knowledge. © The Author(s) 2014.

  10. Comparison and Historical Evolution of Ancient Greek Cosmological Ideas and Mathematical Models

    NASA Astrophysics Data System (ADS)

    Pinotsis, Antonios D.

    2005-12-01

    We present a comparative study of the cosmological ideas and mathematical models in ancient Greece. We show that the heliocentric system introduced by Aristarchus of Samos was the outcome of much intellectual activity. Many Greek philosophers, mathematicians and astronomers such as Anaximander, Philolaus, Hicetas, Ecphantus and Heraclides of Pontus contributed to this. Also, Ptolemy was influenced by the cosmological model of Heraclides of Pontus for the explanation of the apparent motions of Mercury and Venus. Apollonius, who wrote the definitive work on conic sections, introduced the theory of eccentric circles and implemented them together with epicycles instead of considering that the celestial bodies travel in elliptic orbits. This is due to the deeply rooted belief that the orbits of the celestial bodies were normal circular motions around the Earth, which was still. There was also a variety of important ideas which are relevant to modern science. We present the ideas of Plato that are consistent with modern relativity theories, as well as Aristarchus' estimations of the size of the Universe in comparison with the size of the planetary system. As a first approximation, Hipparchus' theory of eccentric circles was equivalent to the first two laws of Kepler. The significance of the principle of independence and superposition of motions in the formulation of ancient cosmological models is also clarified.

  11. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  12. Non-parametric model selection for subject-specific topological organization of resting-state functional connectivity.

    PubMed

    Ferrarini, Luca; Veer, Ilya M; van Lew, Baldur; Oei, Nicole Y L; van Buchem, Mark A; Reiber, Johan H C; Rombouts, Serge A R B; Milles, J

    2011-06-01

    In recent years, graph theory has been successfully applied to study functional and anatomical connectivity networks in the human brain. Most of these networks have shown small-world topological characteristics: high efficiency in long distance communication between nodes, combined with highly interconnected local clusters of nodes. Moreover, functional studies performed at high resolutions have presented convincing evidence that resting-state functional connectivity networks exhibits (exponentially truncated) scale-free behavior. Such evidence, however, was mostly presented qualitatively, in terms of linear regressions of the degree distributions on log-log plots. Even when quantitative measures were given, these were usually limited to the r(2) correlation coefficient. However, the r(2) statistic is not an optimal estimator of explained variance, when dealing with (truncated) power-law models. Recent developments in statistics have introduced new non-parametric approaches, based on the Kolmogorov-Smirnov test, for the problem of model selection. In this work, we have built on this idea to statistically tackle the issue of model selection for the degree distribution of functional connectivity at rest. The analysis, performed at voxel level and in a subject-specific fashion, confirmed the superiority of a truncated power-law model, showing high consistency across subjects. Moreover, the most highly connected voxels were found to be consistently part of the default mode network. Our results provide statistically sound support to the evidence previously presented in literature for a truncated power-law model of resting-state functional connectivity. Copyright © 2010 Elsevier Inc. All rights reserved.

  13. Consentaneous Agent-Based and Stochastic Model of the Financial Markets

    PubMed Central

    Gontis, Vygintas; Kononovicius, Aleksejus

    2014-01-01

    We are looking for the agent-based treatment of the financial markets considering necessity to build bridges between microscopic, agent based, and macroscopic, phenomenological modeling. The acknowledgment that agent-based modeling framework, which may provide qualitative and quantitative understanding of the financial markets, is very ambiguous emphasizes the exceptional value of well defined analytically tractable agent systems. Herding as one of the behavior peculiarities considered in the behavioral finance is the main property of the agent interactions we deal with in this contribution. Looking for the consentaneous agent-based and macroscopic approach we combine two origins of the noise: exogenous one, related to the information flow, and endogenous one, arising form the complex stochastic dynamics of agents. As a result we propose a three state agent-based herding model of the financial markets. From this agent-based model we derive a set of stochastic differential equations, which describes underlying macroscopic dynamics of agent population and log price in the financial markets. The obtained solution is then subjected to the exogenous noise, which shapes instantaneous return fluctuations. We test both Gaussian and q-Gaussian noise as a source of the short term fluctuations. The resulting model of the return in the financial markets with the same set of parameters reproduces empirical probability and spectral densities of absolute return observed in New York, Warsaw and NASDAQ OMX Vilnius Stock Exchanges. Our result confirms the prevalent idea in behavioral finance that herding interactions may be dominant over agent rationality and contribute towards bubble formation. PMID:25029364

  14. Cross-Talk Limits of Highly Segmented Semiconductor Detectors

    NASA Astrophysics Data System (ADS)

    Pullia, Alberto; Weisshaar, Dirk; Zocca, Francesca; Bazzacco, Dino

    2011-06-01

    Cross-talk limits of monolithic highly-segmented semiconductor detectors for high-resolution X-gamma spectrometry are investigated. Cross-talk causes false signal components yielding amplitude losses and fold-dependent shifts of the spectral lines, which partially spoil the spectroscopic performance of the detector. Two complementary electrical models are developed, which describe quantitatively the inter-channel cross-talk of monolithic segmented detectors whose electrodes are read out by charge-sensitive preamplifiers. The first is here designated as Cross-Capacitance (CC) model, the second as Split-Charge (SC) model. The CC model builds around the parasitic capacitances Cij linking the preamplifier outputs and the neighbor channel inputs. The SC model builds around the finite-value of the decoupling capacitance CC used to read out the high-voltage detector electrode. The key parameters of the models are individuated and ideas are shown to minimize their impact. Using a quasi-coaxial germanium segmented detector it is found that the SC cross-talk becomes negligible for decoupling capacitances larger than 1 nF, where instead the CC cross-talk tends to dominate. The residual cross-talk may be reduced by minimization of stray capacitances Cij, through a careful design of the layout of the Printed Circuit Board (PCB) where the input transistors are mounted. Cij can be made as low as 5 fF, but it is shown that even in such case the impact of the CC cross-talk on the detector performance is not negligible. Finally, an algorithm for cross-talk correction is presented and elaborated.

  15. Effect of low-level laser treatment on cochlea hair-cell recovery after ototoxic hearing loss

    NASA Astrophysics Data System (ADS)

    Rhee, Chung-Ku; He, Peijie; Jung, Jae Yun; Ahn, Jin-Chul; Chung, Phil-Sang; Lee, Min Young; Suh, Myung-Whan

    2013-12-01

    The primary cause of hearing loss includes damage to cochlear hair cells. Low-level laser therapy (LLLT) has become a popular treatment for damaged nervous systems. Based on the idea that cochlea hair cells and neural cells are from same developmental origin, the effect of LLLT on hearing loss in animal models is evaluated. Hearing loss animal models were established, and the animals were irradiated by 830-nm diode laser once a day for 10 days. Power density of the laser treatment was 900 mW/cm2, and the fluence was 162 to 194 J. The tympanic membrane was evaluated after LLLT. Thresholds of auditory brainstem responses were evaluated before treatment, after gentamicin, and after 10 days of LLLT. Quantitative scanning electron microscopic (SEM) observations were done by counting remaining hair cells. Tympanic membranes were intact at the end of the experiment. No adverse tissue reaction was found. On SEM images, LLLT significantly increased the number of hair cells in middle and basal turns. Hearing was significantly improved by laser irradiation. After LLLT treatment, both the hearing threshold and hair-cell count significantly improved.

  16. A new approach to preparation of standard LEDs for luminous intensity and flux measurement of LEDs

    NASA Astrophysics Data System (ADS)

    Park, Seung-Nam; Park, Seongchong; Lee, Dong-Hoon

    2006-09-01

    This work presents an alternative approach for preparing photometric standard LEDs, which is based on a novel functional seasoning method. The main idea of our seasoning method is simultaneously monitoring the light output and the junction voltage to obtain quantitative information on the temperature dependence and the aging effect of the LED emission. We suggested a general model describing the seasoning process by taking junction temperature variation and aging effect into account and implemented a fully automated seasoning facility, which is capable of seasoning 12 LEDs at the same time. By independent measurements of the temperature dependence, we confirmed the discrepancy of the theoretical model to be less than 0.5 % and evaluate the uncertainty contribution of the functional seasoning to be less than 0.5 % for all the seasoned samples. To demonstrate assigning the reference value to a standard LED, the CIE averaged LED intensity (ALI) of the seasoned LEDs was measured with a spectroradiometer-based instrument and the measurement uncertainty was analyzed. The expanded uncertainty of the standard LED prepared by the new approach amounts to be 4 % ~ 5 % (k=2) depending on color without correction of spectral stray light in the spectroradiometer.

  17. Cultural evolutionary theory: How culture evolves and why it matters

    PubMed Central

    Creanza, Nicole; Kolodny, Oren; Feldman, Marcus W.

    2017-01-01

    Human cultural traits—behaviors, ideas, and technologies that can be learned from other individuals—can exhibit complex patterns of transmission and evolution, and researchers have developed theoretical models, both verbal and mathematical, to facilitate our understanding of these patterns. Many of the first quantitative models of cultural evolution were modified from existing concepts in theoretical population genetics because cultural evolution has many parallels with, as well as clear differences from, genetic evolution. Furthermore, cultural and genetic evolution can interact with one another and influence both transmission and selection. This interaction requires theoretical treatments of gene–culture coevolution and dual inheritance, in addition to purely cultural evolution. In addition, cultural evolutionary theory is a natural component of studies in demography, human ecology, and many other disciplines. Here, we review the core concepts in cultural evolutionary theory as they pertain to the extension of biology through culture, focusing on cultural evolutionary applications in population genetics, ecology, and demography. For each of these disciplines, we review the theoretical literature and highlight relevant empirical studies. We also discuss the societal implications of the study of cultural evolution and of the interactions of humans with one another and with their environment. PMID:28739941

  18. Evolution of FX Markets via Globalization of Capital

    NASA Astrophysics Data System (ADS)

    McCauley, Joseph L.

    This paper is about money, and why today's foreign exchange (FX) markets are unstable. According to the literature [1], FX markets were fundamentally different before and after WW I. Any attempt to discuss this topic within standard economic theory necessarily fails because money/liquidity/uncertainty is completely excluded from that theory [2]. Fortunately, our market dynamics models adequately serve our purpose. Eichengreen [1] has presented a stimulating history of the evolution of FX markets from the gold standard of the late nineteenth century through the Bretton Woods Agreement (post WWII-1971) and later the floating currencies of our present market deregulation era (1971-present). He asserts a change from stability to instability over the time interval of WWI. Making his argument precise, we describe how speculators could have made money systematically from a market in statistical equilibrium. The present era normal liquid FX markets are in contrast very hard, to a first approximation impossible, to beat, and consequently are described as `martingales'. The ideas of martingales and options/hedging were irrelevant in the pre-WWI era. I end my historical discussion with the empirical evidence for the stochastic model that describes FX market dynamics quantitatively accurately during the last 7-17 years [3].

  19. Suppression of epidemic spreading in complex networks by local information based behavioral responses.

    PubMed

    Zhang, Hai-Feng; Xie, Jia-Rong; Tang, Ming; Lai, Ying-Cheng

    2014-12-01

    The interplay between individual behaviors and epidemic dynamics in complex networks is a topic of recent interest. In particular, individuals can obtain different types of information about the disease and respond by altering their behaviors, and this can affect the spreading dynamics, possibly in a significant way. We propose a model where individuals' behavioral response is based on a generic type of local information, i.e., the number of neighbors that has been infected with the disease. Mathematically, the response can be characterized by a reduction in the transmission rate by a factor that depends on the number of infected neighbors. Utilizing the standard susceptible-infected-susceptible and susceptible-infected-recovery dynamical models for epidemic spreading, we derive a theoretical formula for the epidemic threshold and provide numerical verification. Our analysis lays on a solid quantitative footing the intuition that individual behavioral response can in general suppress epidemic spreading. Furthermore, we find that the hub nodes play the role of "double-edged sword" in that they can either suppress or promote outbreak, depending on their responses to the epidemic, providing additional support for the idea that these nodes are key to controlling epidemic spreading in complex networks.

  20. Suppression of epidemic spreading in complex networks by local information based behavioral responses

    NASA Astrophysics Data System (ADS)

    Zhang, Hai-Feng; Xie, Jia-Rong; Tang, Ming; Lai, Ying-Cheng

    2014-12-01

    The interplay between individual behaviors and epidemic dynamics in complex networks is a topic of recent interest. In particular, individuals can obtain different types of information about the disease and respond by altering their behaviors, and this can affect the spreading dynamics, possibly in a significant way. We propose a model where individuals' behavioral response is based on a generic type of local information, i.e., the number of neighbors that has been infected with the disease. Mathematically, the response can be characterized by a reduction in the transmission rate by a factor that depends on the number of infected neighbors. Utilizing the standard susceptible-infected-susceptible and susceptible-infected-recovery dynamical models for epidemic spreading, we derive a theoretical formula for the epidemic threshold and provide numerical verification. Our analysis lays on a solid quantitative footing the intuition that individual behavioral response can in general suppress epidemic spreading. Furthermore, we find that the hub nodes play the role of "double-edged sword" in that they can either suppress or promote outbreak, depending on their responses to the epidemic, providing additional support for the idea that these nodes are key to controlling epidemic spreading in complex networks.

  1. Asteroid families: Current situation

    NASA Astrophysics Data System (ADS)

    Cellino, A.; Dell'Oro, A.; Tedesco, E. F.

    2009-02-01

    Being the products of energetic collisional events, asteroid families provide a fundamental body of evidence to test the predictions of theoretical and numerical models of catastrophic disruption phenomena. The goal is to obtain, from current physical and dynamical data, reliable inferences on the original disruption events that produced the observed families. The main problem in doing this is recognizing, and quantitatively assessing, the importance of evolutionary phenomena that have progressively changed the observable properties of families, due to physical processes unrelated to the original disruption events. Since the early 1990s, there has been a significant evolution in our interpretation of family properties. New ideas have been conceived, primarily as a consequence of the development of refined models of catastrophic disruption processes, and of the discovery of evolutionary processes that had not been accounted for in previous studies. The latter include primarily the Yarkovsky and Yarkovsky-O'Keefe-Radzvieski-Paddack (YORP) effects - radiation phenomena that can secularly change the semi-major axis and the rotation state. We present a brief review of the current state of the art in our understanding of asteroid families, point out some open problems, and discuss a few likely directions for future developments.

  2. Dynamic molecular structure retrieval from low-energy laser-induced electron diffraction spectra

    NASA Astrophysics Data System (ADS)

    Vu, Dinh-Duy T.; Phan, Ngoc-Loan T.; Hoang, Van-Hung; Le, Van-Hoang

    2017-12-01

    A recently developed quantitative rescattering theory showed that a laser-free elastic cross section can be separated from laser-induced electron diffraction (LIED) spectra. Based upon this idea, Blaga et al investigated the possibility of reconstructing molecular structure from LIED spectra (2012 Nature 483 7388). In the above study, an independent atoms model (IAM) was used to interpret high-energy electron-molecule collisions induced by a mid-infrared laser. Our research aims to extend the application range of this structural retrieval method to low-energy spectra induced by more common near-infrared laser sources. The IAM is insufficient in this case, so we switch to a more comprehensive model—the multiple scattering (MS) theory. From the original version concerning only neutral targets, we upgrade the model so that it is compatible with electron-ion collisions at low energy. With available LIED experiment data of CO2 and O2, the upgraded MS is shown to be greatly effective as a tool for molecular imaging from spectra induced by a near-infrared laser. The captured image is at about 2 fs after the ionization, shorter than the period 4-6 fs by using the mid-infrared laser in Blaga’s experiment.

  3. A quasi-QSPR modelling for the photocatalytic decolourization rate constants and cellular viability (CV%) of nanoparticles by CORAL.

    PubMed

    Toropova, A P; Toropov, A A; Benfenati, E

    2015-01-01

    Most quantitative structure-property/activity relationships (QSPRs/QSARs) predict various endpoints related to organic compounds. Gradually, the variety of organic compounds has been extended to inorganic, organometallic compounds and polymers. However, the so-called molecular descriptors cannot be defined for super-complex substances such as different nanomaterials and peptides, since there is no simple and clear representation of their molecular structure. Some possible ways to define approaches for a predictive model in the case of super-complex substances are discussed. The basic idea of the approach is to change the traditionally used paradigm 'the endpoint is a mathematical function of the molecular structure' with another paradigm 'the endpoint is a mathematical function of available eclectic information'. The eclectic data can be (i) conditions of a synthesis, (ii) technological attributes, (iii) size of nanoparticles, (iv) concentration, (v) attributes related to cell membranes, and so on. Two examples of quasi-QSPR/QSAR analyses are presented and discussed. These are (i) photocatalytic decolourization rate constants (DRC) (10(-5)/s) of different nanopowders; and (ii) the cellular viability under the effect of nano-SiO(2).

  4. Cultural evolutionary theory: How culture evolves and why it matters.

    PubMed

    Creanza, Nicole; Kolodny, Oren; Feldman, Marcus W

    2017-07-24

    Human cultural traits-behaviors, ideas, and technologies that can be learned from other individuals-can exhibit complex patterns of transmission and evolution, and researchers have developed theoretical models, both verbal and mathematical, to facilitate our understanding of these patterns. Many of the first quantitative models of cultural evolution were modified from existing concepts in theoretical population genetics because cultural evolution has many parallels with, as well as clear differences from, genetic evolution. Furthermore, cultural and genetic evolution can interact with one another and influence both transmission and selection. This interaction requires theoretical treatments of gene-culture coevolution and dual inheritance, in addition to purely cultural evolution. In addition, cultural evolutionary theory is a natural component of studies in demography, human ecology, and many other disciplines. Here, we review the core concepts in cultural evolutionary theory as they pertain to the extension of biology through culture, focusing on cultural evolutionary applications in population genetics, ecology, and demography. For each of these disciplines, we review the theoretical literature and highlight relevant empirical studies. We also discuss the societal implications of the study of cultural evolution and of the interactions of humans with one another and with their environment.

  5. A Review of Ideas Concerning Life Origin

    NASA Astrophysics Data System (ADS)

    Gindilis, L. M.

    2014-10-01

    Since the times of Antiquity the and for a long time the idea of self-origination of life was the dominant one. It reappeared again after microorganisms were discovered (XVII century). The possibility of abiogenesis at microbial level was discussed for more than a century. Pateur demonstrated that spontaneous origination of microorganisms in sterile broth was due to those same microorganisms transported by dust particles. Thus proving that every form of life originates from the parental life form. So the question arises: how did the first microorganisms appear on the Earth. There are three possible versions: 1) accidental origination of a viable form; 2) primal organisms were transported to the Earth from outer space; 3) they were formed on the Earth in the process of prebiotic chemical evolution. We discuss the problems of prebiotic evolution from simple monomers up to living cells. An important item of nowadays conceptions of life origination is the hypothesis of the ancient world of RNA as possible precursor of life on Earth. The discovery in carbonaceous chondrites of traces of bacterial life evidences the existence of life in the Solar System even before the formation of the Earth. The idea of life as brought to the Earth out of Cosmos originated under the impression of self-origination hypothesis downfall. It went through several stages (Helmholtz, W. Thompson, XIX century; Arrhenius, early XX century; Hoyle and Wickramasinghe, second half of XX century) and presently evokes constantly growing interest. The panspermia theory does not solve the problem of origination of life, only moves it onto other planets. According to V.A. Mazur, the probability of accidental formation of RNA molecule is negligible not only on the Earth, but in the whole Universe over all the time span of its existence. But it is practically equal to unit in the domain formed at the inflation stage of the evolution of the Universe. A.D.Panov considered panspermia in the Galaxy at the level of prebiotic evolution products. The quantitative model he has brought forward increases life origination probability by many orders of magnitude in comparison with any isolated planet. In this model the life to originates simultaneously on all the planets with proper conditions on the same molecular basis, one and the same genetic code and the same chirality.

  6. Explaining sex differences in lifespan in terms of optimal energy allocation in the baboon.

    PubMed

    King, Annette M; Kirkwood, Thomas B L; Shanley, Daryl P

    2017-10-01

    We provide a quantitative test of the hypothesis that sex role specialization may account for sex differences in lifespan in baboons if such specialization causes the dependency of fitness upon longevity, and consequently the optimal resolution to an energetic trade-off between somatic maintenance and other physiological functions, to differ between males and females. We present a model in which females provide all offspring care and males compete for access to reproductive females and in which the partitioning of available energy between the competing fitness-enhancing functions of growth, maintenance, and reproduction is modeled as a dynamic behavioral game, with the optimal decision for each individual depending upon his/her state and the behavior of other members of the population. Our model replicates the sexual dimorphism in body size and sex differences in longevity and reproductive scheduling seen in natural populations of baboons. We show that this outcome is generally robust to perturbations in model parameters, an important finding given that the same behavior is seen across multiple populations and species in the wild. This supports the idea that sex differences in longevity result from differences in the value of somatic maintenance relative to other fitness-enhancing functions in keeping with the disposable soma theory. © 2017 The Author(s). Evolution © 2017 The Society for the Study of Evolution.

  7. Learning about static electricity and magnetism in a fourth-grade classroom

    NASA Astrophysics Data System (ADS)

    Henry, David Roy

    Students begin to develop mental models to explain electrostatic and magnetic phenomena throughout childhood, middle childhood and high school, although these mental models are often incoherent and unscientific (Borges, Tenico, & Gilbert, 1998; Maloney, 1985). This is a case study of a classroom of grade four students and the mental models of magnetism and static electricity they used during a six-week science unit. The 22 students studied magnetism and static electricity using inquiry activities structured to create an environment where students would be likely to construct powerful scientific ideas (Goldberg & Bendall, 1995). Multiple data sources, including students' writing, student assessments, teacher interviews, student interviews, teacher journals, and classroom video and audio recordings were used to uncover how fourth grade students made sense of static electricity and magnetism before, during, and after instruction. The data were analyzed using a social constructivist framework to determine if students were able to develop target scientific ideas about static electricity and magnetism. In general, students were found to have three core mental models prior to instruction: (1) Static electricity and magnetism are the same "substance"; (2) This substance exists on the surface of a magnet or a charged object and can be rubbed off, and (3) Opposite substances attract. During the activities, students had many opportunities to observe evidence that contradicted these core mental models. Using evidence from direct observations, the students practiced differentiating between evidence and ideas. Through group and class discussions, they developed evidenced-based (scientific) ideas. Final assessments revealed that students were able to construct target ideas such as: (1) static electricity and magnetism are fundamentally different; (2) there are two kinds of static "charge;" (3) magnet-rubbed wires act like a magnet; and (4) opposite substances move toward each other, like substances push away from each other. Some target ideas, such as "Magnetic materials are made up of magnetic domains that align to give an overall magnetic effect" were found to be difficult for students this age to develop. This case study will augment research about effective science teaching, teacher development and the support necessary for curriculum change.

  8. Direct Assessment of the Effect of the Gly380Arg Achondroplasia Mutation on FGFR3 Dimerization Using Quantitative Imaging FRET

    PubMed Central

    Placone, Jesse; Hristova, Kalina

    2012-01-01

    The Gly380Arg mutation in FGFR3 is the genetic cause for achondroplasia (ACH), the most common form of human dwarfism. The mutation has been proposed to increase FGFR3 dimerization, but the dimerization propensities of wild-type and mutant FGFR3 have not been compared. Here we use quantitative imaging FRET to characterize the dimerization of wild-type FGFR3 and the ACH mutant in plasma membrane-derived vesicles from HEK293T cells. We demonstrate a small, but statistically significant increase in FGFR3 dimerization due to the ACH mutation. The data are consistent with the idea that the ACH mutation causes a structural change which affects both the stability and the activity of FGFR3 dimers in the absence of ligand. PMID:23056398

  9. Quantitative analysis and comparative study of four cities green pattern in API system on the background of big data

    NASA Astrophysics Data System (ADS)

    Xin, YANG; Si-qi, WU; Qi, ZHANG

    2018-05-01

    Beijing, London, Paris, New York are typical cities in the world, so comparative study of four cities green pattern is very important to find out gap and advantage and to learn from each other. The paper will provide basis and new ideas for development of metropolises in China. On the background of big data, API (Application Programming Interface) system can provide extensive and accurate basic data to study urban green pattern in different geographical environment in domestic and foreign. On the basis of this, Average nearest neighbor tool, Kernel density tool and Standard Ellipse tool in ArcGIS platform can process and summarize data and realize quantitative analysis of green pattern. The paper summarized uniqueness of four cities green pattern and reasons of formation on basis of numerical comparison.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duane, Greg; Tsonis, Anastasios; Kocarev, Ljupco

    This collaborative reserach has several components but the main idea is that when imperfect copies of a given nonlinear dynamical system are coupled, they may synchronize for some set of coupling parameters. This idea is to be tested for several IPCC-like models each one with its own formulation and representing an “imperfect” copy of the true climate system. By computing the coupling parameters, which will lead the models to a synchronized state, a consensus on climate change simulations may be achieved.

  11. Writing to Learn by Learning to Write during the School Science Laboratory: Helping Middle and High School Students Develop Argumentative Writing Skills as They Learn Core Ideas

    ERIC Educational Resources Information Center

    Sampson, Victor; Enderle, Patrick; Grooms, Jonathon; Witte, Shelbie

    2013-01-01

    This study examined how students' science-specific argumentative writing skills and understanding of core ideas changed over the course of a school year as they participated in a series of science laboratories designed using the Argument-Driven Inquiry (ADI) instructional model. The ADI model is a student-centered and writing-intensive approach to…

  12. Reinterpretation of Students' Ideas When Reasoning about Particle Model Illustrations

    ERIC Educational Resources Information Center

    Langbeheim, Elon

    2015-01-01

    The article, "Using Animations in Identifying General Chemistry Students' Misconceptions and Evaluating Their Knowledge Transfer Relating to Particle Position in Physical Changes" (Smith and Villarreal, 2015), reports that a substantial proportion of undergraduate students expressed misconceived ideas regarding the motion of particles in…

  13. Simulating social-ecological systems: the Island Digital Ecosystem Avatars (IDEA) consortium.

    PubMed

    Davies, Neil; Field, Dawn; Gavaghan, David; Holbrook, Sally J; Planes, Serge; Troyer, Matthias; Bonsall, Michael; Claudet, Joachim; Roderick, George; Schmitt, Russell J; Zettler, Linda Amaral; Berteaux, Véronique; Bossin, Hervé C; Cabasse, Charlotte; Collin, Antoine; Deck, John; Dell, Tony; Dunne, Jennifer; Gates, Ruth; Harfoot, Mike; Hench, James L; Hopuare, Marania; Kirch, Patrick; Kotoulas, Georgios; Kosenkov, Alex; Kusenko, Alex; Leichter, James J; Lenihan, Hunter; Magoulas, Antonios; Martinez, Neo; Meyer, Chris; Stoll, Benoit; Swalla, Billie; Tartakovsky, Daniel M; Murphy, Hinano Teavai; Turyshev, Slava; Valdvinos, Fernanda; Williams, Rich; Wood, Spencer

    2016-01-01

    Systems biology promises to revolutionize medicine, yet human wellbeing is also inherently linked to healthy societies and environments (sustainability). The IDEA Consortium is a systems ecology open science initiative to conduct the basic scientific research needed to build use-oriented simulations (avatars) of entire social-ecological systems. Islands are the most scientifically tractable places for these studies and we begin with one of the best known: Moorea, French Polynesia. The Moorea IDEA will be a sustainability simulator modeling links and feedbacks between climate, environment, biodiversity, and human activities across a coupled marine-terrestrial landscape. As a model system, the resulting knowledge and tools will improve our ability to predict human and natural change on Moorea and elsewhere at scales relevant to management/conservation actions.

  14. Nominal group technique: a brainstorming tool for identifying areas to improve pain management in hospitalized patients.

    PubMed

    Peña, Adolfo; Estrada, Carlos A; Soniat, Debbie; Taylor, Benjamin; Burton, Michael

    2012-01-01

    Pain management in hospitalized patients remains a priority area for improvement; effective strategies for consensus development are needed to prioritize interventions. To identify challenges, barriers, and perspectives of healthcare providers in managing pain among hospitalized patients. Qualitative and quantitative group consensus using a brainstorming technique for quality improvement-the nominal group technique (NGT). One medical, 1 medical-surgical, and 1 surgical hospital unit at a large academic medical center. Nurses, resident physicians, patient care technicians, and unit clerks. Responses and ranking to the NGT question: "What causes uncontrolled pain in your unit?" Twenty-seven health workers generated a total of 94 ideas. The ideas perceived contributing to a suboptimal pain control were grouped as system factors (timeliness, n = 18 ideas; communication, n = 11; pain assessment, n = 8), human factors (knowledge and experience, n = 16; provider bias, n = 8; patient factors, n = 19), and interface of system and human factors (standardization, n = 14). Knowledge, timeliness, provider bias, and patient factors were the top ranked themes. Knowledge and timeliness are considered main priorities to improve pain control. NGT is an efficient tool for identifying general and context-specific priority areas for quality improvement; teams of healthcare providers should consider using NGT to address their own challenges and barriers. Copyright © 2011 Society of Hospital Medicine.

  15. Impedance biosensor based on interdigitated electrode array for detection of E.coli O157:H7 in food products

    NASA Astrophysics Data System (ADS)

    Ghosh Dastider, Shibajyoti; Barizuddin, Syed; Dweik, Majed; Almasri, Mahmoud F.

    2012-05-01

    An impedance biosensor was designed, fabricated and tested for detection of viable Escherichia coli O157:H7 in food samples. This device consists of interdigitated microelectrode array (IDEA) fabricated using thin layer of sputtered gold, embedded under a polydimethylsiloxane (PDMS) microchannel. The array of electrodes is designed to detect viable EColi in different food products. The active surface area of the detection array was modified using goat anti-E.coli polyclonal IgG antibody. Contaminated food samples were tested by infusing the supernatant containing bacteria over the IDEA's, through the microchannel. Antibody-antigen binding on the electrodes results in impedance change. Four serial concentrations of E.coli contaminated food samples (3x102 CFUmL-1 to 3x105 CFUmL-1) were tested. The biosensor successfully detected the E.coli samples, with the lower detection limit being 3x103 CFUmL-1 (up to 3cells/μl). Comparing the test results with an IDEA impedance biosensor without microchannel (published elsewhere) indicates that this biosensor have two order of magnitude times higher sensitivity. The proposed biosensor provides qualitative and quantitative detection, and potentially could be used for detection of other type of bacteria by immobilizing the specific type of antibody.

  16. Potential-of-mean-force description of ionic interactions and structural hydration in biomolecular systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummer, G.; Garcia, A.E.; Soumpasis, D.M.

    1994-10-01

    To understand the functioning of living organisms on a molecular level, it is crucial to dissect the intricate interplay of the immense number of biological molecules. Most of the biochemical processes in cells occur in a liquid environment formed mainly by water and ions. This solvent environment plays an important role in biological systems. The potential-of-mean-force (PMF) formalism attempts to describe quantitatively the interactions of the solvent with biological macromolecules on the basis of an approximate statistical-mechanical representation. At its current status of development, it deals with ionic effects on the biomolecular structure and with the structural hydration of biomolecules.more » The underlying idea of the PMF formalism is to identify the dominant sources of interactions and incorporate these interactions into the theoretical formalism using PMF`s (or particle correlation functions) extracted from bulk-liquid systems. In the following, the authors shall briefly outline the statistical-mechanical foundation of the PMF formalism and introduce the PMF expansion formalism, which is intimately linked to superposition approximations for higher-order particle correlation functions. The authors shall then sketch applications, which describe the effects of the ionic environment on nucleic-acid structure. Finally, the authors shall present the more recent extension of the PMF idea to describe quantitatively the structural hydration of biomolecules. Results for the interface of ice and water and for the hydration of deoxyribonucleic acid (DNA) will be discussed.« less

  17. Quantitative Confocal Microscopy Analysis as a Basis for Search and Study of Potassium Kv1.x Channel Blockers

    NASA Astrophysics Data System (ADS)

    Feofanov, Alexey V.; Kudryashova, Kseniya S.; Nekrasova, Oksana V.; Vassilevski, Alexander A.; Kuzmenkov, Alexey I.; Korolkova, Yuliya V.; Grishin, Eugene V.; Kirpichnikov, Mikhail P.

    Artificial KcsA-Kv1.x (x = 1, 3) receptors were recently designed by transferring the ligand-binding site from human Kv1.x voltage-gated potassium channels into corresponding domain of the bacterial KscA channel. We found that KcsA-Kv1.x receptors expressed in E. coli cells are embedded into cell membrane and bind ligands when the cells are transformed to spheroplasts. We supposed that E. coli spheroplasts with membrane-embedded KcsA-Kv1.x and fluorescently labeled ligand agitoxin-2 (R-AgTx2) can be used as elements of an advanced analytical system for search and study of Kv1-channel blockers. To realize this idea, special procedures were developed for measurement and quantitative treatment of fluorescence signals obtained from spheroplast membrane using confocal laser scanning microscopy (CLSM). The worked out analytical "mix and read" systems supported by quantitative CLSM analysis were demonstrated to be reliable alternative to radioligand and electrophysiology techniques in the search and study of selective Kv1.x channel blockers of high scientific and medical importance.

  18. Context, Learning, and Extinction

    ERIC Educational Resources Information Center

    Gershman, Samuel J.; Blei, David M.; Niv, Yael

    2010-01-01

    A. Redish et al. (2007) proposed a reinforcement learning model of context-dependent learning and extinction in conditioning experiments, using the idea of "state classification" to categorize new observations into states. In the current article, the authors propose an interpretation of this idea in terms of normative statistical inference. They…

  19. A Handbook of Bright Ideas: Facilitating Giftedness.

    ERIC Educational Resources Information Center

    Cherry, Betty S., Ed.

    Presented is a manual developed by the Manatee, Florida, program for gifted students which includes articles by leading thinkers, information on J. Guilford's structure of the intellect model, the importance of cognitive and affective balance, creative development, checklists, games, and other ideas for teachers of gifted students. Articles…

  20. A Quantitative Human Spacecraft Design Evaluation Model for Assessing Crew Accommodation and Utilization

    NASA Astrophysics Data System (ADS)

    Fanchiang, Christine

    Crew performance, including both accommodation and utilization factors, is an integral part of every human spaceflight mission from commercial space tourism, to the demanding journey to Mars and beyond. Spacecraft were historically built by engineers and technologists trying to adapt the vehicle into cutting edge rocketry with the assumption that the astronauts could be trained and will adapt to the design. By and large, that is still the current state of the art. It is recognized, however, that poor human-machine design integration can lead to catastrophic and deadly mishaps. The premise of this work relies on the idea that if an accurate predictive model exists to forecast crew performance issues as a result of spacecraft design and operations, it can help designers and managers make better decisions throughout the design process, and ensure that the crewmembers are well-integrated with the system from the very start. The result should be a high-quality, user-friendly spacecraft that optimizes the utilization of the crew while keeping them alive, healthy, and happy during the course of the mission. Therefore, the goal of this work was to develop an integrative framework to quantitatively evaluate a spacecraft design from the crew performance perspective. The approach presented here is done at a very fundamental level starting with identifying and defining basic terminology, and then builds up important axioms of human spaceflight that lay the foundation for how such a framework can be developed. With the framework established, a methodology for characterizing the outcome using a mathematical model was developed by pulling from existing metrics and data collected on human performance in space. Representative test scenarios were run to show what information could be garnered and how it could be applied as a useful, understandable metric for future spacecraft design. While the model is the primary tangible product from this research, the more interesting outcome of this work is the structure of the framework and what it tells future researchers in terms of where the gaps and limitations exist for developing a better framework. It also identifies metrics that can now be collected as part of future validation efforts for the model.

  1. Does leader-affective presence influence communication of creative ideas within work teams?

    PubMed

    Madrid, Hector P; Totterdell, Peter; Niven, Karen

    2016-09-01

    Affective presence is a novel, emotion-related personality trait, supported in experimental studies, concerning the extent to which a person makes his or her interaction partners feel the same way (Eisenkraft & Elfenbein, 2010). Applying this concept to an applied teamwork context, we proposed that team-leader-affective presence would influence team members' communication of creative ideas. Multilevel modeling analysis of data from a survey study conducted with teams from a consultancy firm confirmed that team-leader-affective presence interacted with team-member creative idea generation to predict inhibition of voicing their ideas. Specifically, withholding of ideas was less likely when team members generated creative ideas and their team leader had higher positive affective presence or lower negative affective presence. These findings contribute to emotion research by showing affective presence as a trait with interpersonal meaning, which can shape how cognition is translated into social behavior in applied performance contexts, such as teamwork in organizations. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  2. Baryon Spectroscopy and the Constituent Quark Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A.W. Thomas; R.D. Young

    2005-07-26

    We explore further the idea that the lattice QCD data for hadron properties in the region m[^2][_pi] > 0.2GeV^2 can be described by the constituent quark model. This leads to a natural explanation of the fact that nucleon excited states are generally stable for pion masses greater than their physical excitation energies. Finally, we apply these same ideas to the problem of how pentaquarks might behave in lattice QCD, with interesting conclusions.

  3. Economic demand predicts addiction-like behavior and therapeutic efficacy of oxytocin in the rat.

    PubMed

    Bentzley, Brandon S; Jhou, Thomas C; Aston-Jones, Gary

    2014-08-12

    Development of new treatments for drug addiction will depend on high-throughput screening in animal models. However, an addiction biomarker fit for rapid testing, and useful in both humans and animals, is not currently available. Economic models are promising candidates. They offer a structured quantitative approach to modeling behavior that is mathematically identical across species, and accruing evidence indicates economic-based descriptors of human behavior may be particularly useful biomarkers of addiction severity. However, economic demand has not yet been established as a biomarker of addiction-like behavior in animals, an essential final step in linking animal and human studies of addiction through economic models. We recently developed a mathematical approach for rapidly modeling economic demand in rats trained to self-administer cocaine. We show here that economic demand, as both a spontaneous trait and induced state, predicts addiction-like behavior, including relapse propensity, drug seeking in abstinence, and compulsive (punished) drug taking. These findings confirm economic demand as a biomarker of addiction-like behavior in rats. They also support the view that excessive motivation plays an important role in addiction while extending the idea that drug dependence represents a shift from initially recreational to compulsive drug use. Finally, we found that economic demand for cocaine predicted the efficacy of a promising pharmacotherapy (oxytocin) in attenuating cocaine-seeking behaviors across individuals, demonstrating that economic measures may be used to rapidly identify the clinical utility of prospective addiction treatments.

  4. Oscillatory Critical Amplitudes in Hierarchical Models and the Harris Function of Branching Processes

    NASA Astrophysics Data System (ADS)

    Costin, Ovidiu; Giacomin, Giambattista

    2013-02-01

    Oscillatory critical amplitudes have been repeatedly observed in hierarchical models and, in the cases that have been taken into consideration, these oscillations are so small to be hardly detectable. Hierarchical models are tightly related to iteration of maps and, in fact, very similar phenomena have been repeatedly reported in many fields of mathematics, like combinatorial evaluations and discrete branching processes. It is precisely in the context of branching processes with bounded off-spring that T. Harris, in 1948, first set forth the possibility that the logarithm of the moment generating function of the rescaled population size, in the super-critical regime, does not grow near infinity as a power, but it has an oscillatory prefactor (the Harris function). These oscillations have been observed numerically only much later and, while the origin is clearly tied to the discrete character of the iteration, the amplitude size is not so well understood. The purpose of this note is to reconsider the issue for hierarchical models and in what is arguably the most elementary setting—the pinning model—that actually just boils down to iteration of polynomial maps (and, notably, quadratic maps). In this note we show that the oscillatory critical amplitude for pinning models and the Harris function coincide. Moreover we make explicit the link between these oscillatory functions and the geometry of the Julia set of the map, making thus rigorous and quantitative some ideas set forth in Derrida et al. (Commun. Math. Phys. 94:115-132, 1984).

  5. A Model of Human Cooperation in Social Dilemmas

    PubMed Central

    Capraro, Valerio

    2013-01-01

    Social dilemmas are situations in which collective interests are at odds with private interests: pollution, depletion of natural resources, and intergroup conflicts, are at their core social dilemmas. Because of their multidisciplinarity and their importance, social dilemmas have been studied by economists, biologists, psychologists, sociologists, and political scientists. These studies typically explain tendency to cooperation by dividing people in proself and prosocial types, or appealing to forms of external control or, in iterated social dilemmas, to long-term strategies. But recent experiments have shown that cooperation is possible even in one-shot social dilemmas without forms of external control and the rate of cooperation typically depends on the payoffs. This makes impossible a predictive division between proself and prosocial people and proves that people have attitude to cooperation by nature. The key innovation of this article is in fact to postulate that humans have attitude to cooperation by nature and consequently they do not act a priori as single agents, as assumed by standard economic models, but they forecast how a social dilemma would evolve if they formed coalitions and then they act according to their most optimistic forecast. Formalizing this idea we propose the first predictive model of human cooperation able to organize a number of different experimental findings that are not explained by the standard model. We show also that the model makes satisfactorily accurate quantitative predictions of population average behavior in one-shot social dilemmas. PMID:24009679

  6. Economic demand predicts addiction-like behavior and therapeutic efficacy of oxytocin in the rat

    PubMed Central

    Bentzley, Brandon S.; Jhou, Thomas C.; Aston-Jones, Gary

    2014-01-01

    Development of new treatments for drug addiction will depend on high-throughput screening in animal models. However, an addiction biomarker fit for rapid testing, and useful in both humans and animals, is not currently available. Economic models are promising candidates. They offer a structured quantitative approach to modeling behavior that is mathematically identical across species, and accruing evidence indicates economic-based descriptors of human behavior may be particularly useful biomarkers of addiction severity. However, economic demand has not yet been established as a biomarker of addiction-like behavior in animals, an essential final step in linking animal and human studies of addiction through economic models. We recently developed a mathematical approach for rapidly modeling economic demand in rats trained to self-administer cocaine. We show here that economic demand, as both a spontaneous trait and induced state, predicts addiction-like behavior, including relapse propensity, drug seeking in abstinence, and compulsive (punished) drug taking. These findings confirm economic demand as a biomarker of addiction-like behavior in rats. They also support the view that excessive motivation plays an important role in addiction while extending the idea that drug dependence represents a shift from initially recreational to compulsive drug use. Finally, we found that economic demand for cocaine predicted the efficacy of a promising pharmacotherapy (oxytocin) in attenuating cocaine-seeking behaviors across individuals, demonstrating that economic measures may be used to rapidly identify the clinical utility of prospective addiction treatments. PMID:25071176

  7. An IDEA for Short Term Outbreak Projection: Nearcasting Using the Basic Reproduction Number

    PubMed Central

    Fisman, David N.; Hauck, Tanya S.; Tuite, Ashleigh R.; Greer, Amy L.

    2013-01-01

    Background Communicable disease outbreaks of novel or existing pathogens threaten human health around the globe. It would be desirable to rapidly characterize such outbreaks and develop accurate projections of their duration and cumulative size even when limited preliminary data are available. Here we develop a mathematical model to aid public health authorities in tracking the expansion and contraction of outbreaks with explicit representation of factors (other than population immunity) that may slow epidemic growth. Methodology The Incidence Decay and Exponential Adjustment (IDEA) model is a parsimonious function that uses the basic reproduction number R0, along with a discounting factor to project the growth of outbreaks using only basic epidemiological information (e.g., daily incidence counts). Principal Findings Compared to simulated data, IDEA provides highly accurate estimates of total size and duration for a given outbreak when R0 is low or moderate, and also identifies turning points or new waves. When tested with an outbreak of pandemic influenza A (H1N1), the model generates estimated incidence at the i+1th serial interval using data from the ith serial interval within an average of 20% of actual incidence. Conclusions and Significance This model for communicable disease outbreaks provides rapid assessments of outbreak growth and public health interventions. Further evaluation in the context of real-world outbreaks will establish the utility of IDEA as a tool for front-line epidemiologists. PMID:24391797

  8. An IDEA for short term outbreak projection: nearcasting using the basic reproduction number.

    PubMed

    Fisman, David N; Hauck, Tanya S; Tuite, Ashleigh R; Greer, Amy L

    2013-01-01

    Communicable disease outbreaks of novel or existing pathogens threaten human health around the globe. It would be desirable to rapidly characterize such outbreaks and develop accurate projections of their duration and cumulative size even when limited preliminary data are available. Here we develop a mathematical model to aid public health authorities in tracking the expansion and contraction of outbreaks with explicit representation of factors (other than population immunity) that may slow epidemic growth. The Incidence Decay and Exponential Adjustment (IDEA) model is a parsimonious function that uses the basic reproduction number R0, along with a discounting factor to project the growth of outbreaks using only basic epidemiological information (e.g., daily incidence counts). Compared to simulated data, IDEA provides highly accurate estimates of total size and duration for a given outbreak when R0 is low or moderate, and also identifies turning points or new waves. When tested with an outbreak of pandemic influenza A (H1N1), the model generates estimated incidence at the i+1(th) serial interval using data from the i(th) serial interval within an average of 20% of actual incidence. This model for communicable disease outbreaks provides rapid assessments of outbreak growth and public health interventions. Further evaluation in the context of real-world outbreaks will establish the utility of IDEA as a tool for front-line epidemiologists.

  9. Schizophrenia and the neurodevelopmental continuum:evidence from genomics

    PubMed Central

    Owen, Michael J.; O'Donovan, Michael C.

    2017-01-01

    The idea that disturbances occurring early in brain development contribute to the pathogenesis of schizophrenia, often referred to as the neurodevelopmental hypothesis, has become widely accepted. Despite this, the disorder is viewed as being distinct nosologically, and by implication pathophysiologically and clinically, from syndromes such as autism spectrum disorders, attention‐deficit/hyperactivity disorder (ADHD) and intellectual disability, which typically present in childhood and are grouped together as “neurodevelopmental disorders”. An alternative view is that neurodevelopmental disorders, including schizophrenia, rather than being etiologically discrete entities, are better conceptualized as lying on an etiological and neurodevelopmental continuum, with the major clinical syndromes reflecting the severity, timing and predominant pattern of abnormal brain development and resulting functional abnormalities. It has also been suggested that, within the neurodevelopmental continuum, severe mental illnesses occupy a gradient of decreasing neurodevelopmental impairment as follows: intellectual disability, autism spectrum disorders, ADHD, schizophrenia and bipolar disorder. Recent genomic studies have identified large numbers of specific risk DNA changes and offer a direct and robust test of the predictions of the neurodevelopmental continuum model and gradient hypothesis. These findings are reviewed in detail. They not only support the view that schizophrenia is a disorder whose origins lie in disturbances of brain development, but also that it shares genetic risk and pathogenic mechanisms with the early onset neurodevelopmental disorders (intellectual disability, autism spectrum disorders and ADHD). They also support the idea that these disorders lie on a gradient of severity, implying that they differ to some extent quantitatively as well as qualitatively. These findings have important implications for nosology, clinical practice and research. PMID:28941101

  10. Schizophrenia and the neurodevelopmental continuum:evidence from genomics.

    PubMed

    Owen, Michael J; O'Donovan, Michael C

    2017-10-01

    The idea that disturbances occurring early in brain development contribute to the pathogenesis of schizophrenia, often referred to as the neurodevelopmental hypothesis, has become widely accepted. Despite this, the disorder is viewed as being distinct nosologically, and by implication pathophysiologically and clinically, from syndromes such as autism spectrum disorders, attention-deficit/hyperactivity disorder (ADHD) and intellectual disability, which typically present in childhood and are grouped together as "neurodevelopmental disorders". An alternative view is that neurodevelopmental disorders, including schizophrenia, rather than being etiologically discrete entities, are better conceptualized as lying on an etiological and neurodevelopmental continuum, with the major clinical syndromes reflecting the severity, timing and predominant pattern of abnormal brain development and resulting functional abnormalities. It has also been suggested that, within the neurodevelopmental continuum, severe mental illnesses occupy a gradient of decreasing neurodevelopmental impairment as follows: intellectual disability, autism spectrum disorders, ADHD, schizophrenia and bipolar disorder. Recent genomic studies have identified large numbers of specific risk DNA changes and offer a direct and robust test of the predictions of the neurodevelopmental continuum model and gradient hypothesis. These findings are reviewed in detail. They not only support the view that schizophrenia is a disorder whose origins lie in disturbances of brain development, but also that it shares genetic risk and pathogenic mechanisms with the early onset neurodevelopmental disorders (intellectual disability, autism spectrum disorders and ADHD). They also support the idea that these disorders lie on a gradient of severity, implying that they differ to some extent quantitatively as well as qualitatively. These findings have important implications for nosology, clinical practice and research. © 2017 World Psychiatric Association.

  11. New Cosmic Scales as a Cornerstone for the Evolutionary Processes, Energetic Resources and Activity Phenomena of the Non-Stable Universe

    NASA Astrophysics Data System (ADS)

    Avetissian, A. K.

    2017-07-01

    New cosmic scales, completely different from the Plank's scales, have been disclosed in the frame of so called “Non-Inflationary Cosmology” (NIC), created by the author during last decade. The proposed new ideas shed light on some hidden inaccuracies within the essence of Planck's scales in Modern Cosmology, so the new scales have been nominated as “NAIRI (New Alternative Ideas Regenerating Irregularities) Cosmic Scales” (NCS). The NCS is believed to be realistic due to qualitative and quantitative correspondences with observational and experimental data. The basic concept about NCS has been created based on two hypotheses about cosmological time-evolution of Planck's constant and multi-photon processes. Together with the hypothesis about domination of Bose-statistics in the early Universe and the possibility of large-scale Bose-condensate, these predictions have been converted into phenomena, based on which the bases of alternative theory of cosmology have been investigated. The predicted by the author “Cosmic Small (Local) Bang” (CSB) phenomenon has been investigated in the model of galaxy, and as a consequence of CSB the possibility of Super-Strong Shock Wave (SSW) has been postulated. Thus, based on phenomena CSB and SSW, NIC guarantees the non-accretion mechanism of generation of galaxies and super-massive black holes in their core, as well as creation of supernovas and massive stars (super-massive stars exceeding also 100M⊙). The possibility of gravitational radiation (GR) by the central black hole of the galaxy, even by the disk (or whole galaxy!) has been investigated.

  12. Children's Concepts of the Shape and Size of the Earth, Sun and Moon

    NASA Astrophysics Data System (ADS)

    Bryce, T. G. K.; Blown, E. J.

    2013-02-01

    Children's understandings of the shape and relative sizes of the Earth, Sun and Moon have been extensively researched and in a variety of ways. Much is known about the confusions which arise as young people try to grasp ideas about the world and our neighbouring celestial bodies. Despite this, there remain uncertainties about the conceptual models which young people use and how they theorise in the process of acquiring more scientific conceptions. In this article, the relevant published research is reviewed critically and in-depth in order to frame a series of investigations using semi-structured interviews carried out with 248 participants aged 3-18 years from China and New Zealand. Analysis of qualitative and quantitative data concerning the reasoning of these subjects (involving cognitive categorisations and their rank ordering) confirmed that (a) concepts of Earth shape and size are embedded in a 'super-concept' or 'Earth notion' embracing ideas of physical shape, 'ground' and 'sky', habitation of and identity with Earth; (b) conceptual development is similar in cultures where teachers hold a scientific world view and (c) children's concepts of shape and size of the Earth, Sun and Moon can be usefully explored within an ethnological approach using multi-media interviews combined with observational astronomy. For these young people, concepts of the shape and size of the Moon and Sun were closely correlated with their Earth notion concepts and there were few differences between the cultures despite their contrasts. Analysis of the statistical data used Kolmogorov-Smirnov Two-Sample Tests with hypotheses confirmed at K-S alpha level 0.05; rs : p < 0.01.

  13. Introductory Statistics Students' Conceptual Understanding of Study Design and Conclusions

    NASA Astrophysics Data System (ADS)

    Fry, Elizabeth Brondos

    Recommended learning goals for students in introductory statistics courses include the ability to recognize and explain the key role of randomness in designing studies and in drawing conclusions from those studies involving generalizations to a population or causal claims (GAISE College Report ASA Revision Committee, 2016). The purpose of this study was to explore introductory statistics students' understanding of the distinct roles that random sampling and random assignment play in study design and the conclusions that can be made from each. A study design unit lasting two and a half weeks was designed and implemented in four sections of an undergraduate introductory statistics course based on modeling and simulation. The research question that this study attempted to answer is: How does introductory statistics students' conceptual understanding of study design and conclusions (in particular, unbiased estimation and establishing causation) change after participating in a learning intervention designed to promote conceptual change in these areas? In order to answer this research question, a forced-choice assessment called the Inferences from Design Assessment (IDEA) was developed as a pretest and posttest, along with two open-ended assignments, a group quiz and a lab assignment. Quantitative analysis of IDEA results and qualitative analysis of the group quiz and lab assignment revealed that overall, students' mastery of study design concepts significantly increased after the unit, and the great majority of students successfully made the appropriate connections between random sampling and generalization, and between random assignment and causal claims. However, a small, but noticeable portion of students continued to demonstrate misunderstandings, such as confusion between random sampling and random assignment.

  14. Insights into linearized rotor dynamics, Part 2

    NASA Astrophysics Data System (ADS)

    Adams, M. L.

    1987-01-01

    This paper builds upon its 1981 namesake to extend and propose ideas which focus on some unique problems at the current center of interest in rotor vibration technology. These problems pertain to the ongoing extension of the linearized rotor-bearing model to include other rotor-stator interactive forces such as seals and turbomachinery stages. A unified linear model is proposed and contains an axiom which requires the coefficient matrix of the highest order term, in an interactive force model, to be symmetric. The paper ends on a fundamental question, namely, the potential weakness inherent in the whole idea of mechanical impedance modeling of rotor-stator interactive fluid flow fields.

  15. Mapping Soil Age at Continental Scales

    NASA Astrophysics Data System (ADS)

    Slessarev, E.; Feng, X.

    2017-12-01

    Soil age controls the balance between weathered and unweathered minerals in soil, and thus strongly influences many of the biological, geochemical, and hydrological functions of the critical zone. However, most quantitative models of soil development do not represent soil age. Instead, they rely on a steady-state assumption: physical erosion controls the residence time of unweathered minerals in soil, and thus fixes the chemical weathering rate. This assumption may hold true in mountainous landscapes, where physical erosion rates are high. However, the steady-state assumption may fail in low-relief landscapes, where physical erosion rates have been insufficient to remove unweathered minerals left by glaciation and dust deposition since the Last Glacial Maximum (LGM). To test the applicability of the steady-state assumption at continental scales, we developed an empirical predictor for physical erosion, and then simulated soil development since LGM with a numerical model. We calibrated the physical erosion predictor using a compilation of watershed-scale sediment yield data, and in-situ 10Be denudation measurements corrected for weathering by Zr/Ti mass-balance. Physical erosion rates can be predicted using a power-law function of local relief and peak ground acceleration, a proxy for tectonic activity. Coupling physical erosion rates with the numerical model reveals that extensive low-relief areas of North America may depart from steady-state because they were glaciated, or received high dust fluxes during LGM. These LGM legacy effects are reflected in topsoil Ca:Al and Quartz:Feldspar ratios derived from United States Geological Survey data, and in a global compilation of soil pH measurements. Our results quantitatively support the classic idea that soils in the mid-high latitudes of the Northern Hemisphere are "young", in the sense that they are undergoing transient response to LGM conditions. Where they occur, such departures from steady-state likely increase mineral weathering rates and the supply of rock-derived nutrients to ecosystems.

  16. Stochastic evolution in populations of ideas

    PubMed Central

    Nicole, Robin; Sollich, Peter; Galla, Tobias

    2017-01-01

    It is known that learning of players who interact in a repeated game can be interpreted as an evolutionary process in a population of ideas. These analogies have so far mostly been established in deterministic models, and memory loss in learning has been seen to act similarly to mutation in evolution. We here propose a representation of reinforcement learning as a stochastic process in finite ‘populations of ideas’. The resulting birth-death dynamics has absorbing states and allows for the extinction or fixation of ideas, marking a key difference to mutation-selection processes in finite populations. We characterize the outcome of evolution in populations of ideas for several classes of symmetric and asymmetric games. PMID:28098244

  17. Stochastic evolution in populations of ideas

    NASA Astrophysics Data System (ADS)

    Nicole, Robin; Sollich, Peter; Galla, Tobias

    2017-01-01

    It is known that learning of players who interact in a repeated game can be interpreted as an evolutionary process in a population of ideas. These analogies have so far mostly been established in deterministic models, and memory loss in learning has been seen to act similarly to mutation in evolution. We here propose a representation of reinforcement learning as a stochastic process in finite ‘populations of ideas’. The resulting birth-death dynamics has absorbing states and allows for the extinction or fixation of ideas, marking a key difference to mutation-selection processes in finite populations. We characterize the outcome of evolution in populations of ideas for several classes of symmetric and asymmetric games.

  18. Education Vouchers: Boon or Bane?

    ERIC Educational Resources Information Center

    Young, David G.

    The idea of educational vouchers goes back to Adam Smith in 1778, according to this examination of past and present discussions about vouchers. The author begins by defining educational vouchers and summarizing the idea's history, especially since its revival in 1955 by economist Milton Friedman. Seven models of voucher systems are briefly…

  19. Manipulating Models and Grasping the Ideas They Represent

    ERIC Educational Resources Information Center

    Bryce, T. G.; Blown, E. J.

    2016-01-01

    This article notes the convergence of recent thinking in neuroscience and grounded cognition regarding the way we understand mental representation and recollection: ideas are dynamic and multi-modal, actively created at the point of recall. Also, neurophysiologically, re-entrant signalling among cortical circuits allows non-conscious processing to…

  20. Age-Related Changes in Creative Thinking

    ERIC Educational Resources Information Center

    Roskos-Ewoldsen, Beverly; Black, Sheila R.; Mccown, Steven M.

    2008-01-01

    Age-related differences in cognitive processes were used to understand age-related declines in creativity. According to the Geneplore model (Finke, Ward, & Smith, 1992), there are two phases of creativity--generating an idea and exploring the implications of the idea--each with different underlying cognitive processes. These two phases are…

  1. Idea Bank.

    ERIC Educational Resources Information Center

    Science Teacher, 1993

    1993-01-01

    Presents a series of science teaching ideas with the following titles: When Demonstrations Are Misleading, Lasers and Refraction, An Improved Stair-Step Model, Correcting Your Compass, Seeing Is Not Believing, Food Coloring: From the Kitchen to the Lab, Punny Business, Portfolios in Science, Feathers or Gold: A Case for Using the Metric System,…

  2. A New "Idea of Nature" for Chemical Education

    ERIC Educational Resources Information Center

    Earley, Joseph E., Sr.

    2013-01-01

    "The idea of nature" (general model of how things work) that is accepted in a society strongly influences that group's social and technological progress. Currently, science education concentrates on "analysis" of stable pre-existing items to minimum constituents. This emphasis is consistent with an outlook that has been…

  3. Learning from Each Other

    ERIC Educational Resources Information Center

    Phillips, Vicki L.

    2011-01-01

    The idea behind public charter schools was to develop flexible models of public schools and to incubate innovative ideas that then could be shared with the district's public schools. Today, almost 20 years since the first public charter school opened its doors in Minnesota, one still does not see consistent, productive collaboration and shared…

  4. A system dynamics model of a large R&D program

    NASA Astrophysics Data System (ADS)

    Ahn, Namsung

    Organizations with large R&D activities must deal with a hierarchy of decision regarding resource allocation. At the highest level of allocation, the decision is related to the total allocation to R&D as some portion of revenue. The middle level of allocation deals with the allocation among phases of the R&D process. The lowest level of decisions relates to the resource allocation to specific projects within a specific phase. This study focuses on developing an R&D model to deal with the middle level of allocation, i.e., the allocation among phases of research such as basic research, development, and demonstration. The methodology used to develop the R&D model is System Dynamics. Our modeling concept is innovative in representing each phase of R&D as consisting of two parts: projects under way, and an inventory of successful but not-yet- exploited projects. In a simple world, this concept can yield an exact analytical solution for allocation of resources among phases. But in a real world, the concept should be improved by adding more complex structures with nonlinear behaviors. Two particular nonlinear feedbacks are incorporated into the R&D model. The probability of success for any specific project is assumed partly dependent upon resources allocated to the project. Further, the time required to reach a conclusion regarding the success or failure of a project is also assumed dependent upon the level of resources allocated. In addition, the number of successful projects partly depends on the inventory of potential ideas in the previous stage that can be exploited. This model can provide R&D management with insights into the effect of changing allocations to phases whether those changes are internally or externally driven. With this model, it is possible to study the effectiveness of management decisions in a continuous fashion. Managers can predict payoffs for a host of different policies. In addition, as new research results accumulate, a re- assessment of program goals can be implemented easily and allocations adjusted to enhance continuously the likelihood of success, and to optimize payoffs. Finally, this model can give managers a quantitative rationale for program evaluation and permit the quantitative assessment of various externally imposed changes. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.)

  5. The effects of duration of exposure to the REAPS model in developing students' general creativity and creative problem solving in science

    NASA Astrophysics Data System (ADS)

    Alhusaini, Abdulnasser Alashaal F.

    The Real Engagement in Active Problem Solving (REAPS) model was developed in 2004 by C. June Maker and colleagues as an intervention for gifted students to develop creative problem solving ability through the use of real-world problems. The primary purpose of this study was to examine the effects of the REAPS model on developing students' general creativity and creative problem solving in science with two durations as independent variables. The long duration of the REAPS model implementation lasted five academic quarters or approximately 10 months; the short duration lasted two quarters or approximately four months. The dependent variables were students' general creativity and creative problem solving in science. The second purpose of the study was to explore which aspects of creative problem solving (i.e., generating ideas, generating different types of ideas, generating original ideas, adding details to ideas, generating ideas with social impact, finding problems, generating and elaborating on solutions, and classifying elements) were most affected by the long duration of the intervention. The REAPS model in conjunction with Amabile's (1983; 1996) model of creative performance provided the theoretical framework for this study. The study was conducted using data from the Project of Differentiation for Diverse Learners in Regular Classrooms (i.e., the Australian Project) in which one public elementary school in the eastern region of Australia cooperated with the DISCOVER research team at the University of Arizona. All students in the school from first to sixth grade participated in the study. The total sample was 360 students, of which 115 were exposed to a long duration and 245 to a short duration of the REAPS model. The principal investigators used a quasi-experimental research design in which all students in the school received the treatment for different durations. Students in both groups completed pre- and posttests using the Test of Creative Thinking-Drawing Production (TCT-DP) and the Test of Creative Problem Solving in Science (TCPS-S). A one-way analysis of covariance (ANCOVA) was conducted to control for differences between the two groups on pretest results. Statistically significant differences were not found between posttest scores on the TCT-DP for the two durations of REAPS model implementation. However, statistically significant differences were found between posttest scores on the TCPS-S. These findings are consistent with Amabile's (1983; 1996) model of creative performance, particularly her explanation that domain-specific creativity requires knowledge such as specific content and technical skills that must be learned prior to being applied creatively. The findings are also consistent with literature in which researchers have found that longer interventions typically result in expected positive growth in domain-specific creativity, while both longer and shorter interventions have been found effective in improving domain-general creativity. Change scores were also calculated between pre- and posttest scores on the 8 aspects of creativity (Maker, Jo, Alfaiz, & Alhusaini, 2015a), and a binary logistic regression was conducted to assess which were the most affected by the long duration of the intervention. The regression model was statistically significant, with aspects of generating ideas, adding details to ideas, and finding problems being the most affected by the long duration of the intervention. Based on these findings, the researcher believes that the REAPS model is a useful intervention to develop students' creativity. Future researchers should implement the model for longer durations if they are interested in developing students' domain-specific creative problem solving ability.

  6. Multilevel Modeling in Psychosomatic Medicine Research

    PubMed Central

    Myers, Nicholas D.; Brincks, Ahnalee M.; Ames, Allison J.; Prado, Guillermo J.; Penedo, Frank J.; Benedict, Catherine

    2012-01-01

    The primary purpose of this manuscript is to provide an overview of multilevel modeling for Psychosomatic Medicine readers and contributors. The manuscript begins with a general introduction to multilevel modeling. Multilevel regression modeling at two-levels is emphasized because of its prevalence in psychosomatic medicine research. Simulated datasets based on some core ideas from the Familias Unidas effectiveness study are used to illustrate key concepts including: communication of model specification, parameter interpretation, sample size and power, and missing data. Input and key output files from Mplus and SAS are provided. A cluster randomized trial with repeated measures (i.e., three-level regression model) is then briefly presented with simulated data based on some core ideas from a cognitive behavioral stress management intervention in prostate cancer. PMID:23107843

  7. Anomalous diffusion and the structure of human transportation networks

    NASA Astrophysics Data System (ADS)

    Brockmann, D.

    2008-04-01

    The dispersal of individuals of a species is the key driving force of various spatiotemporal phenomena which occur on geographical scales. It can synchronise populations of interacting species, stabilise them, and diversify gene pools [1-3]. The geographic spread of human infectious diseases such as influenza, measles and the recent severe acute respiratory syndrome (SARS) is essentially promoted by human travel which occurs on many length scales and is sustained by a variety of means of transportation [4-8]. In the light of increasing international trade, intensified human traffic, and an imminent influenza A pandemic the knowledge of dynamical and statistical properties of human dispersal is of fundamental importance and acute [7,9,10]. A quantitative statistical theory for human travel and concomitant reliable forecasts would substantially improve and extend existing prevention strategies. Despite its crucial role, a quantitative assessment of human dispersal remains elusive and the opinion that humans disperse diffusively still prevails in many models [11]. In this chapter I will report on a recently developed technique which permits a solid and quantitative assessment of human dispersal on geographical scales [11]. The key idea is to infer the statistical properties of human travel by analysing the geographic circulation of individual bank notes for which comprehensive datasets are collected at the online bill-tracking website www.wheresgeorge.com. The analysis shows that the distribution of travelling distances decays as a power law, indicating that the movement of bank notes is reminiscent of superdiffusive, scale free random walks known as Lèvy flights [13]. Secondly, the probability of remaining in a small, spatially confined region for a time T is dominated by heavy tails which attenuate superdiffusive dispersal. I will show that the dispersal of bank notes can be described on many spatiotemporal scales by a two parameter continuous time random walk (CTRW) model to a surprising accuracy. To this end, I will provide a brief introduction to continuous time random walk theory [14] and will show that human dispersal is an ambivalent, effectively superdiffusive process.

  8. Modeling commodity salam contract between two parties for discrete and continuous time series

    NASA Astrophysics Data System (ADS)

    Hisham, Azie Farhani Badrol; Jaffar, Maheran Mohd

    2017-08-01

    In order for Islamic finance to remain competitive as the conventional, there needs a new development of Islamic compliance product such as Islamic derivative that can be used to manage the risk. However, under syariah principles and regulations, all financial instruments must not be conflicting with five syariah elements which are riba (interest paid), rishwah (corruption), gharar (uncertainty or unnecessary risk), maysir (speculation or gambling) and jahl (taking advantage of the counterparty's ignorance). This study has proposed a traditional Islamic contract namely salam that can be built as an Islamic derivative product. Although a lot of studies has been done on discussing and proposing the implementation of salam contract as the Islamic product however they are more into qualitative and law issues. Since there is lack of quantitative study of salam contract being developed, this study introduces mathematical models that can value the appropriate salam price for a commodity salam contract between two parties. In modeling the commodity salam contract, this study has modified the existing conventional derivative model and come out with some adjustments to comply with syariah rules and regulations. The cost of carry model has been chosen as the foundation to develop the commodity salam model between two parties for discrete and continuous time series. However, the conventional time value of money results from the concept of interest that is prohibited in Islam. Therefore, this study has adopted the idea of Islamic time value of money which is known as the positive time preference, in modeling the commodity salam contract between two parties for discrete and continuous time series.

  9. FORMING CHONDRULES IN IMPACT SPLASHES. II. VOLATILE RETENTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dullemond, Cornelis Petrus; Harsono, Daniel; Stammler, Sebastian Markus

    2016-11-20

    Solving the mystery of the origin of chondrules is one of the most elusive goals in the field of meteoritics. Recently, the idea of planet(esimal) collisions releasing splashes of lava droplets, long considered out of favor, has been reconsidered as a possible origin of chondrules by several papers. One of the main problems with this idea is the lack of quantitative and simple models that can be used to test this scenario by directly comparing to the many known observables of chondrules. In Paper I of this series, we presented a simple thermal evolution model of a spherically symmetric expandingmore » cloud of molten lava droplets that is assumed to emerge from a collision between two planetesimals. The production of lava could be either because the two planetesimals were already in a largely molten (or almost molten) state due to heating by {sup 26}Al, or due to impact jetting at higher impact velocities. In the present paper, number II of this series, we use this model to calculate whether or not volatile elements such as Na and K will remain abundant in these droplets or whether they will get depleted due to evaporation. The high density of the droplet cloud (e.g., small distance between adjacent droplets) causes the vapor to quickly reach saturation pressure and thus shuts down further evaporation. We show to what extent, and under which conditions, this keeps the abundances of these elements high, as is seen in chondrules. We find that for most parameters of our model (cloud mass, expansion velocity, initial temperature) the volatile elements Mg, Si, and Fe remain entirely in the chondrules. The Na and K abundances inside the droplets will initially stay mostly at their initial values due to the saturation of the vapor pressure, but at some point start to drop due to the cloud expansion. However, as soon as the temperature starts to decrease, most or all of the vapor recondenses again. At the end, the Na and K elements retain most of their initial abundances, albeit occasionally somewhat reduced, depending on the parameters of the expanding cloud model. These findings appear to be qualitatively consistent with the analysis of Semarkona Type II chondrules by Hewins et al. who found evidence for sodium evaporation followed by recondensation.« less

  10. Pesticide risk perceptions and the differences between farmers and extensionists: Towards a knowledge-in-context model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ríos-González, Adriana, E-mail: adrianariosg@hotmail.com; The Africa and Latin America Research Groups Network; Jansen, Kees, E-mail: Kees.Jansen@wur.nl

    A growing body of literature analyzes farmer perceptions of pesticide risk, but much less attention has been given to differences in risk perception between farmers and technical experts. Furthermore, inconsistencies in knowledge have too easily been explained in terms of lack of knowledge rather than exploring the underlying reasons for particular forms of thinking about pesticide risks. By doing this, the division between expert and lay knowledge has been deepened rather than transcended. Objective: This study aims to understand differences and similarities among the perceptions of pesticide risks of farmers, farm workers, and technical experts such as extensionists, by applyingmore » a social science approach towards knowledge and risk attitudes. Methods: Semi-structured interviews and field observations were conducted to smallholders, farm workers, extensionists, health professionals and scientists involved in the use and handling of pesticides. Subsequently, a survey was carried out to quantify the farmers and extensionists' acceptance or rejection of typical assertions expressed previously in the semi-structured interviews. Results: Smallholders showed to gain knowledge from their own experiences and to adapt pesticides practices, which is a potential basis for transforming notions of pesticide safety and risk reduction strategies. Though extensionists have received formal education, they sometimes develop ideas deviating from the technical perspective. The risk perception of the studied actors appeared to vary according to their role in the agricultural labor process; they varied much less than expected according to their schooling level. Conclusions: Commitment to the technical perspective is not dramatically different for extensionists on the one hand and farmers as well as farm workers on the other hand. Ideas about a supposed lack of knowledge by farmers and the need of formal training are too much driven by a deficit model of knowledge. Further research on risk perceptions of pesticides and training of rural people will benefit from the development of a knowledge-in-context model. -- Highlights: • Researching perceptions of farmers' extensionists and other professionals. • Experts as well as farmers deviate from the technical perspective. • Blaming who is responsible for pesticide problems creates expert-lay division. • Qualitative and quantitative methods, not as complementary but integrated. • Knowledge-in-context model as an alternative to the knowledge-deficit model.« less

  11. Observations, theoretical ideas and modeling of turbulent flows: Past, present and future

    NASA Technical Reports Server (NTRS)

    Chapman, G. T.; Tobak, M.

    1985-01-01

    Turbulence was analyzed in a historical context featuring the interactions between observations, theoretical ideas, and modeling within three successive movements. These are identified as predominantly statistical, structural and deterministic. The statistical movement is criticized for its failure to deal with the structural elements observed in turbulent flows. The structural movement is criticized for its failure to embody observed structural elements within a formal theory. The deterministic movement is described as having the potential of overcoming these deficiencies by allowing structural elements to exhibit chaotic behavior that is nevertheless embodied within a theory. Four major ideas of this movement are described: bifurcation theory, strange attractors, fractals, and the renormalization group. A framework for the future study of turbulent flows is proposed, based on the premises of the deterministic movement.

  12. Modeling the Transition from a Phenotypic to Genotypic Conceptualization of Genetics in a University-Level Introductory Biology Context

    NASA Astrophysics Data System (ADS)

    Todd, Amber; Romine, William L.; Correa-Menendez, Josefina

    2017-07-01

    Identifying contingencies between constructs in a multi-faceted learning progression (LP) is a challenging task. Often, there is not enough evidence in the literature to support connections, and once identified, they are difficult to empirically test. Here, we use causal model search to evaluate how connections between ideas in a genetics LP change over time in the context of an introductory biology course. We identify primary and secondary hub ideas and connections between concepts before and after instruction to illustrate how students moved from a phenotypic grounding of genetics knowledge to a more genotypic grounding of their genetics knowledge after instruction. We discuss our results in light of conceptual change and illustrate the importance of understanding students' idea structures within a domain.

  13. Modeling Languages Refine Vehicle Design

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Cincinnati, Ohio s TechnoSoft Inc. is a leading provider of object-oriented modeling and simulation technology used for commercial and defense applications. With funding from Small Business Innovation Research (SBIR) contracts issued by Langley Research Center, the company continued development on its adaptive modeling language, or AML, originally created for the U.S. Air Force. TechnoSoft then created what is now known as its Integrated Design and Engineering Analysis Environment, or IDEA, which can be used to design a variety of vehicles and machinery. IDEA's customers include clients in green industries, such as designers for power plant exhaust filtration systems and wind turbines.

  14. `Human nature': Chemical engineering students' ideas about human relationships with the natural world

    NASA Astrophysics Data System (ADS)

    Goldman, Daphne; Ben-Zvi Assaraf, Orit; Shemesh, Julia

    2014-05-01

    While importance of environmental ethics, as a component of sustainable development, in preparing engineers is widely acknowledged, little research has addressed chemical engineers' environmental concerns. This study aimed to address this void by exploring chemical engineering students' values regarding human-nature relationships. The study was conducted with 247 3rd-4th year chemical engineering students in Israeli Universities. It employed the New Ecological Paradigm (NEP)-questionnaire to which students added written explanations. Quantitative analysis of NEP-scale results shows that the students demonstrated moderately ecocentric orientation. Explanations to the NEP-items reveal diverse, ambivalent ideas regarding the notions embodied in the NEP, strong scientific orientation and reliance on technology for addressing environmental challenges. Endorsing sustainability implies that today's engineers be equipped with an ecological perspective. The capacity of Higher Education to enable engineers to develop dispositions about human-nature interrelationships requires adaptation of curricula towards multidisciplinary, integrative learning addressing social-political-economic-ethical perspectives, and implementing critical-thinking within the socio-scientific issues pedagogical approach.

  15. The implementation of multi-task geophysical survey to locate Cleopatra Tomb at Tap-Osiris Magna, Borg El-Arab, Alexandria, Egypt “Phase II”

    NASA Astrophysics Data System (ADS)

    Abbas, Abbas M.; Khalil, Mohamed A.; Massoud, Usama; Santos, Fernando M.; Mesbah, Hany A.; Lethy, Ahmed; Soliman, Mamdouh; Ragab, El Said A.

    2012-06-01

    According to some new discoveries at Tap-Osiris Magna temple (West of Alexandria), there is potentiality to uncover a remarkable archeological finding at this site. Three years ago many significant archeological evidences have been discovered sustaining the idea that the tomb of Cleopatra and Anthony may be found in the Osiris temple inside Tap-Osiris Magna temple at a depth from 20 to 30 m. To confirm this idea, PHASE I was conducted in by joint application of Ground Penetrating Radar “GPR”, Electrical Resistivity Tomography “ERT” and Magnetometry. The results obtained from PHASE I could not confirm the existence of major tombs at this site. However, small possible cavities were strongly indicated which encouraged us to proceed in investigation of this site by using another geophysical approach including Very Low Frequency Electro Magnetic (VLF-EM) technique. VLF-EM data were collected along parallel lines covering the investigated site with a line-to-line spacing of 1 m. The point-to-point distance of 1 m along the same line was employed. The data were qualitatively interpreted by Fraser filtering process and quantitatively by 2-D VLF inversion of tipper data and forward modeling. Results obtained from VLF-EM interpretation are correlated with 2-D resistivity imaging and drilling information. Findings showed a highly resistive zone at a depth extended from about 25-45 m buried beneath Osiris temple, which could be indicated as the tomb of Cleopatra and Anthony. This result is supported by Fraser filtering and forward modeling results. The depth of archeological findings as indicated from the geophysical survey is correlated well with the depth expected by archeologists, as well as, the depth of discovered tombs outside Tap-Osiris Magna temple. This depth level has not been reached by drilling in this site. We hope that the site can be excavated in the future based on these geophysical results.

  16. Fostering Under-represented Minority Student Success and Interest in the Geosciences: Outcomes of the UNC-Chapel Hill Increasing Diversity and Enhancing Academia (IDEA) Program

    NASA Astrophysics Data System (ADS)

    Hughes, M. H.; Gray, K.; Drostin, M.

    2016-12-01

    For under-represented minority (URM) students, opportunities to meaningfully participate in academic communities and develop supportive relationships with faculty and peers influence persistence in STEM majors (Figueroa, Hurtado, & Wilkins, 2015; PCAST, 2012; Tsui, 2007). Creating such opportunities is even more important in the geosciences, where a lower percentage of post-secondary degrees are awarded to URM students than in other STEM fields (NSF, 2015; O'Connell & Holmes, 2011; NSF, 2011). Since 2011, Increasing Diversity and Enhancing Academia (IDEA), a program of the UNC-Chapel Hill Institute for the Environment (UNC-IE), has provided 39 undergraduates (predominantly URM and female students) with career-relevant research experiences and professional development opportunities, including a culminating experience of presenting their research at a campus-wide research symposium. External evaluation data have helped to characterize the effectiveness of the IDEA program. These data included pre- and post-surveys assessing students' interest in geosciences, knowledge of career pathways, and perceptions of their abilities related to a specific set of scientific research skills. Additionally, progress towards degrees and dissemination outcomes were tracked. In this presentation, we will share quantitative and qualitative data that demonstrate that participation in the IDEA program has influenced students' interest and persistence in geosciences research and careers. These data range from self-reported competencies in a variety of scientific skills (such as organizing and interpreting data and reading and interpreting science literature) to documentation of student participation in geoscience study and professions. About 69% of participants continued research begun during their internships beyond the internship; and about 38% pursued graduate degrees and secured jobs in geoscience and other STEM fields. (Nearly half are still in school.) Overall, these evaluation data have shown that the IDEA research experience, combined with program elements focused on professional development, reinforces students' sense of their science abilities, connects them to a network of supportive students and professionals and contributes to their sense of belonging within the geosciences.

  17. Challenges and opportunities for imaging journals: emerging from the shadows.

    PubMed

    Kressel, Herbert Y

    2011-09-01

    In this article, we will discuss the challenges and opportunities for imaging journals in the next decades. These include: the importance of optimizing online communication to enhance exchange of ideas in the sciences and the necessity to facilitate the development of quantitative tools that will aid in risk stratification, diagnosis, monitoring of therapy, and disease surveillance in an era of "P4 medicine". Journals will also need to promote the evidence-based evaluation of promising technologies so that the resources expended in healthcare may be most effectively used.

  18. How patients understand depression associated with chronic physical disease – a systematic review

    PubMed Central

    2012-01-01

    Background Clinicians are encouraged to screen people with chronic physical illness for depression. Screening alone may not improve outcomes, especially if the process is incompatible with patient beliefs. The aim of this research is to understand people’s beliefs about depression, particularly in the presence of chronic physical disease. Methods A mixed method systematic review involving a thematic analysis of qualitative studies and quantitative studies of beliefs held by people with current depressive symptoms. MEDLINE, EMBASE, PSYCHINFO, CINAHL, BIOSIS, Web of Science, The Cochrane Library, UKCRN portfolio, National Research Register Archive, Clinicaltrials.gov and OpenSIGLE were searched from database inception to 31st December 2010. A narrative synthesis of qualitative and quantitative data, based initially upon illness representations and extended to include other themes not compatible with that framework. Results A range of clinically relevant beliefs was identified from 65 studies including the difficulty in labeling depression, complex causal factors instead of the biological model, the roles of different treatments and negative views about the consequences of depression. We found other important themes less related to ideas about illness: the existence of a self-sustaining ‘depression spiral’; depression as an existential state; the ambiguous status of suicidal thinking; and the role of stigma and blame in depression. Conclusions Approaches to detection of depression in physical illness need to be receptive to the range of beliefs held by patients. Patient beliefs have implications for engagement with depression screening. PMID:22640234

  19. Quantitative Understanding on the Amplitude Decay Characteristic of the Evanescent Electromagnetic Waves Generated by Seismoelectric Conversion

    NASA Astrophysics Data System (ADS)

    Ren, Hengxin; Huang, Qinghua; Chen, Xiaofei

    2018-03-01

    We conduct numerical simulations and theoretical analyses to quantitatively study the amplitude decay characteristic of the evanescent electromagnetic (EM) waves, which has been neglected in previous studies on the seismoelectric conversion occurring at a porous-porous interface. Time slice snapshots of seismic and EM wave-fields generated by a vertical single force point source in a two-layer porous model show that evanescent EM waves can be induced at a porous-porous interface. The seismic and EM wave-fields computed for a receiver array located in a vertical line nearby the interface are investigated in detail. In addition to the direct and interface-response radiation EM waves, we identify three groups of coseismic EM fields and evanescent EM waves associated with the direct P, refracted SV-P and direct SV waves, respectively. Thereafter, we derive the mathematical expression of the amplitude decay factor of the evanescent EM waves. This mathematical expression is further validated by our numerical simulations. It turns out the amplitude decay of the evanescent EM waves generated by seismoelectric conversion is greatly dependent on the horizontal wavenumber of seismic waves. It is also found the evanescent EM waves have a higher detectability at a lower frequency range. This work provides a better understanding on the EM wave-fields generated by seismoelectric conversion, which probably will help improve the interpretation of the seismoelectric coupling phenomena associated with natural earthquakes or possibly will inspire some new ideas on the application of the seismoelectric coupling effect.

  20. The place character as land use change determinant in Deli Serdang

    NASA Astrophysics Data System (ADS)

    Lindarto, D.; Sirojuzilam; Badaruddin; Aulia, DN

    2018-03-01

    The Mebidangro concept of development (Medan, Binjai, Deli Serdang, Karo) in Sumatera Utara creating peri urban area in region hinterland Medan city especially in Tembung village, Percut Sei Tuan District. This peri urban area is a conjunction of several rural-urban activities that forming a friendly atmosphere. The dynamic of population structure shows occurrence the sprawl of land use change condition. In the site of the urban region showing the unique performance that built the place character. The aim of the study is to uncover the place character as one of land use change determinant factors. The study conducted with quantitative approach intended at obtaining variables which describing several factors forming land use change. Descriptive approach give an idea, justification, and fact-finding with correct interpretation. Data collected through a purposive sampling of 320 respondents who stay and built the building and land between 2010 till 2014. With overlay figure/ground technique, scoring analysis, descriptive quantitative and SEM (Structural Equational Models) gained a result that urban heritage (p=0,008) potentially as one of the main land use change driving factors besides accessibility (p=0,039), infrastructure (p=0,010), social-economic (p=0,038) in fact topographic factor (p=0,663) was inversely potentially. The implication of the findings is required intensive attention toward the form of place character (mosque, the quarter, district activity, peri urban edges city and railway) as determinant factors of land use change considering forming the identity of the rapid change in land use transformation.

  1. The Dokuchaev hypothesis as a basis for predictive digital soil mapping (on the 125th anniversary of its publication)

    NASA Astrophysics Data System (ADS)

    Florinsky, I. V.

    2012-04-01

    Predictive digital soil mapping is widely used in soil science. Its objective is the prediction of the spatial distribution of soil taxonomic units and quantitative soil properties via the analysis of spatially distributed quantitative characteristics of soil-forming factors. Western pedometrists stress the scientific priority and principal importance of Hans Jenny's book (1941) for the emergence and development of predictive soil mapping. In this paper, we demonstrate that Vasily Dokuchaev explicitly defined the central idea and statement of the problem of contemporary predictive soil mapping in the year 1886. Then, we reconstruct the history of the soil formation equation from 1899 to 1941. We argue that Jenny adopted the soil formation equation from Sergey Zakharov, who published it in a well-known fundamental textbook in 1927. It is encouraging that this issue was clarified in 2011, the anniversary year for publications of Dokuchaev and Jenny.

  2. Evolution, Energy Landscapes and the Paradoxes of Protein Folding

    PubMed Central

    Wolynes, Peter G.

    2014-01-01

    Protein folding has been viewed as a difficult problem of molecular self-organization. The search problem involved in folding however has been simplified through the evolution of folding energy landscapes that are funneled. The funnel hypothesis can be quantified using energy landscape theory based on the minimal frustration principle. Strong quantitative predictions that follow from energy landscape theory have been widely confirmed both through laboratory folding experiments and from detailed simulations. Energy landscape ideas also have allowed successful protein structure prediction algorithms to be developed. The selection constraint of having funneled folding landscapes has left its imprint on the sequences of existing protein structural families. Quantitative analysis of co-evolution patterns allows us to infer the statistical characteristics of the folding landscape. These turn out to be consistent with what has been obtained from laboratory physicochemical folding experiments signalling a beautiful confluence of genomics and chemical physics. PMID:25530262

  3. From university research to commercial product (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Mathuis, Philip

    2016-03-01

    Ovizio Imaging Systems, a quantitative microscopic imaging spin-off of the Université Libre de Bruxelles, Belgium, was founded in the beginning of 2010 by Philip Mathuis, Serge Jooris, Prof. Frank Dubois and Dr. Catherine Yourassowky. The company has launched a range of specialized microscopy instruments for quantitative imaging mainly focused on the bioprocessing and diagnostics fields within the life sciences market. During my talk I will present the story of how an idea, emerged from the research labs of the University made it to a manufactured and sold product. The talk will look at many aspects of entrepreneurship and setting up a company, finding the funding for the project, attracting people, industrialization and product design and commercialization. It will also be focused on choices one has to make during the start-up phase and methodologies that can be applied in many different settings.

  4. Modeling abundance using hierarchical distance sampling

    USGS Publications Warehouse

    Royle, Andy; Kery, Marc

    2016-01-01

    In this chapter, we provide an introduction to classical distance sampling ideas for point and line transect data, and for continuous and binned distance data. We introduce the conditional and the full likelihood, and we discuss Bayesian analysis of these models in BUGS using the idea of data augmentation, which we discussed in Chapter 7. We then extend the basic ideas to the problem of hierarchical distance sampling (HDS), where we have multiple point or transect sample units in space (or possibly in time). The benefit of HDS in practice is that it allows us to directly model spatial variation in population size among these sample units. This is a preeminent concern of most field studies that use distance sampling methods, but it is not a problem that has received much attention in the literature. We show how to analyze HDS models in both the unmarked package and in the BUGS language for point and line transects, and for continuous and binned distance data. We provide a case study of HDS applied to a survey of the island scrub-jay on Santa Cruz Island, California.

  5. Teaching Scientific Core Ideas through Immersing Students in Argument: Using Density as an Example

    ERIC Educational Resources Information Center

    Chen, Ying-Chih; Lin, Jia-Ling; Chen, Yen-Ting

    2014-01-01

    Argumentation is one of the central practices in science learning and helps deepen students' conceptual understanding. Students should learn how to communicate ideas including procedure tests, data interpretations, and investigation outcomes in verbal and written forms through argument structure. This article presents a negotiation model to…

  6. "The Fly on the Wall" Reflecting Team Supervision.

    ERIC Educational Resources Information Center

    Prest, Layne E.; And Others

    1990-01-01

    Adapts reflecting team concept, a practical application of constructivist ideas, for use in group supervision. Evolving model includes a focus on the unique "fly on the wall" perspective of the reflecting team. Trainees are introduced to a multiverse of new ideas and perspectives in a context which integrates some of the most challenging…

  7. The Idea of the Secondary School in Nineteenth-Century Europe

    ERIC Educational Resources Information Center

    Anderson, Robert

    2004-01-01

    The title echoes the well-known phrase "the idea of the university", and European universities have always been seen as institutions with a strong international dimension, developing according to common patterns. In their case, it was the "Humboldtian" model embodied in the University of Berlin founded in 1810 which prevailed.…

  8. Individual and Collective Reflection: How to Meet the Needs of Development in Teaching

    ERIC Educational Resources Information Center

    Nissila, Sade-Pirkko

    2005-01-01

    The following five core ideas explain how learning organizations function as wholes. The core ideas are central when school is examined as a learning organization. Personal mastery, mental models, team learning, shared visions and system thinking offer different angles to examine the organization. (1) Personal mastery. Without personal commitment,…

  9. Sensitive Technology Assessment of ACOT.

    ERIC Educational Resources Information Center

    Baker, Eva L.

    This paper explores the ideas and the model underlying the evaluation of the Apple Classroom of Tomorrow project (ACOT), a 2-year-old research and development project incorporating at least seven different grade levels which is located in five different school sites in four states. The major features of ACOT are identified as the ideas of computer…

  10. Ideas for a Teaching Sequence for the Concept of Energy

    ERIC Educational Resources Information Center

    Duit, Reinders; Neumann, Knut

    2014-01-01

    The energy concept is one of the most important ideas for students to understand. Looking at phenomena through the lens of energy provides powerful tools to model, analyse and predict phenomena in the scientific disciplines. The cross-disciplinary nature of the energy concept enables students to look at phenomena from different angles, helping…

  11. Let's Have Some Capatence Here

    ERIC Educational Resources Information Center

    Brown, Reva Berman; McCartney, Sean

    2003-01-01

    Defines two competitive ideas--competence and capability--and argues that neither deals adequately with the central issue of the present. Provides a model, to place these ideas in conceptual space--the vertical axis of which is bounded by the extremes of narrow and broad focus, and the horizontal axis by the past and the future. Suggests that…

  12. Sculptural Ideas with Sketches and Maquettes. Teaching Art with Art.

    ERIC Educational Resources Information Center

    Hubbard, Guy

    2000-01-01

    Discusses the use of sketches and maquettes (idea models usually made of clay) as a preparatory method for a final artwork. Suggests ways to teach students about the importance of planning an artwork. Discusses the sketches and maquettes of sculptural pieces by Henry Moore, Constantin Brancusi, and Antoine Pevsner. (CMK)

  13. How Do Students' Behaviors Relate to the Growth of Their Mathematical Ideas?

    ERIC Educational Resources Information Center

    Warner, Lisa B.

    2008-01-01

    The purpose of this study is to analyze the relationship between student behaviors and the growth of mathematical ideas (using the Pirie-Kieren model). This analysis was accomplished through a series of case studies, involving middle school students of varying ability levels, who were investigating a combinatorics problem in after-school…

  14. Building a scholar in writing (BSW): A model for developing students' critical writing skills.

    PubMed

    Bailey, Annette; Zanchetta, Margareth; Velasco, Divine; Pon, Gordon; Hassan, Aafreen

    2015-11-01

    Several authors have highlighted the importance of writing in developing reflective thinking skills, transforming knowledge, communicating expressions, and filling knowledge gaps. However, difficulties with higher order processing and critical analysis affect students' ability to write critical and thoughtful essays. The Building a Scholar in Writing (BSW) model is a 6-step process of increasing intricacies in critical writing development. Development of critical writing is proposed to occur in a processed manner that transitions from presenting simple ideas (just bones) in writing, to connecting ideas (connecting bones), to formulating a thesis and connecting key components (constructing a skeleton), to supporting ideas with evidence (adding muscle), to building creativity and originality (adding essential organs), and finally, developing strong, integrated, critical arguments (adding brain). This process symbolically represents the building of a scholar. The idea of building a scholar equates to progressively giving life and meaning to a piece of writing with unique scholarly characteristics. This progression involves a transformation in awareness, thinking, and understanding, as well as advancement in students' level of critical appraisal skills. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. The design of multi-core DSP parallel model based on message passing and multi-level pipeline

    NASA Astrophysics Data System (ADS)

    Niu, Jingyu; Hu, Jian; He, Wenjing; Meng, Fanrong; Li, Chuanrong

    2017-10-01

    Currently, the design of embedded signal processing system is often based on a specific application, but this idea is not conducive to the rapid development of signal processing technology. In this paper, a parallel processing model architecture based on multi-core DSP platform is designed, and it is mainly suitable for the complex algorithms which are composed of different modules. This model combines the ideas of multi-level pipeline parallelism and message passing, and summarizes the advantages of the mainstream model of multi-core DSP (the Master-Slave model and the Data Flow model), so that it has better performance. This paper uses three-dimensional image generation algorithm to validate the efficiency of the proposed model by comparing with the effectiveness of the Master-Slave and the Data Flow model.

  16. Phospho-Tau Accumulation and Structural Alterations of the Golgi Apparatus of Cortical Pyramidal Neurons in the P301S Tauopathy Mouse Model

    PubMed Central

    Antón-Fernández, Alejandro; Merchán-Rubira, Jesús; Avila, Jesús; Hernández, Félix; DeFelipe, Javier; Muñoz, Alberto

    2017-01-01

    The Golgi apparatus (GA) is a highly dynamic organelle involved in the processing and sorting of cellular proteins. In Alzheimer’s disease (AD), it has been shown to decrease in size and become fragmented in neocortical and hippocampal neuronal subpopulations. This fragmentation and decrease in size of the GA in AD has been related to the accumulation of hyperphosphorylated tau. However, the involvement of other pathological factors associated with the course of the disease, such as the extracellular accumulation of amyloid-β (Aβ) aggregates, cannot be ruled out, since both pathologies are present in AD patients. Here we use the P301S tauopathy mouse model to examine possible alterations of the GA in neurons that overexpress human tau (P301S mutated gene) in neocortical and hippocampal neurons, using double immunofluorescence techniques and confocal microscopy. Quantitative analysis revealed that neurofibrillary tangle (NFT)-bearing neurons had important morphological alterations and reductions in the surface area and volume of the GA compared with NFT-free neurons. Since in this mouse model there are no Aβ aggregates typical of AD, the present findings support the idea that the progressive accumulation of phospho-tau is associated with structural alterations of the GA, and that these changes may occur in the absence of Aβ pathology. PMID:28922155

  17. Phospho-Tau Accumulation and Structural Alterations of the Golgi Apparatus of Cortical Pyramidal Neurons in the P301S Tauopathy Mouse Model.

    PubMed

    Antón-Fernández, Alejandro; Merchán-Rubira, Jesús; Avila, Jesús; Hernández, Félix; DeFelipe, Javier; Muñoz, Alberto

    2017-01-01

    The Golgi apparatus (GA) is a highly dynamic organelle involved in the processing and sorting of cellular proteins. In Alzheimer's disease (AD), it has been shown to decrease in size and become fragmented in neocortical and hippocampal neuronal subpopulations. This fragmentation and decrease in size of the GA in AD has been related to the accumulation of hyperphosphorylated tau. However, the involvement of other pathological factors associated with the course of the disease, such as the extracellular accumulation of amyloid-β (Aβ) aggregates, cannot be ruled out, since both pathologies are present in AD patients. Here we use the P301S tauopathy mouse model to examine possible alterations of the GA in neurons that overexpress human tau (P301S mutated gene) in neocortical and hippocampal neurons, using double immunofluorescence techniques and confocal microscopy. Quantitative analysis revealed that neurofibrillary tangle (NFT)-bearing neurons had important morphological alterations and reductions in the surface area and volume of the GA compared with NFT-free neurons. Since in this mouse model there are no Aβ aggregates typical of AD, the present findings support the idea that the progressive accumulation of phospho-tau is associated with structural alterations of the GA, and that these changes may occur in the absence of Aβ pathology.

  18. The roadmap for estimation of cell-type-specific neuronal activity from non-invasive measurements

    PubMed Central

    Uhlirova, Hana; Kılıç, Kıvılcım; Tian, Peifang; Sakadžić, Sava; Thunemann, Martin; Desjardins, Michèle; Saisan, Payam A.; Nizar, Krystal; Yaseen, Mohammad A.; Hagler, Donald J.; Vandenberghe, Matthieu; Djurovic, Srdjan; Andreassen, Ole A.; Silva, Gabriel A.; Masliah, Eliezer; Vinogradov, Sergei; Buxton, Richard B.; Einevoll, Gaute T.; Boas, David A.; Dale, Anders M.; Devor, Anna

    2016-01-01

    The computational properties of the human brain arise from an intricate interplay between billions of neurons connected in complex networks. However, our ability to study these networks in healthy human brain is limited by the necessity to use non-invasive technologies. This is in contrast to animal models where a rich, detailed view of cellular-level brain function with cell-type-specific molecular identity has become available due to recent advances in microscopic optical imaging and genetics. Thus, a central challenge facing neuroscience today is leveraging these mechanistic insights from animal studies to accurately draw physiological inferences from non-invasive signals in humans. On the essential path towards this goal is the development of a detailed ‘bottom-up’ forward model bridging neuronal activity at the level of cell-type-specific populations to non-invasive imaging signals. The general idea is that specific neuronal cell types have identifiable signatures in the way they drive changes in cerebral blood flow, cerebral metabolic rate of O2 (measurable with quantitative functional Magnetic Resonance Imaging), and electrical currents/potentials (measurable with magneto/electroencephalography). This forward model would then provide the ‘ground truth’ for the development of new tools for tackling the inverse problem—estimation of neuronal activity from multimodal non-invasive imaging data. This article is part of the themed issue ‘Interpreting BOLD: a dialogue between cognitive and cellular neuroscience’. PMID:27574309

  19. Evaluating the Sustainability of School-Based Health Centers.

    PubMed

    Navarro, Stephanie; Zirkle, Dorothy L; Barr, Donald A

    2017-01-01

    The United States is facing a surge in the number of school-based health centers (SBHCs) owing to their success in delivering positive health outcomes and increasing access to care. To preserve this success, experts have developed frameworks for creating sustainable SBHCs; however, little research has affirmed or added to these models. This research seeks to analyze elements of sustainability in a case study of three SBHCs in San Diego, California, with the purpose of creating a research-based framework of SBHC sustainability to supplement expertly derived models. Using a mixed methods study design, data were collected from interviews with SBHC stakeholders, observations in SBHCs, and SBHC budgets. A grounded theory qualitative analysis and a quantitative budget analysis were completed to develop a theoretical framework for the sustainability of SBHCs. Forty-one interviews were conducted, 6 hours of observations were completed, and 3 years of SBHC budgets were analyzed to identify care coordination, community buy-in, community awareness, and SBHC partner cooperation as key themes of sustainability promoting patient retention for sustainable billing and reimbursement levels. These findings highlight the unique ways in which SBHCs gain community buy-in and awareness by becoming trusted sources of comprehensive and coordinated care within communities and among vulnerable populations. Findings also support ideas from expert models of SBHC sustainability calling for well-defined and executed community partnerships and quality coordinated care in the procurement of sustainable SBHC funding.

  20. Adaptation Measures Evaliation on Agriculture Under Future Climate and Land Use Scenarios in Central Chile

    NASA Astrophysics Data System (ADS)

    Henriquez Dole, L. E.; Vicuna, S.; Gironas, J. A.; Meza, F. J.

    2016-12-01

    Future climate change scenarios threaten current practices in agriculture and therefore adaptation measures have been proposed to overcome this possible situation. Regional to local ideas apply for all kind of adaptation measures and can be found among literature for Central Chile, but their quantitative efficiency is rarely evaluated. Furthermore, land uses changes are commonly neglected in such evaluations. This research use the Water Evaluation and Planning (WEAP) model and the Plant Growth Model (PGM) to simulate weekly water distribution and consumption in Chile's rural areas up to 2050. Using information directly provided by the Water User Organizations (WUO), the developed model assesses possible future impacts on 2 crops (corn and plum) under 15 climate scenarios and land use trends. Results show that WEAP-PGM tool can represent satisfactorily crop sensitiveness to historic and future circumstances. Nine scenarios satisfy average crop water demands, but all of them present a diminished yield (1%-14%) and production (8%-20%). Just six scenarios cannot meet crop water demands (40-70% of reliability) if adaptation measures are not applied. Given this need, two adaptation measures were evaluated: a) using all water rights and b) irrigation improvements. The second option showed to be the most effective measure leading to the satisfaction of crop water demands under all the scenarios, but still a diminished yield and production remained.

  1. Networks as Renormalized Models for Emergent Behavior in Physical Systems

    NASA Astrophysics Data System (ADS)

    Paczuski, Maya

    2005-09-01

    Networks are paradigms for describing complex biological, social and technological systems. Here I argue that networks provide a coherent framework to construct coarsegrained models for many different physical systems. To elucidate these ideas, I discuss two long-standing problems. The first concerns the structure and dynamics of magnetic fields in the solar corona, as exemplified by sunspots that startled Galileo almost 400 years ago. We discovered that the magnetic structure of the corona embodies a scale free network, with spots at all scales. A network model representing the three-dimensional geometry of magnetic fields, where links rewire and nodes merge when they collide in space, gives quantitative agreement with available data, and suggests new measurements. Seismicity is addressed in terms of relations between events without imposing space-time windows. A metric estimates the correlation between any two earthquakes. Linking strongly correlated pairs, and ignoring pairs with weak correlation organizes the spatio-temporal process into a sparse, directed, weighted network. New scaling laws for seismicity are found. For instance, the aftershock decay rate decreases as ~ 1/t in time up to a correlation time, tomori. An estimate from the data gives tomori to be about one year for small magnitude 3 earthquakes, about 1400 years for the Landers event, and roughly 26,000 years for the earthquake causing the 2004 Asian tsunami. Our results confirm Kagan's conjecture that aftershocks can rumble on for centuries.

  2. Network model of top-down influences on local gain and contextual interactions in visual cortex.

    PubMed

    Piëch, Valentin; Li, Wu; Reeke, George N; Gilbert, Charles D

    2013-10-22

    The visual system uses continuity as a cue for grouping oriented line segments that define object boundaries in complex visual scenes. Many studies support the idea that long-range intrinsic horizontal connections in early visual cortex contribute to this grouping. Top-down influences in primary visual cortex (V1) play an important role in the processes of contour integration and perceptual saliency, with contour-related responses being task dependent. This suggests an interaction between recurrent inputs to V1 and intrinsic connections within V1 that enables V1 neurons to respond differently under different conditions. We created a network model that simulates parametrically the control of local gain by hypothetical top-down modification of local recurrence. These local gain changes, as a consequence of network dynamics in our model, enable modulation of contextual interactions in a task-dependent manner. Our model displays contour-related facilitation of neuronal responses and differential foreground vs. background responses over the neuronal ensemble, accounting for the perceptual pop-out of salient contours. It quantitatively reproduces the results of single-unit recording experiments in V1, highlighting salient contours and replicating the time course of contextual influences. We show by means of phase-plane analysis that the model operates stably even in the presence of large inputs. Our model shows how a simple form of top-down modulation of the effective connectivity of intrinsic cortical connections among biophysically realistic neurons can account for some of the response changes seen in perceptual learning and task switching.

  3. Hidden Markov models for evolution and comparative genomics analysis.

    PubMed

    Bykova, Nadezda A; Favorov, Alexander V; Mironov, Andrey A

    2013-01-01

    The problem of reconstruction of ancestral states given a phylogeny and data from extant species arises in a wide range of biological studies. The continuous-time Markov model for the discrete states evolution is generally used for the reconstruction of ancestral states. We modify this model to account for a case when the states of the extant species are uncertain. This situation appears, for example, if the states for extant species are predicted by some program and thus are known only with some level of reliability; it is common for bioinformatics field. The main idea is formulation of the problem as a hidden Markov model on a tree (tree HMM, tHMM), where the basic continuous-time Markov model is expanded with the introduction of emission probabilities of observed data (e.g. prediction scores) for each underlying discrete state. Our tHMM decoding algorithm allows us to predict states at the ancestral nodes as well as to refine states at the leaves on the basis of quantitative comparative genomics. The test on the simulated data shows that the tHMM approach applied to the continuous variable reflecting the probabilities of the states (i.e. prediction score) appears to be more accurate then the reconstruction from the discrete states assignment defined by the best score threshold. We provide examples of applying our model to the evolutionary analysis of N-terminal signal peptides and transcription factor binding sites in bacteria. The program is freely available at http://bioinf.fbb.msu.ru/~nadya/tHMM and via web-service at http://bioinf.fbb.msu.ru/treehmmweb.

  4. Self-evaluation System for Low carbon Industrial Park--A Case Study of TEDA Industrial Park in Tianjin

    NASA Astrophysics Data System (ADS)

    Wenyan, W.; Fanghua, H.; Ying, C.; Ouyang, W.; Yuan, Q.

    2013-12-01

    Massive fossil fuel burning caused by industrialization development is one major reason of global climate change. After Copenhagen climate summit, the studies of low-carbon city gain attentions from many countries. On 25th Nov. 2009, the State Council executive meeting announced that by 2020 China will reduce the carbon dioxide emissions per unit of GDP by 40% to 45% compared with the level of 2005. Industrial Park as an important part of city, has developed rapidly in recent years, and turns into a key element and an alternative mechanism to achieve emission reduction target. Thus, establishing a low carbon development model for industrial park is one of the most effective ways to build sustainable low carbon cities. By adopting the self-evaluation system of low carbon industrial park, this research aims to summarize the low carbon concept in industrial park practice. According to The Guide for Low Carbon Industrial Development Zones, the quantitative evaluation system is divided into 4 separate categories with 23 different quantitative indicators. The 4 categories include: 1) energy and GHG management (weigh 60%), 2) circular economy and environmental protection (weigh 15%), 3) administration and incentive mechanisms of industrial parks (weigh 15%), and 4) planning and urban forms (weigh 10%). By going through the necessary stages and by leading continuous improvements low carbon development goals can be achieved. Tianjin TEDA industrial park is selected as one case study to conduct an assessment on TEDA low-carbon development condition. Tianjin TEDA Industrial Park is already an ecological demonstration industrial park in China, with good foundations on environmental protection, resource recycling, etc. Based on the self-evaluation system, the indicators, such as the energy using efficiency and the degree of land intensive utilization, are also analyzed and assessed. Through field survey and data collection, in accordance with the quantitative self-evaluation system, the author has scored and calculated the various indicators and found the key points and bottle-neck issues of low-carbon industrial park development. Combined with the actual situation of the existing park, self-assessment system as an aid tool can help the park to carry out low-carbon practice, also to propose scientific recommendations. By analyzing a case of comprehensive industry park, the author attempts to explain the implementation of low carbon ideas in practice, also contributes to increased understanding of the factors critical to low-carbon development in industrial park. Although the main focus of the paper is TEDA, it would be relevant to other industrial parks which would attempt to further develop their low-carbon ideas.

  5. Report for the Office of Scientific and Technical Information: Population Modeling of the Emergence and Development of Scientific Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bettencourt, L. M. A.; Castillo-Chavez, C.; Kaiser, D.

    2006-10-04

    The accelerated development of digital libraries and archives, in tandem with efficient search engines and the computational ability to retrieve and parse massive amounts of information, are making it possible to quantify the time evolution of scientific literatures. These data are but one piece of the tangible recorded evidence of the processes whereby scientists create and exchange information in their journeys towards the generation of knowledge. As such, these tools provide a proxy with which to study our ability to innovate. Innovation has often been linked with prosperity and growth and, consequently, trying to understand what drives scientific innovation ismore » of extreme interest. Identifying sets of population characteristics, factors, and mechanisms that enable scientific communities to remain at the cutting edge, accelerate their growth, or increase their ability to re-organize around new themes or research topics is therefore of special significance. Yet generating a quantitative understanding of the factors that make scientific fields arise and/or become more or less productive is still in its infancy. This is precisely the type of knowledge most needed for promoting and sustaining innovation. Ideally, the efficient and strategic allocation of resources on the part of funding agencies and corporations would be driven primarily by knowledge of this type. Early steps have been taken toward such a quantitative understanding of scientific innovation. Some have focused on characterizing the broad properties of relevant time series, such as numbers of publications and authors in a given field. Others have focused on the structure and evolution of networks of coauthorship and citation. Together these types of studies provide much needed statistical analyses of the structure and evolution of scientific communities. Despite these efforts, however, crucial elements of prediction have remained elusive. Building on many of these earlier insights, we provide here a coarse-grained approach to modeling the time-evolution of scientific fields mathematically, through adaptive models of contagion. That is, our models are inspired by epidemic contact processes, but take into account the social interactions and processes whereby scientific ideas spread - social interactions gleaned from close empirical study of historical cases. Variations in model parameters can increase or hamper the speed at which a field develops. In this way, models for the spread of 'infectious' ideas can be used to identify pressure points in the process of innovation that may allow for the evaluation of possible interventions by those responsible for promoting innovation, such as funding agencies. This report is organized as follows: Section 2 introduces and discusses the population model used here to describe the dynamics behind the establishment of scientific fields. The approach is based on a succinct (coarse) description of contact processes between scientists, and is a simplified version of a general class of models developed in the course of this work. We selected this model based primarily on its ability to treat a wide range of data patterns efficiently, across several different scientific fields. We also describe our methods for estimating parameter values, our optimization techniques used to match the model to data, and our method of generating error estimates. Section 3 presents brief accounts of six case studies of scientific evolution, measured by the growth in number of active authors over time, and shows the results of fitting our model to these data, including extrapolations to the near future. Section 4 discusses these results and provides some perspectives on the values and limitations of the models used. We also discuss topics for further research which should improve our ability to predict (and perhaps influence) the course of future scientific research. Section 5 provides more detail on the broad class of epidemic models developed as part of this project.« less

  6. Materials-by-design: computation, synthesis, and characterization from atoms to structures

    NASA Astrophysics Data System (ADS)

    Yeo, Jingjie; Jung, Gang Seob; Martín-Martínez, Francisco J.; Ling, Shengjie; Gu, Grace X.; Qin, Zhao; Buehler, Markus J.

    2018-05-01

    In the 50 years that succeeded Richard Feynman’s exposition of the idea that there is ‘plenty of room at the bottom’ for manipulating individual atoms for the synthesis and manufacturing processing of materials, the materials-by-design paradigm is being developed gradually through synergistic integration of experimental material synthesis and characterization with predictive computational modeling and optimization. This paper reviews how this paradigm creates the possibility to develop materials according to specific, rational designs from the molecular to the macroscopic scale. We discuss promising techniques in experimental small-scale material synthesis and large-scale fabrication methods to manipulate atomistic or macroscale structures, which can be designed by computational modeling. These include recombinant protein technology to produce peptides and proteins with tailored sequences encoded by recombinant DNA, self-assembly processes induced by conformational transition of proteins, additive manufacturing for designing complex structures, and qualitative and quantitative characterization of materials at different length scales. We describe important material characterization techniques using numerous methods of spectroscopy and microscopy. We detail numerous multi-scale computational modeling techniques that complements these experimental techniques: DFT at the atomistic scale; fully atomistic and coarse-grain molecular dynamics at the molecular to mesoscale; continuum modeling at the macroscale. Additionally, we present case studies that utilize experimental and computational approaches in an integrated manner to broaden our understanding of the properties of two-dimensional materials and materials based on silk and silk-elastin-like proteins.

  7. A mixed-methods study identifying key intervention targets to improve participation in daily living activities in primary Sjögren's syndrome patients.

    PubMed

    Hackett, Katie L; Deane, Katherine H O; Newton, Julia L; Deary, Vincent; Bowman, Simon; Rapley, Tim; Ng, Wan-Fai

    2018-02-06

    Functional ability and participation in life situations are compromised in many primary Sjögren's syndrome (PSS) patients. This study aims to identify the key barriers and priorities to participation in daily living activities, in order to develop potential future interventions. Group concept mapping (GCM), a semi-quantitative, mixed-methods, approach was used to identify and structure ideas from UK PSS patients, adults living with a PSS patient (AHMs) and health care professionals (HCPs). Brainstorming generated ideas, which were summarised into a final set of statements. Participants individually arranged these statements into themes and rated each statement for importance. Multidimensional scaling and hierarchical cluster analysis were applied to sorted and rated data to produce visual representations of the ideas (concept maps), enabling identification of agreed priority areas for interventions. 121 patients, 43 AHMs and 67 HCPs took part. 463 ideas were distilled down to 94 statements. These statements were grouped into seven clusters; 'Patient empowerment', 'Symptoms', 'Wellbeing', 'Access and coordination of healthcare', 'Knowledge and support', 'Public awareness and support' and 'Family and friends'. Patient empowerment and Symptoms were rated as priority conceptual themes. Important statements within priority clusters indicate patients should be taken seriously and supported to self-manage symptoms of oral and ocular dryness, fatigue, pain and poor sleep. Our data highlighted that in addition to managing PSS symptoms; interventions aiming to improve patient empowerment, general wellbeing, access to healthcare, patient education and social support are important to facilitate improved participation in daily living activities. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  8. Emerging areas of science: Recommendations for Nursing Science Education from the Council for the Advancement of Nursing Science Idea Festival.

    PubMed

    Henly, Susan J; McCarthy, Donna O; Wyman, Jean F; Heitkemper, Margaret M; Redeker, Nancy S; Titler, Marita G; McCarthy, Ann Marie; Stone, Patricia W; Moore, Shirley M; Alt-White, Anna C; Conley, Yvette P; Dunbar-Jacob, Jacqueline

    2015-01-01

    The Council for the Advancement of Nursing Science aims to "facilitate and recognize life-long nursing science career development" as an important part of its mission. In light of fast-paced advances in science and technology that are inspiring new questions and methods of investigation in the health sciences, the Council for the Advancement of Nursing Science convened the Idea Festival for Nursing Science Education and appointed the Idea Festival Advisory Committee (IFAC) to stimulate dialogue about linking PhD education with a renewed vision for preparation of the next generation of nursing scientists. Building on the 2005 National Research Council report Advancing The Nation's Health Needs and the 2010 American Association of Colleges of Nursing Position Statement on the Research-Focused Doctorate Pathways to Excellence, the IFAC specifically addressed the capacity of PhD programs to prepare nursing scientists to conduct cutting-edge research in the following key emerging and priority areas of health sciences research: omics and the microbiome; health behavior, behavior change, and biobehavioral science; patient-reported outcomes; big data, e-science, and informatics; quantitative sciences; translation science; and health economics. The purpose of this article is to (a) describe IFAC activities, (b) summarize 2014 discussions hosted as part of the Idea Festival, and (c) present IFAC recommendations for incorporating these emerging areas of science and technology into research-focused doctoral programs committed to preparing graduates for lifelong, competitive careers in nursing science. The recommendations address clearer articulation of program focus areas; inclusion of foundational knowledge in emerging areas of science in core courses on nursing science and research methods; faculty composition; prerequisite student knowledge and skills; and in-depth, interdisciplinary training in supporting area of science content and methods. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Fostering a Sense of Wonder in the Science Classroom

    NASA Astrophysics Data System (ADS)

    Hadzigeorgiou, Yannis Petros

    2012-10-01

    This paper reports on a study undertaken with the primary aim of investigating the role of wonder in the learning process. The study was carried out by a 9th grade science teacher in collaboration with a university professor. The teacher taught two classrooms of 27 and 30 students respectively, by trying to evoke a sense of wonder only in one of them. To this end the teacher identified ideas and phenomena as potential sources of wonder and initiated the instruction through these ideas and phenomena. Observation and especially student optional journals were the main instruments of the research. A quantitative analysis of journal entries made by the students of both classrooms, provided evidence for higher involvement for the students—both males and females—of the classroom where the teacher evoked a sense of wonder. Also an analysis of students' comments provided evidence that wonder, experienced as astonishment and a shock of awareness can help students change their outlook on natural phenomena. Moreover two paper-and-pencil tests administered at the end of the school year provided additional evidence that wonder had an effect on students' ability to remember "wonder-full" ideas and also an effect on better understanding, of at least, three phenomena. This empirical evidence of better retention and understanding is evidence of the role of wonder as an attention catcher and generally of the role of affective factors in the learning process.

  10. Residual Stress Analysis Based on Acoustic and Optical Methods.

    PubMed

    Yoshida, Sanichiro; Sasaki, Tomohiro; Usui, Masaru; Sakamoto, Shuichi; Gurney, David; Park, Ik-Keun

    2016-02-16

    Co-application of acoustoelasticity and optical interferometry to residual stress analysis is discussed. The underlying idea is to combine the advantages of both methods. Acoustoelasticity is capable of evaluating a residual stress absolutely but it is a single point measurement. Optical interferometry is able to measure deformation yielding two-dimensional, full-field data, but it is not suitable for absolute evaluation of residual stresses. By theoretically relating the deformation data to residual stresses, and calibrating it with absolute residual stress evaluated at a reference point, it is possible to measure residual stresses quantitatively, nondestructively and two-dimensionally. The feasibility of the idea has been tested with a butt-jointed dissimilar plate specimen. A steel plate 18.5 mm wide, 50 mm long and 3.37 mm thick is braze-jointed to a cemented carbide plate of the same dimension along the 18.5 mm-side. Acoustoelasticity evaluates the elastic modulus at reference points via acoustic velocity measurement. A tensile load is applied to the specimen at a constant pulling rate in a stress range substantially lower than the yield stress. Optical interferometry measures the resulting acceleration field. Based on the theory of harmonic oscillation, the acceleration field is correlated to compressive and tensile residual stresses qualitatively. The acoustic and optical results show reasonable agreement in the compressive and tensile residual stresses, indicating the feasibility of the idea.

  11. `Teaching What I Learned': Exploring students' Earth and Space Science learning experiences in secondary school with a particular focus on their comprehension of the concept of `geologic time'

    NASA Astrophysics Data System (ADS)

    Yoon, Sae Yeol; Peate, David W.

    2015-06-01

    According to the national survey of science education, science educators in the USA currently face many challenges such as lack of qualified secondary Earth and Space Science (ESS) teachers. Less qualified teachers may have difficulty teaching ESS because of a lack of conceptual understanding, which leads to diminished confidence in content knowledge. More importantly, teachers' limited conceptual understanding of the core ideas automatically leads to a lack of pedagogical content knowledge. This mixed methods study aims to explore the ways in which current secondary schooling, especially the small numbers of highly qualified ESS teachers in the USA, might influence students' learning of the discipline. To gain a better understanding of the current conditions of ESS education in secondary schools, in the first phase, we qualitatively examined a sample middle and high school ESS textbook to explore how the big ideas of ESS, particularly geological time, are represented. In the second phase, we quantitatively analyzed the participating college students' conceptual understanding of geological time by comparing those who had said they had had secondary school ESS learning experience with those who did not. Additionally, college students' perceptions on learning and teaching ESS are discussed. Findings from both the qualitative and quantitative phases indicate participating students' ESS learning experience in their secondary schools seemed to have limited or little influence on their conceptual understandings of the discipline. We believe that these results reflect the current ESS education status, connected with the declining numbers of highly qualified ESS teachers in secondary schools.

  12. Examining Pre-Service Teachers' Use of Atomic Models in Explaining Subsequent Ionisation Energy Values

    ERIC Educational Resources Information Center

    Wheeldon, Ruth

    2012-01-01

    Chemistry students' explanations of ionisation energy phenomena often involve a number of non-scientific or inappropriate ideas being used to form causality arguments. Research has attributed this to many science teachers using these ideas themselves (Tan and Taber, in "J Chem Educ" 86(5):623-629, 2009). This research extends this work by…

  13. The Cat Sat on the Mat: Changing Minds, Changing Ideas

    ERIC Educational Resources Information Center

    Statham, Mick

    2013-01-01

    In this second installment of a two-part article, the author draws from coaching and research into children's learning in science to present a model for building pupils' scientific ideas quickly and effectively. In the first article, the author outlined how pupils' reading improves when they are more engaged in finding scientific…

  14. Effectiveness of Inquiry-Based Lessons Using Particulate Level Models to Develop High School Students' Understanding of Conceptual Stoichiometry

    ERIC Educational Resources Information Center

    Kimberlin, Stephanie; Yezierski, Ellen

    2016-01-01

    Students' inaccurate ideas about what is represented by chemical equations and concepts underlying stoichiometry are well documented; however, there are few classroom-ready instructional solutions to help students build scientifically accurate ideas about these topics central to learning chemistry. An intervention (two inquiry-based activities)…

  15. Facilitating Conceptual Change through Modeling in the Middle School Science Classroom

    ERIC Educational Resources Information Center

    Carrejo, David J.; Reinhartz, Judy

    2014-01-01

    Engaging students in both hands-on and minds-on experiences is needed for education that is relevant and complete. Many middle school students enter science classrooms with pre-conceived ideas about their world. Some of these ideas are misconceptions that hinder students from developing accepted concepts in science, such as those related to…

  16. Self-Esteem, Creativity, and Music: Implications and Directions for Research.

    ERIC Educational Resources Information Center

    VanderArk, Sherman

    1989-01-01

    This paper seeks to give potentially pertinent information and ideas for the development of a model and of hypotheses that are relevant in terms of combining the areas of self-concept and creativity. Selected sources from the areas of psychology, education, and music education are presented as the basis for ideas and thoughts for further research.…

  17. Gather 'Round the Campfire: Engaging Students and Creating Storytellers

    ERIC Educational Resources Information Center

    Higgins, Carrie

    2008-01-01

    In this article, the author describes the development of a storytelling unit she introduced to her school. She got the idea for the storytelling unit from the National Storytelling Festival she had attended several years ago in Jonesboro, Tennessee. When she proposed her idea of a storytelling unit culminating in a festival modeled on the national…

  18. Investigation of Mental Models of Turkish Pre-Service Physics Students for the Concept of "Spin"

    ERIC Educational Resources Information Center

    Özcan, Özgür

    2013-01-01

    Problem Statement: Difficulties in the learning process usually emerge from the problem of mental representations constructed by students in their interactions with the world. This previous knowledge and these ideas are in contradiction with scientific facts, and are known as misconceptions or alternative ideas. Thus, an analysis of the mental…

  19. Naïve Scientists and Conflict Analysis: Learning through Case Studies

    ERIC Educational Resources Information Center

    Ayres, R. Williams

    2016-01-01

    Much of our teaching about conflict relies on theoretical ideas and models that are delivered as finished products. This article explores the supposition that what students need is not already-formed theoretical ideas, but exposure to more real-world cases of conflict from which to build theory. The article presents an experiment in pedagogy:…

  20. Environmental Management Competitive Pressure Effect on SME Environmental Innovation Activities: A Green Supply Chain Perspective

    NASA Astrophysics Data System (ADS)

    Rashid, A. A.; Sidek, A. A.; Suffian, S. A.; Daud, M. R. C.

    2018-01-01

    The idea of assimilating green supply chain is to integrate and establish environmental management into the supply chain practices. The study aims to explore how environmental management competitive pressure influences a SME company in Malaysia to incorporate green supply chain integration, which is an efficient platform to develop environmental innovation. This study further advances green supply chain management research in Malaysia by using the method of quantitative analysis to analyze the model developed which data will be collected based on a sample of SMEs in Malaysia in manufacturing sector. The model developed in this study illustrates how environmental management competitive pressure from main competitors affects three fundamental dimensions of green supply chain integration. The research findings suggest that environmental management competitive pressure is a vital driving force for a SME company to incorporate internal and external collaboration in developing green product innovation. From the analysis conducted, the study strongly demonstrated that the best way for a company to counteract competitor’s environmental management success is to first implement strong internal green product development process then move to incorporate external environmental management innovation between their suppliers and customers. The findings also show that internal integration of green product innovation fully mediates the relationship of environmental management competitive pressure and the external integration of green product innovation.

  1. Data driven approaches vs. qualitative approaches in climate change impact and vulnerability assessment.

    NASA Astrophysics Data System (ADS)

    Zebisch, Marc; Schneiderbauer, Stefan; Petitta, Marcello

    2015-04-01

    In the last decade the scope of climate change science has broadened significantly. 15 years ago the focus was mainly on understanding climate change, providing climate change scenarios and giving ideas about potential climate change impacts. Today, adaptation to climate change has become an increasingly important field of politics and one role of science is to inform and consult this process. Therefore, climate change science is not anymore focusing on data driven approaches only (such as climate or climate impact models) but is progressively applying and relying on qualitative approaches including opinion and expertise acquired through interactive processes with local stakeholders and decision maker. Furthermore, climate change science is facing the challenge of normative questions, such us 'how important is a decrease of yield in a developed country where agriculture only represents 3% of the GDP and the supply with agricultural products is strongly linked to global markets and less depending on local production?'. In this talk we will present examples from various applied research and consultancy projects on climate change vulnerabilities including data driven methods (e.g. remote sensing and modelling) to semi-quantitative and qualitative assessment approaches. Furthermore, we will discuss bottlenecks, pitfalls and opportunities in transferring climate change science to policy and decision maker oriented climate services.

  2. Analysis of Science and Technology Trend Based on Word Usage in Digitized Books

    NASA Astrophysics Data System (ADS)

    Yun, Jinhyuk; Kim, Pan-Jun; Jeong, Hawoong

    2013-03-01

    Throughout mankind's history, forecasting and predicting future has been a long-lasting interest to our society. Many fortune-tellers have tried to forecast the future by ``divine'' items. Sci-fi writers have also imagined what the future would look like. However most of them have been illogical and unscientific. Meanwhile, scientists have also attempted to discover future trend of science. Many researchers have used quantitative models to study how new ideas are used and spread. Besides the modeling works, in the early 21st century, the rise of data science has provided another prospect of forecasting future. However many studies have focused on very limited set of period or age, due to the limitations of dataset. Hence, many questions still remained unanswered. Fortunately, Google released a new dataset named ``Google N-Gram Dataset.'' This dataset provides us with 5 million words worth of literature dating from 1520 to 2008, and this is nearly 4% of publications ever printed. With this new time-varying dataset, we studied the spread and development of technologies by searching ``Science and Technology'' related words from 1800 to 2000. By statistical analysis, some general scaling laws were discovered. And finally, we determined factors that strongly affect the lifecycle of a word.

  3. Modeling and replicating statistical topology and evidence for CMB nonhomogeneity

    PubMed Central

    Agami, Sarit

    2017-01-01

    Under the banner of “big data,” the detection and classification of structure in extremely large, high-dimensional, data sets are two of the central statistical challenges of our times. Among the most intriguing new approaches to this challenge is “TDA,” or “topological data analysis,” one of the primary aims of which is providing nonmetric, but topologically informative, preanalyses of data which make later, more quantitative, analyses feasible. While TDA rests on strong mathematical foundations from topology, in applications, it has faced challenges due to difficulties in handling issues of statistical reliability and robustness, often leading to an inability to make scientific claims with verifiable levels of statistical confidence. We propose a methodology for the parametric representation, estimation, and replication of persistence diagrams, the main diagnostic tool of TDA. The power of the methodology lies in the fact that even if only one persistence diagram is available for analysis—the typical case for big data applications—the replications permit conventional statistical hypothesis testing. The methodology is conceptually simple and computationally practical, and provides a broadly effective statistical framework for persistence diagram TDA analysis. We demonstrate the basic ideas on a toy example, and the power of the parametric approach to TDA modeling in an analysis of cosmic microwave background (CMB) nonhomogeneity. PMID:29078301

  4. KINETIC ENERGY AND MASS DISTRIBUTIONS FOR NUCLEAR FISSION AT MODERATE EXCITATION ENERGY (thesis)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burnett, D.S.

    1963-10-01

    Fission fragment kinetic energy measurements using semiconductor detectors were made for the alpha-induced fission of Au/sup 197/, Bi/sup 209/, Th/ sup 232/, and U/sup 238/ at alpha energi es of 21 to 65 Mev. The data were recorded as the number of events at fragment energies E/sub 1/ and E/sub 2/, N(E/ sub 1/,E/sub 2/). The data were then transformed into mass--total kinetic energy maps and analyzed by means of moments. The Bi and Au data are in good agreement with quantitative theoretical predictions from the liquid drop model available for the lighter elements. The U and Th data aremore » discussed in terms of qualitative ideas that have been proposed to explain the properties of the fission process for the heavier elements. The changes in the U and Th mass and total kinetic energy distributions with excitation energy are emphasized. Pulse- height energy relations for the detectors used were obtained by a detailed comparison of detector and time-offlight results for the spontaneous fission of Cf/sup 252/. 54 references. (auth)« less

  5. How to detect fluctuating stripes in the high-temperature superconductors

    NASA Astrophysics Data System (ADS)

    Kivelson, S. A.; Bindloss, I. P.; Fradkin, E.; Oganesyan, V.; Tranquada, J. M.; Kapitulnik, A.; Howald, C.

    2003-10-01

    This article discusses fluctuating order in a quantum disordered phase proximate to a quantum critical point, with particular emphasis on fluctuating stripe order. Optimal strategies are derived for extracting information concerning such local order from experiments, with emphasis on neutron scattering and scanning tunneling microscopy. These ideas are tested by application to two model systems—an exactly solvable one-dimensional (1D) electron gas with an impurity, and a weakly interacting 2D electron gas. Experiments on the cuprate high-temperature superconductors which can be analyzed using these strategies are extensively reviewed. The authors adduce evidence that stripe correlations are widespread in the cuprates. They compare and contrast the advantages of two limiting perspectives on the high-temperature superconductor: weak coupling, in which correlation effects are treated as a perturbation on an underlying metallic (although renormalized) Fermi-liquid state, and strong coupling, in which the magnetism is associated with well-defined localized spins, and stripes are viewed as a form of micro phase separation. The authors present quantitative indicators that the latter view better accounts for the observed stripe phenomena in the cuprates.

  6. Origin of the correlations between exit times in pedestrian flows through a bottleneck

    NASA Astrophysics Data System (ADS)

    Nicolas, Alexandre; Touloupas, Ioannis

    2018-01-01

    Robust statistical features have emerged from the microscopic analysis of dense pedestrian flows through a bottleneck, notably with respect to the time gaps between successive passages. We pinpoint the mechanisms at the origin of these features thanks to simple models that we develop and analyse quantitatively. We disprove the idea that anticorrelations between successive time gaps (i.e. an alternation between shorter ones and longer ones) are a hallmark of a zipper-like intercalation of pedestrian lines and show that they simply result from the possibility that pedestrians from distinct ‘lines’ or directions cross the bottleneck within a short time interval. A second feature concerns the bursts of escapes, i.e. egresses that come in fast succession. Despite the ubiquity of exponential distributions of burst sizes, entailed by a Poisson process, we argue that anomalous (power-law) statistics arise if the bottleneck is nearly congested, albeit only in a tiny portion of parameter space. The generality of the proposed mechanisms implies that similar statistical features should also be observed for other types of particulate flows.

  7. Controllable rotational inversion in nanostructures with dual chirality.

    PubMed

    Dai, Lu; Zhu, Ka-Di; Shen, Wenzhong; Huang, Xiaojiang; Zhang, Li; Goriely, Alain

    2018-04-05

    Chiral structures play an important role in natural sciences due to their great variety and potential applications. A perversion connecting two helices with opposite chirality creates a dual-chirality helical structure. In this paper, we develop a novel model to explore quantitatively the mechanical behavior of normal, binormal and transversely isotropic helical structures with dual chirality and apply these ideas to known nanostructures. It is found that both direction and amplitude of rotation can be finely controlled by designing the cross-sectional shape. A peculiar rotational inversion of overwinding followed by unwinding, observed in some gourd and cucumber tendril perversions, not only exists in transversely isotropic dual-chirality helical nanobelts, but also in the binormal/normal ones when the cross-sectional aspect ratio is close to 1. Beyond this rotational inversion region, the binormal and normal dual-chirality helical nanobelts exhibit a fixed directional rotation of unwinding and overwinding, respectively. Moreover, in the binormal case, the rotation of these helical nanobelts is nearly linear, which is promising as a possible design for linear-to-rotary motion converters. The present work suggests new designs for nanoscale devices.

  8. Hopping Conduction and Bacteria: Transport Properties of Disordered Reaction-Diffusion Systems

    NASA Astrophysics Data System (ADS)

    Missel, Andrew; Dahmen, Karin

    2008-03-01

    Reaction-diffusion (RD) systems are used to model everything from the formation of animal coat patterns to the spread of genes in a population to the seasonal variation of plankton density in the ocean. In all of these problems, disorder plays a large role, but determining its effects on transport properties in RD systems has been a challenge. We present here both analytical and numerical studies of a particular disordered RD system consisting of particles which are allowed to diffuse and compete for resources (2A->A) with spatially homogeneous rates, reproduce (A->2A) in certain areas (``oases''), and die (A->0) everywhere else (the ``desert''). In the low oasis density regime, transport is mediated through rare ``hopping events'' in which a small number of particles diffuse through the desert from one oasis to another; the situation is mathematically analogous to hopping conduction in doped semiconductors, and this analogy, along with some ideas from first passage percolation theory, allows us to make some quantitative predictions about the transport properties of the system on a large scale.

  9. Comprehensive analysis and evaluation of big data for main transformer equipment based on PCA and Apriority

    NASA Astrophysics Data System (ADS)

    Guo, Lijuan; Yan, Haijun; Hao, Yongqi; Chen, Yun

    2018-01-01

    With the power supply level of urban power grid toward high reliability development, it is necessary to adopt appropriate methods for comprehensive evaluation of existing equipment. Considering the wide and multi-dimensional power system data, the method of large data mining is used to explore the potential law and value of power system equipment. Based on the monitoring data of main transformer and the records of defects and faults, this paper integrates the data of power grid equipment environment. Apriori is used as an association identification algorithm to extract the frequent correlation factors of the main transformer, and the potential dependence of the big data is analyzed by the support and confidence. Then, the integrated data is analyzed by PCA, and the integrated quantitative scoring model is constructed. It is proved to be effective by using the test set to validate the evaluation algorithm and scheme. This paper provides a new idea for data fusion of smart grid, and provides a reference for further evaluation of big data of power grid equipment.

  10. Statistical physics of language dynamics

    NASA Astrophysics Data System (ADS)

    Loreto, Vittorio; Baronchelli, Andrea; Mukherjee, Animesh; Puglisi, Andrea; Tria, Francesca

    2011-04-01

    Language dynamics is a rapidly growing field that focuses on all processes related to the emergence, evolution, change and extinction of languages. Recently, the study of self-organization and evolution of language and meaning has led to the idea that a community of language users can be seen as a complex dynamical system, which collectively solves the problem of developing a shared communication framework through the back-and-forth signaling between individuals. We shall review some of the progress made in the past few years and highlight potential future directions of research in this area. In particular, the emergence of a common lexicon and of a shared set of linguistic categories will be discussed, as examples corresponding to the early stages of a language. The extent to which synthetic modeling is nowadays contributing to the ongoing debate in cognitive science will be pointed out. In addition, the burst of growth of the web is providing new experimental frameworks. It makes available a huge amount of resources, both as novel tools and data to be analyzed, allowing quantitative and large-scale analysis of the processes underlying the emergence of a collective information and language dynamics.

  11. The quantum computer game: citizen science

    NASA Astrophysics Data System (ADS)

    Damgaard, Sidse; Mølmer, Klaus; Sherson, Jacob

    2013-05-01

    Progress in the field of quantum computation is hampered by daunting technical challenges. Here we present an alternative approach to solving these by enlisting the aid of computer players around the world. We have previously examined a quantum computation architecture involving ultracold atoms in optical lattices and strongly focused tweezers of light. In The Quantum Computer Game (see http://www.scienceathome.org/), we have encapsulated the time-dependent Schrödinger equation for the problem in a graphical user interface allowing for easy user input. Players can then search the parameter space with real-time graphical feedback in a game context with a global high-score that rewards short gate times and robustness to experimental errors. The game which is still in a demo version has so far been tried by several hundred players. Extensions of the approach to other models such as Gross-Pitaevskii and Bose-Hubbard are currently under development. The game has also been incorporated into science education at high-school and university level as an alternative method for teaching quantum mechanics. Initial quantitative evaluation results are very positive. AU Ideas Center for Community Driven Research, CODER.

  12. Towards exaggerated emphysema stereotypes

    NASA Astrophysics Data System (ADS)

    Chen, C.; Sørensen, L.; Lauze, F.; Igel, C.; Loog, M.; Feragen, A.; de Bruijne, M.; Nielsen, M.

    2012-03-01

    Classification is widely used in the context of medical image analysis and in order to illustrate the mechanism of a classifier, we introduce the notion of an exaggerated image stereotype based on training data and trained classifier. The stereotype of some image class of interest should emphasize/exaggerate the characteristic patterns in an image class and visualize the information the employed classifier relies on. This is useful for gaining insight into the classification and serves for comparison with the biological models of disease. In this work, we build exaggerated image stereotypes by optimizing an objective function which consists of a discriminative term based on the classification accuracy, and a generative term based on the class distributions. A gradient descent method based on iterated conditional modes (ICM) is employed for optimization. We use this idea with Fisher's linear discriminant rule and assume a multivariate normal distribution for samples within a class. The proposed framework is applied to computed tomography (CT) images of lung tissue with emphysema. The synthesized stereotypes illustrate the exaggerated patterns of lung tissue with emphysema, which is underpinned by three different quantitative evaluation methods.

  13. Applying the concepts of innovation strategies to plastic surgery.

    PubMed

    Wang, Yirong; Kotsis, Sandra V; Chung, Kevin C

    2013-08-01

    Plastic surgery has a well-known history of innovative procedures and products. However, with the rise in competition, such as aesthetic procedures being performed by other medical specialties, there is a need for continued innovation in plastic surgery to create novel treatments to advance this specialty. Although many articles introduce innovative technologies and procedures, there is a paucity of publications to highlight the application of principles of innovation in plastic surgery. The authors review the literature regarding business strategies for innovation. The authors evaluate concepts of innovation, process of innovation (i.e., idea generation, idea evaluation, idea conversion, idea diffusion, and adoption), ethical issues, and application to plastic surgery. Adopting a business model of innovation is helpful for promoting a new paradigm of progress to propel plastic surgery to new avenues of creativity.

  14. Transatlantic Irritability: Brunonian sociology, America and mass culture in the nineteenth century.

    PubMed

    Budge, Gavin

    2014-01-01

    The widespread influence exerted by the medical theories of Scottish doctor, John Brown, whose eponymously named Brunonianism radically simplified the ideas of his mentor, William Cullen, has not been generally recognised. However, the very simplicity of the Brunonian medical model played a key role in ensuring the dissemination of medical ideas about nervous irritability and the harmful effects of overstimulation in the literary culture of the nineteenth century and shaped early sociological thinking. This chapter suggests the centrality of these medical ideas, as mediated by Brunonianism, to the understanding of Romanticism in the nineteenth century, and argues that Brunonian ideas shaped nineteenth-century thinking about the effects of mass print culture in ways which continue to influence contemporary thinking about the effects of media.

  15. Interpretation of protein quantitation using the Bradford assay: comparison with two calculation models.

    PubMed

    Ku, Hyung-Keun; Lim, Hyuk-Min; Oh, Kyong-Hwa; Yang, Hyo-Jin; Jeong, Ji-Seon; Kim, Sook-Kyung

    2013-03-01

    The Bradford assay is a simple method for protein quantitation, but variation in the results between proteins is a matter of concern. In this study, we compared and normalized quantitative values from two models for protein quantitation, where the residues in the protein that bind to anionic Coomassie Brilliant Blue G-250 comprise either Arg and Lys (Method 1, M1) or Arg, Lys, and His (Method 2, M2). Use of the M2 model yielded much more consistent quantitation values compared with use of the M1 model, which exhibited marked overestimations against protein standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Two Archetypes of Motor Control Research.

    PubMed

    Latash, Mark L

    2010-07-01

    This reply to the Commentaries is focused on two archetypes of motor control research, one based on physics and physiology and the other based on control theory and ideas of neural computations. The former approach, represented by the equilibrium-point hypothesis, strives to discover the physical laws and salient physiological variables that make purposeful coordinated movements possible. The latter approach, represented by the ideas of internal models and optimal control, tries to apply methods of control developed for man-made inanimate systems to the human body. Specific issues related to control with subthreshold membrane depolarization, motor redundancy, and the idea of synergies are briefly discussed.

  17. Unleashing creativity: The role of left temporoparietal regions in evaluating and inhibiting the generation of creative ideas.

    PubMed

    Mayseless, Naama; Aharon-Peretz, Judith; Shamay-Tsoory, Simone

    2014-11-01

    Human creativity is thought to entail two processes. One is idea generation, whereby ideas emerge in an associative manner, and the other is idea evaluation, whereby generated ideas are evaluated and screened. Thus far, neuroimaging studies have identified several brain regions as being involved in creativity, yet only a handful of studies have examined the neural basis underlying these two processes. We found that an individual with left temporoparietal hemorrhage who had no previous experience as an artist developed remarkable artistic creativity, which diminished as the hemorrhage receded. We thus hypothesized that damage to the evaluation network of creativity during the initial hematoma had a releasing effect on creativity by "freeing" the idea generation system. In line with this hypothesis, we conducted a subsequent fMRI study showing that decreased left temporal and parietal activations among healthy individuals as they evaluated creative ideas selectively predicted higher creativity. The current studies provide converging multi-method evidence suggesting that the left temporoparietal area is part of a neural network involved in evaluating creativity, and that as such may act as inhibitors of creativity. We propose an explanatory model of creativity centered upon the key role of the left temporoparietal regions in evaluating and inhibiting creativity. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Quantitative Characterization of Nanostructured Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. Frank

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structuremore » measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.« less

  19. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  20. An interdisciplinary approach for earthquake modelling and forecasting

    NASA Astrophysics Data System (ADS)

    Han, P.; Zhuang, J.; Hattori, K.; Ogata, Y.

    2016-12-01

    Earthquake is one of the most serious disasters, which may cause heavy casualties and economic losses. Especially in the past two decades, huge/mega earthquakes have hit many countries. Effective earthquake forecasting (including time, location, and magnitude) becomes extremely important and urgent. To date, various heuristically derived algorithms have been developed for forecasting earthquakes. Generally, they can be classified into two types: catalog-based approaches and non-catalog-based approaches. Thanks to the rapid development of statistical seismology in the past 30 years, now we are able to evaluate the performances of these earthquake forecast approaches quantitatively. Although a certain amount of precursory information is available in both earthquake catalogs and non-catalog observations, the earthquake forecast is still far from satisfactory. In most case, the precursory phenomena were studied individually. An earthquake model that combines self-exciting and mutually exciting elements was developed by Ogata and Utsu from the Hawkes process. The core idea of this combined model is that the status of the event at present is controlled by the event itself (self-exciting) and all the external factors (mutually exciting) in the past. In essence, the conditional intensity function is a time-varying Poisson process with rate λ(t), which is composed of the background rate, the self-exciting term (the information from past seismic events), and the external excitation term (the information from past non-seismic observations). This model shows us a way to integrate the catalog-based forecast and non-catalog-based forecast. Against this background, we are trying to develop a new earthquake forecast model which combines catalog-based and non-catalog-based approaches.

  1. Stat-tracks and mediotypes: powerful tools for modern ichnology based on 3D models

    PubMed Central

    Bennett, Matthew R.; Marty, Daniel; Budka, Marcin; Reynolds, Sally C.; Bakirov, Rashid

    2018-01-01

    Vertebrate tracks are subject to a wide distribution of morphological types. A single trackmaker may be associated with a range of tracks reflecting individual pedal anatomy and behavioural kinematics mediated through substrate properties which may vary both in space and time. Accordingly, the same trackmaker can leave substantially different morphotypes something which must be considered in creating ichnotaxa. In modern practice this is often captured by the collection of a series of 3D track models. We introduce two concepts to help integrate these 3D models into ichnological analysis procedures. The mediotype is based on the idea of using statistically-generated three-dimensional track models (median or mean) of the type specimens to create a composite track to support formal recognition of a ichno type. A representative track (mean and/or median) is created from a set of individual reference tracks or from multiple examples from one or more trackways. In contrast, stat-tracks refer to other digitally generated tracks which may explore variance. For example, they are useful in: understanding the preservation variability of a given track sample; identifying characteristics or unusual track features; or simply as a quantitative comparison tool. Both concepts assist in making ichnotaxonomical interpretations and we argue that they should become part of the standard procedure when instituting new ichnotaxa. As three-dimensional models start to become a standard in publications on vertebrate ichnology, the mediotype and stat-track concepts have the potential to help guiding a revolution in the study of vertebrate ichnology and ichnotaxonomy. PMID:29340246

  2. Combined registration of 3D tibia and femur implant models in 3D magnetic resonance images

    NASA Astrophysics Data System (ADS)

    Englmeier, Karl-Hans; Siebert, Markus; von Eisenhart-Rothe, Ruediger; Graichen, Heiko

    2008-03-01

    The most frequent reasons for revision of total knee arthroplasty are loosening and abnormal axial alignment leading to an unphysiological kinematic of the knee implant. To get an idea about the postoperative kinematic of the implant, it is essential to determine the position and orientation of the tibial and femoral prosthesis. Therefore we developed a registration method for fitting 3D CAD-models of knee joint prostheses into an 3D MR image. This rigid registration is the basis for a quantitative analysis of the kinematics of knee implants. Firstly the surface data of the prostheses models are converted into a voxel representation; a recursive algorithm determines all boundary voxels of the original triangular surface data. Secondly an initial preconfiguration of the implants by the user is still necessary for the following step: The user has to perform a rough preconfiguration of both remaining prostheses models, so that the fine matching process gets a reasonable starting point. After that an automated gradient-based fine matching process determines the best absolute position and orientation: This iterative process changes all 6 parameters (3 rotational- and 3 translational parameters) of a model by a minimal amount until a maximum value of the matching function is reached. To examine the spread of the final solutions of the registration, the interobserver variability was measured in a group of testers. This variability, calculated by the relative standard deviation, improved from about 50% (pure manual registration) to 0.5% (rough manual preconfiguration and subsequent fine registration with the automatic fine matching process).

  3. Quantifying Volcanic Emissions of Trace Elements to the Atmosphere: Ideas Based on Past Studies

    NASA Astrophysics Data System (ADS)

    Rose, W. I.

    2003-12-01

    Extensive data exist from volcanological and geochemical studies about exotic elemental enrichments in volcanic emissions to the atmosphere but quantitative data are quite rare. Advanced, highly sensitive techniques of analysis are needed to detect low concentrations of some minor elements, especially during major eruptions. I will present data from studies done during low levels of activity (incrustations and silica tube sublimates at high temperature fumaroles, from SEM studies of particle samples collected in volcanic plumes and volcanic clouds, from geochemical analysis of volcanic gas condensates, from analysis of treated particle and gas filter packs) and a much smaller number that could reflect explosive activity (from fresh ashfall leachate geochemistry, and from thermodynamic codes modeling volatile emissions from magma). This data describes a highly variable pattern of elemental enrichments which are difficult to quantify, generalize and understand. Sampling in a routine way is difficult, and work in active craters has heightened our awareness of danger, which appropriately inhibits some sampling. There are numerous localized enrichments of minor elements that can be documented and others can be expected or inferred. There is a lack of systematic tools to measure minor element abundances in volcanic emissions. The careful combination of several methodologies listed above for the same volcanic vents can provide redundant data on multiple elements which could lead to overall quantification of minor element fluxes but there are challenging issues about detection. For quiescent plumes we can design combinations of measurements to quantify minor element emission rates. Doing a comparable methodology to succeed in measuring minor element fluxes for significant eruptions will require new strategies and/or ideas.

  4. ‘Rowing against the current’: the policy process and effects of removing user fees for caesarean sections in Benin

    PubMed Central

    Cresswell, Jenny A; Makoutodé, Patrick; De Brouwere, Vincent; Witter, Sophie; Filippi, Veronique; Kanhonou, Lydie G; Goufodji, Sourou B; Lange, Isabelle L; Lawin, Lionel; Affo, Fabien; Marchal, Bruno

    2018-01-01

    Background In 2009, the Benin government introduced a user fee exemption policy for caesarean sections. We analyse this policy with regard to how the existing ideas and institutions related to user fees influenced key steps of the policy cycle and draw lessons that could inform the policy dialogue for universal health coverage in the West African region. Methods Following the policy stages model, we analyse the agenda setting, policy formulation and legitimation phase, and assess the implementation fidelity and policy results. We adopted an embedded case study design, using quantitative and qualitative data collected with 13 tools at the national level and in seven hospitals implementing the policy. Results We found that the initial political goal of the policy was not to reduce maternal mortality but to eliminate the detention in hospitals of mothers and newborns who cannot pay the user fees by exempting a comprehensive package of maternal health services. We found that the policy development process suffered from inadequate uptake of evidence and that the policy content and process were not completely in harmony with political and public health goals. The initial policy intention clashed with the neoliberal orientation of the political system, the fee recovery principles institutionalised since the Bamako Initiative and the prevailing ideas in favour of user fees. The policymakers did not take these entrenched factors into account. The resulting tension contributed to a benefit package covering only caesarean sections and to the variable implementation and effectiveness of the policy. Conclusion The influence of organisational culture in the decision-making processes in the health sector is often ignored but must be considered in the design and implementation of any policy aimed at achieving universal health coverage in West African countries. PMID:29564156

  5. Role of innovative institutional structures in integrated governance. A case study of integrating health and nutrition programs in Chhattisgarh, India.

    PubMed

    Kalita, Anuska; Mondal, Shinjini

    2012-01-01

    The aim of this paper is to highlight the significance of integrated governance in bringing about community participation, improved service delivery, accountability of public systems and human resource rationalisation. It discusses the strategies of innovative institutional structures in translating such integration in the areas of public health and nutrition for poor communities. The paper draws on experience of initiating integrated governance through innovations in health and nutrition programming in the resource-poor state of Chhattisgarh, India, at different levels of governance structures--hamlets, villages, clusters, blocks, districts and at the state. The study uses mixed methods--i.e. document analysis, interviews, discussions and quantitative data from facilities surveys--to present a case study analyzing the process and outcome of integration. The data indicate that integrated governance initiatives improved convergence between health and nutrition departments of the state at all levels. Also, innovative structures are important to implement the idea of integration, especially in contexts that do not have historical experience of such partnerships. Integration also contributed towards improved participation of communities in self-governance, community monitoring of government programs, and therefore, better services. As governments across the world, especially in developing countries, struggle towards achieving better governance, integration can serve as a desirable process to address this. Integration can affect the decentralisation of power, inclusion, efficiency, accountability and improved service quality in government programs. The institutional structures detailed in this paper can provide models for replication in other similar contexts for translating and sustaining the idea of integrated governance. This paper is one of the few to investigate innovative public institutions of a and community mobilisation to explore this important, and under-researched, topic.

  6. Model-based Executive Control through Reactive Planning for Autonomous Rovers

    NASA Technical Reports Server (NTRS)

    Finzi, Alberto; Ingrand, Felix; Muscettola, Nicola

    2004-01-01

    This paper reports on the design and implementation of a real-time executive for a mobile rover that uses a model-based, declarative approach. The control system is based on the Intelligent Distributed Execution Architecture (IDEA), an approach to planning and execution that provides a unified representational and computational framework for an autonomous agent. The basic hypothesis of IDEA is that a large control system can be structured as a collection of interacting agents, each with the same fundamental structure. We show that planning and real-time response are compatible if the executive minimizes the size of the planning problem. We detail the implementation of this approach on an exploration rover (Gromit an RWI ATRV Junior at NASA Ames) presenting different IDEA controllers of the same domain and comparing them with more classical approaches. We demonstrate that the approach is scalable to complex coordination of functional modules needed for autonomous navigation and exploration.

  7. Thermodynamics of Biological Processes

    PubMed Central

    Garcia, Hernan G.; Kondev, Jane; Orme, Nigel; Theriot, Julie A.; Phillips, Rob

    2012-01-01

    There is a long and rich tradition of using ideas from both equilibrium thermodynamics and its microscopic partner theory of equilibrium statistical mechanics. In this chapter, we provide some background on the origins of the seemingly unreasonable effectiveness of ideas from both thermodynamics and statistical mechanics in biology. After making a description of these foundational issues, we turn to a series of case studies primarily focused on binding that are intended to illustrate the broad biological reach of equilibrium thinking in biology. These case studies include ligand-gated ion channels, thermodynamic models of transcription, and recent applications to the problem of bacterial chemotaxis. As part of the description of these case studies, we explore a number of different uses of the famed Monod–Wyman–Changeux (MWC) model as a generic tool for providing a mathematical characterization of two-state systems. These case studies should provide a template for tailoring equilibrium ideas to other problems of biological interest. PMID:21333788

  8. The Effects of Educational Diversity in a National Sample of Law Students: Fitting Multilevel Latent Variable Models in Data With Categorical Indicators.

    PubMed

    Gottfredson, Nisha C; Panter, A T; Daye, Charles E; Allen, Walter F; Wightman, Linda F

    2009-01-01

    Controversy surrounding the use of race-conscious admissions can be partially resolved with improved empirical knowledge of the effects of racial diversity in educational settings. We use a national sample of law students nested in 64 law schools to test the complex and largely untested theory regarding the effects of educational diversity on student outcomes. Social scientists who study these outcomes frequently encounter both latent variables and nested data within a single analysis. Yet, until recently, an appropriate modeling technique has been computationally infeasible, and consequently few applied researchers have estimated appropriate models to test their theories, sometimes limiting the scope of their research question. Our results, based on disaggregated multilevel structural equation models, show that racial diversity is related to a reduction in prejudiced attitudes and increased perceived exposure to diverse ideas and that these effects are mediated by more frequent interpersonal contact with diverse peers. These findings provide support for the idea that administrative manipulation of educational diversity may lead to improved student outcomes. Admitting a racially/ethnically diverse student body provides an educational experience that encourages increased exposure to diverse ideas and belief systems.

  9. Models of the Jovian Ring and Comparisions With Observations

    NASA Astrophysics Data System (ADS)

    Juhasz, A.; Horanyi, M.

    2008-12-01

    A number of in situ and remote sensing observations of the Jovian ring system exist so we can now combine observations from Voyager, Pioneer, Galileo and Cassini, as well as ground based and HST measurements. In this presentation we will compare this large body of observations to available theoretical models of the dust dynamics in the Jovian ring. Common to all models (Burns et al., 1985, 2001 ; Horanyi et al.,1996, 2004) is the basic idea that dust is being continuously produced due to micro-meteoroid bombardment of the moons in this region. Also, the spatial distribution of dust in the halo region inward of the main ring is generally accepted to be a consequence of electrodynamic perturbations acting on small charged dust particles. However, in the suggested theoretical models the time scale for orbital evolution is drastically differ. Burns et al. argues, that in the main ring, dust particles evolve inward very slowly due to Poynting-Robertson drag. A typical micron sized grain is predicted to orbit Jupiter for 104 years before crashing into the atmosphere of Jupiter. Horanyi et al. argues that the radial transport is due to resonant charge variations, dictated by the plasma density distribution. In this model grains are transported on a time scale that is orders of magnitude shorter than predicted by PR drag. Here we use both of these models to generate brightness distributions and predict optical depth distributions for same geometries and wavelengths as that of the observations. Quantitative comparisons of the modeled and the real observations lead us to the conclusion that the dust transport in ring/halo region at Jupiter is mainly due to resonant charge variation.

  10. Rasmussen's legacy: A paradigm change in engineering for safety.

    PubMed

    Leveson, Nancy G

    2017-03-01

    This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. The mathematics of cancer: integrating quantitative models.

    PubMed

    Altrock, Philipp M; Liu, Lin L; Michor, Franziska

    2015-12-01

    Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.

  12. Generalized PSF modeling for optimized quantitation in PET imaging.

    PubMed

    Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman

    2017-06-21

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF modeling does not offer optimized PET quantitation, and that PSF overestimation may provide enhanced SUV quantitation. Furthermore, generalized PSF modeling may provide a valuable approach for quantitative tasks such as treatment-response assessment and prognostication.

  13. Inhibitory Control as a Core Process of Creative Problem Solving and Idea Generation from Childhood to Adulthood

    ERIC Educational Resources Information Center

    Cassotti, Mathieu; Agogué, Marine; Camarda, Anaëlle; Houdé, Olivier; Borst, Grégoire

    2016-01-01

    Developmental cognitive neuroscience studies tend to show that the prefrontal brain regions (known to be involved in inhibitory control) are activated during the generation of creative ideas. In the present article, we discuss how a dual-process model of creativity--much like the ones proposed to account for decision making and reasoning--could…

  14. Why Don't Students Like School? Willingham, Perkins, and a Comprehensive Model of School Reform

    ERIC Educational Resources Information Center

    Jones, Jennifer L.; Jones, Karrie A.; Vermette, Paul J.

    2013-01-01

    As teachers attempt to sustain meaningful school change, the need for visionary yet feasible reform becomes profound. To help reach this end, this article examines the ideas of Willingham and Perkins for principles of effective teaching and learning. While seemingly divergent in their approaches for reform, examination of their collective ideas in…

  15. Studying Plant-Rhizobium Mutualism in the Biology Classroom: Connecting the Big Ideas in Biology through Inquiry

    ERIC Educational Resources Information Center

    Suwa, Tomomi; Williamson, Brad

    2014-01-01

    We present a guided-inquiry biology lesson, using the plant-rhizobium symbiosis as a model system. This system provides a rich environment for developing connections between the big ideas in biology as outlined in the College Board's new AP Biology Curriculum. Students gain experience with the practice of scientific investigation, from…

  16. An Investigation of University Student and K-12 Teacher Reasoning about Key Ideas in the Development of the Particulate Nature of Matter

    ERIC Educational Resources Information Center

    Robertson, Amy D.

    2011-01-01

    This dissertation describes a systematic investigation of university student and K-12 teacher reasoning about key ideas relevant to the development of a particulate model for matter. Written assessments and individual demonstration interviews have been used to study the reasoning of introductory and sophomore-level physics students, introductory…

  17. Studies for Emerging Electric Grid Cybersecurity Technologies using Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Francia, X.

    The Energy Commission is currently soliciting ideas and stakeholder input for the 2018 – 2020 EPIC Triennial Investment Plan. For those that would like to submit an idea for consideration in the 2018-2020 EPIC Triennial Plan, we ask that you complete the form below. Submittals are due by 5:00 p.m. on February 10, 2017.

  18. Planning for the Future: A Model for Using the Principles of Transition to Guide the Development of Behavior Intervention Plans

    ERIC Educational Resources Information Center

    Mueller, Tracy Gershwin; Bassett, Diane S.; Brewer, Robin D.

    2012-01-01

    The Individuals with Disabilities Education Act (IDEA) mandates the implementation of a behavior intervention plan based on a functional behavioral assessment when a student's behavior necessitates disciplinary actions. However, IDEA does not provide any clear guidelines as to what the plans should contain nor how they can address behaviors that…

  19. Light and dark: A survey of new physics ideas in the 1-100 MeV window

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pospelov, Maxim

    2013-11-07

    I review the set of theoretical ideas motivating experimental searches of light physics beyond Standard Model using the high-intensity electron beams. While 'dark photon' is the chief example of such physics, the other 'light and dark' states (e.g. 'Dark Higgses') are also of interest. I discuss particle physics, cosmology and astrophysics applications.

  20. Seeing the World Anew: A Case Study of Ideas, Engagement, and Transfer in a 3 Year Old.

    ERIC Educational Resources Information Center

    Pugh, Kevin

    According to the philosophy of John Dewey, the goal of education is to provide students with an increased capacity for having worthwhile experiences. This paper draws on Dewey's writings to develop a theory of worthwhile experience, termed "idea-based experience." A model is proposed of how individuals are apprenticed into having an…

  1. Minding the Business of Business: Tools and Models to Design and Measure Wealth Creation

    ERIC Educational Resources Information Center

    Bernardez, Mariano L.

    2009-01-01

    What is the business of business? How can planners and investors anticipate the true chances of failure and success of a business idea? This article describes a rationale for developing successful new business on the basis of a simple, sensible idea: the business of any business is to make its clients successful enough to continue purchasing and…

  2. Health care system change and the cross-border transfer of ideas: influence of the Dutch model on the 2007 German health reform.

    PubMed

    Leiber, Simone; Gress, Stefan; Manouguian, Maral-Sonja

    2010-08-01

    To increase understanding of the cross-border transfer of ideas through a case study of the 2007 German health reform, this article draws on Kingdon's approach of streams and follows two main objectives: first, to understand the extent to which the German health reform was actually influenced by the Dutch model and, second, in theoretical terms, to inform inductively on how ideas from abroad enter government agendas. The results show that the streams of problem recognition and policy proposals have not been predominantly influenced by the cross-border transfer of ideas from the Netherlands to Germany. The Dutch experience was taken into consideration only after a policy window opened by a shift in politics in the third, the political, stream: the change of government in 2005. In many respects, the way Germany learned from the Netherlands in this case sharply contrasts with an image of solving policy problems by either lesson drawing or transnational deliberation. Instead, the process was dominated by problem solving in the sphere of politics, that is, finding a way to prove the grand coalition was capable of acting.

  3. Coupled oscillators and Feynman's three papers

    NASA Astrophysics Data System (ADS)

    Kim, Y. S.

    2007-05-01

    According to Richard Feynman, the adventure of our science of physics is a perpetual attempt to recognize that the different aspects of nature are really different aspects of the same thing. It is therefore interesting to combine some, if not all, of Feynman's papers into one. The first of his three papers is on the "rest of the universe" contained in his 1972 book on statistical mechanics. The second idea is Feynman's parton picture which he presented in 1969 at the Stony Brook conference on high-energy physics. The third idea is contained in the 1971 paper he published with his students, where they show that the hadronic spectra on Regge trajectories are manifestations of harmonic-oscillator degeneracies. In this report, we formulate these three ideas using the mathematics of two coupled oscillators. It is shown that the idea of entanglement is contained in his rest of the universe, and can be extended to a space-time entanglement. It is shown also that his parton model and the static quark model can be combined into one Lorentz-covariant entity. Furthermore, Einstein's special relativity, based on the Lorentz group, can also be formulated within the mathematical framework of two coupled oscillators.

  4. Projecting Event-Based Analysis Dates in Clinical Trials: An Illustration Based on the International Duration Evaluation of Adjuvant Chemotherapy (IDEA) Collaboration. Projecting analysis dates for the IDEA collaboration.

    PubMed

    Renfro, Lindsay A; Grothey, Axel M; Paul, James; Floriani, Irene; Bonnetain, Franck; Niedzwiecki, Donna; Yamanaka, Takeharu; Souglakos, Ioannis; Yothers, Greg; Sargent, Daniel J

    2014-12-01

    Clinical trials are expensive and lengthy, where success of a given trial depends on observing a prospectively defined number of patient events required to answer the clinical question. The point at which this analysis time occurs depends on both patient accrual and primary event rates, which typically vary throughout the trial's duration. We demonstrate real-time analysis date projections using data from a collection of six clinical trials that are part of the IDEA collaboration, an international preplanned pooling of data from six trials testing the duration of adjuvant chemotherapy in stage III colon cancer, and we additionally consider the hypothetical impact of one trial's early termination of follow-up. In the absence of outcome data from IDEA, monthly accrual rates for each of the six IDEA trials were used to project subsequent trial-specific accrual, while historical data from similar Adjuvant Colon Cancer Endpoints (ACCENT) Group trials were used to construct a parametric model for IDEA's primary endpoint, disease-free survival, under the same treatment regimen. With this information and using the planned total accrual from each IDEA trial protocol, individual patient accrual and event dates were simulated and the overall IDEA interim and final analysis times projected. Projections were then compared with actual (previously undisclosed) trial-specific event totals at a recent census time for validation. The change in projected final analysis date assuming early termination of follow-up for one IDEA trial was also calculated. Trial-specific predicted event totals were close to the actual number of events per trial for the recent census date at which the number of events per trial was known, with the overall IDEA projected number of events only off by eight patients. Potential early termination of follow-up by one IDEA trial was estimated to postpone the overall IDEA final analysis date by 9 months. Real-time projection of the final analysis time during a trial, or the overall analysis time during a trial collaborative such as IDEA, has practical implications for trial feasibility when these projections are translated into additional time and resources required.

  5. Teaching 1H NMR Spectrometry Using Computer Modeling.

    ERIC Educational Resources Information Center

    Habata, Yoichi; Akabori, Sadatoshi

    2001-01-01

    Molecular modeling by computer is used to display stereochemistry, molecular orbitals, structure of transition states, and progress of reactions. Describes new ideas for teaching 1H NMR spectroscopy using computer modeling. (Contains 12 references.) (ASK)

  6. Human Reliability and Ship Stability

    DTIC Science & Technology

    2003-07-04

    models such as Miller (1957) and Broadbent (1959) is the idea of human beings as limited capacity information processors with constraints on...15 4.2.2 Outline of Some Key models ...23 TABLE 11: GENERIC ERROR MODELING SYSTEM

  7. Quantitative structure-property relationship (correlation analysis) of phosphonic acid-based chelates in design of MRI contrast agent.

    PubMed

    Tiwari, Anjani K; Ojha, Himanshu; Kaul, Ankur; Dutta, Anupama; Srivastava, Pooja; Shukla, Gauri; Srivastava, Rakesh; Mishra, Anil K

    2009-07-01

    Nuclear magnetic resonance imaging is a very useful tool in modern medical diagnostics, especially when gadolinium (III)-based contrast agents are administered to the patient with the aim of increasing the image contrast between normal and diseased tissues. With the use of soft modelling techniques such as quantitative structure-activity relationship/quantitative structure-property relationship after a suitable description of their molecular structure, we have studied a series of phosphonic acid for designing new MRI contrast agent. Quantitative structure-property relationship studies with multiple linear regression analysis were applied to find correlation between different calculated molecular descriptors of the phosphonic acid-based chelating agent and their stability constants. The final quantitative structure-property relationship mathematical models were found as--quantitative structure-property relationship Model for phosphonic acid series (Model 1)--log K(ML) = {5.00243(+/-0.7102)}- MR {0.0263(+/-0.540)}n = 12 l r l = 0.942 s = 0.183 F = 99.165 quantitative structure-property relationship Model for phosphonic acid series (Model 2)--log K(ML) = {5.06280(+/-0.3418)}- MR {0.0252(+/- .198)}n = 12 l r l = 0.956 s = 0.186 F = 99.256.

  8. Data compression for sequencing data

    PubMed Central

    2013-01-01

    Post-Sanger sequencing methods produce tons of data, and there is a general agreement that the challenge to store and process them must be addressed with data compression. In this review we first answer the question “why compression” in a quantitative manner. Then we also answer the questions “what” and “how”, by sketching the fundamental compression ideas, describing the main sequencing data types and formats, and comparing the specialized compression algorithms and tools. Finally, we go back to the question “why compression” and give other, perhaps surprising answers, demonstrating the pervasiveness of data compression techniques in computational biology. PMID:24252160

  9. What Family Support Specialists Do: Examining Service Delivery

    PubMed Central

    Wisdom, Jennifer P.; Lewandowski, R. Eric; Pollock, Michele; Acri, Mary; Shorter, Priscilla; Olin, S. Serene; Armusewicz, Kelsey; Horwitz, Sarah; Hoagwood, Kimberly E.

    2013-01-01

    This study describes services provided by family support specialists (FSS), peer advocates in programs for children with serious psychiatric conditions, to delineate differences between recommended components of FSS services and services actually provided. An analysis of qualitative interview and observational data and quantitative survey data from 63 staff at 21 mental health programs in New York identified that FSS and other staff have generally similar ideas about FSS services, and that these perceptions of activities are generally congruent with what FSS actually did. Implications of findings are discussed in the context of developing competencies and quality indicators for FSS. PMID:24174330

  10. Quantitative characterisation of audio data by ordinal symbolic dynamics

    NASA Astrophysics Data System (ADS)

    Aschenbrenner, T.; Monetti, R.; Amigó, J. M.; Bunk, W.

    2013-06-01

    Ordinal symbolic dynamics has developed into a valuable method to describe complex systems. Recently, using the concept of transcripts, the coupling behaviour of systems was assessed, combining the properties of the symmetric group with information theoretic ideas. In this contribution, methods from the field of ordinal symbolic dynamics are applied to the characterisation of audio data. Coupling complexity between frequency bands of solo violin music, as a fingerprint of the instrument, is used for classification purposes within a support vector machine scheme. Our results suggest that coupling complexity is able to capture essential characteristics, sufficient to distinguish among different violins.

  11. Maths Meets Myths: Network Investigations of Ancient Narratives

    NASA Astrophysics Data System (ADS)

    Kenna, Ralph; Mac Carron, Pádraig

    2016-02-01

    Three years ago, we initiated a programme of research in which ideas and tools from statistical physics and network theory were applied to the field of comparative mythology. The eclecticism of the work, together with the perspectives it delivered, led to widespread media coverage and academic discussion. Here we review some aspects of the project, contextualised with a brief history of the long relationship between science and the humanities. We focus in particular on an Irish epic, summarising some of the outcomes of our quantitative investigation. We also describe the emergence of a new sub-discipline and our hopes for its future.

  12. RS CVn stars - Chromospheric phenomena

    NASA Technical Reports Server (NTRS)

    Bopp, B. W.

    1983-01-01

    The observational information regarding chromospheric emission features in surface-active RS CVn stars is reviewed. Three optical features are considered in detail: Ca II H and K, Balmer H-alpha and He I 10830 A. While the qualitative behavior of these lines is in accord with solar-analogy/rotation-activity ideas, the quantitative variation and scaling are very poorly understood. In many cases, the spectroscopic observations with sufficient SNR and resolution to decide these questions have simply not yet been made. The FK Com stars, in particular, present extreme examples of rotation that may well tax present understanding of surface activity to its limits.

  13. Phi optics: from image to knowledge (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Chiritescu, Catalin

    2016-03-01

    Optical microscopy of live cells and tissues provides the main insight for life science researchers in academia and bio-pharma. The cells have very small features, are transparent, and require long term observations (hours to days) to measure the effects of drugs and diseases. New technologies - under the umbrella term of Quantitative Phase Imaging (QPI) - have come to light in the past decade to challenge and complement the current state of the art solutions that use fluorophores. Phi Optics talk will outline their lessons learned in the process of bringing an academic idea to the commercial space.

  14. Usefulness of threshold dose to prevent damage of underlying tissue by PDT treatment: an in-vitro study on chondrocytes

    NASA Astrophysics Data System (ADS)

    Placzek, R.; Kempka, G.; Ruether, W.; Moser, Joerg G.

    1995-03-01

    Arthritic synovitis is best cured by total removal of the inflamed synovia (equals synovectomy). This can be performed by open chirurgy, by arthroscopy, or by radiosynoviothesis, i.e., injection of (beta) -radiating rare earth metals into the joint cavity. All these procedures are more or less non-quantitative and may lead to a recidive. The idea was to destroy the inflamed synovia by PDT without destruction of the underlying tissue (cartilage and bone). So, the sensitivity of the cartilage-building cells, which can be grown in cell culture, has to be studied.

  15. Leaky Pipeline Myths: In Search of Gender Effects on the Job Market and Early Career Publishing in Philosophy

    PubMed Central

    Allen-Hermanson, Sean

    2017-01-01

    That philosophy is an outlier in the humanities when it comes to the underrepresentation of women has been the occasion for much discussion about possible effects of subtle forms of prejudice, including implicit bias and stereotype threat. While these ideas have become familiar to the philosophical community, there has only recently been a surge of interest in acquiring field-specific data. This paper adds to quantitative findings bearing on hypotheses about the effects of unconscious prejudice on two important stages along career pathways: tenure-track hiring and early career publishing. PMID:28659843

  16. Inhibitory Control as a Core Process of Creative Problem Solving and Idea Generation from Childhood to Adulthood.

    PubMed

    Cassotti, Mathieu; Agogué, Marine; Camarda, Anaëlle; Houdé, Olivier; Borst, Grégoire

    2016-01-01

    Developmental cognitive neuroscience studies tend to show that the prefrontal brain regions (known to be involved in inhibitory control) are activated during the generation of creative ideas. In the present article, we discuss how a dual-process model of creativity-much like the ones proposed to account for decision making and reasoning-could broaden our understanding of the processes involved in creative ideas generation. When generating creative ideas, children, adolescents, and adults tend to follow "the path of least resistance" and propose solutions that are built on the most common and accessible knowledge within a specific domain, leading to fixation effect. In line with recent theory of typical cognitive development, we argue that the ability to resist the spontaneous activation of design heuristics, to privilege other types of reasoning, might be critical to generate creative ideas at all ages. In the present review, we demonstrate that inhibitory control at all ages can actually support creativity. Indeed, the ability to think of something truly new and original requires first inhibiting spontaneous solutions that come to mind quickly and unconsciously and then exploring new ideas using a generative type of reasoning. © 2016 Wiley Periodicals, Inc.

  17. How well do middle school science programs measure up? Findings from Project 2061's curriculum review

    NASA Astrophysics Data System (ADS)

    Kesidou, Sofia; Roseman, Jo Ellen

    2002-08-01

    The purposes of this study were to examine how well middle school programs support the attainment of key scientific ideas specified in national science standards, and to identify typical strengths and weaknesses of these programs using research-based criteria. Nine widely used programs were examined by teams of teachers and specialists in research on teaching and learning. Reviewers found that whereas key ideas were generally present in the programs, they were typically buried between detailed or even unrelated ideas. Programs only rarely provided students with a sense of purpose for the units of study, took account of student beliefs that interfere with learning, engaged students with relevant phenomena to make abstract scientific ideas plausible, modeled the use of scientific knowledge so that students could apply what they learned in everyday situations, or scaffolded student efforts to make meaning of key phenomena and ideas presented in the programs. New middle school science programs that reflect findings from learning research are needed to support teachers better in helping students learn key ideas in science. The criteria and findings from this study on the inadequacies in existing programs could serve as guidelines in new curriculum development.

  18. Customer-experienced rapid prototyping

    NASA Astrophysics Data System (ADS)

    Zhang, Lijuan; Zhang, Fu; Li, Anbo

    2008-12-01

    In order to describe accurately and comprehend quickly the perfect GIS requirements, this article will integrate the ideas of QFD (Quality Function Deployment) and UML (Unified Modeling Language), and analyze the deficiency of prototype development model, and will propose the idea of the Customer-Experienced Rapid Prototyping (CE-RP) and describe in detail the process and framework of the CE-RP, from the angle of the characteristics of Modern-GIS. The CE-RP is mainly composed of Customer Tool-Sets (CTS), Developer Tool-Sets (DTS) and Barrier-Free Semantic Interpreter (BF-SI) and performed by two roles of customer and developer. The main purpose of the CE-RP is to produce the unified and authorized requirements data models between customer and software developer.

  19. In science communication, why does the idea of the public deficit always return? Exploring key influences.

    PubMed

    Suldovsky, Brianne

    2016-05-01

    Despite mounting criticism, the deficit model remains an integral part of science communication research and practice. In this article, I advance three key factors that contribute to the idea of the public deficit in science communication, including the purpose of science communication, how communication processes and outcomes are conceptualized, and how science and scientific knowledge are defined. Affording science absolute epistemic privilege, I argue, is the most compelling factor contributing to the continued use of the deficit model. In addition, I contend that the deficit model plays a necessary, though not sufficient, role in science communication research and practice. Areas for future research are discussed. © The Author(s) 2016.

  20. Cyclic multiverses

    NASA Astrophysics Data System (ADS)

    Marosek, Konrad; Dąbrowski, Mariusz P.; Balcerzak, Adam

    2016-09-01

    Using the idea of regularization of singularities due to the variability of the fundamental constants in cosmology we study the cyclic universe models. We find two models of oscillating and non-singular mass density and pressure (`non-singular' bounce) regularized by varying gravitational constant G despite the scale factor evolution is oscillating and having sharp turning points (`singular' bounce). Both violating (big-bang) and non-violating (phantom) null energy condition models appear. Then, we extend this idea on to the multiverse containing cyclic individual universes with either growing or decreasing entropy though leaving the net entropy constant. In order to get an insight into the key idea, we consider the doubleverse with the same geometrical evolution of the two `parallel' universes with their physical evolution [physical coupling constants c(t) and G(t)] being different. An interesting point is that there is a possibility to exchange the universes at the point of maximum expansion - the fact which was already noticed in quantum cosmology. Similar scenario is also possible within the framework of Brans-Dicke theory where varying G(t) is replaced by the dynamical Brans-Dicke field φ(t) though these theories are slightly different.

  1. Anticipatory dynamics of biological systems: from molecular quantum states to evolution

    NASA Astrophysics Data System (ADS)

    Igamberdiev, Abir U.

    2015-08-01

    Living systems possess anticipatory behaviour that is based on the flexibility of internal models generated by the system's embedded description. The idea was suggested by Aristotle and is explicitly introduced to theoretical biology by Rosen. The possibility of holding the embedded internal model is grounded in the principle of stable non-equilibrium (Bauer). From the quantum mechanical view, this principle aims to minimize energy dissipation in expense of long relaxation times. The ideas of stable non-equilibrium were developed by Liberman who viewed living systems as subdivided into the quantum regulator and the molecular computer supporting coherence of the regulator's internal quantum state. The computational power of the cell molecular computer is based on the possibility of molecular rearrangements according to molecular addresses. In evolution, the anticipatory strategies are realized both as a precession of phylogenesis by ontogenesis (Berg) and as the anticipatory search of genetic fixation of adaptive changes that incorporates them into the internal model of genetic system. We discuss how the fundamental ideas of anticipation can be introduced into the basic foundations of theoretical biology.

  2. A Unique Digital Electrocardiographic Repository for the Development of Quantitative Electrocardiography and Cardiac Safety: The Telemetric and Holter ECG Warehouse (THEW)

    PubMed Central

    Couderc, Jean-Philippe

    2010-01-01

    The sharing of scientific data reinforces open scientific inquiry; it encourages diversity of analysis and opinion while promoting new research and facilitating the education of next generations of scientists. In this article, we present an initiative for the development of a repository containing continuous electrocardiographic information and their associated clinical information. This information is shared with the worldwide scientific community in order to improve quantitative electrocardiology and cardiac safety. First, we present the objectives of the initiative and its mission. Then, we describe the resources available in this initiative following three components: data, expertise and tools. The Data available in the Telemetric and Holter ECG Warehouse (THEW) includes continuous ECG signals and associated clinical information. The initiative attracted various academic and private partners whom expertise covers a large list of research arenas related to quantitative electrocardiography; their contribution to the THEW promotes cross-fertilization of scientific knowledge, resources, and ideas that will advance the field of quantitative electrocardiography. Finally, the tools of the THEW include software and servers to access and review the data available in the repository. To conclude, the THEW is an initiative developed to benefit the scientific community and to advance the field of quantitative electrocardiography and cardiac safety. It is a new repository designed to complement the existing ones such as Physionet, the AHA-BIH Arrhythmia Database, and the CSE database. The THEW hosts unique datasets from clinical trials and drug safety studies that, so far, were not available to the worldwide scientific community. PMID:20863512

  3. An Overview of NASA's Integrated Design and Engineering Analysis (IDEA) Environment

    NASA Technical Reports Server (NTRS)

    Robinson, Jeffrey S.

    2011-01-01

    Historically, the design of subsonic and supersonic aircraft has been divided into separate technical disciplines (such as propulsion, aerodynamics and structures), each of which performs design and analysis in relative isolation from others. This is possible, in most cases, either because the amount of interdisciplinary coupling is minimal, or because the interactions can be treated as linear. The design of hypersonic airbreathing vehicles, like NASA's X-43, is quite the opposite. Such systems are dominated by strong non-linear interactions between disciplines. The design of these systems demands that a multi-disciplinary approach be taken. Furthermore, increased analytical fidelity at the conceptual design phase is highly desirable, as many of the non-linearities are not captured by lower fidelity tools. Only when these systems are designed from a true multi-disciplinary perspective, can the real performance benefits be achieved and complete vehicle systems be fielded. Toward this end, the Vehicle Analysis Branch at NASA Langley Research Center has been developing the Integrated Design and Engineering Analysis (IDEA) Environment. IDEA is a collaborative environment for parametrically modeling conceptual and preliminary designs for launch vehicle and high speed atmospheric flight configurations using the Adaptive Modeling Language (AML) as the underlying framework. The environment integrates geometry, packaging, propulsion, trajectory, aerodynamics, aerothermodynamics, engine and airframe subsystem design, thermal and structural analysis, and vehicle closure into a generative, parametric, unified computational model where data is shared seamlessly between the different disciplines. Plans are also in place to incorporate life cycle analysis tools into the environment which will estimate vehicle operability, reliability and cost. IDEA is currently being funded by NASA?s Hypersonics Project, a part of the Fundamental Aeronautics Program within the Aeronautics Research Mission Directorate. The environment is currently focused around a two-stage-to-orbit configuration with a turbine-based combined cycle (TBCC) first stage and a reusable rocket second stage. IDEA will be rolled out in generations, with each successive generation providing a significant increase in capability, either through increased analytic fidelity, expansion of vehicle classes considered, or by the inclusion of advanced modeling techniques. This paper provides the motivation behind the current effort, an overview of the development of the IDEA environment (including the contents and capabilities to be included in Generation 1 and Generation 2), and a description of the current status and detail of future plans.

  4. Applying the Concepts of Innovation Strategies to Plastic Surgery

    PubMed Central

    Wang, Yirong; Kotsis, Sandra V.; Chung, Kevin C.

    2014-01-01

    Background: Plastic surgery has a well-known history of innovative procedures and products. However, with the rise in competition, such as aesthetic procedures being performed by other medical specialties, there is a need for continued innovation in plastic surgery to create novel treatments to advance this specialty. Although many articles introduce innovative technologies and procedures, there is a paucity of publications to highlight the application of principles of innovation in plastic surgery. Methods: We review the literature regarding business strategies for innovation. Results: We evaluate concepts of innovation, process of innovation (idea generation, idea evaluation, idea conversion, idea diffusion and adoption), ethical issues, and the application to plastic surgery. Conclusions: Adopting a business model of innovation is helpful to promote a new paradigm of progress to propel plastic surgery to new avenues of creativity. PMID:23897344

  5. An Assessment of the Quantitative Literacy of Undergraduate Students

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.

    2016-01-01

    Quantitative literacy (QLT) represents an underlying higher-order construct that accounts for a person's willingness to engage in quantitative situations in everyday life. The purpose of this study is to retest the construct validity of a model of quantitative literacy (Wilkins, 2010). In this model, QLT represents a second-order factor that…

  6. Examining the Relationships Between Education, Social Networks and Democratic Support With ABM

    NASA Technical Reports Server (NTRS)

    Drucker, Nick; Campbell, Kenyth

    2011-01-01

    This paper introduces an agent-based model that explores the relationships between education, social networks, and support for democratic ideals. This study examines two factors thai affect democratic support, education, and social networks. Current theory concerning these two variables suggests that positive relationships exist between education and democratic support and between social networks and the spread of ideas. The model contains multiple variables of democratic support, two of which are evaluated through experimentation. The model allows individual entities within the system to make "decisions" about their democratic support independent of one another. The agent based approach also allows entities to utilize their social networks to spread ideas. Current theory supports experimentation results. In addion , these results show the model is capable of reproducing real world outcomes. This paper addresses the model creation process and the experimentation procedure, as well as future research avenues and potential shortcomings of the model

  7. Towards Reproducible Descriptions of Neuronal Network Models

    PubMed Central

    Nordlie, Eilen; Gewaltig, Marc-Oliver; Plesser, Hans Ekkehard

    2009-01-01

    Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come. We further hope that work on standardized ways of describing—and thinking about—complex neuronal networks will lead the scientific community to a clearer understanding of high-level concepts in network dynamics, and will thus lead to deeper insights into the function of the brain. PMID:19662159

  8. Space-Time, Relativity, and Cosmology

    NASA Astrophysics Data System (ADS)

    Wudka, Jose

    2006-07-01

    Space-Time, Relativity and Cosmology provides a historical introduction to modern relativistic cosmology and traces its historical roots and evolution from antiquity to Einstein. The topics are presented in a non-mathematical manner, with the emphasis on the ideas that underlie each theory rather than their detailed quantitative consequences. A significant part of the book focuses on the Special and General theories of relativity. The tests and experimental evidence supporting the theories are explained together with their predictions and their confirmation. Other topics include a discussion of modern relativistic cosmology, the consequences of Hubble's observations leading to the Big Bang hypothesis, and an overview of the most exciting research topics in relativistic cosmology. This textbook is intended for introductory undergraduate courses on the foundations of modern physics. It is also accessible to advanced high school students, as well as non-science majors who are concerned with science issues.• Uses a historical perspective to describe the evolution of modern ideas about space and time • The main arguments are described using a completely non-mathematical approach • Ideal for physics undergraduates and high-school students, non-science majors and general readers

  9. Asymptotic analysis of discrete schemes for non-equilibrium radiation diffusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Xia, E-mail: cui_xia@iapcm.ac.cn; Yuan, Guang-wei; Shen, Zhi-jun

    Motivated by providing well-behaved fully discrete schemes in practice, this paper extends the asymptotic analysis on time integration methods for non-equilibrium radiation diffusion in [2] to space discretizations. Therein studies were carried out on a two-temperature model with Larsen's flux-limited diffusion operator, both the implicitly balanced (IB) and linearly implicit (LI) methods were shown asymptotic-preserving. In this paper, we focus on asymptotic analysis for space discrete schemes in dimensions one and two. First, in construction of the schemes, in contrast to traditional first-order approximations, asymmetric second-order accurate spatial approximations are devised for flux-limiters on boundary, and discrete schemes with second-ordermore » accuracy on global spatial domain are acquired consequently. Then by employing formal asymptotic analysis, the first-order asymptotic-preserving property for these schemes and furthermore for the fully discrete schemes is shown. Finally, with the help of manufactured solutions, numerical tests are performed, which demonstrate quantitatively the fully discrete schemes with IB time evolution indeed have the accuracy and asymptotic convergence as theory predicts, hence are well qualified for both non-equilibrium and equilibrium radiation diffusion. - Highlights: • Provide AP fully discrete schemes for non-equilibrium radiation diffusion. • Propose second order accurate schemes by asymmetric approach for boundary flux-limiter. • Show first order AP property of spatially and fully discrete schemes with IB evolution. • Devise subtle artificial solutions; verify accuracy and AP property quantitatively. • Ideas can be generalized to 3-dimensional problems and higher order implicit schemes.« less

  10. Fidelity Failures in Brief Strategic Family Therapy for Adolescent Drug Abuse: A Clinical Analysis.

    PubMed

    Lebensohn-Chialvo, Florencia; Rohrbaugh, Michael J; Hasler, Brant P

    2018-04-30

    As evidence-based family treatments for adolescent substance use and conduct problems gain traction, cutting edge research moves beyond randomized efficacy trials to address questions such as how these treatments work and how best to disseminate them to community settings. A key factor in effective dissemination is treatment fidelity, which refers to implementing an intervention in a manner consistent with an established manual. While most fidelity research is quantitative, this study offers a qualitative clinical analysis of fidelity failures in a large, multisite effectiveness trial of Brief Strategic Family Therapy (BSFT) for adolescent drug abuse, where BSFT developers trained community therapists to administer this intervention in their own agencies. Using case notes and video recordings of therapy sessions, an independent expert panel first rated 103 cases on quantitative fidelity scales grounded in the BSFT manual and the broader structural-strategic framework that informs BSFT intervention. Because fidelity was generally low, the panel reviewed all cases qualitatively to identify emergent types or categories of fidelity failure. Ten categories of failures emerged, characterized by therapist omissions (e.g., failure to engage key family members, failure to think in threes) and commissions (e.g., off-model, nonsystemic formulations/interventions). Of these, "failure to think in threes" appeared basic and particularly problematic, reflecting the central place of this idea in structural theory and therapy. Although subject to possible bias, our observations highlight likely stumbling blocks in exporting a complex family treatment like BSFT to community settings. These findings also underscore the importance of treatment fidelity in family therapy research. © 2018 Family Process Institute.

  11. A novel spectral imaging system for quantitative analysis of hypertrophic scar

    NASA Astrophysics Data System (ADS)

    Ghassemi, Pejhman; Shupp, Jeffrey W.; Moffatt, Lauren T.; Ramella-Roman, Jessica C.

    2013-03-01

    Scarring can lead to significant cosmetic, psychosocial, and functional consequences in patients with hypertrophic scars from burn and trauma injuries. Therefore, quantitative assessment of scar is needed in clinical diagnosis and treatment. The Vancouver Scar Scale (VSS), the accepted clinical scar assessment tool, was introduced in the nineties and relies only on the physician subjective evaluation of skin pliability, height, vascularity, and pigmentation. To date, no entirely objective method has been available for scar assessment. So, there is a continued need for better techniques to monitor patients with scars. We introduce a new spectral imaging system combining out-of-plane Stokes polarimetry, Spatial Frequency Domain Imaging (SFDI), and three-dimensional (3D) reconstruction. The main idea behind this system is to estimate hemoglobin and melanin contents of scar using SFDI technique, roughness and directional anisotropy features with Stokes polarimetry, and height and general shape with 3D reconstruction. Our proposed tool has several advantages compared to current methodologies. First and foremost, it is non-contact and non-invasive and thus can be used at any stage in wound healing without causing harm to the patient. Secondarily, the height, pigmentation, and hemoglobin assessments are co-registered and are based on imaging and not point measurement, allowing for more meaningful interpretation of the data. Finally, the algorithms used in the data analysis are physics based which will be very beneficial in the standardization of the technique. A swine model has also been developed for hypertrophic scarring and an ongoing pre-clinical evaluation of the technique is being conducted.

  12. Fostering Students' Conceptual Knowledge in Biology in the Context of German National Education Standards

    NASA Astrophysics Data System (ADS)

    Förtsch, Christian; Dorfner, Tobias; Baumgartner, Julia; Werner, Sonja; von Kotzebue, Lena; Neuhaus, Birgit J.

    2018-04-01

    The German National Education Standards (NES) for biology were introduced in 2005. The content part of the NES emphasizes fostering conceptual knowledge. However, there are hardly any indications of what such an instructional implementation could look like. We introduce a theoretical framework of an instructional approach to foster students' conceptual knowledge as demanded in the NES (Fostering Conceptual Knowledge) including instructional practices derived from research on single core ideas, general psychological theories, and biology-specific features of instructional quality. First, we aimed to develop a rating manual, which is based on this theoretical framework. Second, we wanted to describe current German biology instruction according to this approach and to quantitatively analyze its effectiveness. And third, we aimed to provide qualitative examples of this approach to triangulate our findings. In a first step, we developed a theoretically devised rating manual to measure Fostering Conceptual Knowledge in videotaped lessons. Data for quantitative analysis included 81 videotaped biology lessons of 28 biology teachers from different German secondary schools. Six hundred forty students completed a questionnaire on their situational interest after each lesson and an achievement test. Results from multilevel modeling showed significant positive effects of Fostering Conceptual Knowledge on students' achievement and situational interest. For qualitative analysis, we contrasted instruction of four teachers, two with high and two with low student achievement and situational interest using the qualitative method of thematic analysis. Qualitative analysis revealed five main characteristics describing Fostering Conceptual Knowledge. Therefore, implementing Fostering Conceptual Knowledge in biology instruction seems promising. Examples of how to implement Fostering Conceptual Knowledge in instruction are shown and discussed.

  13. Representation in development: from a model system to some general processes.

    PubMed

    Montuori, Luke M; Honey, Robert C

    2015-03-01

    The view that filial imprinting might serve as a useful model system for studying the neurobiological basis of memory was inspired, at least in part, by a simple idea: acquired filial preferences reflect the formation of a memory or representation of the imprinting object itself, as opposed to the change in the efficacy of stimulus-response pathways, for example. We provide a synthesis of the evidence that supports this idea; and show that the processes of memory formation observed in filial imprinting find surprisingly close counterparts in other species, including our own. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Exponential vanishing of the ground-state gap of the quantum random energy model via adiabatic quantum computing

    NASA Astrophysics Data System (ADS)

    Adame, J.; Warzel, S.

    2015-11-01

    In this note, we use ideas of Farhi et al. [Int. J. Quantum. Inf. 6, 503 (2008) and Quantum Inf. Comput. 11, 840 (2011)] who link a lower bound on the run time of their quantum adiabatic search algorithm to an upper bound on the energy gap above the ground-state of the generators of this algorithm. We apply these ideas to the quantum random energy model (QREM). Our main result is a simple proof of the conjectured exponential vanishing of the energy gap of the QREM.

  15. Exponential vanishing of the ground-state gap of the quantum random energy model via adiabatic quantum computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adame, J.; Warzel, S., E-mail: warzel@ma.tum.de

    In this note, we use ideas of Farhi et al. [Int. J. Quantum. Inf. 6, 503 (2008) and Quantum Inf. Comput. 11, 840 (2011)] who link a lower bound on the run time of their quantum adiabatic search algorithm to an upper bound on the energy gap above the ground-state of the generators of this algorithm. We apply these ideas to the quantum random energy model (QREM). Our main result is a simple proof of the conjectured exponential vanishing of the energy gap of the QREM.

  16. Teaching medical students about fair distribution of healthcare resources

    PubMed Central

    Leget, C; Hoedemaekers, R

    2007-01-01

    Healthcare package decisions are complex. Different judgements about effectiveness, cost‐effectiveness and disease burden influence the decision‐making process. Moreover, different concepts of justice generate different ideas about fair distribution of healthcare resources. This paper presents a decision model that is used in medical school in order to familiarise medical students with the different concepts of justice and the ethical dimension of making concrete choices. The model is based on the four‐stage decision model developed in the Netherlands by the Dunning Committee and the discussion that followed its presentation in 1991. Having to deal with 10 medical services, students working with the model learn to discern and integrate four different ideas of distributive justice that are integrated in a flow chart: libertarian, communitarian, egalitarian and utilitarian. PMID:18055907

  17. Teaching medical students about fair distribution of healthcare resources.

    PubMed

    Leget, C; Hoedemaekers, R

    2007-12-01

    Healthcare package decisions are complex. Different judgements about effectiveness, cost-effectiveness and disease burden influence the decision-making process. Moreover, different concepts of justice generate different ideas about fair distribution of healthcare resources. This paper presents a decision model that is used in medical school in order to familiarise medical students with the different concepts of justice and the ethical dimension of making concrete choices. The model is based on the four-stage decision model developed in the Netherlands by the Dunning Committee and the discussion that followed its presentation in 1991. Having to deal with 10 medical services, students working with the model learn to discern and integrate four different ideas of distributive justice that are integrated in a flow chart: libertarian, communitarian, egalitarian and utilitarian.

  18. Beyond the "History of Ideas": The Issue of the "Ideological Origins of the Revolutions of Independence" Revisited.

    PubMed

    Palti, Elías

    2018-01-01

    This paper analyzes how Latin American historiography has addressed the issue of "the ideological origins of the revolution of independence," and how the formulation of that topic implies assumptions proper to the tradition of the history of ideas and leads to anachronistic conceptual transpositions. Halperín Donghi's work models a different approach, illuminating how a series of meaningful torsions within traditional languages provided the ideological framework for a result incompatible with those languages. This paradox forces a break with the frameworks of the history of ideas and the set of antinomies intrinsic to them, such as that between "tradition" and "modernity."

  19. The Revised Inventory of the Dimensions of Emerging Adulthood (IDEA-R) and Substance Use Among College Students.

    PubMed

    Allem, Jon-Patrick; Sussman, Steve; Unger, Jennifer B

    2017-12-01

    Transition-to-adulthood themes, or thoughts and feelings about emerging adulthood, have been measured by the Inventory of the Dimensions of Emerging Adulthood (IDEA) and found to be associated with substance use among emerging adults. It has been suggested, however, that the IDEA is lengthy and may not include the most unique and theoretically relevant constructs of emerging adulthood. The Revised Inventory of the Dimensions of Emerging Adulthood (IDEA-R) was developed as an alternative instrument, but research has yet to determine the relationship between the IDEA-R and substance use among emerging adults (ages 18-25 years). College students completed surveys indicating their identification with transition-to-adulthood themes and substance use. Logistic regression models examined the associations between transition-to-adulthood themes and marijuana use and binge drinking, respectively. Participants who felt emerging adulthood was a time of identity exploration were less likely to report marijuana use, while feelings of experimentation/possibility were positively associated with marijuana use and binge drinking. The IDEA-R may be useful for identifying correlates of substance use among emerging adults. Future research should evaluate the IDEA-R among representative samples of emerging adults to confirm the findings of this study. Health professionals working in substance use prevention may consider targeting the themes of identity exploration and experimentation/possibility in programs intended for emerging adults.

  20. Secondary Students' Thinking about Familiar Phenomena: Learners' Explanations from a Curriculum Context Where "Particles" Is a Key Idea for Organising Teaching and Learning

    ERIC Educational Resources Information Center

    Garcia Franco, Alejandra; Taber, Keith S.

    2009-01-01

    Particle models of matter are widely recognised as being of fundamental importance in many branches of modern science, and particle ideas are commonly introduced and developed in the secondary school curriculum. However, research undertaken in a range of national contexts has identified significant learning difficulties in this topic, and suggests…

Top