Science.gov

Sample records for accounting approach based

  1. Accounting Control Technology Using SAP: A Case-Based Approach

    ERIC Educational Resources Information Center

    Ragan, Joseph; Puccio, Christopher; Talisesky, Brandon

    2014-01-01

    The Sarbanes-Oxley Act (SOX) revolutionized the accounting and audit industry. The use of preventative and process controls to evaluate the continuous audit process done via an SAP ERP ECC 6.0 system is key to compliance with SOX and managing costs. This paper can be used in a variety of ways to discuss issues associated with auditing and testing…

  2. A Simple Regression-based Approach to Account for Survival Bias in Birth Outcomes Research.

    PubMed

    Tchetgen Tchetgen, Eric J; Phiri, Kelesitse; Shapiro, Roger

    2015-07-01

    In perinatal epidemiology, birth outcomes such as small for gestational age (SGA) may not be observed for a pregnancy ending with a stillbirth. It is then said that SGA is truncated by stillbirth, which may give rise to survival bias when evaluating the effects on SGA of an exposure known also to influence the risk of a stillbirth. In this article, we consider the causal effects of maternal infection with human immunodeficiency virus (HIV) on the risk of SGA, in a sample of pregnant women in Botswana. We hypothesize that previously estimated effects of HIV on SGA may be understated because they fail to appropriately account for the over-representation of live births among HIV negative mothers, relative to HIV positive mothers. A simple yet novel regression-based approach is proposed to adjust effect estimates for survival bias for an outcome that is either continuous or binary. Under certain straightforward assumptions, the approach produces an estimate that may be interpreted as the survivor average causal effect of maternal HIV, which is, the average effect of maternal HIV on SGA among births that would be live irrespective of maternal HIV status. The approach is particularly appealing, because it recovers an exposure effect which is robust to survival bias, even if the association between the risk of SGA and that of a stillbirth cannot be completely explained by adjusting for observed shared risk factors. The approach also gives a formal statistical test of the null hypothesis of no survival bias in the regression framework.

  3. Greenhouse Gas Emissions Accounting of Urban Residential Consumption: A Household Survey Based Approach

    PubMed Central

    Lin, Tao; Yu, Yunjun; Bai, Xuemei; Feng, Ling; Wang, Jin

    2013-01-01

    Devising policies for a low carbon city requires a careful understanding of the characteristics of urban residential lifestyle and consumption. The production-based accounting approach based on top-down statistical data has a limited ability to reflect the total greenhouse gas (GHG) emissions from residential consumption. In this paper, we present a survey-based GHG emissions accounting methodology for urban residential consumption, and apply it in Xiamen City, a rapidly urbanizing coastal city in southeast China. Based on this, the main influencing factors determining residential GHG emissions at the household and community scale are identified, and the typical profiles of low, medium and high GHG emission households and communities are identified. Up to 70% of household GHG emissions are from regional and national activities that support household consumption including the supply of energy and building materials, while 17% are from urban level basic services and supplies such as sewage treatment and solid waste management, and only 13% are direct emissions from household consumption. Housing area and household size are the two main factors determining GHG emissions from residential consumption at the household scale, while average housing area and building height were the main factors at the community scale. Our results show a large disparity in GHG emissions profiles among different households, with high GHG emissions households emitting about five times more than low GHG emissions households. Emissions from high GHG emissions communities are about twice as high as from low GHG emissions communities. Our findings can contribute to better tailored and targeted policies aimed at reducing household GHG emissions, and developing low GHG emissions residential communities in China. PMID:23405187

  4. Greenhouse gas emissions accounting of urban residential consumption: a household survey based approach.

    PubMed

    Lin, Tao; Yu, Yunjun; Bai, Xuemei; Feng, Ling; Wang, Jin

    2013-01-01

    Devising policies for a low carbon city requires a careful understanding of the characteristics of urban residential lifestyle and consumption. The production-based accounting approach based on top-down statistical data has a limited ability to reflect the total greenhouse gas (GHG) emissions from residential consumption. In this paper, we present a survey-based GHG emissions accounting methodology for urban residential consumption, and apply it in Xiamen City, a rapidly urbanizing coastal city in southeast China. Based on this, the main influencing factors determining residential GHG emissions at the household and community scale are identified, and the typical profiles of low, medium and high GHG emission households and communities are identified. Up to 70% of household GHG emissions are from regional and national activities that support household consumption including the supply of energy and building materials, while 17% are from urban level basic services and supplies such as sewage treatment and solid waste management, and only 13% are direct emissions from household consumption. Housing area and household size are the two main factors determining GHG emissions from residential consumption at the household scale, while average housing area and building height were the main factors at the community scale. Our results show a large disparity in GHG emissions profiles among different households, with high GHG emissions households emitting about five times more than low GHG emissions households. Emissions from high GHG emissions communities are about twice as high as from low GHG emissions communities. Our findings can contribute to better tailored and targeted policies aimed at reducing household GHG emissions, and developing low GHG emissions residential communities in China.

  5. Examining air pollution in China using production- and consumption-based emissions accounting approaches.

    PubMed

    Huo, Hong; Zhang, Qiang; Guan, Dabo; Su, Xin; Zhao, Hongyan; He, Kebin

    2014-12-16

    Two important reasons for China's air pollution are the high emission factors (emission per unit of product) of pollution sources and the high emission intensity (emissions per unit of GDP) of the industrial structure. Therefore, a wide variety of policy measures, including both emission abatement technologies and economic adjustment, must be implemented. To support such measures, this study used the production- and consumption-based emissions accounting approaches to simulate the SO2, NOx, PM2.5, and VOC emissions flows among producers and consumers. This study analyzed the emissions and GDP performance of 36 production sectors. The results showed that the equipment, machinery, and devices manufacturing and construction sectors contributed more than 50% of air pollutant emissions, and most of their products were used for capital formation and export. The service sector had the lowest emission intensities, and its output was mainly consumed by households and the government. In China, the emission intensities of production activities triggered by capital formation and export were approximately twice that of the service sector triggered by final consumption expenditure. This study suggests that China should control air pollution using the following strategies: applying end-of-pipe abatement technologies and using cleaner fuels to further decrease the emission factors associated with rural cooking, electricity generation, and the transportation sector; continuing to limit highly emission-intensive but low value-added exports; developing a plan to reduce construction activities; and increasing the proportion of service GDP in the national economy.

  6. Reality-Based Learning and Interdisciplinary Teams: An Interactive Approach Integrating Accounting and Engineering Technology.

    ERIC Educational Resources Information Center

    Rogers, Robert L.; Stemkoski, Michael J.

    This paper describes a reality-based learning project in which sophomore accounting and engineering students collaborated in interdisciplinary teams to design and build a million-dollar waterslide park. Two weeks into the project, the teams received a briefing from an industrial panel of engineers, bankers, entrepreneurs, and other professionals.…

  7. Building Student Success Using Problem-Based Learning Approach in the Accounting Classroom

    ERIC Educational Resources Information Center

    Shawver, Todd A.

    2015-01-01

    A major area of concern in academia is that of student retention at the university, college, and departmental levels. As academics, there is a considerable amount that we can do to improve student retention, and reduce the attrition rates in our departments. One way to solve this is to take an innovative approach in the classroom to enhance the…

  8. Accounting for selection bias in species distribution models: An econometric approach on forested trees based on structural modeling

    NASA Astrophysics Data System (ADS)

    Ay, Jean-Sauveur; Guillemot, Joannès; Martin-StPaul, Nicolas K.; Doyen, Luc; Leadley, Paul

    2015-04-01

    Species distribution models (SDMs) are widely used to study and predict the outcome of global change on species. In human dominated ecosystems the presence of a given species is the result of both its ecological suitability and human footprint on nature such as land use choices. Land use choices may thus be responsible for a selection bias in the presence/absence data used in SDM calibration. We present a structural modelling approach (i.e. based on structural equation modelling) that accounts for this selection bias. The new structural species distribution model (SSDM) estimates simultaneously land use choices and species responses to bioclimatic variables. A land use equation based on an econometric model of landowner choices was joined to an equation of species response to bioclimatic variables. SSDM allows the residuals of both equations to be dependent, taking into account the possibility of shared omitted variables and measurement errors. We provide a general description of the statistical theory and a set of application on forested trees over France using databases of climate and forest inventory at different spatial resolution (from 2km to 8 km). We also compared the output of the SSDM with outputs of a classical SDM in term of bioclimatic response curves and potential distribution under current climate. According to the species and the spatial resolution of the calibration dataset, shapes of bioclimatic response curves the modelled species distribution maps differed markedly between the SSDM and classical SDMs. The magnitude and directions of these differences were dependent on the correlations between the errors from both equations and were highest for higher spatial resolutions. A first conclusion is that the use of classical SDMs can potentially lead to strong miss-estimation of the actual and future probability of presence modelled. Beyond this selection bias, the SSDM we propose represents a crucial step to account for economic constraints on tree

  9. Starworld: Preparing Accountants for the Future: A Case-Based Approach to Teach International Financial Reporting Standards Using ERP Software

    ERIC Educational Resources Information Center

    Ragan, Joseph M.; Savino, Christopher J.; Parashac, Paul; Hosler, Jonathan C.

    2010-01-01

    International Financial Reporting Standards now constitute an important part of educating young professional accountants. This paper looks at a case based process to teach International Financial Reporting Standards using integrated Enterprise Resource Planning software. The case contained within the paper can be used within a variety of courses…

  10. Integrated Approach to User Account Management

    NASA Technical Reports Server (NTRS)

    Kesselman, Glenn; Smith, William

    2007-01-01

    IT environments consist of both Windows and other platforms. Providing user account management for this model has become increasingly diffi cult. If Microsoft#s Active Directory could be enhanced to extend a W indows identity for authentication services for Unix, Linux, Java and Macintosh systems, then an integrated approach to user account manag ement could be realized.

  11. Assessing Students' Accounting Knowledge: A Structural Approach.

    ERIC Educational Resources Information Center

    Boldt, Margaret N.

    2001-01-01

    Comparisons of students' representations of financial accounting concepts with the knowledge structures of experts were depicted using Pathfinder networks. This structural approach identified the level of students' understanding of concepts and knowledge gaps that need to be addressed. (SK)

  12. School Centered Evidence Based Accountability

    ERIC Educational Resources Information Center

    Milligan, Charles

    2015-01-01

    Achievement scores drive much of the effort in today's accountability system, however, there is much more that occurs in every school, every day. School Centered Evidence Based Accountability can be used from micro to macro giving School Boards and Administration a process for monitoring the results of the entire school operation effectively and…

  13. Approaches to accountability in long-term care.

    PubMed

    Berta, Whitney; Laporte, Audrey; Wodchis, Walter P

    2014-09-01

    This paper discusses the array of approaches to accountability in Ontario long-term care (LTC) homes. A focus group involving key informants from the LTC industry, including both for-profit and not-for-profit nursing home owners/operators, was used to identify stakeholders involved in formulating and implementing LTC accountability approaches and the relevant regulations, policies and initiatives relating to accountability in the LTC sector. These documents were then systematically reviewed. We found that the dominant mechanisms have been financial incentives and oversight, regulations and information; professionalism has played a minor role. More recently, measurement for accountability in LTC has grown to encompass an array of fiscal, clinical and public accountability measurement mechanisms. The goals of improved quality and accountability are likely more achievable using these historical regulatory approaches, but the recent rapid increase in data and measurability could also enable judicious application of market-based approaches.

  14. Approaches to Accountability in Long-Term Care

    PubMed Central

    Berta, Whitney; Laporte, Audrey; Wodchis, Walter P.

    2014-01-01

    This paper discusses the array of approaches to accountability in Ontario long-term care (LTC) homes. A focus group involving key informants from the LTC industry, including both for-profit and not-for-profit nursing home owners/operators, was used to identify stakeholders involved in formulating and implementing LTC accountability approaches and the relevant regulations, policies and initiatives relating to accountability in the LTC sector. These documents were then systematically reviewed. We found that the dominant mechanisms have been financial incentives and oversight, regulations and information; professionalism has played a minor role. More recently, measurement for accountability in LTC has grown to encompass an array of fiscal, clinical and public accountability measurement mechanisms. The goals of improved quality and accountability are likely more achievable using these historical regulatory approaches, but the recent rapid increase in data and measurability could also enable judicious application of market-based approaches. PMID:25305396

  15. Combining accounting approaches to practice valuation.

    PubMed

    Schwartzben, D; Finkler, S A

    1998-06-01

    Healthcare organizations that wish to acquire physician or ambulatory care practices can choose from a variety of practice valuation approaches. Basic accounting methods assess the value of a physician practice on the basis of a historical, balance-sheet description of tangible assets. Yet these methods alone are inadequate to determine the true financial value of a practice. By using a combination of accounting approaches to practice valuation that consider factors such as fair market value, opportunity cost, and discounted cash flow over a defined time period, organizations can more accurately assess a practice's actual value.

  16. Teaching Financial Accounting via a Worksheet Approach.

    ERIC Educational Resources Information Center

    Vincent, Vern C.; Dietz, Elizabeth M.

    A classroom research study investigated the effectiveness of an approach to financial accounting instruction that uses worksheets to bring together the conceptual and practical aspects of the field. Students were divided into two groups, one taught by traditional lecture method and the other taught with worksheet exercises and lectures stressing…

  17. Accounting Ethics Education: An Interactive Approach

    ERIC Educational Resources Information Center

    White, Gwendolen B.

    2004-01-01

    An interactive and technological approach was used to discuss ethics with accounting students. Students responded anonymously to ethics questions using wireless transmitters. The students' responses were shown to the group. A customized DVD of movie scenes from "The Producers" and "Wall Street" and a still picture of Enron's…

  18. Restorative Justice as Strength-Based Accountability

    ERIC Educational Resources Information Center

    Ball, Robert

    2003-01-01

    This article compares strength-based and restorative justice philosophies for young people and their families. Restorative justice provides ways to respond to crime and harm that establish accountability while seeking to reconcile members of a community. Restorative approaches are an important subset of strength-based interventions.

  19. Teaching the Indirect Method of the Statement of Cash Flows in Introductory Financial Accounting: A Comprehensive, Problem-Based Approach

    ERIC Educational Resources Information Center

    Brickner, Daniel R.; McCombs, Gary B.

    2004-01-01

    In this article, the authors provide an instructional resource for presenting the indirect method of the statement of cash flows (SCF) in an introductory financial accounting course. The authors focus primarily on presenting a comprehensive example that illustrates the "why" of SCF preparation and show how journal entries and T-accounts can be…

  20. Storage-based approaches to build floodplain inundation modelling capability in river system models for water resources planning and accounting

    NASA Astrophysics Data System (ADS)

    Dutta, Dushmanta; Teng, Jin; Vaze, Jai; Lerat, Julien; Hughes, Justin; Marvanek, Steve

    2013-11-01

    We develop two innovate approaches for floodplain modelling in river system models.The two approaches can estimate floodplain fluxes and stores model at river reach.Performance of the second approach is equivalent to a hydrodynamic model.The second approach is suitable for rapid inundation estimate at high spatial scale.New developments enable river models to improve environmental flow modelling.

  1. A new approach to account for the medium-dependent effect in model-based dose calculations for kilovoltage x-rays.

    PubMed

    Pawlowski, Jason M; Ding, George X

    2011-07-07

    This study presents a new approach to accurately account for the medium-dependent effect in model-based dose calculations for kilovoltage (kV) x-rays. This approach is based on the hypothesis that the correction factors needed to convert dose from model-based dose calculations to absorbed dose-to-medium depend on both the attenuation characteristics of the absorbing media and the changes to the energy spectrum of the incident x-rays as they traverse media with an effective atomic number different than that of water. Using Monte Carlo simulation techniques, we obtained empirical medium-dependent correction factors that take both effects into account. We found that the correction factors can be expressed as a function of a single quantity, called the effective bone depth, which is a measure of the amount of bone that an x-ray beam must penetrate to reach a voxel. Since the effective bone depth can be calculated from volumetric patient CT images, the medium-dependent correction factors can be obtained for model-based dose calculations based on patient CT images. We tested the accuracy of this new approach on 14 patients for the case of calculating imaging dose from kilovoltage cone-beam computed tomography used for patient setup in radiotherapy, and compared it with the Monte Carlo method, which is regarded as the 'gold standard'. For all patients studied, the new approach resulted in mean dose errors of less than 3%. This is in contrast to current available inhomogeneity corrected methods, which have been shown to result in mean errors of up to -103% for bone and 8% for soft tissue. Since there is a huge gain in the calculation speed relative to the Monte Carlo method (∼two orders of magnitude) with an acceptable loss of accuracy, this approach provides an alternative accurate dose calculation method for kV x-rays.

  2. Bookkeeping and Accounting: The "Time" Approach to Teaching Accounting

    ERIC Educational Resources Information Center

    Mallue, Henry E., Jr.

    1977-01-01

    Describes the "time" approach, a non-traditional method for teaching Bookkeeping I, which redirects the general climate of the first week of class by not introducing crucial balance sheet and journal concepts, but makes use of sections 441 and 446 of the Internal Revenue Code, thereby permitting students to learn the important role "time"…

  3. The Effects of Different Teaching Approaches in Introductory Financial Accounting

    ERIC Educational Resources Information Center

    Chiang, Bea; Nouri, Hossein; Samanta, Subarna

    2014-01-01

    The purpose of the research is to examine the effect of the two different teaching approaches in the first accounting course on student performance in a subsequent finance course. The study compares 128 accounting and finance students who took introductory financial accounting by either a user approach or a traditional preparer approach to examine…

  4. The Cyclical Relationship Approach in Teaching Basic Accounting Principles.

    ERIC Educational Resources Information Center

    Golen, Steven

    1981-01-01

    Shows how teachers can provide a more meaningful presentation of various accounting principles by illustrating them through a cyclical relationship approach. Thus, the students see the entire accounting relationship as a result of doing business. (CT)

  5. Account-based plans curb costs and engage employees.

    PubMed

    Osterndorf, Dave

    2006-01-01

    A growing number of organizations are combining consumer-driven health plans with account-based approaches in order to limit health benefit costs, reinforce key consumerist messages and provide meaningful benefits to both actives and retirees. This article describes how account-based approaches work and can be used to motivate employees to invest in their health today and salt away funds for tomorrow. The author describes what employers can do to ensure that consumer-driven health plans and account-based approaches help employees accomplish their goals.

  6. Enhanced Student Learning in Accounting Utilising Web-Based Technology, Peer-Review Feedback and Reflective Practices: A Learning Community Approach to Assessment

    ERIC Educational Resources Information Center

    Taylor, Sue; Ryan, Mary; Pearce, Jon

    2015-01-01

    Higher education is becoming a major driver of economic competitiveness in an increasingly knowledge-driven global economy. Maintaining the competitive edge has seen an increase in public accountability of higher education institutions through the mechanism of ranking universities based on the quality of their teaching and learning outcomes. As a…

  7. Risk-Based Educational Accountability in Dutch Primary Education

    ERIC Educational Resources Information Center

    Timmermans, A. C.; de Wolf, I. F.; Bosker, R. J.; Doolaard, S.

    2015-01-01

    A recent development in educational accountability is a risk-based approach, in which intensity and frequency of school inspections vary across schools to make educational accountability more efficient and effective by enabling inspectorates to focus on organizations at risk. Characteristics relevant in predicting which schools are "at risk…

  8. Students' Approaches to Study in Introductory Accounting Courses

    ERIC Educational Resources Information Center

    Elias, Rafik Z.

    2005-01-01

    Significant education research has focused on the study approaches of students. Two study approaches have been clearly identified: deep and surface. In this study, the author examined the way in which students approach studying introductory accounting courses. In general, he found that GPA and expected course grade were correlated positively with…

  9. Students' Approaches to Learning in Problem-Based Learning: Taking into Account Professional Behavior in the Tutorial Groups, Self-Study Time, and Different Assessment Aspects

    ERIC Educational Resources Information Center

    Loyens, Sofie M. M.; Gijbels, David; Coertjens, Liesje; Cote, Daniel J.

    2013-01-01

    Problem-based learning (PBL) represents a major development in higher educational practice and is believed to promote deep learning in students. However, empirical findings on the promotion of deep learning in PBL remain unclear. The aim of the present study is to investigate the relationships between students' approaches to learning (SAL) and…

  10. Problem-Based Learning in Accounting

    ERIC Educational Resources Information Center

    Dockter, DuWayne L.

    2012-01-01

    Seasoned educators use an assortment of student-centered methods and tools to enhance their student's learning environment. In respects to methodologies used in accounting, educators have utilized and created new forms of problem-based learning exercises, including case studies, simulations, and other projects, to help students become more active…

  11. Standards-Based Accountability in South Africa

    ERIC Educational Resources Information Center

    Taylor, Nick

    2009-01-01

    The implementation of standards-based accountability (SBA) interventions aimed at improving school performance often focuses on the testing component, at the expense of capacity building. This was the case in South Africa when a SBA programme was instituted by government in 2000, which was accompanied by substantial rises in senior certificate…

  12. The Approaches to Studying of Portuguese Students of Introductory Accounting

    ERIC Educational Resources Information Center

    Teixeira, Cláudia; Gomes, Delfina; Borges, Janete

    2013-01-01

    The focus of this paper is an investigation into the approaches to studying of Portuguese students of introductory accounting using the short version of the ASSIST instrument. In doing so, it also examined the impact upon the strategy adopted of the discipline area of students and gender. The results validate the use of the inventory with students…

  13. Intergovernmental Approaches for Strengthening K-12 Accountability Systems

    ERIC Educational Resources Information Center

    Armour-Garb, Allison, Ed.

    2007-01-01

    This volume contains an edited transcript of the Rockefeller Institute's October 29, 2007 symposium (Chicago, IL) entitled "Intergovernmental Approaches to Strengthen K-12 Accountability Systems" as well as a framework paper circulated in preparation for the symposium. The transcript begins with a list of the forty state and federal education…

  14. Accountability systems and instructional approaches in youth volleyball training.

    PubMed

    Pereira, Felismina; Mesquita, Isabel; Graça, Amândio

    2009-01-01

    The purpose of this study was to examine accountability systems operating in youth volleyball training sessions and to understand how those systems vary according to the instructional tasks and the nature of the information provided by coaches. Additionally, the interactive effect of the players' age group on accountability systems and instructional tasks will be inspected. Twenty-eight youth volleyball coaches (for under 14s and under 18s) were observed, one training session each. Systematic observation strategies were used to describe and analyse task presentation and task structure during practice. Results convey that the accountability systems implemented by coaches were mainly implicit and governed by opportunity rather than explicit performance criteria imparted in task presentation. Remarks on the quality of performance only occurred during ongoing practice. More often than not coaches showed no reaction when athletes did not accomplish the tasks, failing to convey consequential expectancy-demand-monitoring messages. The instructional approach was predominantly composed of informing tasks, of technical nature and general information, which can reflect a technique and generalist coach profile. These results indicate the presence of weak and ambiguous accountability system, also corroborated by positive correlations of extending tasks with the category without exigency task presentation as well as with no reaction to unaccomplished tasks. There were no notorious differences in accountability behaviours between players' age group. Key pointsAccountability systems implemented by coaches were mainly implicit and governed by opportunity rather than explicit performance criteria imparted in task presentation. Only during practice, coaches remark the performance quality, followed by participation/effort.The instructional approach was predominantly composed of informing tasks, of technical nature and general information, which can reflect a technique and generalist

  15. Accountability Systems and Instructional Approaches in Youth Volleyball Training

    PubMed Central

    Pereira, Felismina; Mesquita, Isabel; Graça, Amândio

    2009-01-01

    The purpose of this study was to examine accountability systems operating in youth volleyball training sessions and to understand how those systems vary according to the instructional tasks and the nature of the information provided by coaches. Additionally, the interactive effect of the players’ age group on accountability systems and instructional tasks will be inspected. Twenty-eight youth volleyball coaches (for under 14s and under 18s) were observed, one training session each. Systematic observation strategies were used to describe and analyse task presentation and task structure during practice. Results convey that the accountability systems implemented by coaches were mainly implicit and governed by opportunity rather than explicit performance criteria imparted in task presentation. Remarks on the quality of performance only occurred during ongoing practice. More often than not coaches showed no reaction when athletes did not accomplish the tasks, failing to convey consequential expectancy-demand-monitoring messages. The instructional approach was predominantly composed of informing tasks, of technical nature and general information, which can reflect a technique and generalist coach profile. These results indicate the presence of weak and ambiguous accountability system, also corroborated by positive correlations of extending tasks with the category without exigency task presentation as well as with no reaction to unaccomplished tasks. There were no notorious differences in accountability behaviours between players’ age group. Key points Accountability systems implemented by coaches were mainly implicit and governed by opportunity rather than explicit performance criteria imparted in task presentation. Only during practice, coaches remark the performance quality, followed by participation/effort. The instructional approach was predominantly composed of informing tasks, of technical nature and general information, which can reflect a technique and

  16. Arkansas' Curriculum Guide. Competency Based Computerized Accounting.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock. Div. of Vocational, Technical and Adult Education.

    This guide contains the essential parts of a total curriculum for a one-year secondary-level course in computerized accounting. Addressed in the individual sections of the guide are the following topics: the complete accounting cycle, computer operations for accounting, computerized accounting and general ledgers, computerized accounts payable,…

  17. [Stewart's acid-base approach].

    PubMed

    Funk, Georg-Christian

    2007-01-01

    In addition to paCO(2), Stewart's acid base model takes into account the influence of albumin, inorganic phosphate, electrolytes and lactate on acid-base equilibrium. It allows a comprehensive and quantitative analysis of acid-base disorders. Particularly simultaneous and mixed metabolic acid-base disorders, which are common in critically ill patients, can be assessed. Stewart's approach is therefore a valuable tool in addition to the customary acid-base approach based on bicarbonate or base excess. However, some chemical aspects of Stewart's approach remain controversial.

  18. Trade-based carbon sequestration accounting.

    PubMed

    King, Dennis M

    2004-04-01

    This article describes and illustrates an accounting method to assess and compare "early" carbon sequestration investments and trades on the basis of the number of standardized CO2 emission offset credits they will provide. The "gold standard" for such credits is assumed to be a relatively riskless credit based on a CO2 emission reduction that provides offsets against CO2 emissions on a one-for-one basis. The number of credits associated with carbon sequestration needs to account for time, risk, durability, permanence, additionality, and other factors that future trade regulators will most certainly use to assign "official" credits to sequestration projects. The method that is presented here uses established principles of natural resource accounting and conventional rules of asset valuation to "score" projects. A review of 20 "early" voluntary United States based CO2 offset trades that involve carbon sequestration reveals that the assumptions that buyers, sellers, brokers, and traders are using to characterize the economic potential of their investments and trades vary enormously. The article develops a "universal carbon sequestration credit scoring equation" and uses two of these trades to illustrate the sensitivity of trade outcomes to various assumptions about how future trade auditors are likely to "score" carbon sequestration projects in terms of their "equivalency" with CO2 emission reductions. The article emphasizes the importance of using a standard credit scoring method that accounts for time and risk to assess and compare even unofficial prototype carbon sequestration trades. The scoring method illustrated in this article is a tool that can protect the integrity of carbon sequestration credit trading and can assist buyers and sellers in evaluating the real economic potential of prospective trades.

  19. Accountability for Project-Based Collaborative Learning

    ERIC Educational Resources Information Center

    Jamal, Abu-Hussain; Essawi, Mohammad; Tilchin, Oleg

    2014-01-01

    One perspective model for the creation of the learning environment and engendering students' thinking development is the Project-Based Collaborative Learning (PBCL) model. This model organizes learning by collaborative performance of various projects. In this paper we describe an approach to enhancing the PBCL model through the creation of…

  20. The Value of Information: Approaches in Economics, Accounting, and Management Science.

    ERIC Educational Resources Information Center

    Repo, Aatto J.

    1989-01-01

    This review and analysis of research on the economics of information performed by economists, accounting researchers, and management scientists focuses on their approaches to describing and measuring the value of information. The discussion includes comparisons of research approaches based on cost effectiveness and on the value of information. (77…

  1. Validating a mass balance accounting approach to using 7Be measurements to estimate event-based erosion rates over an extended period at the catchment scale

    NASA Astrophysics Data System (ADS)

    Porto, Paolo; Walling, Des E.; Cogliandro, Vanessa; Callegari, Giovanni

    2016-07-01

    Use of the fallout radionuclides cesium-137 and excess lead-210 offers important advantages over traditional methods of quantifying erosion and soil redistribution rates. However, both radionuclides provide information on longer-term (i.e., 50-100 years) average rates of soil redistribution. Beryllium-7, with its half-life of 53 days, can provide a basis for documenting short-term soil redistribution and it has been successfully employed in several studies. However, the approach commonly used introduces several important constraints related to the timing and duration of the study period. A new approach proposed by the authors that overcomes these constraints has been successfully validated using an erosion plot experiment undertaken in southern Italy. Here, a further validation exercise undertaken in a small (1.38 ha) catchment is reported. The catchment was instrumented to measure event sediment yields and beryllium-7 measurements were employed to document the net soil loss for a series of 13 events that occurred between November 2013 and June 2015. In the absence of significant sediment storage within the catchment's ephemeral channel system and of a significant contribution from channel erosion to the measured sediment yield, the estimates of net soil loss for the individual events could be directly compared with the measured sediment yields to validate the former. The close agreement of the two sets of values is seen as successfully validating the use of beryllium-7 measurements and the new approach to obtain estimates of net soil loss for a sequence of individual events occurring over an extended period at the scale of a small catchment.

  2. Trading Land: A Review of Approaches to Accounting for Upstream Land Requirements of Traded Products

    PubMed Central

    Haberl, Helmut; Kastner, Thomas; Wiedenhofer, Dominik; Eisenmenger, Nina; Erb, Karl‐Heinz

    2015-01-01

    Summary Land use is recognized as a pervasive driver of environmental impacts, including climate change and biodiversity loss. Global trade leads to “telecoupling” between the land use of production and the consumption of biomass‐based goods and services. Telecoupling is captured by accounts of the upstream land requirements associated with traded products, also commonly referred to as land footprints. These accounts face challenges in two main areas: (1) the allocation of land to products traded and consumed and (2) the metrics to account for differences in land quality and land‐use intensity. For two main families of accounting approaches (biophysical, factor‐based and environmentally extended input‐output analysis), this review discusses conceptual differences and compares results for land footprints. Biophysical approaches are able to capture a large number of products and different land uses, but suffer from a truncation problem. Economic approaches solve the truncation problem, but are hampered by the limited disaggregation of sectors and products. In light of the conceptual differences, the overall similarity of results generated by both types of approaches is remarkable. Diametrically opposed results for some of the world's largest producers and consumers of biomass‐based products, however, make interpretation difficult. This review aims to provide clarity on some of the underlying conceptual issues of accounting for land footprints. PMID:27547028

  3. Trading Land: A Review of Approaches to Accounting for Upstream Land Requirements of Traded Products.

    PubMed

    Schaffartzik, Anke; Haberl, Helmut; Kastner, Thomas; Wiedenhofer, Dominik; Eisenmenger, Nina; Erb, Karl-Heinz

    2015-10-01

    Land use is recognized as a pervasive driver of environmental impacts, including climate change and biodiversity loss. Global trade leads to "telecoupling" between the land use of production and the consumption of biomass-based goods and services. Telecoupling is captured by accounts of the upstream land requirements associated with traded products, also commonly referred to as land footprints. These accounts face challenges in two main areas: (1) the allocation of land to products traded and consumed and (2) the metrics to account for differences in land quality and land-use intensity. For two main families of accounting approaches (biophysical, factor-based and environmentally extended input-output analysis), this review discusses conceptual differences and compares results for land footprints. Biophysical approaches are able to capture a large number of products and different land uses, but suffer from a truncation problem. Economic approaches solve the truncation problem, but are hampered by the limited disaggregation of sectors and products. In light of the conceptual differences, the overall similarity of results generated by both types of approaches is remarkable. Diametrically opposed results for some of the world's largest producers and consumers of biomass-based products, however, make interpretation difficult. This review aims to provide clarity on some of the underlying conceptual issues of accounting for land footprints.

  4. Accountability.

    ERIC Educational Resources Information Center

    Lashway, Larry

    1999-01-01

    This issue reviews publications that provide a starting point for principals looking for a way through the accountability maze. Each publication views accountability differently, but collectively these readings argue that even in an era of state-mandated assessment, principals can pursue proactive strategies that serve students' needs. James A.…

  5. Challenges of "Thinking Differently" with Rhizoanalytic Approaches: A Reflexive Account

    ERIC Educational Resources Information Center

    Cumming, Tamara

    2015-01-01

    Growing numbers of educational researchers are using rhizoanalytic approaches based on the work of Deleuze and Guattari to think differently in their research practices. However, as those engaging in debates about post-qualitative research suggest, thinking differently is not without its challenges. This paper uses three complex challenges…

  6. Teaching Consolidations Accounting: An Approach to Easing the Challenge

    ERIC Educational Resources Information Center

    Murphy, Elizabeth A.; McCarthy, Mark A.

    2010-01-01

    Teaching and learning accounting for consolidations is a challenging endeavor. Students not only need to understand the conceptual underpinnings of the accounting requirements for consolidations, but also must master the complex accounting needed to prepare consolidated financial statements. To add to the challenge, the consolidation process is…

  7. Towards ecosystem accounting: a comprehensive approach to modelling multiple hydrological ecosystem services

    NASA Astrophysics Data System (ADS)

    Duku, C.; Rathjens, H.; Zwart, S. J.; Hein, L.

    2015-10-01

    Ecosystem accounting is an emerging field that aims to provide a consistent approach to analysing environment-economy interactions. One of the specific features of ecosystem accounting is the distinction between the capacity and the flow of ecosystem services. Ecohydrological modelling to support ecosystem accounting requires considering among others physical and mathematical representation of ecohydrological processes, spatial heterogeneity of the ecosystem, temporal resolution, and required model accuracy. This study examines how a spatially explicit ecohydrological model can be used to analyse multiple hydrological ecosystem services in line with the ecosystem accounting framework. We use the Upper Ouémé watershed in Benin as a test case to demonstrate our approach. The Soil Water and Assessment Tool (SWAT), which has been configured with a grid-based landscape discretization and further enhanced to simulate water flow across the discretized landscape units, is used to simulate the ecohydrology of the Upper Ouémé watershed. Indicators consistent with the ecosystem accounting framework are used to map and quantify the capacities and the flows of multiple hydrological ecosystem services based on the model outputs. Biophysical ecosystem accounts are subsequently set up based on the spatial estimates of hydrological ecosystem services. In addition, we conduct trend analysis statistical tests on biophysical ecosystem accounts to identify trends in changes in the capacity of the watershed ecosystems to provide service flows. We show that the integration of hydrological ecosystem services into an ecosystem accounting framework provides relevant information on ecosystems and hydrological ecosystem services at appropriate scales suitable for decision-making.

  8. Accounting for Linkage Disequilibrium in genome scans for selection without individual genotypes: the local score approach.

    PubMed

    Fariello, María Inés; Boitard, Simon; Mercier, Sabine; Robelin, David; Faraut, Thomas; Arnould, Cécile; Recoquillay, Julien; Bouchez, Olivier; Salin, Gérald; Dehais, Patrice; Gourichon, David; Leroux, Sophie; Pitel, Frédérique; Leterrier, Christine; SanCristobal, Magali

    2017-04-10

    Detecting genomic footprints of selection is an important step in the understanding of evolution. Accounting for linkage disequilibrium in genome scans increases detection power, but haplotype-based methods require individual genotypes and are not applicable on pool-sequenced samples. We propose to take advantage of the local score approach to account for linkage disequilibrium in genome scans for selection, cumulating (possibly small) signals from single markers over a genomic segment, to clearly pinpoint a selection signal. Using computer simulations, we demonstrate that this approach detects selection with higher power than several state-of-the-art single marker, windowing or haplotype-based approaches. We illustrate this on two benchmark data sets including individual genotypes, for which we obtain similar results with the local score and one haplotype-based approach. Finally, we apply the local score approach to Pool-Seq data obtained from a divergent selection experiment on behavior in quail, and obtain precise and biologically coherent selection signals: while competing methods fail to highlight any clear selection signature, our method detects several regions involving genes known to act on social responsiveness or autistic traits. Although we focus here on the detection of positive selection from multiple population data, the local score approach is general and can be applied to other genome scans for selection or other genome-wide analyses such as GWAS. This article is protected by copyright. All rights reserved.

  9. A Humanistic Approach to South African Accounting Education

    ERIC Educational Resources Information Center

    West, A.; Saunders, S.

    2006-01-01

    Humanistic psychologist Carl Rogers made a distinction between traditional approaches and humanistic "learner-centred" approaches to education. The traditional approach holds that educators impart their knowledge to willing and able recipients; whereas the humanistic approach holds that educators act as facilitators who assist learners…

  10. 12 CFR 563b.465 - Do account holders retain any voting rights based on their liquidation sub-accounts?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... based on their liquidation sub-accounts? 563b.465 Section 563b.465 Banks and Banking OFFICE OF THRIFT... Account § 563b.465 Do account holders retain any voting rights based on their liquidation sub-accounts... their liquidation sub-accounts....

  11. Accounting for pairwise distance restraints in FFT-based protein-protein docking.

    PubMed

    Xia, Bing; Vajda, Sandor; Kozakov, Dima

    2016-11-01

    ClusPro is a heavily used protein-protein docking server based on the fast Fourier transform (FFT) correlation approach. While FFT enables global docking, accounting for pairwise distance restraints using penalty terms in the scoring function is computationally expensive. We use a different approach and directly select low energy solutions that also satisfy the given restraints. As expected, accounting for restraints generally improves the rank of near native predictions, while retaining or even improving the numerical efficiency of FFT based docking.

  12. Focus on the Customer: A New Approach to State-Level Accountability Reporting and Processes for Higher Education.

    ERIC Educational Resources Information Center

    Ruppert, Sandra S.

    This paper outlines the dimensions of a customer-focused system of accountability and describes approaches taken at the state level to respond to the information needs of a broader client base for higher education. Section 1 traces current trends in the development and implementation of state-level accountability policy for higher education. It…

  13. Evaluating an Accountability Mentoring Approach for School Counselors

    ERIC Educational Resources Information Center

    Milsom, Amy; McCormick, Katlyn

    2015-01-01

    School counselors are encouraged to use accountability in order to advocate for their programs and students, but many school counselors lack confidence to work with data. This project examined the effectiveness of an individualized mentoring intervention targeting data attitudes, self-efficacy, and behaviors. After participating in the…

  14. Educational Accountability: A Qualitatively Driven Mixed-Methods Approach

    ERIC Educational Resources Information Center

    Hall, Jori N.; Ryan, Katherine E.

    2011-01-01

    This article discusses the importance of mixed-methods research, in particular the value of qualitatively driven mixed-methods research for quantitatively driven domains like educational accountability. The article demonstrates the merits of qualitative thinking by describing a mixed-methods study that focuses on a middle school's system of…

  15. Community-Based School Finance and Accountability: A New Era for Local Control in Education Policy?

    ERIC Educational Resources Information Center

    Vasquez Heilig, Julian; Ward, Derrick R.; Weisman, Eric; Cole, Heather

    2014-01-01

    Top-down accountability policies have arguably had very limited impact over the past 20 years. Education stakeholders are now contemplating new forms of bottom-up accountability. In 2013, policymakers in California enacted a community-based approach that creates the Local Control Funding Formula (LCFF) process for school finance to increase…

  16. Two-stage decision approach to material accounting

    SciTech Connect

    Opelka, J.H.; Sutton, W.B.

    1982-01-01

    The validity of the alarm threshold 4sigma has been checked for hypothetical large and small facilities using a two-stage decision model in which the diverter's strategic variable is the quantity diverted, and the defender's strategic variables are the alarm threshold and the effectiveness of the physical security and material control systems in the possible presence of a diverter. For large facilities, the material accounting system inherently appears not to be a particularly useful system for the deterrence of diversions, and essentially no improvement can be made by lowering the alarm threshold below 4sigma. For small facilities, reduction of the threshold to 2sigma or 3sigma is a cost effective change for the accounting system, but is probably less cost effective than making improvements in the material control and physical security systems.

  17. Test-Based Teacher Evaluations: Accountability vs. Responsibility

    ERIC Educational Resources Information Center

    Bolyard, Chloé

    2015-01-01

    Gert Biesta contends that managerial accountability, which focuses on efficiency and competition, dominates the current political arena in education. Such accountability has influenced states' developments of test-based teacher evaluations in an attempt to quantify teachers' efficacy on student learning. With numerous state policies requiring the…

  18. Student Accountability in Team-Based Learning Classes

    ERIC Educational Resources Information Center

    Stein, Rachel E.; Colyer, Corey J.; Manning, Jason

    2016-01-01

    Team-based learning (TBL) is a form of small-group learning that assumes stable teams promote accountability. Teamwork promotes communication among members; application exercises promote active learning. Students must prepare for each class; failure to do so harms their team's performance. Therefore, TBL promotes accountability. As part of the…

  19. Financial Accounting System Based Upon NCES Revised Handbook II.

    ERIC Educational Resources Information Center

    National Center for Education Statistics (DHEW), Washington, DC. Educational Data Standards Branch.

    This publication describes the development and implementation of a school district financial accounting system based on the concepts and guidelines of the National Center for Education Statistics Handbook II, Revised. The system described was designed by school district personnel to utilize computer equipment and to meet the accounting and…

  20. Major Approaches to Music Education: An Account of Method.

    ERIC Educational Resources Information Center

    Shehan, Patricia K.

    1986-01-01

    In a continuing effort to improve the music education of students in beginning stages, there is a need for the review of teaching techniques that motivate student learning behaviors. Historically, music methods actively engaged students in the music-making process. The approaches of Dalcroze, Orff, Suzuki, Kodaly, and Gordon continue that…

  1. Developing Accounting Students' Listening Skills: Barriers, Opportunities and an Integrated Stakeholder Approach

    ERIC Educational Resources Information Center

    Stone, Gerard; Lightbody, Margaret; Whait, Rob

    2013-01-01

    Accountants and employers of accounting graduates consider listening to be among the most important communication skills that graduates possess. However, accounting education practices that develop students' listening skills are uncommon. Further, in the case of listening development, the current approach of prescribing that educators do more to…

  2. A Road to Results: Results-Based Accountability in the Annie E. Casey Foundation's Education Program

    ERIC Educational Resources Information Center

    Manno, Bruno V.

    2006-01-01

    This report details the Annie E. Casey Foundation's four-year effort to develop a "results-based accountability" (RBA) approach to its K-12 education portfolio. Though still a work in progress, the Foundation's experience with RBA can help other philanthropic organizations and individual donors develop their own approaches to producing and…

  3. Measuring Resources in Education: From Accounting to the Resource Cost Model Approach. Working Paper Series.

    ERIC Educational Resources Information Center

    Chambers, Jay G.

    This report describes two alternative approaches to measuring resources in K-12 education. One approach relies heavily on traditional accounting data, whereas the other draws on detailed information about the jobs and assignments of individual school personnel. It outlines the differences between accounting and economics and discusses how each…

  4. The Impact of a Participant-Based Accounting Cycle Course on Student Performance in Intermediate Financial Accounting I

    ERIC Educational Resources Information Center

    Siagian, Ferdinand T.; Khan, Mohammad

    2016-01-01

    The authors investigated whether students in an Intermediate Financial Accounting I course who took a 1-credit, participant-based accounting cycle course performed better than students who did not take the accounting cycle course. Results indicate a higher likelihood of earning a better grade for students who took the accounting cycle course even…

  5. The utilization of activity-based cost accounting in hospitals.

    PubMed

    Emmett, Dennis; Forget, Robert

    2005-01-01

    Healthcare costs are being examined on all fronts. Healthcare accounts for 11% of the gross national product and will continue to rise as the "babyboomers" reach retirement age. While ascertaining costs is important, most research shows that costing methods have not been implemented in hospitals. This study is concerned with the use of costing methods; particularly activity-based cost accounting. A mail survey of CFOs was undertaken to determine the type of cost accounting method they use. In addition, they were asked whether they were aware of activity-based cost accounting and whether they had implemented it or were planning to implement it. Only 71.8% were aware of it and only 4.7% had implemented it. In addition, only 52% of all hospitals report using any cost accounting systems. Education needs to ensure that all healthcare executives are cognizant of activity-based accounting and its importance in determining costs. Only by determining costs can hospitals strive to contain them.

  6. Accounting for Parameter Uncertainty in Reservoir Uncertainty Assessment: The Conditional Finite-Domain Approach

    SciTech Connect

    Babak, Olena Deutsch, Clayton V.

    2009-03-15

    An important aim of modern geostatistical modeling is to quantify uncertainty in geological systems. Geostatistical modeling requires many input parameters. The input univariate distribution or histogram is perhaps the most important. A new method for assessing uncertainty in the histogram, particularly uncertainty in the mean, is presented. This method, referred to as the conditional finite-domain (CFD) approach, accounts for the size of the domain and the local conditioning data. It is a stochastic approach based on a multivariate Gaussian distribution. The CFD approach is shown to be convergent, design independent, and parameterization invariant. The performance of the CFD approach is illustrated in a case study focusing on the impact of the number of data and the range of correlation on the limiting uncertainty in the parameters. The spatial bootstrap method and CFD approach are compared. As the number of data increases, uncertainty in the sample mean decreases in both the spatial bootstrap and the CFD. Contrary to spatial bootstrap, uncertainty in the sample mean in the CFD approach decreases as the range of correlation increases. This is a direct result of the conditioning data being more correlated to unsampled locations in the finite domain. The sensitivity of the limiting uncertainty relative to the variogram and the variable limits are also discussed.

  7. Interpolation method taking into account inequality constraints. II. Practical approach

    SciTech Connect

    Kostov, C.; Dubrule, O.

    1986-01-01

    Common features of models for interpolation, consistent with a finite number of inequality constraints on the range of values of a variable z, are discussed. A method based on constrained quadratic minimization yielding kriging estimates when no constraints exist, is presented. A computationally efficient formulation of quadratic minimization is obtained by using results on duality in quadratic programming. Relevant properties of the optimal interpolator are derived in a simple, self-contained way. The method is applied to mapping of horizon depth and estimation of thickness of an oil-bearing formation.

  8. A New Approach to Accountability: Creating Effective Learning Environments for Programs

    ERIC Educational Resources Information Center

    Surr, Wendy

    2012-01-01

    This article describes a new paradigm for accountability that envisions afterschool programs as learning organizations continually engaged in improving quality. Nearly 20 years into the era of results-based accountability, a new generation of afterschool accountability systems is emerging. Rather than aiming to test whether programs have produced…

  9. Knuckling Under? School Superintendents and Accountability-Based Educational Reform

    ERIC Educational Resources Information Center

    Feuerstein, Abe

    2013-01-01

    The goal of this article is to explore the various ways that superintendents have responded to accountability-based educational reform efforts such as No Child Left Behind, the factors that have influenced their responses, and the implications of these responses for current and future educational leaders. With respect to the first issue, empirical…

  10. Digital Game-Based Learning in Accounting and Business Education

    ERIC Educational Resources Information Center

    Carenys, Jordi; Moya, Soledad

    2016-01-01

    This article presents a review of the accounting and business literature on digital game-based learning (DGBL). The article classifies what is already settled in the literature about the theoretical foundations of DGBL's effectiveness and its practical use into three categories. The first comprises what is known about the evaluation of digital…

  11. Performance-Based Accountability in Qatar: A State in Progress

    ERIC Educational Resources Information Center

    Jaafar, Sonia Ben

    2011-01-01

    It has become a normative practice to include Performance-Based Accountability (PBA) policies in educational reforms to foster school changes that enhance student learning and success. There is considerable variation in PBA models that have an important impact on how they operate in schools. It is, therefore, important to characterize PBA models…

  12. Ontology-Based e-Assessment for Accounting Education

    ERIC Educational Resources Information Center

    Litherland, Kate; Carmichael, Patrick; Martínez-García, Agustina

    2013-01-01

    This summary reports on a pilot of a novel, ontology-based e-assessment system in accounting. The system, OeLe, uses emerging semantic technologies to offer an online assessment environment capable of marking students' free text answers to questions of a conceptual nature. It does this by matching their response with a "concept map" or…

  13. Water Accounting Plus (WA+) - a water accounting procedure for complex river basins based on satellite measurements

    NASA Astrophysics Data System (ADS)

    Karimi, P.; Bastiaanssen, W. G. M.; Molden, D.

    2012-11-01

    Coping with the issue of water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links hydrological flows to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper we introduce Water Accounting Plus (WA+), which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use on the water cycle is described explicitly by defining land use groups with common characteristics. Analogous to financial accounting, WA+ presents four sheets including (i) a resource base sheet, (ii) a consumption sheet, (iii) a productivity sheet, and (iv) a withdrawal sheet. Every sheet encompasses a set of indicators that summarize the overall water resources situation. The impact of external (e.g. climate change) and internal influences (e.g. infrastructure building) can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used for 3 out of the 4 sheets, but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  14. Failing Tests: Commentary on "Adapting Educational Measurement to the Demands of Test-Based Accountability"

    ERIC Educational Resources Information Center

    Thissen, David

    2015-01-01

    In "Adapting Educational Measurement to the Demands of Test-Based Accountability" Koretz takes the time-honored engineering approach to educational measurement, identifying specific problems with current practice and proposing minimal modifications of the system to alleviate those problems. In response to that article, David Thissen…

  15. An Inter-Institutional Exploration of the Learning Approaches of Students Studying Accounting

    ERIC Educational Resources Information Center

    Byrne, Marann; Flood, Barbara; Willis, Pauline

    2009-01-01

    This paper provides a comparative analysis of the learning approaches of students taking their first course in accounting at a United States or an Irish university. The data for this study was gathered from 204 students in the U.S. and 309 in Ireland, using the Approaches and Study Skills Inventory for Students (ASSIST, 1997) which measures…

  16. Accounting for the effects of surface BRDF on satellite cloud and trace-gas retrievals: a new approach based on geometry-dependent Lambertian equivalent reflectivity applied to OMI algorithms

    NASA Astrophysics Data System (ADS)

    Vasilkov, Alexander; Qin, Wenhan; Krotkov, Nickolay; Lamsal, Lok; Spurr, Robert; Haffner, David; Joiner, Joanna; Yang, Eun-Su; Marchenko, Sergey

    2017-01-01

    Most satellite nadir ultraviolet and visible cloud, aerosol, and trace-gas algorithms make use of climatological surface reflectivity databases. For example, cloud and NO2 retrievals for the Ozone Monitoring Instrument (OMI) use monthly gridded surface reflectivity climatologies that do not depend upon the observation geometry. In reality, reflection of incoming direct and diffuse solar light from land or ocean surfaces is sensitive to the sun-sensor geometry. This dependence is described by the bidirectional reflectance distribution function (BRDF). To account for the BRDF, we propose to use a new concept of geometry-dependent Lambertian equivalent reflectivity (LER). Implementation within the existing OMI cloud and NO2 retrieval infrastructure requires changes only to the input surface reflectivity database. The geometry-dependent LER is calculated using a vector radiative transfer model with high spatial resolution BRDF information from the Moderate Resolution Imaging Spectroradiometer (MODIS) over land and the Cox-Munk slope distribution over ocean with a contribution from water-leaving radiance. We compare the geometry-dependent and climatological LERs for two wavelengths, 354 and 466 nm, that are used in OMI cloud algorithms to derive cloud fractions. A detailed comparison of the cloud fractions and pressures derived with climatological and geometry-dependent LERs is carried out. Geometry-dependent LER and corresponding retrieved cloud products are then used as inputs to our OMI NO2 algorithm. We find that replacing the climatological OMI-based LERs with geometry-dependent LERs can increase NO2 vertical columns by up to 50 % in highly polluted areas; the differences include both BRDF effects and biases between the MODIS and OMI-based surface reflectance data sets. Only minor changes to NO2 columns (within 5 %) are found over unpolluted and overcast areas.

  17. Accounting for the Effects of Surface BRDF on Satellite Cloud and Trace-Gas Retrievals: A New Approach Based on Geometry-Dependent Lambertian-Equivalent Reflectivity Applied to OMI Algorithms

    NASA Technical Reports Server (NTRS)

    Vasilkov, Alexander; Qin, Wenhan; Krotkov, Nickolay; Lamsal, Lok; Spurr, Robert; Haffner, David; Joiner, Joanna; Yang, Eun-Su; Marchenko, Sergey

    2017-01-01

    Most satellite nadir ultraviolet and visible cloud, aerosol, and trace-gas algorithms make use of climatological surface reflectivity databases. For example, cloud and NO2 retrievals for the Ozone Monitoring Instrument (OMI) use monthly gridded surface reflectivity climatologies that do not depend upon the observation geometry. In reality, reflection of incoming direct and diffuse solar light from land or ocean surfaces is sensitive to the sun-sensor geometry. This dependence is described by the bidirectional reflectance distribution function (BRDF). To account for the BRDF, we propose to use a new concept of geometry-dependent Lambertian equivalent reflectivity (LER). Implementation within the existing OMI cloud and NO2 retrieval infrastructure requires changes only to the input surface reflectivity database. The geometry-dependent LER is calculated using a vector radiative transfer model with high spatial resolution BRDF information from the Moderate Resolution Imaging Spectroradiometer (MODIS) over land and the Cox-Munk slope distribution over ocean with a contribution from water-leaving radiance. We compare the geometry-dependent and climatological LERs for two wavelengths, 354 and 466 nm, that are used in OMI cloud algorithms to derive cloud fractions. A detailed comparison of the cloud fractions and pressures derived with climatological and geometry-dependent LERs is carried out. Geometry-dependent LER and corresponding retrieved cloud products are then used as inputs to our OMI NO2 algorithm. We find that replacing the climatological OMI-based LERs with geometry-dependent LERs can increase NO2 vertical columns by up to 50% in highly polluted areas; the differences include both BRDF effects and biases between the MODIS and OMI-based surface reflectance data sets. Only minor changes to NO2 columns (within 5 %) are found over unpolluted and overcast areas.

  18. A Comparison of Four Approaches to Account for Method Effects in Latent State-Trait Analyses

    PubMed Central

    Geiser, Christian; Lockhart, Ginger

    2012-01-01

    Latent state-trait (LST) analysis is frequently applied in psychological research to determine the degree to which observed scores reflect stable person-specific effects, effects of situations and/or person-situation interactions, and random measurement error. Most LST applications use multiple repeatedly measured observed variables as indicators of latent trait and latent state residual factors. In practice, such indicators often show shared indicator-specific (or methods) variance over time. In this article, the authors compare four approaches to account for such method effects in LST models and discuss the strengths and weaknesses of each approach based on theoretical considerations, simulations, and applications to actual data sets. The simulation study revealed that the LST model with indicator-specific traits (Eid, 1996) and the LST model with M − 1 correlated method factors (Eid, Schneider, & Schwenkmezger, 1999) performed well, whereas the model with M orthogonal method factors used in the early work of Steyer, Ferring, and Schmitt (1992) and the correlated uniqueness approach (Kenny, 1976) showed limitations under conditions of either low or high method-specificity. Recommendations for the choice of an appropriate model are provided. PMID:22309958

  19. A comparison of four approaches to account for method effects in latent state-trait analyses.

    PubMed

    Geiser, Christian; Lockhart, Ginger

    2012-06-01

    Latent state-trait (LST) analysis is frequently applied in psychological research to determine the degree to which observed scores reflect stable person-specific effects, effects of situations and/or person-situation interactions, and random measurement error. Most LST applications use multiple repeatedly measured observed variables as indicators of latent trait and latent state residual factors. In practice, such indicators often show shared indicator-specific (or method) variance over time. In this article, the authors compare 4 approaches to account for such method effects in LST models and discuss the strengths and weaknesses of each approach based on theoretical considerations, simulations, and applications to actual data sets. The simulation study revealed that the LST model with indicator-specific traits (Eid, 1996) and the LST model with M - 1 correlated method factors (Eid, Schneider, & Schwenkmezger, 1999) performed well, whereas the model with M orthogonal method factors used in the early work of Steyer, Ferring, and Schmitt (1992) and the correlated uniqueness approach (Kenny, 1976) showed limitations under conditions of either low or high method-specificity. Recommendations for the choice of an appropriate model are provided.

  20. Ecological accounting based on extended exergy: a sustainability perspective.

    PubMed

    Dai, Jing; Chen, Bin; Sciubba, Enrico

    2014-08-19

    The excessive energy consumption, environmental pollution, and ecological destruction problems have gradually become huge obstacles for the development of societal-economic-natural complex ecosystems. Regarding the national ecological-economic system, how to make explicit the resource accounting, diagnose the resource conversion, and measure the disturbance of environmental emissions to the systems are the fundamental basis of sustainable development and coordinated management. This paper presents an extended exergy (EE) accounting including the material exergy and exergy equivalent of externalities consideration in a systematic process from production to consumption, and China in 2010 is chosen as a case study to foster an in-depth understanding of the conflict between high-speed development and the available resources. The whole society is decomposed into seven sectors (i.e., Agriculture, Extraction, Conversion, Industry, Transportation, Tertiary, and Domestic sectors) according to their distinct characteristics. An adaptive EE accounting database, which incorporates traditional energy, renewable energy, mineral element, and other natural resources as well as resource-based secondary products, is constructed on the basis of the internal flows in the system. In addition, the environmental emission accounting has been adjusted to calculate the externalities-equivalent exergy. The results show that the EE value for the year 2010 in China was 1.80 × 10(14) MJ, which is greatly increased. Furthermore, an EE-based sustainability indices system has been established to provide an epitomized exploration for evaluating the performance of flows and storages with the system from a sustainability perspective. The value of the EE-based sustainability indicator was calculated to be 0.23, much lower than the critical value of 1, implying that China is still developing in the stages of high energy consumption and a low sustainability level.

  1. Consumption-based accounting of CO2 emissions

    PubMed Central

    Davis, Steven J.; Caldeira, Ken

    2010-01-01

    CO2 emissions from the burning of fossil fuels are the primary cause of global warming. Much attention has been focused on the CO2 directly emitted by each country, but relatively little attention has been paid to the amount of emissions associated with the consumption of goods and services in each country. Consumption-based accounting of CO2 emissions differs from traditional, production-based inventories because of imports and exports of goods and services that, either directly or indirectly, involve CO2 emissions. Here, using the latest available data, we present a global consumption-based CO2 emissions inventory and calculations of associated consumption-based energy and carbon intensities. We find that, in 2004, 23% of global CO2 emissions, or 6.2 gigatonnes CO2, were traded internationally, primarily as exports from China and other emerging markets to consumers in developed countries. In some wealthy countries, including Switzerland, Sweden, Austria, the United Kingdom, and France, >30% of consumption-based emissions were imported, with net imports to many Europeans of >4 tons CO2 per person in 2004. Net import of emissions to the United States in the same year was somewhat less: 10.8% of total consumption-based emissions and 2.4 tons CO2 per person. In contrast, 22.5% of the emissions produced in China in 2004 were exported, on net, to consumers elsewhere. Consumption-based accounting of CO2 emissions demonstrates the potential for international carbon leakage. Sharing responsibility for emissions among producers and consumers could facilitate international agreement on global climate policy that is now hindered by concerns over the regional and historical inequity of emissions. PMID:20212122

  2. Consumption-based accounting of CO2 emissions.

    PubMed

    Davis, Steven J; Caldeira, Ken

    2010-03-23

    CO(2) emissions from the burning of fossil fuels are the primary cause of global warming. Much attention has been focused on the CO(2) directly emitted by each country, but relatively little attention has been paid to the amount of emissions associated with the consumption of goods and services in each country. Consumption-based accounting of CO(2) emissions differs from traditional, production-based inventories because of imports and exports of goods and services that, either directly or indirectly, involve CO(2) emissions. Here, using the latest available data, we present a global consumption-based CO(2) emissions inventory and calculations of associated consumption-based energy and carbon intensities. We find that, in 2004, 23% of global CO(2) emissions, or 6.2 gigatonnes CO(2), were traded internationally, primarily as exports from China and other emerging markets to consumers in developed countries. In some wealthy countries, including Switzerland, Sweden, Austria, the United Kingdom, and France, >30% of consumption-based emissions were imported, with net imports to many Europeans of >4 tons CO(2) per person in 2004. Net import of emissions to the United States in the same year was somewhat less: 10.8% of total consumption-based emissions and 2.4 tons CO(2) per person. In contrast, 22.5% of the emissions produced in China in 2004 were exported, on net, to consumers elsewhere. Consumption-based accounting of CO(2) emissions demonstrates the potential for international carbon leakage. Sharing responsibility for emissions among producers and consumers could facilitate international agreement on global climate policy that is now hindered by concerns over the regional and historical inequity of emissions.

  3. Implications of Climate Change for State Bioassessment Programs and Approaches to Account for Effects (Final Report)

    EPA Science Inventory

    EPA announced the availability of the final report, Implications of Climate Change for State Bioassessment Programs and Approaches to Account for Effects. This report uses biological data collected by four states in wadeable rivers and streams to examine the components ...

  4. A Comparison of the Learning Approaches of Accounting and Science Students at an Irish University

    ERIC Educational Resources Information Center

    Byrne, Marann; Finlayson, Odilla; Flood, Barbara; Lyons, Orla; Willis, Pauline

    2010-01-01

    One of the major challenges facing accounting education is the creation of a learning environment that promotes high-quality learning. Comparative research across disciplines offers educators the opportunity to gain a better understanding of the influence of contextual and personal variables on students' learning approaches. Using the Approaches…

  5. The Army’s Approach to Property Accountability: A Strategic Assessment

    DTIC Science & Technology

    2012-02-07

    States Army Dr. Richard M. Meinhart Project Adviser This SRP is submitted in partial fulfillment of the requirements of the...The Army’s Approach to Property Accountability: A Strategic Assessment by Colonel Thomas Rivard United States Army ...United States Army War College Class of 2012 DISTRIBUTION STATEMENT: A Approved for Public Release Distribution is Unlimited This

  6. 76 FR 20974 - Implications of Climate Change for Bioassessment Programs and Approaches To Account for Effects

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-14

    ... AGENCY Implications of Climate Change for Bioassessment Programs and Approaches To Account for Effects... by climate change. The study (1) Investigates the potential to identify biological response signals to climate change within existing bioassessment data from Maine, North Carolina, Ohio, and Utah;...

  7. Pension Accounting and Reporting with Other Comprehensive Income and Deferred Taxes: A Worksheet Approach

    ERIC Educational Resources Information Center

    Jackson, Robert E.; Sneathen, L. Dwight, Jr.; Veal, Timothy R.

    2012-01-01

    This instructional tool presents pension accounting using a worksheet approach where debits equal credits for both the employer and for the plan. Transactions associated with the initiation of the plan through the end of the second year of the plan are presented, including their impact on accumulated other comprehensive income and deferred taxes.…

  8. Impact of Entry Mode on Students' Approaches to Learning: A Study of Accounting Students

    ERIC Educational Resources Information Center

    Abhayawansa, Subhash; Tempone, Irene; Pillay, Soma

    2012-01-01

    This study examines the impact of prior learning experience on students' approaches to learning (SAL). It compares SAL of accounting students admitted to university in Australia on the basis of Institutes of Technical and Further Education (TAFE) qualifications (TAFE-to-university) and through direct entry mode (Year 12-to-university). The…

  9. An Internet-Based Accounting Information Systems Project

    ERIC Educational Resources Information Center

    Miller, Louise

    2012-01-01

    This paper describes a student project assignment used in an accounting information systems course. We are now truly immersed in the internet age, and while many required accounting information systems courses and textbooks introduce database design, accounting software development, cloud computing, and internet security, projects involving the…

  10. Skull base approaches in neurosurgery

    PubMed Central

    2010-01-01

    The skull base surgery is one of the most demanding surgeries. There are different structures that can be injured easily, by operating in the skull base. It is very important for the neurosurgeon to choose the right approach in order to reach the lesion without harming the other intact structures. Due to the pioneering work of Cushing, Hirsch, Yasargil, Krause, Dandy and other dedicated neurosurgeons, it is possible to address the tumor and other lesions in the anterior, the mid-line and the posterior cranial base. With the transsphenoidal, the frontolateral, the pterional and the lateral suboccipital approach nearly every region of the skull base is exposable. In the current state many different skull base approaches are described for various neurosurgical diseases during the last 20 years. The selection of an approach may differ from country to country, e.g., in the United States orbitozygomaticotomy for special lesions of the anterior skull base or petrosectomy for clivus meningiomas, are found more frequently than in Europe. The reason for writing the review was the question: Are there keyhole approaches with which someone can deal with a vast variety of lesions in the neurosurgical field? In my opinion the different surgical approaches mentioned above cover almost 95% of all skull base tumors and lesions. In the following text these approaches will be described. These approaches are: 1) pterional approach 2) frontolateral approach 3) transsphenoidal approach 4) suboccipital lateral approach These approaches can be extended and combined with each other. In the following we want to enhance this philosophy. PMID:20602753

  11. Demetrius Cantemir: the first account of transabodominal approach to repair groin hernia.

    PubMed

    Nicolau, A E

    2009-01-01

    The first description of the transabdominal approach for hernia repair was written by Demetrius Cantemir, Prince of Moldavia and encyclopedic scholar, in his 1716 Latin manuscript "Incrementa et decrementa Aulae Othomaniae" ("The history of the Growth and Decay of the Ottoman Empire"). This manuscript was one of the most important in Eastern Europe at the time. It was first translated in English in 1734 by N. Tyndal, and all subsequent translations into various other languages were based on this english version. The original manuscript now belongs to the Houghton Library of Harvard University, where it was recently discovered in 1984 by V. Candea. Our article presents for the first time the complete account of the surgical procedure performed by Albanian physicians in the prince's palace in Constantinopol. The patient was the Prince's secretary. There is a detailed description of the operation, postoperative therapy and the medical course to recovery. The text presented is translated in English from Annotations of Volume Two, chapter four. We consider it worthwhile to present to the medical community this valuable but less known contribution to the history of medicine.

  12. Communication: A combined periodic density functional and incremental wave-function-based approach for the dispersion-accounting time-resolved dynamics of {sup 4}He nanodroplets on surfaces: {sup 4}He/graphene

    SciTech Connect

    Lara-Castells, María Pilar de; Stoll, Hermann; Civalleri, Bartolomeo; Causà, Mauro; Voloshina, Elena; Mitrushchenkov, Alexander O.; Pi, Martí

    2014-10-21

    In this work we propose a general strategy to calculate accurate He–surface interaction potentials. It extends the dispersionless density functional approach recently developed by Pernal et al. [Phys. Rev. Lett. 103, 263201 (2009)] to adsorbate-surface interactions by including periodic boundary conditions. We also introduce a scheme to parametrize the dispersion interaction by calculating two- and three-body dispersion terms at coupled cluster singles and doubles and perturbative triples (CCSD(T)) level via the method of increments [H. Stoll, J. Chem. Phys. 97, 8449 (1992)]. The performance of the composite approach is tested on {sup 4}He/graphene by determining the energies of the low-lying selective adsorption states, finding an excellent agreement with the best available theoretical data. Second, the capability of the approach to describe dispersionless correlation effects realistically is used to extract dispersion effects in time-dependent density functional simulations on the collision of {sup 4}He droplets with a single graphene sheet. It is found that dispersion effects play a key role in the fast spreading of the {sup 4}He nanodroplet, the evaporation-like process of helium atoms, and the formation of solid-like helium structures. These characteristics are expected to be quite general and highly relevant to explain experimental measurements with the newly developed helium droplet mediated deposition technique.

  13. Communication: A combined periodic density functional and incremental wave-function-based approach for the dispersion-accounting time-resolved dynamics of ⁴He nanodroplets on surfaces: ⁴He/graphene.

    PubMed

    de Lara-Castells, María Pilar; Stoll, Hermann; Civalleri, Bartolomeo; Causà, Mauro; Voloshina, Elena; Mitrushchenkov, Alexander O; Pi, Martí

    2014-10-21

    In this work we propose a general strategy to calculate accurate He-surface interaction potentials. It extends the dispersionless density functional approach recently developed by Pernal et al. [Phys. Rev. Lett. 103, 263201 (2009)] to adsorbate-surface interactions by including periodic boundary conditions. We also introduce a scheme to parametrize the dispersion interaction by calculating two- and three-body dispersion terms at coupled cluster singles and doubles and perturbative triples (CCSD(T)) level via the method of increments [H. Stoll, J. Chem. Phys. 97, 8449 (1992)]. The performance of the composite approach is tested on (4)He/graphene by determining the energies of the low-lying selective adsorption states, finding an excellent agreement with the best available theoretical data. Second, the capability of the approach to describe dispersionless correlation effects realistically is used to extract dispersion effects in time-dependent density functional simulations on the collision of (4)He droplets with a single graphene sheet. It is found that dispersion effects play a key role in the fast spreading of the (4)He nanodroplet, the evaporation-like process of helium atoms, and the formation of solid-like helium structures. These characteristics are expected to be quite general and highly relevant to explain experimental measurements with the newly developed helium droplet mediated deposition technique.

  14. How do the approaches to accountability compare for charities working in international development?

    PubMed

    Kirsch, David

    2014-09-01

    Approaches to accountability vary between charities working to reduce under-five mortality in underdeveloped countries, and healthcare workers and facilities in Canada. Comparison reveals key differences, similarities and trade-offs. For example, while health professionals are governed by legislation and healthcare facilities have a de facto obligation to be accredited, charities and other international organizations are not subject to mandatory international laws or guidelines or to de facto international standards. Charities have policy goals similar to those found in the Canadian substudies, including access, quality, cost control, cost-effectiveness and customer satisfaction. However, the relative absence of external policy tools means that these goals may not be realized. Accountability can be beneficial, but too much or the wrong kind of accountability can divert resources and diminish returns.

  15. Territory management an appropriate approach for taking into account dynamic risks

    NASA Astrophysics Data System (ADS)

    Fernandez, M.; Ruegg, J.

    2012-04-01

    The territorial approach in risk analysis is well established in scientific communications in recent years, especially in the francophone literature. It is an especially appropriate approach for exploring a large number of criteria and factors influencing, on the territory, the composition of the vulnerabilities and risks. In these sense, this approach is appropriate to identify not only risks due to natural hazards but also social and environmental risks. Our case study explores the catastrophic landslide, a collapse of 6 millions cubic meters of rock in Los Chorros, in the municipality of San Cristobal Verapaz-Guatemala, in January 2009. We demonstrate that the same natural hazard has different consequences within this territory and may also increase or even create new vulnerabilities and risks for the population. The analysis shows that the same event can endanger various aspects of the territory: resources, functions (agriculture, or houses uses for example) and allocations and highlights the different types of vulnerabilities that land users (i.e., farmers, merchants transport drivers) face. To resolve a post-disaster situation, the actors choose one vulnerability among a set of vulnerabilities (in a multi-vulnerability context) and with this choice they define their own acceptable risk limits. To give an example, the transport driver choose to reduce the economic vulnerability when going to the local market and crossing the landslide (physical vulnerability). In the context of a developing country with weak development and limited resources, land users that become the Risk managers after the disaster are compelled to prioritize between different actions for reducing risks This study provides a novel approach to risk management by adding a political science and geography dimension through the territory approach for improving our understanding of multi-hazard and multi-risk management. Based on findings from this case study, this work asserts that risk is not

  16. A behavior-analytic account of depression and a case report using acceptance-based procedures

    PubMed Central

    Dougher, Michael J.; Hackbert, Lucianne

    1994-01-01

    Although roughly 6% of the general population is affected by depression at some time during their lifetime, the disorder has been relatively neglected by behavior analysts. The preponderance of research on the etiology and treatment of depression has been conducted by cognitive behavior theorists and biological psychiatrists and psychopharmacologists interested in the biological substrates of depression. These approaches have certainly been useful, but their reliance on cognitive and biological processes and their lack of attention to environment—behavior relations render them unsatisfactory from a behavior-analytic perspective. The purpose of this paper is to provide a behavior-analytic account of depression and to derive from this account several possible treatment interventions. In addition, case material is presented to illustrate an acceptance-based approach with a depressed client. PMID:22478195

  17. Integrating Mission-Based Values into Accounting Curriculum: Catholic Social Teaching and Introductory Accounting

    ERIC Educational Resources Information Center

    Hise, Joan Vane; Koeplin, John P.

    2010-01-01

    This paper presents several reasons why mission-based values, in this case Catholic Social Teaching (CST), should be incorporated into a university business curriculum. The CST tenets include the sanctity of human life; call to family, community, and participation; rights and responsibilities; option for the poor and vulnerable; the dignity of…

  18. Accounting Faculty Utilization of Web-Based Resources to Enhance In-Class Instruction

    ERIC Educational Resources Information Center

    Black, Thomas G.; Turetsky, Howard F.

    2010-01-01

    Our study examines the extent to which accounting faculty use web-based resources to augment classroom instruction. Moreover, we explore the effects of the institutional factors of accounting accreditation and the existence of an accounting Ph.D. program on internet use by accounting academics toward enhancing pedagogy, while controlling for the…

  19. Statistical approaches to account for false-positive errors in environmental DNA samples.

    PubMed

    Lahoz-Monfort, José J; Guillera-Arroita, Gurutzeta; Tingley, Reid

    2016-05-01

    Environmental DNA (eDNA) sampling is prone to both false-positive and false-negative errors. We review statistical methods to account for such errors in the analysis of eDNA data and use simulations to compare the performance of different modelling approaches. Our simulations illustrate that even low false-positive rates can produce biased estimates of occupancy and detectability. We further show that removing or classifying single PCR detections in an ad hoc manner under the suspicion that such records represent false positives, as sometimes advocated in the eDNA literature, also results in biased estimation of occupancy, detectability and false-positive rates. We advocate alternative approaches to account for false-positive errors that rely on prior information, or the collection of ancillary detection data at a subset of sites using a sampling method that is not prone to false-positive errors. We illustrate the advantages of these approaches over ad hoc classifications of detections and provide practical advice and code for fitting these models in maximum likelihood and Bayesian frameworks. Given the severe bias induced by false-negative and false-positive errors, the methods presented here should be more routinely adopted in eDNA studies.

  20. An enhanced nonlinear damping approach accounting for system constraints in active mass dampers

    NASA Astrophysics Data System (ADS)

    Venanzi, Ilaria; Ierimonti, Laura; Ubertini, Filippo

    2015-11-01

    Active mass dampers are a viable solution for mitigating wind-induced vibrations in high-rise buildings and improve occupants' comfort. Such devices suffer particularly when they reach force saturation of the actuators and maximum extension of their stroke, which may occur in case of severe loading conditions (e.g. wind gust and earthquake). Exceeding actuators' physical limits can impair the control performance of the system or even lead to devices damage, with consequent need for repair or substitution of part of the control system. Controllers for active mass dampers should account for their technological limits. Prior work of the authors was devoted to stroke issues and led to the definition of a nonlinear damping approach, very easy to implement in practice. It consisted of a modified skyhook algorithm complemented with a nonlinear braking force to reverse the direction of the mass before reaching the stroke limit. This paper presents an enhanced version of this approach, also accounting for force saturation of the actuator and keeping the simplicity of implementation. This is achieved by modulating the control force by a nonlinear smooth function depending on the ratio between actuator's force and saturation limit. Results of a numerical investigation show that the proposed approach provides similar results to the method of the State Dependent Riccati Equation, a well-established technique for designing optimal controllers for constrained systems, yet very difficult to apply in practice.

  1. Designing Rules for Accounting Transaction Identification based on Indonesian NLP

    NASA Astrophysics Data System (ADS)

    Iswandi, I.; Suwardi, I. S.; Maulidevi, N. U.

    2017-03-01

    Recording accounting transactions carried out by the evidence of the transactions. It can be invoices, receipts, letters of intent, electricity bill, telephone bill, etc. In this paper, we proposed design of rules to identify the entities located on the sales invoice. There are some entities identified in a sales invoice, namely : invoice date, company name, invoice number, product id, product name, quantity and total price. Identification this entities using named entity recognition method. The entities generated from the rules used as a basis for automation process of data input into the accounting system.

  2. Accounting for estimated IQ in neuropsychological test performance with regression-based techniques.

    PubMed

    Testa, S Marc; Winicki, Jessica M; Pearlson, Godfrey D; Gordon, Barry; Schretlen, David J

    2009-11-01

    Regression-based normative techniques account for variability in test performance associated with multiple predictor variables and generate expected scores based on algebraic equations. Using this approach, we show that estimated IQ, based on oral word reading, accounts for 1-9% of the variability beyond that explained by individual differences in age, sex, race, and years of education for most cognitive measures. These results confirm that adding estimated "premorbid" IQ to demographic predictors in multiple regression models can incrementally improve the accuracy with which regression-based norms (RBNs) benchmark expected neuropsychological test performance in healthy adults. It remains to be seen whether the incremental variance in test performance explained by estimated "premorbid" IQ translates to improved diagnostic accuracy in patient samples. We describe these methods, and illustrate the step-by-step application of RBNs with two cases. We also discuss the rationale, assumptions, and caveats of this approach. More broadly, we note that adjusting test scores for age and other characteristics might actually decrease the accuracy with which test performance predicts absolute criteria, such as the ability to drive or live independently.

  3. Pharmacokinetic Modeling of Manganese III. Physiological Approaches Accounting for Background and Tracer Kinetics

    SciTech Connect

    Teeguarden, Justin G.; Gearhart, Jeffrey; Clewell, III, H. J.; Covington, Tammie R.; Nong, Andy; Anderson, Melvin E.

    2007-01-01

    Manganese (Mn) is an essential nutrient. Mn deficiency is associated with altered lipid (Kawano et al. 1987) and carbohydrate metabolism (Baly et al. 1984; Baly et al. 1985), abnormal skeletal cartilage development (Keen et al. 2000), decreased reproductive capacity, and brain dysfunction. Occupational and accidental inhalation exposures to aerosols containing high concentrations of Mn produce neurological symptoms with Parkinson-like characteristics in workers. At present, there is also concern about use of the manganese-containing compound, methylcyclopentadienyl manganese tricarbonyl (MMT), in unleaded gasoline as an octane enhancer. Combustion of MMT produces aerosols containing a mixture of manganese salts (Lynam et al. 1999). These Mn particulates may be inhaled at low concentrations by the general public in areas using MMT. Risk assessments for essential elements need to acknowledge that risks occur with either excesses or deficiencies and the presence of significant amounts of these nutrients in the body even in the absence of any exogenous exposures. With Mn there is an added complication, i.e., the primary risk is associated with inhalation while Mn is an essential dietary nutrient. Exposure standards for inhaled Mn will need to consider the substantial background uptake from normal ingestion. Andersen et al. (1999) suggested a generic approach for essential nutrient risk assessment. An acceptable exposure limit could be based on some ‘tolerable’ change in tissue concentration in normal and exposed individuals, i.e., a change somewhere from 10 to 25 % of the individual variation in tissue concentration seen in a large human population. A reliable multi-route, multi-species pharmacokinetic model would be necessary for the implementation of this type of dosimetry-based risk assessment approach for Mn. Physiologically-based pharmacokinetic (PBPK) models for various xenobiotics have proven valuable in contributing to a variety of chemical specific risk

  4. 26 CFR 1.801-8 - Contracts with reserves based on segregated asset accounts.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... short-term capital losses of $10,000 attributable to its general asset accounts and realized short-term... based on a segregated asset account. (i) For purposes of part I, section 801(g)(1)(B) defines the term...) The term contract with reserves based on a segregated asset account includes a contract such as...

  5. An integrative mechanistic account of psychological distress, therapeutic change and recovery: the Perceptual Control Theory approach.

    PubMed

    Higginson, Sally; Mansell, Warren; Wood, Alex M

    2011-03-01

    The exact nature and mechanisms of psychological change within psychological disorders remain unknown. This review aims to use a psychological framework known as Perceptual Control Theory (Powers, 1973, 2005; Powers, Clark, & McFarland, 1960) to integrate the diverse literature within psychotherapy research. The core principles of Perceptual Control Theory are explained, and key domains of psychotherapy are considered to explore how well they converge with these principles. The quantitative and qualitative empirical literature on the process of psychological change is reviewed to examine how it fits with predictions based on Perceptual Control Theory. Furthermore, the prerequisites for psychological change; client qualities, therapist qualities, the therapeutic alliance and the shifting of awareness, are also considered to examine their consistency within a Perceptual Control Theory account. Finally the strengths and limitations of a Perceptual Control Theory account in explaining the mechanism of psychological change are considered.

  6. Accounting for water management issues within hydrological simulation: Alternative modelling options and a network optimization approach

    NASA Astrophysics Data System (ADS)

    Efstratiadis, Andreas; Nalbantis, Ioannis; Rozos, Evangelos; Koutsoyiannis, Demetris

    2010-05-01

    In mixed natural and artificialized river basins, many complexities arise due to anthropogenic interventions in the hydrological cycle, including abstractions from surface water bodies, groundwater pumping or recharge and water returns through drainage systems. Typical engineering approaches adopt a multi-stage modelling procedure, with the aim to handle the complexity of process interactions and the lack of measured abstractions. In such context, the entire hydrosystem is separated into natural and artificial sub-systems or components; the natural ones are modelled individually, and their predictions (i.e. hydrological fluxes) are transferred to the artificial components as inputs to a water management scheme. To account for the interactions between the various components, an iterative procedure is essential, whereby the outputs of the artificial sub-systems (i.e. abstractions) become inputs to the natural ones. However, this strategy suffers from multiple shortcomings, since it presupposes that pure natural sub-systems can be located and that sufficient information is available for each sub-system modelled, including suitable, i.e. "unmodified", data for calibrating the hydrological component. In addition, implementing such strategy is ineffective when the entire scheme runs in stochastic simulation mode. To cope with the above drawbacks, we developed a generalized modelling framework, following a network optimization approach. This originates from the graph theory, which has been successfully implemented within some advanced computer packages for water resource systems analysis. The user formulates a unified system which is comprised of the hydrographical network and the typical components of a water management network (aqueducts, pumps, junctions, demand nodes etc.). Input data for the later include hydraulic properties, constraints, targets, priorities and operation costs. The real-world system is described through a conceptual graph, whose dummy properties

  7. Meeting Performance-based Training Demands: Accountability in an Intervention-based Practicum.

    ERIC Educational Resources Information Center

    Barnett, David W.; Daly III, Edward J.; Hampshire, Ellen M.; Hines, Nancy Rovak; Maples, Kelly A.; Ostrom, Jennifer K.; Van Buren, Amy E.

    1999-01-01

    Describes accountability methods built into practicums for school psychology trainees. Results of intervention-based services were summed across individual cases developed by trainees as a means of examining the overall effectiveness of the practicum experiences. Outcomes are reported as procedural adherence to the model of service delivery,…

  8. Superconfiguration accounting approach versus average-atom model in local-thermodynamic-equilibrium highly ionized plasmas.

    PubMed

    Faussurier, G

    1999-06-01

    Statistical methods of describing and simulating complex ionized plasmas requires the development of reliable and computationally tractable models. In that spirit, we propose the screened-hydrogenic average atom, augmented with corrections resulting from fluctuations of the occupation probabilities around the mean-field equilibrium, as an approximation to calculate the grand potential and related statistical properties. Our main objective is to check the validity of this approach by comparing its predictions with those given by the superconfiguration accounting method. The latter is well-suited to this purpose. In effect, this method makes it possible to go beyond the mean-field model by using nonperturbative, analytic, and systematic techniques. Besides, it allows us to establish the relationship between the detailed configuration accounting and the average-atom methods. To our knowledge, this is the first time that the superconfiguration description has been used in this context. Finally, this study is also the occasion for presenting a powerful technique from analytic number theory to calculate superconfiguration averaged quantities.

  9. Australian Rural Accountants' Views on How Locally Provided CPD Compares with City-Based Provision

    ERIC Educational Resources Information Center

    Halabi, Abdel K.

    2015-01-01

    This paper analyses Australian rural accountants' attitudes and levels of satisfaction with continuing professional development (CPD), based on whether the CPD was delivered by a professional accounting body in a rural or metropolitan area. The paper responds to prior research that finds rural accountants are dissatisfied with professional…

  10. A formal ideal-based account of typicality.

    PubMed

    Voorspoels, Wouter; Vanpaemel, Wolf; Storms, Gert

    2011-10-01

    Inspired by Barsalou's (Journal of Experimental Psychology: Learning, Memory, and Cognition, 11, 629-654, 1985) proposal that categories can be represented by ideals, we develop and test a computational model, the ideal dimension model (IDM). The IDM is tested in its account of the typicality gradient for 11 superordinate natural language concepts and, using Bayesian model evaluation, contrasted with a standard exemplar model and a central prototype model. The IDM is found to capture typicality better than do the exemplar model and the central tendency prototype model, in terms of both goodness of fit and generalizability. The present findings challenge the dominant view that exemplar representations are most successful and present compelling evidence that superordinate natural language categories can be represented using an abstract summary, in the form of ideal representations. Supplemental appendices for this article can be downloaded from http://mc.psychonomic-journals.org/content/supplemental.

  11. A user-friendly approach to cost accounting in laboratory animal facilities.

    PubMed

    Baker, David G

    2011-08-19

    Cost accounting is an essential management activity for laboratory animal facility management. In this report, the author describes basic principles of cost accounting and outlines steps for carrying out cost accounting in laboratory animal facilities. Methods of post hoc cost accounting analysis for maximizing the efficiency of facility operations are also described.

  12. The Need for Global Application of the Accountability for Reasonableness Approach to Support Sustainable Outcomes

    PubMed Central

    Byskov, Jens; Maluka, Stephen Oswald; Marchal, Bruno; Shayo, Elizabeth H.; Bukachi, Salome; Zulu, Joseph M.; Blas, Erik; Michelo, Charles; Ndawi, Benedict; Hurtig, Anna-Karin

    2017-01-01

    The accountability for reasonableness (AFR) concept has been developed and discussed for over two decades. Its interpretation has been studied in several ways partly guided by the specific settings and the researchers involved. This has again influenced the development of the concept, but not led to universal application. The potential use in health technology assessments (HTAs) has recently been identified by Daniels et al as yet another excellent justification for AFR-based process guidance that refers to both qualitative and a broader participatory input for HTA, but it has raised concerns from those who primarily support the consistency and objectivity of more quantitative and reproducible evidence. With reference to studies of AFR-based interventions and the through these repeatedly documented motivation for their consolidation, we argue that it can even be unethical not to take AFR conditions beyond their still mainly formative stage and test their application within routine health systems management for their expected support to more sustainable health improvements. The ever increasing evidence and technical expertise are necessary but at times contradictory and do not in isolation lead to optimally accountable, fair and sustainable solutions. Technical experts, politicians, managers, service providers, community members, and beneficiaries each have their own values, expertise and preferences, to be considered for necessary buy in and sustainability. Legitimacy, accountability and fairness do not come about without an inclusive and agreed process guidance that can reconcile differences of opinion and indeed differences in evidence to arrive at a by all understood, accepted, but not necessarily agreed compromise in a current context - until major premises for the decision change. AFR should be widely adopted in projects and services under close monitoring and frequent reviews.

  13. Accounting for non-independent detection when estimating abundance of organisms with a Bayesian approach

    USGS Publications Warehouse

    Martin, Julien; Royle, J. Andrew; MacKenzie, Darryl I.; Edwards, Holly H.; Kery, Marc; Gardner, Beth

    2011-01-01

    Summary 1. Binomial mixture models use repeated count data to estimate abundance. They are becoming increasingly popular because they provide a simple and cost-effective way to account for imperfect detection. However, these models assume that individuals are detected independently of each other. This assumption may often be violated in the field. For instance, manatees (Trichechus manatus latirostris) may surface in turbid water (i.e. become available for detection during aerial surveys) in a correlated manner (i.e. in groups). However, correlated behaviour, affecting the non-independence of individual detections, may also be relevant in other systems (e.g. correlated patterns of singing in birds and amphibians). 2. We extend binomial mixture models to account for correlated behaviour and therefore to account for non-independent detection of individuals. We simulated correlated behaviour using beta-binomial random variables. Our approach can be used to simultaneously estimate abundance, detection probability and a correlation parameter. 3. Fitting binomial mixture models to data that followed a beta-binomial distribution resulted in an overestimation of abundance even for moderate levels of correlation. In contrast, the beta-binomial mixture model performed considerably better in our simulation scenarios. We also present a goodness-of-fit procedure to evaluate the fit of beta-binomial mixture models. 4. We illustrate our approach by fitting both binomial and beta-binomial mixture models to aerial survey data of manatees in Florida. We found that the binomial mixture model did not fit the data, whereas there was no evidence of lack of fit for the beta-binomial mixture model. This example helps illustrate the importance of using simulations and assessing goodness-of-fit when analysing ecological data with N-mixture models. Indeed, both the simulations and the goodness-of-fit procedure highlighted the limitations of the standard binomial mixture model for aerial

  14. 26 CFR 1.801-8 - Contracts with reserves based on segregated asset accounts.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 26 Internal Revenue 8 2014-04-01 2014-04-01 false Contracts with reserves based on segregated... Contracts with reserves based on segregated asset accounts. (a) Definitions—(1) Annuity contracts include...) defines the term contract with reserves based on a segregated asset account as a contract (individual...

  15. 26 CFR 1.801-8 - Contracts with reserves based on segregated asset accounts.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 26 Internal Revenue 8 2012-04-01 2012-04-01 false Contracts with reserves based on segregated... Contracts with reserves based on segregated asset accounts. (a) Definitions—(1) Annuity contracts include...) defines the term contract with reserves based on a segregated asset account as a contract (individual...

  16. 26 CFR 1.801-8 - Contracts with reserves based on segregated asset accounts.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 26 Internal Revenue 8 2013-04-01 2013-04-01 false Contracts with reserves based on segregated... Contracts with reserves based on segregated asset accounts. (a) Definitions—(1) Annuity contracts include...) defines the term contract with reserves based on a segregated asset account as a contract (individual...

  17. Statistical approaches to account for missing values in accelerometer data: Applications to modeling physical activity.

    PubMed

    Xu, Selene Yue; Nelson, Sandahl; Kerr, Jacqueline; Godbole, Suneeta; Patterson, Ruth; Merchant, Gina; Abramson, Ian; Staudenmayer, John; Natarajan, Loki

    2016-07-10

    Physical inactivity is a recognized risk factor for many chronic diseases. Accelerometers are increasingly used as an objective means to measure daily physical activity. One challenge in using these devices is missing data due to device nonwear. We used a well-characterized cohort of 333 overweight postmenopausal breast cancer survivors to examine missing data patterns of accelerometer outputs over the day. Based on these observed missingness patterns, we created psuedo-simulated datasets with realistic missing data patterns. We developed statistical methods to design imputation and variance weighting algorithms to account for missing data effects when fitting regression models. Bias and precision of each method were evaluated and compared. Our results indicated that not accounting for missing data in the analysis yielded unstable estimates in the regression analysis. Incorporating variance weights and/or subject-level imputation improved precision by >50%, compared to ignoring missing data. We recommend that these simple easy-to-implement statistical tools be used to improve analysis of accelerometer data.

  18. Mindfulness meditation-based pain relief: a mechanistic account.

    PubMed

    Zeidan, Fadel; Vago, David R

    2016-06-01

    Pain is a multidimensional experience that involves interacting sensory, cognitive, and affective factors, rendering the treatment of chronic pain challenging and financially burdensome. Further, the widespread use of opioids to treat chronic pain has led to an opioid epidemic characterized by exponential growth in opioid misuse and addiction. The staggering statistics related to opioid use highlight the importance of developing, testing, and validating fast-acting nonpharmacological approaches to treat pain. Mindfulness meditation is a technique that has been found to significantly reduce pain in experimental and clinical settings. The present review delineates findings from recent studies demonstrating that mindfulness meditation significantly attenuates pain through multiple, unique mechanisms-an important consideration for the millions of chronic pain patients seeking narcotic-free, self-facilitated pain therapy.

  19. Accountability to Public Stakeholders in Watershed-Based Restoration

    EPA Science Inventory

    There is an increasing push at the federal, state, and local levels for watershed-based conservation projects. These projects work to address water quality issues in degraded waterways through the implementation of a suite of best management practices on land throughout a watersh...

  20. School-Based Budgets: Getting, Spending, and Accounting.

    ERIC Educational Resources Information Center

    Herman, Jerry L.; Herman, Janice L.

    With the advent of large interest in school-based management came the task of inventing a different type of budgeting system--one that delegated the many tasks of developing a budget, expending the allocated funds, and controlling those expenditures in a way that did not exceed the allocation to the site level. This book explores the various means…

  1. One Paradox in District Accountability and Site-Based Management.

    ERIC Educational Resources Information Center

    Shellman, David W.

    The paradox of site-based school management with use of standardized tests or instructional management systems that restrict teacher choices was evident in one school district in North Carolina in which measurement of student success has centered on student performance on state-mandated tests. A study was conducted to see if students whose…

  2. Students' Concern about Indebtedness: A Rank Based Social Norms Account

    ERIC Educational Resources Information Center

    Aldrovandi, Silvio; Wood, Alex M.; Maltby, John; Brown, Gordon D. A.

    2015-01-01

    This paper describes a new model of students' concern about indebtedness within a rank-based social norms framework. Study 1 found that students hold highly variable beliefs about how much other students will owe at the end of their degree. Students' concern about their own anticipated debt--and their intention of taking on a part-time job during…

  3. Performance-Based Measurement: Action for Organizations and HPT Accountability

    ERIC Educational Resources Information Center

    Larbi-Apau, Josephine A.; Moseley, James L.

    2010-01-01

    Basic measurements and applications of six selected general but critical operational performance-based indicators--effectiveness, efficiency, productivity, profitability, return on investment, and benefit-cost ratio--are presented. With each measurement, goals and potential impact are explored. Errors, risks, limitations to measurements, and a…

  4. Test-Based Accountability: Potential Benefits and Pitfalls of Science Assessment with Student Diversity

    ERIC Educational Resources Information Center

    Penfield, Randall D.; Lee, Okhee

    2010-01-01

    Recent test-based accountability policy in the U.S. has involved annually assessing all students in core subjects and holding schools accountable for adequate progress of all students by implementing sanctions when adequate progress is not met. Despite its potential benefits, basing educational policy on assessments developed for a student…

  5. Impacts of Performance-Based Accountability on Institutional Performance in the U.S.

    ERIC Educational Resources Information Center

    Shin, Jung Cheol

    2010-01-01

    In the 1990s, most US states adopted new forms of performance-based accountability, e.g., performance-based budgeting, funding, or reporting. This study analyzed changes in institutional performance following the adoption of these new accountability standards. We measured institutional performance by representative education and research…

  6. Separation of time-based and trial-based accounts of the partial reinforcement extinction effect.

    PubMed

    Bouton, Mark E; Woods, Amanda M; Todd, Travis P

    2014-01-01

    Two appetitive conditioning experiments with rats examined time-based and trial-based accounts of the partial reinforcement extinction effect (PREE). In the PREE, the loss of responding that occurs in extinction is slower when the conditioned stimulus (CS) has been paired with a reinforcer on some of its presentations (partially reinforced) instead of every presentation (continuously reinforced). According to a time-based or "time-accumulation" view (e.g., Gallistel and Gibbon, 2000), the PREE occurs because the organism has learned in partial reinforcement to expect the reinforcer after a larger amount of time has accumulated in the CS over trials. In contrast, according to a trial-based view (e.g., Capaldi, 1967), the PREE occurs because the organism has learned in partial reinforcement to expect the reinforcer after a larger number of CS presentations. Experiment 1 used a procedure that equated partially and continuously reinforced groups on their expected times to reinforcement during conditioning. A PREE was still observed. Experiment 2 then used an extinction procedure that allowed time in the CS and the number of trials to accumulate differentially through extinction. The PREE was still evident when responding was examined as a function of expected time units to the reinforcer, but was eliminated when responding was examined as a function of expected trial units to the reinforcer. There was no evidence that the animal responded according to the ratio of time accumulated during the CS in extinction over the time in the CS expected before the reinforcer. The results thus favor a trial-based account over a time-based account of extinction and the PREE. This article is part of a Special Issue entitled: Associative and Temporal Learning.

  7. Measuring health system performance: A new approach to accountability and quality improvement in New Zealand.

    PubMed

    Ashton, Toni

    2015-08-01

    In February 2014, the New Zealand Ministry of Health released a new framework for measuring the performance of the New Zealand health system. The two key aims are to strengthen accountability to taxpayers and to lift the performance of the system's component parts using a 'whole-of-system' approach to performance measurement. Development of this new framework--called the Integrated Performance and Incentive Framework (IPIF)--was stimulated by a need for a performance management framework which reflects the health system as a whole, which encourages primary and secondary providers to work towards the same end, and which incorporates the needs and priorities of local communities. Measures within the IPIF will be set at two levels: the system level, where measures are set nationally, and the local district level, where measures which contribute towards the system level indicators will be selected by local health alliances. In the first year, the framework applies only at the system level and only to primary health care services. It will continue to be developed over time and will gradually be extended to cover a wide range of health and disability services. The success of the IPIF in improving health sector performance depends crucially on the willingness of health sector personnel to engage closely with the measurement process.

  8. A blue/green water-based accounting framework for assessment of water security

    NASA Astrophysics Data System (ADS)

    Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.

    2014-09-01

    A comprehensive assessment of water security can incorporate several water-related concepts, while accounting for Blue and Green Water (BW and GW) types defined in accordance with the hydrological processes involved. Here we demonstrate how a quantitative analysis of provision probability and use of BW and GW can be conducted, so as to provide indicators of water scarcity and vulnerability at the basin level. To illustrate the approach, we use the Soil and Water Assessment Tool (SWAT) to model the hydrology of an agricultural basin (291 km2) within the Cantareira Water Supply System in Brazil. To provide a more comprehensive basis for decision making, we analyze the BW and GW-Footprint components against probabilistic levels (50th and 30th percentile) of freshwater availability for human activities, during a 23 year period. Several contrasting situations of BW provision are distinguished, using different hydrological-based methodologies for specifying monthly Environmental Flow Requirements (EFRs), and the risk of natural EFR violation is evaluated by use of a freshwater provision index. Our results reveal clear spatial and temporal patterns of water scarcity and vulnerability levels within the basin. Taking into account conservation targets for the basin, it appears that the more restrictive EFR methods are more appropriate than the method currently employed at the study basin. The blue/green water-based accounting framework developed here provides a useful integration of hydrologic, ecosystem and human needs information on a monthly basis, thereby improving our understanding of how and where water-related threats to human and aquatic ecosystem security can arise.

  9. A New Approach to Account for the Correlations among Single Nucleotide Polymorphisms in Genome-Wide Association Studies

    PubMed Central

    Chen, Zhongxue; Liu, Qingzhong

    2011-01-01

    In genetic association studies, such as genome-wide association studies (GWAS), the number of single nucleotide polymorphisms (SNPs) can be as large as hundreds of thousands. Due to linkage disequilibrium, many SNPs are highly correlated; assuming they are independent is not valid. The commonly used multiple comparison methods, such as Bonferroni correction, are not appropriate and are too conservative when applied to GWAS. To overcome these limitations, many approaches have been proposed to estimate the so-called effective number of independent tests to account for the correlations among SNPs. However, many current effective number estimation methods are based on eigenvalues of the correlation matrix. When the dimension of the matrix is large, the numeric results may be unreliable or even unobtainable. To circumvent this obstacle and provide better estimates, we propose a new effective number estimation approach which is not based on the eigenvalues. We compare the new method with others through simulated and real data. The comparison results show that the proposed method has very good performance. PMID:21849789

  10. Cognitive fatigue: A Time-based Resource-sharing account.

    PubMed

    Borragán, Guillermo; Slama, Hichem; Bartolomei, Mario; Peigneux, Philippe

    2017-04-01

    Cognitive Fatigue (CF) is an important confound impacting cognitive performance. How CF is triggered and what are the features that make a cognitive effort perceived as exhausting remain unclear. In the theoretical framework of the Time-based Resource-sharing (TBRS) model (Barrouillet et al., 2004), we hypothesized that CF is an outcome of increased cognitive load due to constrained time to process ongoing cognitive demands. We tested this cognitive load-related CF hypothesis across 2 experiments manipulating both task complexity and cognitive load induced by the processing time interval. To do so, we used the TloadDback paradigm, a working memory dual task in which high and low cognitive load levels can be individually adjusted. In Experiment 1, participants were administered a high cognitive load (HCL, short processing time interval) and a low cognitive load (LCL, large processing time interval) conditions while complexity of the task was kept constant (1-back dual task). In Experiment 2, two tasks featuring different levels of complexity were both administered at the individual's maximal processing speed capacity for each task (i.e., short processing time interval). Results disclosed higher CF in the HCL than in the LCL condition in Experiment 1. On the contrary, in Experiment 2 similar levels of CF were obtained for different levels of task complexity when processing time interval was individually adjusted to induce a HCL condition. Altogether, our results indicate that processing time-related cognitive load eventually leads to the subjective feeling of CF, and to a decrease in alertness. In this framework, we propose that the development of CF can be envisioned as the result of sustained cognitive demands irrespective of task complexity.

  11. Accounting for Heaping in Retrospectively Reported Event Data – A Mixture-Model Approach

    PubMed Central

    Bar, Haim Y.; Lillard, Dean R.

    2012-01-01

    When event data are retrospectively reported, more temporally distal events tend to get “heaped” on even multiples of reporting units. Heaping may introduce a type of attenuation bias because it causes researchers to mismatch time-varying right-hand side variables. We develop a model-based approach to estimate the extent of heaping in the data, and how it affects regression parameter estimates. We use smoking cessation data as a motivating example, but our method is general. It facilitates the use of retrospective data from the multitude of cross-sectional and longitudinal studies worldwide that collect and potentially could collect event data. PMID:22733577

  12. A Comparative Study of the Effect of Web-Based versus In-Class Textbook Ethics Instruction on Accounting Students' Propensity to Whistle-Blow

    ERIC Educational Resources Information Center

    McManus, Lisa; Subramaniam, Nava; James, Wendy

    2012-01-01

    The authors examined whether accounting students' propensity to whistle-blow differed between those instructed through a web-based teaching module and those exposed to a traditional in-class textbook-focused approach. A total of 156 students from a second-year financial accounting course participated in the study. Ninety students utilized the…

  13. Does Participation in a Computer-Based Learning Program in Introductory Financial Accounting Course Lead to Choosing Accounting as a Major?

    ERIC Educational Resources Information Center

    Owhoso, Vincent; Malgwi, Charles A.; Akpomi, Margaret

    2014-01-01

    The authors examine whether students who completed a computer-based intervention program, designed to help them develop abilities and skills in introductory accounting, later declared accounting as a major. A sample of 1,341 students participated in the study, of which 74 completed the intervention program (computer-based assisted learning [CBAL])…

  14. Working toward More Engaged and Successful Accounting Students: A Balanced Scorecard Approach

    ERIC Educational Resources Information Center

    Fredin, Amy; Fuchsteiner, Peter; Portz, Kris

    2015-01-01

    Prior research indicates that student engagement is the key to student success, as measured by college grades, degree completion, and graduate school enrollment. We propose a set of goals and objectives for accounting students, in particular, to help them become engaged not only in the educational process, but also in the accounting profession.…

  15. A Total Quality Management Approach to Assurance of Learning in the Accounting Classroom: An Empirical Study

    ERIC Educational Resources Information Center

    Harvey, Mary Ellen; Eisner, Susan

    2011-01-01

    The research presented in this paper seeks to discern which combination of pedagogical tools most positively impact student learning of the introductory Accounting curriculum in the Principles of Accounting courses in a 4-year U.S. public college. This research topic is relevant because it helps address a quandary many instructors experience: how…

  16. A Practical Approach to Accountability in an Oklahoma School. Project SEEK.

    ERIC Educational Resources Information Center

    Southwest Oklahoma Region 14 Service Center, Elk City.

    This booklet presents the accountability program developed by the Elk City (Oklahoma) Public Schools. During the first year of the program ten broad educational goals were formulated through a series of administrator workshops, accountability committee meetings, informal surveys of the community, and questionnaires for teachers and students.…

  17. Discourse Surrounding the International Education Standards for Professional Accountants (IES): A Content Analysis Approach

    ERIC Educational Resources Information Center

    Sugahara, Satoshi; Wilson, Rachel

    2013-01-01

    The development and implementation of the International Education Standards (IES) for professional accountants is currently an important issue in accounting education and for educators interested in a shift toward international education standards more broadly. The purpose of this study is to investigate professional and research discourse…

  18. The Effect of Web-Based Collaborative Learning Methods to the Accounting Courses in Technical Education

    ERIC Educational Resources Information Center

    Cheng, K. W. Kevin

    2009-01-01

    This study mainly explored the effect of applying web-based collaborative learning instruction to the accounting curriculum on student's problem-solving attitudes in Technical Education. The research findings and proposed suggestions would serve as a reference for the development of accounting-related curricula and teaching strategies. To achieve…

  19. 24 CFR 990.280 - Project-based budgeting and accounting.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Project-based budgeting and... budgeting and accounting. (a) All PHAs covered by this subpart shall develop and maintain a system of budgeting and accounting for each project in a manner that allows for analysis of the actual revenues...

  20. Measure for Measure: How Proficiency-Based Accountability Systems Affect Inequality in Academic Achievement

    ERIC Educational Resources Information Center

    Jennings, Jennifer; Sohn, Heeju

    2014-01-01

    How do proficiency-based accountability systems affect inequality in academic achievement? This article reconciles mixed findings in the literature by demonstrating that three factors jointly determine accountability's impact. First, by analyzing student-level data from a large urban school district, we find that when educators face accountability…

  1. A network identity authentication protocol of bank account system based on fingerprint identification and mixed encryption

    NASA Astrophysics Data System (ADS)

    Zhu, Lijuan; Liu, Jingao

    2013-07-01

    This paper describes a network identity authentication protocol of bank account system based on fingerprint identification and mixed encryption. This protocol can provide every bank user a safe and effective way to manage his own bank account, and also can effectively prevent the hacker attacks and bank clerk crime, so that it is absolute to guarantee the legitimate rights and interests of bank users.

  2. 24 CFR 990.280 - Project-based budgeting and accounting.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... budgeting and accounting for each project in a manner that allows for analysis of the actual revenues and..., etc.). (b)(1) Financial information to be budgeted and accounted for at a project level shall include all data needed to complete project-based financial statements in accordance with...

  3. ASPEN--A Web-Based Application for Managing Student Server Accounts

    ERIC Educational Resources Information Center

    Sandvig, J. Christopher

    2004-01-01

    The growth of the Internet has greatly increased the demand for server-side programming courses at colleges and universities. Students enrolled in such courses must be provided with server-based accounts that support the technologies that they are learning. The process of creating, managing and removing large numbers of student server accounts is…

  4. Materiality in a Practice-Based Approach

    ERIC Educational Resources Information Center

    Svabo, Connie

    2009-01-01

    Purpose: The paper aims to provide an overview of the vocabulary for materiality which is used by practice-based approaches to organizational knowing. Design/methodology/approach: The overview is theoretically generated and is based on the anthology Knowing in Organizations: A Practice-based Approach edited by Nicolini, Gherardi and Yanow. The…

  5. A Review of Financial Accounting Fraud Detection based on Data Mining Techniques

    NASA Astrophysics Data System (ADS)

    Sharma, Anuj; Kumar Panigrahi, Prabin

    2012-02-01

    With an upsurge in financial accounting fraud in the current economic scenario experienced, financial accounting fraud detection (FAFD) has become an emerging topic of great importance for academic, research and industries. The failure of internal auditing system of the organization in identifying the accounting frauds has lead to use of specialized procedures to detect financial accounting fraud, collective known as forensic accounting. Data mining techniques are providing great aid in financial accounting fraud detection, since dealing with the large data volumes and complexities of financial data are big challenges for forensic accounting. This paper presents a comprehensive review of the literature on the application of data mining techniques for the detection of financial accounting fraud and proposes a framework for data mining techniques based accounting fraud detection. The systematic and comprehensive literature review of the data mining techniques applicable to financial accounting fraud detection may provide a foundation to future research in this field. The findings of this review show that data mining techniques like logistic models, neural networks, Bayesian belief network, and decision trees have been applied most extensively to provide primary solutions to the problems inherent in the detection and classification of fraudulent data.

  6. Combating QR-Code-Based Compromised Accounts in Mobile Social Networks

    PubMed Central

    Guo, Dong; Cao, Jian; Wang, Xiaoqi; Fu, Qiang; Li, Qiang

    2016-01-01

    Cyber Physical Social Sensing makes mobile social networks (MSNs) popular with users. However, such attacks are rampant as malicious URLs are spread covertly through quick response (QR) codes to control compromised accounts in MSNs to propagate malicious messages. Currently, there are generally two types of methods to identify compromised accounts in MSNs: one type is to analyze the potential threats on wireless access points and the potential threats on handheld devices’ operation systems so as to stop compromised accounts from spreading malicious messages; the other type is to apply the method of detecting compromised accounts in online social networks to MSNs. The above types of methods above focus neither on the problems of MSNs themselves nor on the interaction of sensors’ messages, which leads to the restrictiveness of platforms and the simplification of methods. In order to stop the spreading of compromised accounts in MSNs effectively, the attacks have to be traced to their sources first. Through sensors, users exchange information in MSNs and acquire information by scanning QR codes. Therefore, analyzing the traces of sensor-related information helps to identify the compromised accounts in MSNs. This paper analyzes the diversity of information sending modes of compromised accounts and normal accounts, analyzes the regularity of GPS (Global Positioning System)-based location information, and introduces the concepts of entropy and conditional entropy so as to construct an entropy-based model based on machine learning strategies. To achieve the goal, about 500,000 accounts of Sina Weibo and about 100 million corresponding messages are collected. Through the validation, the accuracy rate of the model is proved to be as high as 87.6%, and the false positive rate is only 3.7%. Meanwhile, the comparative experiments of the feature sets prove that sensor-based location information can be applied to detect the compromised accounts in MSNs. PMID:27657071

  7. Combating QR-Code-Based Compromised Accounts in Mobile Social Networks.

    PubMed

    Guo, Dong; Cao, Jian; Wang, Xiaoqi; Fu, Qiang; Li, Qiang

    2016-09-20

    Cyber Physical Social Sensing makes mobile social networks (MSNs) popular with users. However, such attacks are rampant as malicious URLs are spread covertly through quick response (QR) codes to control compromised accounts in MSNs to propagate malicious messages. Currently, there are generally two types of methods to identify compromised accounts in MSNs: one type is to analyze the potential threats on wireless access points and the potential threats on handheld devices' operation systems so as to stop compromised accounts from spreading malicious messages; the other type is to apply the method of detecting compromised accounts in online social networks to MSNs. The above types of methods above focus neither on the problems of MSNs themselves nor on the interaction of sensors' messages, which leads to the restrictiveness of platforms and the simplification of methods. In order to stop the spreading of compromised accounts in MSNs effectively, the attacks have to be traced to their sources first. Through sensors, users exchange information in MSNs and acquire information by scanning QR codes. Therefore, analyzing the traces of sensor-related information helps to identify the compromised accounts in MSNs. This paper analyzes the diversity of information sending modes of compromised accounts and normal accounts, analyzes the regularity of GPS (Global Positioning System)-based location information, and introduces the concepts of entropy and conditional entropy so as to construct an entropy-based model based on machine learning strategies. To achieve the goal, about 500,000 accounts of Sina Weibo and about 100 million corresponding messages are collected. Through the validation, the accuracy rate of the model is proved to be as high as 87.6%, and the false positive rate is only 3.7%. Meanwhile, the comparative experiments of the feature sets prove that sensor-based location information can be applied to detect the compromised accounts in MSNs.

  8. A Blue/Green Water-based Accounting Framework for Assessment of Water Security

    NASA Astrophysics Data System (ADS)

    Rodrigues, D. B.; Gupta, H. V.; Mendiondo, E. M.

    2013-12-01

    A comprehensive assessment of water security can incorporate several water-related concepts, including provisioning and support for freshwater ecosystem services, water footprint, water scarcity, and water vulnerability, while accounting for Blue and Green Water (BW and GW) flows defined in accordance with the hydrological processes involved. Here, we demonstrate how a quantitative analysis of provisioning and demand (in terms of water footprint) for BW and GW ecosystem services can be conducted, so as to provide indicators of water scarcity and vulnerability at the basin level. To illustrate the approach, we use the Soil and Water Assessment Tool (SWAT) to model the hydrology of an agricultural basin (291 sq.km) within the Cantareira water supply system in Brazil. To provide a more comprehensive basis for decision-making, we compute the BW provision using three different hydrological-based methods for specifying monthly Environmental Flow Requirements (EFRs) for 23 year-period. The current BW-Footprint was defined using surface water rights for reference year 2012. Then we analyzed the BW- and GW-Footprints against long-term series of monthly values of freshwater availability. Our results reveal clear spatial and temporal patterns of water scarcity and vulnerability levels within the basin, and help to distinguish between human and natural reasons (drought) for conditions of insecurity. The Blue/Green water-based accounting framework developed here can be benchmarked at a range of spatial scales, thereby improving our understanding of how and where water-related threats to human and aquatic ecosystem security can arise. Future investigation will be necessary to better understand the intra-annual variability of blue water demand and to evaluate the impacts of uncertainties associated with a) the water rights database, b) the effects of climate change projections on blue and green freshwater provision.

  9. Analyzing the Operation of Performance-Based Accountability Systems for Public Services. Technical Report

    ERIC Educational Resources Information Center

    Camm, Frank; Stecher, Brian M.

    2010-01-01

    Empirical evidence of the effects of performance-based public management is scarce. This report describes a framework used to organize available empirical information on one form of performance-based management, a performance-based accountability system (PBAS). Such a system identifies individuals or organizations that must change their behavior…

  10. Green function approach for the ab initio calculation of the optical and magneto-optical properties of solids:  Accounting for dynamical many-body effects

    NASA Astrophysics Data System (ADS)

    Perlov, A.; Chadov, S.; Ebert, H.

    2003-12-01

    An approach for the calculation of the optical and magneto-optical properties of solids based on the one-particle Green function is introduced in the framework of the linear muffin-tin orbital method. The approach keeps all advantages of the more accurate Korringa-Kohn-Rostoker scheme as the possibility to account for many-body effects in terms of the nonlocal energy dependent self-energy but is numerically much more efficient. Application of various proposed model self-energies for the calculation of the optical properties of bulk Ni and Fe demonstrates the great potential of the new scheme.

  11. The Demand for Higher Education: A Static Structural Approach Accounting for Individual Heterogeneity and Nesting Patterns

    ERIC Educational Resources Information Center

    Flannery, Darragh; O'Donoghue, Cathal

    2013-01-01

    In this paper we estimate a structural model of higher education participation and labour choices in a static setting that accounts for individual heterogeneity and possible nesting structures in the decision process. We assume that young people that complete upper secondary education are faced with three choices, go to higher education, not go to…

  12. Peer-Mentoring Undergraduate Accounting Students: The Influence on Approaches to Learning and Academic Performance

    ERIC Educational Resources Information Center

    Fox, Alison; Stevenson, Lorna; Connelly, Patricia; Duff, Angus; Dunlop, Angela

    2010-01-01

    This article considers the impact of a student peer-mentoring programme (the Mentor Accountant Project, MAP) on first-year undergraduates' academic performance. The development of MAP was informed by reference to extant literature; it relies on the voluntary services of third-year students who then act as mentors to first-year student mentees in…

  13. Measuring Resources in Education: A Comparison of Accounting and the Resource Cost Model Approach.

    ERIC Educational Resources Information Center

    Chambers, Jay G.

    2000-01-01

    The need for programmatic cost information, data compatibility, and understanding input/output relationships are sparking efforts to improve standards for organizing and reporting educational-resource data. Unlike accountants, economists measure resources in real terms and organize information around service delivery, using a resource-cost model.…

  14. Financial Management Reforms in the Health Sector: A Comparative Study Between Cash-based and Accrual-based Accounting Systems

    PubMed Central

    Abolhallaje, Masoud; Jafari, Mehdi; Seyedin, Hesam; Salehi, Masoud

    2014-01-01

    Background: Financial management and accounting reform in the public sectors was started in 2000. Moving from cash-based to accrual-based is considered as the key component of these reforms and adjustments in the public sector. Performing this reform in the health system is a part of a bigger reform under the new public management. Objectives: The current study aimed to analyze the movement from cash-based to accrual-based accounting in the health sector in Iran. Patients and Methods: This comparative study was conducted in 2013 to compare financial management and movement from cash-based to accrual-based accounting in health sector in the countries such as the United States, Britain, Canada, Australia, New Zealand, and Iran. Library resources and reputable databases such as Medline, Elsevier, Index Copernicus, DOAJ, EBSCO-CINAHL and SID, and Iranmedex were searched. Fish cards were used to collect the data. Data were compared and analyzed using comparative tables. Results: Developed countries have implemented accrual-based accounting and utilized the valid, reliable and practical information in accrual-based reporting in different areas such as price and tariffs setting, operational budgeting, public accounting, performance evaluation and comparison and evidence based decision making. In Iran, however, only a few public organizations such as the municipalities and the universities of medical sciences use accrual-based accounting, but despite what is required by law, the other public organizations do not use accrual-based accounting. Conclusions: There are advantages in applying accrual-based accounting in the public sector which certainly depends on how this system is implemented in the sector. PMID:25763194

  15. Force field development for actinyl ions via quantum mechanical calculations: an approach to account for many body solvation effects.

    PubMed

    Rai, Neeraj; Tiwari, Surya P; Maginn, Edward J

    2012-09-06

    Advances in computational algorithms and methodologies make it possible to use highly accurate quantum mechanical calculations to develop force fields (pair-wise additive intermolecular potentials) for condensed phase simulations. Despite these advances, this approach faces numerous hurdles for the case of actinyl ions, AcO2(n+) (high-oxidation-state actinide dioxo cations), mainly due to the complex electronic structure resulting from an interplay of s, p, d, and f valence orbitals. Traditional methods use a pair of molecules (“dimer”) to generate a potential energy surface (PES) for force field parametrization based on the assumption that many body polarization effects are negligible. We show that this is a poor approximation for aqueous phase uranyl ions and present an alternative approach for the development of actinyl ion force fields that includes important many body solvation effects. Force fields are developed for the UO2(2+) ion with the SPC/Fw, TIP3P, TIP4P, and TIP5P water models and are validated by carrying out detailed molecular simulations on the uranyl aqua ion, one of the most characterized actinide systems. It is shown that the force fields faithfully reproduce available experimental structural data and hydration free energies. Failure to account for solvation effects when generating PES leads to overbinding between UO2(2+) and water, resulting in incorrect hydration free energies and coordination numbers. A detailed analysis of arrangement of water molecules in the first and second solvation shell of UO2(2+) is presented. The use of a simple functional form involving the sum of Lennard-Jones + Coulomb potentials makes the new force field compatible with a large number of available molecular simulation engines and common force fields.

  16. 24 CFR 990.280 - Project-based budgeting and accounting.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... URBAN DEVELOPMENT THE PUBLIC HOUSING OPERATING FUND PROGRAM Asset Management § 990.280 Project-based... asset management fee may be charged only if the PHA performs all asset management activities described in this subpart (including project-based management, budgeting, and accounting). Asset...

  17. Beyond Traditional Literacy Instruction: Toward an Account-Based Literacy Training Curriculum in Libraries

    ERIC Educational Resources Information Center

    Cirella, David

    2012-01-01

    A diverse group, account-based services include a wide variety of sites commonly used by patrons, including online shopping sites, social networks, photo- and video-sharing sites, banking and financial sites, government services, and cloud-based storage. Whether or not a piece of information is obtainable online must be considered when creating…

  18. Cosmological constraints from Sunyaev-Zeldovich cluster counts: An approach to account for missing redshifts

    SciTech Connect

    Bonaldi, A.; Battye, R. A.; Brown, M. L.

    2014-05-10

    The accumulation of redshifts provides a significant observational bottleneck when using galaxy cluster surveys to constrain cosmological parameters. We propose a simple method to allow the use of samples where there is a fraction of the redshifts that are not known. The simplest assumption is that the missing redshifts are randomly extracted from the catalog, but the method also allows one to take into account known selection effects in the accumulation of redshifts. We quantify the reduction in statistical precision of cosmological parameter constraints as a function of the fraction of missing redshifts for simulated surveys, and also investigate the impact of making an incorrect assumption for the distribution of missing redshifts.

  19. The Usage of an Online Discussion Forum for the Facilitation of Case-Based Learning in an Intermediate Accounting Course: A New Zealand Case

    ERIC Educational Resources Information Center

    Weil, Sidney; McGuigan, Nicholas; Kern, Thomas

    2011-01-01

    This paper describes the implementation of an online discussion forum as a means of facilitating case-based learning in an intermediate financial accounting course. The paper commences with a review of case-based learning literature and the use of online discussions as a delivery platform, linking these pedagogical approaches to the emerging needs…

  20. [Orbitozygomatic approaches to the skull base].

    PubMed

    Cherekaev, V A; Gol'bin, D A; Belov, A I; Radchenkov, N S; Lasunin, N V; Vinokurov, A G

    2015-01-01

    The paper is written in the lecture format and dedicated to one of the main basal approaches, the orbitozygomatic approach, that has been widely used by neurosurgeons for several decades. The authors describe the historical background of the approach development and the surgical technique features and also analyze the published data about application of the orbitozygomatic approach in surgery for skull base tumors and cerebral aneurysms.

  1. Situational Effects May Account for Gain Scores in Cognitive Ability Testing: A Longitudinal SEM Approach

    ERIC Educational Resources Information Center

    Matton, Nadine; Vautier, Stephane; Raufaste, Eric

    2009-01-01

    Mean gain scores for cognitive ability tests between two sessions in a selection setting are now a robust finding, yet not fully understood. Many authors do not attribute such gain scores to an increase in the target abilities. Our approach consists of testing a longitudinal SEM model suitable to this view. We propose to model the scores' changes…

  2. Accounting for Success and Failure: A Discursive Psychological Approach to Sport Talk

    ERIC Educational Resources Information Center

    Locke, Abigail

    2004-01-01

    In recent years, constructionist methodologies such as discursive psychology (Edwards & Potter, 1992) have begun to be used in sport research. This paper provides a practical guide to applying a discursive psychological approach to sport data. It discusses the assumptions and principles of discursive psychology and outlines the stages of a…

  3. A Bayesian Approach to Account for Misclassification and Overdispersion in Count Data

    PubMed Central

    Wu, Wenqi; Stamey, James; Kahle, David

    2015-01-01

    Count data are subject to considerable sources of what is often referred to as non-sampling error. Errors such as misclassification, measurement error and unmeasured confounding can lead to substantially biased estimators. It is strongly recommended that epidemiologists not only acknowledge these sorts of errors in data, but incorporate sensitivity analyses into part of the total data analysis. We extend previous work on Poisson regression models that allow for misclassification by thoroughly discussing the basis for the models and allowing for extra-Poisson variability in the form of random effects. Via simulation we show the improvements in inference that are brought about by accounting for both the misclassification and the overdispersion. PMID:26343704

  4. Accounting for Negative Automaintenance in Pigeons: A Dual Learning Systems Approach and Factored Representations

    PubMed Central

    Lesaint, Florian; Sigaud, Olivier; Khamassi, Mehdi

    2014-01-01

    Animals, including Humans, are prone to develop persistent maladaptive and suboptimal behaviours. Some of these behaviours have been suggested to arise from interactions between brain systems of Pavlovian conditioning, the acquisition of responses to initially neutral stimuli previously paired with rewards, and instrumental conditioning, the acquisition of active behaviours leading to rewards. However the mechanics of these systems and their interactions are still unclear. While extensively studied independently, few models have been developed to account for these interactions. On some experiment, pigeons have been observed to display a maladaptive behaviour that some suggest to involve conflicts between Pavlovian and instrumental conditioning. In a procedure referred as negative automaintenance, a key light is paired with the subsequent delivery of food, however any peck towards the key light results in the omission of the reward. Studies showed that in such procedure some pigeons persisted in pecking to a substantial level despite its negative consequence, while others learned to refrain from pecking and maximized their cumulative rewards. Furthermore, the pigeons that were unable to refrain from pecking could nevertheless shift their pecks towards a harmless alternative key light. We confronted a computational model that combines dual-learning systems and factored representations, recently developed to account for sign-tracking and goal-tracking behaviours in rats, to these negative automaintenance experimental data. We show that it can explain the variability of the observed behaviours and the capacity of alternative key lights to distract pigeons from their detrimental behaviours. These results confirm the proposed model as an interesting tool to reproduce experiments that could involve interactions between Pavlovian and instrumental conditioning. The model allows us to draw predictions that may be experimentally verified, which could help further investigate

  5. Accounting for negative automaintenance in pigeons: a dual learning systems approach and factored representations.

    PubMed

    Lesaint, Florian; Sigaud, Olivier; Khamassi, Mehdi

    2014-01-01

    Animals, including Humans, are prone to develop persistent maladaptive and suboptimal behaviours. Some of these behaviours have been suggested to arise from interactions between brain systems of Pavlovian conditioning, the acquisition of responses to initially neutral stimuli previously paired with rewards, and instrumental conditioning, the acquisition of active behaviours leading to rewards. However the mechanics of these systems and their interactions are still unclear. While extensively studied independently, few models have been developed to account for these interactions. On some experiment, pigeons have been observed to display a maladaptive behaviour that some suggest to involve conflicts between Pavlovian and instrumental conditioning. In a procedure referred as negative automaintenance, a key light is paired with the subsequent delivery of food, however any peck towards the key light results in the omission of the reward. Studies showed that in such procedure some pigeons persisted in pecking to a substantial level despite its negative consequence, while others learned to refrain from pecking and maximized their cumulative rewards. Furthermore, the pigeons that were unable to refrain from pecking could nevertheless shift their pecks towards a harmless alternative key light. We confronted a computational model that combines dual-learning systems and factored representations, recently developed to account for sign-tracking and goal-tracking behaviours in rats, to these negative automaintenance experimental data. We show that it can explain the variability of the observed behaviours and the capacity of alternative key lights to distract pigeons from their detrimental behaviours. These results confirm the proposed model as an interesting tool to reproduce experiments that could involve interactions between Pavlovian and instrumental conditioning. The model allows us to draw predictions that may be experimentally verified, which could help further investigate

  6. A Multi-scale Approach for CO2 Accounting and Risk Analysis in CO2 Enhanced Oil Recovery Sites

    NASA Astrophysics Data System (ADS)

    Dai, Z.; Viswanathan, H. S.; Middleton, R. S.; Pan, F.; Ampomah, W.; Yang, C.; Jia, W.; Lee, S. Y.; McPherson, B. J. O. L.; Grigg, R.; White, M. D.

    2015-12-01

    Using carbon dioxide in enhanced oil recovery (CO2-EOR) is a promising technology for emissions management because CO2-EOR can dramatically reduce carbon sequestration costs in the absence of greenhouse gas emissions policies that include incentives for carbon capture and storage. This study develops a multi-scale approach to perform CO2 accounting and risk analysis for understanding CO2 storage potential within an EOR environment at the Farnsworth Unit of the Anadarko Basin in northern Texas. A set of geostatistical-based Monte Carlo simulations of CO2-oil-water flow and transport in the Marrow formation are conducted for global sensitivity and statistical analysis of the major risk metrics: CO2 injection rate, CO2 first breakthrough time, CO2 production rate, cumulative net CO2 storage, cumulative oil and CH4 production, and water injection and production rates. A global sensitivity analysis indicates that reservoir permeability, porosity, and thickness are the major intrinsic reservoir parameters that control net CO2 injection/storage and oil/CH4 recovery rates. The well spacing (the distance between the injection and production wells) and the sequence of alternating CO2 and water injection are the major operational parameters for designing an effective five-spot CO2-EOR pattern. The response surface analysis shows that net CO2 injection rate increases with the increasing reservoir thickness, permeability, and porosity. The oil/CH4 production rates are positively correlated to reservoir permeability, porosity and thickness, but negatively correlated to the initial water saturation. The mean and confidence intervals are estimated for quantifying the uncertainty ranges of the risk metrics. The results from this study provide useful insights for understanding the CO2 storage potential and the corresponding risks of commercial-scale CO2-EOR fields.

  7. Evolutionary impact assessment: accounting for evolutionary consequences of fishing in an ecosystem approach to fisheries management.

    PubMed

    Laugen, Ane T; Engelhard, Georg H; Whitlock, Rebecca; Arlinghaus, Robert; Dankel, Dorothy J; Dunlop, Erin S; Eikeset, Anne M; Enberg, Katja; Jørgensen, Christian; Matsumura, Shuichi; Nusslé, Sébastien; Urbach, Davnah; Baulier, Loїc; Boukal, David S; Ernande, Bruno; Johnston, Fiona D; Mollet, Fabian; Pardoe, Heidi; Therkildsen, Nina O; Uusi-Heikkilä, Silva; Vainikka, Anssi; Heino, Mikko; Rijnsdorp, Adriaan D; Dieckmann, Ulf

    2014-03-01

    Managing fisheries resources to maintain healthy ecosystems is one of the main goals of the ecosystem approach to fisheries (EAF). While a number of international treaties call for the implementation of EAF, there are still gaps in the underlying methodology. One aspect that has received substantial scientific attention recently is fisheries-induced evolution (FIE). Increasing evidence indicates that intensive fishing has the potential to exert strong directional selection on life-history traits, behaviour, physiology, and morphology of exploited fish. Of particular concern is that reversing evolutionary responses to fishing can be much more difficult than reversing demographic or phenotypically plastic responses. Furthermore, like climate change, multiple agents cause FIE, with effects accumulating over time. Consequently, FIE may alter the utility derived from fish stocks, which in turn can modify the monetary value living aquatic resources provide to society. Quantifying and predicting the evolutionary effects of fishing is therefore important for both ecological and economic reasons. An important reason this is not happening is the lack of an appropriate assessment framework. We therefore describe the evolutionary impact assessment (EvoIA) as a structured approach for assessing the evolutionary consequences of fishing and evaluating the predicted evolutionary outcomes of alternative management options. EvoIA can contribute to EAF by clarifying how evolution may alter stock properties and ecological relations, support the precautionary approach to fisheries management by addressing a previously overlooked source of uncertainty and risk, and thus contribute to sustainable fisheries.

  8. Evolutionary impact assessment: accounting for evolutionary consequences of fishing in an ecosystem approach to fisheries management

    PubMed Central

    Laugen, Ane T; Engelhard, Georg H; Whitlock, Rebecca; Arlinghaus, Robert; Dankel, Dorothy J; Dunlop, Erin S; Eikeset, Anne M; Enberg, Katja; Jørgensen, Christian; Matsumura, Shuichi; Nusslé, Sébastien; Urbach, Davnah; Baulier, Loїc; Boukal, David S; Ernande, Bruno; Johnston, Fiona D; Mollet, Fabian; Pardoe, Heidi; Therkildsen, Nina O; Uusi-Heikkilä, Silva; Vainikka, Anssi; Heino, Mikko; Rijnsdorp, Adriaan D; Dieckmann, Ulf

    2014-01-01

    Managing fisheries resources to maintain healthy ecosystems is one of the main goals of the ecosystem approach to fisheries (EAF). While a number of international treaties call for the implementation of EAF, there are still gaps in the underlying methodology. One aspect that has received substantial scientific attention recently is fisheries-induced evolution (FIE). Increasing evidence indicates that intensive fishing has the potential to exert strong directional selection on life-history traits, behaviour, physiology, and morphology of exploited fish. Of particular concern is that reversing evolutionary responses to fishing can be much more difficult than reversing demographic or phenotypically plastic responses. Furthermore, like climate change, multiple agents cause FIE, with effects accumulating over time. Consequently, FIE may alter the utility derived from fish stocks, which in turn can modify the monetary value living aquatic resources provide to society. Quantifying and predicting the evolutionary effects of fishing is therefore important for both ecological and economic reasons. An important reason this is not happening is the lack of an appropriate assessment framework. We therefore describe the evolutionary impact assessment (EvoIA) as a structured approach for assessing the evolutionary consequences of fishing and evaluating the predicted evolutionary outcomes of alternative management options. EvoIA can contribute to EAF by clarifying how evolution may alter stock properties and ecological relations, support the precautionary approach to fisheries management by addressing a previously overlooked source of uncertainty and risk, and thus contribute to sustainable fisheries. PMID:26430388

  9. Safe Maneuvering Envelope Estimation Based on a Physical Approach

    NASA Technical Reports Server (NTRS)

    Lombaerts, Thomas J. J.; Schuet, Stefan R.; Wheeler, Kevin R.; Acosta, Diana; Kaneshige, John T.

    2013-01-01

    This paper discusses a computationally efficient algorithm for estimating the safe maneuvering envelope of damaged aircraft. The algorithm performs a robust reachability analysis through an optimal control formulation while making use of time scale separation and taking into account uncertainties in the aerodynamic derivatives. This approach differs from others since it is physically inspired. This more transparent approach allows interpreting data in each step, and it is assumed that these physical models based upon flight dynamics theory will therefore facilitate certification for future real life applications.

  10. Historicizing and Contextualizing Global Policy Discourses: Test- and Standards-Based Accountabilities in Education

    ERIC Educational Resources Information Center

    Lingard, Bob

    2013-01-01

    This paper in commenting on the contributions to this special number demonstrates the necessity of historicizing and contextualizing the rise of test- and standards-based modes of accountability in contemporary education policy globally. Both are imperative for understanding specific national manifestations of what has become a globalized…

  11. Is Comprehension Necessary for Error Detection? A Conflict-Based Account of Monitoring in Speech Production

    ERIC Educational Resources Information Center

    Nozari, Nazbanou; Dell, Gary S.; Schwartz, Myrna F.

    2011-01-01

    Despite the existence of speech errors, verbal communication is successful because speakers can detect (and correct) their errors. The standard theory of speech-error detection, the perceptual-loop account, posits that the comprehension system monitors production output for errors. Such a comprehension-based monitor, however, cannot explain the…

  12. Toward an Episodic Context Account of Retrieval-Based Learning: Dissociating Retrieval Practice and Elaboration

    ERIC Educational Resources Information Center

    Lehman, Melissa; Smith, Megan A.; Karpicke, Jeffrey D.

    2014-01-01

    We tested the predictions of 2 explanations for retrieval-based learning; while the elaborative retrieval hypothesis assumes that the retrieval of studied information promotes the generation of semantically related information, which aids in later retrieval (Carpenter, 2009), the episodic context account proposed by Karpicke, Lehman, and Aue (in…

  13. The Influence of Performance-Based Accountability on the Distribution of Teacher Salary Increases

    ERIC Educational Resources Information Center

    Bifulco, Robert

    2010-01-01

    This study examines how aspects of a district's institutional and policy environment influence the distribution of teacher salary increases. The primary hypothesis tested is that statewide performance-based accountability policies influence the extent to which districts backload teacher salary increases. I use data on teacher salaries from the…

  14. Outcome-Based Education and Student Learning in Managerial Accounting in Hong Kong

    ERIC Educational Resources Information Center

    Lui, Gladie; Shum, Connie

    2012-01-01

    Although Outcome-based Education has not been successful in public education in several countries, it has been successful in the medical fields in higher education in the U.S. The author implemented OBE in her Managerial Accounting course in H.K. Intended learning outcomes were mapped again Bloom's Cognitive Domain. Teaching and learning…

  15. Dynamic model of production enterprises based on accounting registers and its identification

    NASA Astrophysics Data System (ADS)

    Sirazetdinov, R. T.; Samodurov, A. V.; Yenikeev, I. A.; Markov, D. S.

    2016-06-01

    The report focuses on the mathematical modeling of economic entities based on accounting registers. Developed the dynamic model of financial and economic activity of the enterprise as a system of differential equations. Created algorithms for identification of parameters of the dynamic model. Constructed and identified the model of Russian machine-building enterprises.

  16. Preferences for Team Learning and Lecture-Based Learning among First-Year Undergraduate Accounting Students

    ERIC Educational Resources Information Center

    Opdecam, Evelien; Everaert, Patricia; Van Keer, Hilde; Buysschaert, Fanny

    2014-01-01

    This study investigates students' "preference" for team learning and its effectiveness, compared to lecture-based learning. A quasi-experiment was set up in a financial accounting course in the first-year undergraduate of the Economics and Business Administration Program, where students had to choose between one of the two learning…

  17. The Social Organization of School Counseling in the Era of Standards-Based Accountability

    ERIC Educational Resources Information Center

    Dorsey, Alexander C.

    2011-01-01

    The reform policies of standards-based accountability, as outlined in NCLB, impede the functioning of school counseling programs and the delivery of services to students. Although recent studies have focused on the transformation of the school counseling profession, a gap exists in the literature with regard to how the experiences of school…

  18. Toward a Culture of Consequences: Performance-Based Accountability Systems for Public Services. Monograph

    ERIC Educational Resources Information Center

    Stecher, Brian M.; Camm, Frank; Damberg, Cheryl L.; Hamilton, Laura S.; Mullen, Kathleen J.; Nelson, Christopher; Sorensen, Paul; Wachs, Martin; Yoh, Allison; Zellman, Gail L.

    2010-01-01

    Performance-based accountability systems (PBASs), which link incentives to measured performance as a means of improving services to the public, have gained popularity. While PBASs can vary widely across sectors, they share three main components: goals, incentives, and measures. Research suggests that PBASs influence provider behaviors, but little…

  19. The Inequality Footprints of Nations: A Novel Approach to Quantitative Accounting of Income Inequality

    PubMed Central

    Alsamawi, Ali; Murray, Joy; Lenzen, Manfred; Moran, Daniel; Kanemoto, Keiichiro

    2014-01-01

    In this study we use economic input-output analysis to calculate the inequality footprint of nations. An inequality footprint shows the link that each country's domestic economic activity has to income distribution elsewhere in the world. To this end we use employment and household income accounts for 187 countries and an historical time series dating back to 1990. Our results show that in 2010, most developed countries had an inequality footprint that was higher than their within-country inequality, meaning that in order to support domestic lifestyles, these countries source imports from more unequal economies. Amongst exceptions are the United States and United Kingdom, which placed them on a par with many developing countries. Russia has a high within-country inequality nevertheless it has the lowest inequality footprint in the world, which is because of its trade connections with the Commonwealth of Independent States and Europe. Our findings show that the commodities that are inequality-intensive, such as electronic components, chemicals, fertilizers, minerals, and agricultural products often originate in developing countries characterized by high levels of inequality. Consumption of these commodities may implicate within-country inequality in both developing and developed countries. PMID:25353333

  20. An approach to the impact of nanoscale vat coloration of cotton on reducing agent account.

    PubMed

    Hakeim, O A; Nassar, S H; Raghab, A A; Abdou, L A W

    2013-02-15

    Aqueous dispersions of nanoscale vat dyes were successfully prepared through ball milling and ultrasonication of three test dyes in the presence of dispersing agent. Critical factors included the time of ball milling and ultrasonication and the molecular structure of the vat dyes have been studied. These dispersions were characterized by morphological structures with particle size determination and quality was evaluated by shelf-life stability using digital images. The nanoscale vat dyes have been applied in dyeing and printing of cotton to evaluate the effect of nanoscale dispersion on the reducing agent account and the difference of coloration performance of a nanoscale and conventionally dispersed vat dyes. Results showed that use of sodium dodecyl sulfate (SDS) maintained a high stability of dispersion with storage. The size and stability of nanoscale dispersion were greatly influenced by molecular structure of the vat dyes. Ultrasonication was helpful in decreasing average particle size. Nanoscale vat dye dispersions gave a much higher color yield than conventional vat dyes. Fastness properties were excellent for washing effects. It is clear that coloration using nanoscale vat dye dispersions offer a number of advantages in terms of reducing agent requirement, improved appearance and also in environmental protection.

  1. The inequality footprints of nations: a novel approach to quantitative accounting of income inequality.

    PubMed

    Alsamawi, Ali; Murray, Joy; Lenzen, Manfred; Moran, Daniel; Kanemoto, Keiichiro

    2014-01-01

    In this study we use economic input-output analysis to calculate the inequality footprint of nations. An inequality footprint shows the link that each country's domestic economic activity has to income distribution elsewhere in the world. To this end we use employment and household income accounts for 187 countries and an historical time series dating back to 1990. Our results show that in 2010, most developed countries had an inequality footprint that was higher than their within-country inequality, meaning that in order to support domestic lifestyles, these countries source imports from more unequal economies. Amongst exceptions are the United States and United Kingdom, which placed them on a par with many developing countries. Russia has a high within-country inequality nevertheless it has the lowest inequality footprint in the world, which is because of its trade connections with the Commonwealth of Independent States and Europe. Our findings show that the commodities that are inequality-intensive, such as electronic components, chemicals, fertilizers, minerals, and agricultural products often originate in developing countries characterized by high levels of inequality. Consumption of these commodities may implicate within-country inequality in both developing and developed countries.

  2. A simulation model of hospital management based on cost accounting analysis according to disease.

    PubMed

    Tanaka, Koji; Sato, Junzo; Guo, Jinqiu; Takada, Akira; Yoshihara, Hiroyuki

    2004-12-01

    Since a little before 2000, hospital cost accounting has been increasingly performed at Japanese national university hospitals. At Kumamoto University Hospital, for instance, departmental costs have been analyzed since 2000. And, since 2003, the cost balance has been obtained according to certain diseases for the preparation of Diagnosis-Related Groups and Prospective Payment System. On the basis of these experiences, we have constructed a simulation model of hospital management. This program has worked correctly at repeated trials and with satisfactory speed. Although there has been room for improvement of detailed accounts and cost accounting engine, the basic model has proved satisfactory. We have constructed a hospital management model based on the financial data of an existing hospital. We will later improve this program from the viewpoint of construction and using more various data of hospital management. A prospective outlook may be obtained for the practical application of this hospital management model.

  3. Measure for Measure: How Proficiency-Based Accountability Systems Affect Inequality in Academic Achievement

    PubMed Central

    Jennings, Jennifer; Sohn, Heeju

    2016-01-01

    How do proficiency-based accountability systems affect inequality in academic achievement? This paper reconciles mixed findings in the literature by demonstrating that three factors jointly determine accountability's impact. First, by analyzing student-level data from a large urban school district, we find that when educators face accountability pressure, they focus attention on students closest to proficiency. We refer to this practice as educational triage, and show that the difficulty of the proficiency standard affects whether lower or higher performing students gain most on high-stakes tests used to evaluate schools. Less difficult proficiency standards decrease inequality in high-stakes achievement, while more difficult ones increase it. Second, we show that educators emphasize test-specific skills with students near proficiency, a practice that we refer to as instructional triage. As a result, the effects of accountability pressure differ across high and low-stakes tests; we find no effects on inequality in low-stakes reading and math tests of similar skills. Finally, we provide suggestive evidence that instructional triage is most pronounced in the lowest performing schools. We conclude by discussing how these findings shape our understanding of accountability's impacts on educational inequality. PMID:27122642

  4. Measure for Measure: How Proficiency-Based Accountability Systems Affect Inequality in Academic Achievement.

    PubMed

    Jennings, Jennifer; Sohn, Heeju

    2014-04-01

    How do proficiency-based accountability systems affect inequality in academic achievement? This paper reconciles mixed findings in the literature by demonstrating that three factors jointly determine accountability's impact. First, by analyzing student-level data from a large urban school district, we find that when educators face accountability pressure, they focus attention on students closest to proficiency. We refer to this practice as educational triage, and show that the difficulty of the proficiency standard affects whether lower or higher performing students gain most on high-stakes tests used to evaluate schools. Less difficult proficiency standards decrease inequality in high-stakes achievement, while more difficult ones increase it. Second, we show that educators emphasize test-specific skills with students near proficiency, a practice that we refer to as instructional triage. As a result, the effects of accountability pressure differ across high and low-stakes tests; we find no effects on inequality in low-stakes reading and math tests of similar skills. Finally, we provide suggestive evidence that instructional triage is most pronounced in the lowest performing schools. We conclude by discussing how these findings shape our understanding of accountability's impacts on educational inequality.

  5. Narrative accounts of tracking the rural domestic violence survivors' journey: a feminist approach.

    PubMed

    Davis, K; Taylor, B; Furniss, D

    2001-06-01

    This research represents the first stage of a project to determine the level of use and effectiveness of informal support networks utilised by Australian rural women. We used a feminist narrative approach with semistructured interviews and a convenience sample of 26 rural women. Only 9 out of 12 women's stories are presented. We found that poverty and geographical, social, and emotional isolation resulted in the privatisation of abuse. Women were triggered to leave the family home when their children, friends, or family became victims of the abuse. They planned their escape by telephone with support of friends and family. Although they used these informal supports, the participants paradoxically expected a high level of expertise in domestic violence knowledge and skills. We recommend an integrated multilevel model of support for rural women in violent intimate relationships and their informal supporters.

  6. Standards-Based Accountability as a Tool for Making a Difference in Student Learning. A State and an Institutional Perspective on Standards-Based Accountability.

    ERIC Educational Resources Information Center

    Wilkerson, Judy R.

    This paper examines Florida's standards-driven performance assessment, emphasizing teacher preparation, and touching on K-12 accountability. Florida's educational reform and accountability efforts are driven by the Florida System of School Improvement and Accountability document. The system is derived from state goals similar to the national Goals…

  7. The Tutor's Approach in Base Groups (PBL)

    ERIC Educational Resources Information Center

    Silen, Charlotte

    2006-01-01

    In this article, the concept of approach related to tutor functioning in problem-based learning (PBL) is explored and the significance of a phenomenological perspective of the body in relation to learning and tutoring is investigated. The aim has been to understand the concept of approach in a context where the individual, thoughts, emotions and…

  8. Computer-Based Training: An Institutional Approach.

    ERIC Educational Resources Information Center

    Barker, Philip; Manji, Karim

    1992-01-01

    Discussion of issues related to computer-assisted learning (CAL) and computer-based training (CBT) describes approaches to electronic learning; principles underlying courseware development to support these approaches; and a plan for creation of a CAL/CBT development center, including its functional role, campus services, staffing, and equipment…

  9. An Inquiry-Based Approach of Traditional "Step-by-Step" Experiments

    ERIC Educational Resources Information Center

    Szalay, L.; Tóth, Z.

    2016-01-01

    This is the start of a road map for the effective introduction of inquiry-based learning in chemistry. Advantages of inquiry-based approaches to the development of scientific literacy are widely discussed in the literature. However, unless chemistry educators take account of teachers' reservations and identified disadvantages such approaches will…

  10. Comparative approaches to studying strategy: towards an evolutionary account of primate decision making.

    PubMed

    Brosnan, Sarah F; Beran, Michael J; Parrish, Audrey E; Price, Sara A; Wilson, Bart J

    2013-07-18

    How do primates, humans included, deal with novel problems that arise in interactions with other group members? Despite much research regarding how animals and humans solve social problems, few studies have utilized comparable procedures, outcomes, or measures across different species. Thus, it is difficult to piece together the evolution of decision making, including the roots from which human economic decision making emerged. Recently, a comparative body of decision making research has emerged, relying largely on the methodology of experimental economics in order to address these questions in a cross-species fashion. Experimental economics is an ideal method of inquiry for this approach. It is a well-developed method for distilling complex decision making involving multiple conspecifics whose decisions are contingent upon one another into a series of simple decision choices. This allows these decisions to be compared across species and contexts. In particular, our group has used this approach to investigate coordination in New World monkeys, Old World monkeys, and great apes (including humans), using identical methods. We find that in some cases there are remarkable continuities of outcome, as when some pairs in all species solved a coordination game, the Assurance game. On the other hand, we also find that these similarities in outcomes are likely driven by differences in underlying cognitive mechanisms. New World monkeys required exogenous information about their partners' choices in order to solve the task, indicating that they were using a matching strategy. Old World monkeys, on the other hand, solved the task without exogenous cues, leading to investigations into what mechanisms may be underpinning their responses (e.g., reward maximization, strategy formation, etc.). Great apes showed a strong experience effect, with cognitively enriched apes following what appears to be a strategy. Finally, humans were able to solve the task with or without exogenous cues

  11. A dynamic water accounting framework based on marginal resource opportunity cost

    NASA Astrophysics Data System (ADS)

    Tilmant, A.; Marques, G.; Mohamed, Y.

    2014-10-01

    Many river basins throughout the world are increasingly under pressure as water demands keep rising due to population growth, industrialization, urbanization and rising living standards. In the past, the typical answer to meet those demands focused on the supply-side and involved the construction of hydraulic infrastructures to capture more water from surface water bodies and from aquifers. As river basins were being more and more developed, downstream water users and ecosystems have become increasingly dependent on the management actions taken by upstream users. The increased interconnectedness between water users, aquatic ecosystems and the built environment is further compounded by climate change and its impact on the water cycle. Those pressures mean that it has become increasingly important to measure and account for changes in water fluxes and their corresponding economic value as they progress throughout the river system. Such basin water accounting should provide policy makers with important information regarding the relative contribution of each water user, infrastructure and management decision to the overall economic value of the river basin. This paper presents a dynamic water accounting approach whereby the entire river basin is considered as a value chain with multiple services including production and storage. Water users and reservoirs operators are considered as economic agents who can exchange water with their hydraulic neighbours at a price corresponding to the marginal value of water. Effective water accounting is made possible by keeping track of all water fluxes and their corresponding hypothetical transactions using the results of a hydro-economic model. The proposed approach is illustrated with the Eastern Nile River basin in Africa.

  12. A dynamic water accounting framework based on marginal resource opportunity cost

    NASA Astrophysics Data System (ADS)

    Tilmant, A.; Marques, G.; Mohamed, Y.

    2015-03-01

    Many river basins throughout the world are increasingly under pressure as water demands keep rising due to population growth, industrialization, urbanization and rising living standards. In the past, the typical answer to meet those demands focused on the supply side and involved the construction of hydraulic infrastructures to capture more water from surface water bodies and from aquifers. As river basins have become more and more developed, downstream water users and ecosystems have become increasingly dependent on the management actions taken by upstream users. The increased interconnectedness between water users, aquatic ecosystems and the built environment is further compounded by climate change and its impact on the water cycle. Those pressures mean that it has become increasingly important to measure and account for changes in water fluxes and their corresponding economic value as they progress throughout the river system. Such basin water accounting should provide policy makers with important information regarding the relative contribution of each water user, infrastructure and management decision to the overall economic value of the river basin. This paper presents a dynamic water accounting approach whereby the entire river basin is considered as a value chain with multiple services including production and storage. Water users and reservoir operators are considered as economic agents who can exchange water with their hydraulic neighbors at a price corresponding to the marginal value of water. Effective water accounting is made possible by keeping track of all water fluxes and their corresponding hypothetical transactions using the results of a hydro-economic model. The proposed approach is illustrated with the Eastern Nile River basin in Africa.

  13. A Principles-Based Approach to Teaching International Financial Reporting Standards (IFRS)

    ERIC Educational Resources Information Center

    Persons, Obeua

    2014-01-01

    This article discusses the principles-based approach that emphasizes a "why" question by using the International Accounting Standards Board (IASB) "Conceptual Framework for Financial Reporting" to question and understand the basis for specific differences between IFRS and U.S. generally accepted accounting principles (U.S.…

  14. Toward a Culture of Consequences: Performance-Based Accountability Systems for Public Services.

    PubMed

    Stecher, Brian M; Camm, Frank; Damberg, Cheryl L; Hamilton, Laura S; Mullen, Kathleen J; Nelson, Christopher; Sorensen, Paul; Wachs, Martin; Yoh, Allison; Zellman, Gail L; Leuschner, Kristin J; Camm, Frank; Stecher, Brian M

    2012-01-01

    Performance-based accountability systems (PBASs), which link incentives to measured performance as a means of improving services to the public, have gained popularity. While PBASs can vary widely across sectors, they share three main components: goals, incentives, and measures. Research suggests that PBASs influence provider behaviors, but little is known about PBAS effectiveness at achieving performance goals or about government and agency experiences. This study examines nine PBASs that are drawn from five sectors: child care, education, health care, public health emergency preparedness, and transportation. In the right circumstances, a PBAS can be an effective strategy for improving service delivery. Optimum circumstances include having a widely shared goal, unambiguous observable measures, meaningful incentives for those with control over the relevant inputs and processes, few competing interests, and adequate resources to design, implement, and operate the PBAS. However, these conditions are rarely fully realized, so it is difficult to design and implement PBASs that are uniformly effective. PBASs represent a promising policy option for improving the quality of service-delivery activities in many contexts. The evidence supports continued experimentation with and adoption of this approach in appropriate circumstances. Even so, PBAS design and its prospects for success depend on the context in which it will operate. Also, ongoing system evaluation and monitoring are integral components of a PBAS; they inform refinements that improve system functioning over time. Empirical evidence of the effects of performance-based public management is scarce. This article also describes a framework used to evaluate a PBAS. Such a system identifies individuals or organizations that must change their behavior for the performance of an activity to improve, chooses an implicit or explicit incentive structure to motivate these organizations or individuals to change, and then

  15. Spatial pattern of nitrogen deposition flux over Czech forests: a novel approach accounting for unmeasured nitrogen species

    NASA Astrophysics Data System (ADS)

    Hůnová, Iva; Stoklasová, Petra; Kurfürst, Pavel; Vlček, Ondřej; Schovánková, Jana; Stráník, Vojtěch

    2015-04-01

    atmospheric nitrogen deposition flux over the Czech forests collating all available data and model results. The aim of the presented study is to provide an improved, more reliable and more realistic estimate of spatial pattern of nitrogen deposition flux over one country. This has so far been based standardly on measurements of ambient N/NOx concentrations as dry deposition proxy, and N/NH4+ and N/NO3- as wet deposition proxy. For estimate of unmeasured species contributing to dry deposition, we used an Eulerian photochemical dispersion model CAMx, the Comprehensive Air Quality Model with extensions (ESSS, 2011), coupled with a high resolution regional numeric weather prediction model Aladin (Vlček, Corbet, 2011). Contribution of fog was estimated using a geostatistical data driven model. Final maps accounting for unmeasured species clearly indicate, that so far used approach results in substantial underestimation of nitrogen deposition flux. Substitution of unmeasured nitrogen species by modeled values seems to be a plausible way for approximation of total nitrogen deposition, and getting more realistic spatial pattern as input for further studies of likely nitrogen impacts on ecosystems. Acknowledgements: We would like to acknowledge the grants GA14-12262S - Effects of changing growth conditions on tree increment, stand production and vitality - danger or opportunity for the Central-European forestry?, and NAZV QI112A168 (ForSoil) of the Czech Ministry for Agriculture for support of this contribution. The input data used for the analysis were provided by the Czech Hydrometeorological Institute. References: Bobbink, R., Hicks, K., Galloway, J., Spranger, T., Alkemade, R. et al. (2010): Global Assessment of Nitrogen Deposition Effects on Terrestrial Plant Diversity: a Synthesis. Ecological Applications 20 (1), 30-59. Fowler D., O'Donoghue M., Muller J.B.A, et al. (2005): A chronology of nitrogen deposition in the UK between 1900 and 2000. Watter, Air & Soil Pollution: Focus

  16. Accounting Technology Associate Degree. Louisiana Technical Education Program and Course Standards. Competency-Based Postsecondary Curriculum Outline from Bulletin 1822.

    ERIC Educational Resources Information Center

    Louisiana State Dept. of Education, Baton Rouge. Div. of Vocational Education.

    This document outlines the curriculum of Louisiana's accounting technology associate degree program, which is a 6-term (77-credit hour) competency-based program designed to prepare students for employment as accounting technicians providing technical administrative support to professional accountants and other financial management personnel.…

  17. High Performance Controllers Based on Real Parameters to Account for Parameter Variations due to Iron Saturation

    DTIC Science & Technology

    2013-08-01

    Article 3. DATES COVERED 10-04-2013 to 15-07-2013 4 . TITLE AND SUBTITLE HIGH PERFORMANCE CONTROLLERS BASED ON REALPARAMETERS TO ACCOUNT FOR... 4 )   3 4 d q q d P T i i   (5...ideal model (3) and ( 4 ). Instead, the flux linkages become coupled to both axis currents, this coupling effect can be modeled as follows

  18. Burned area, active fires and biomass burning - approaches to account for emissions from fires in Tanzania

    NASA Astrophysics Data System (ADS)

    Ruecker, Gernot; Hoffmann, Anja; Leimbach, David; Tiemann, Joachim; Ng'atigwa, Charles

    2013-04-01

    Eleven years of data from the globally available MODIS burned area and the MODS Active Fire Product have been analysed for Tanzania in conjunction with GIS data on land use and cover to provide a baseline for fire activity in this East African country. The total radiated energy (FRE) emitted by fires that were picked up by the burned area and active fire product is estimated based on a spatio-temporal clustering algorithm over the burned areas, and integration of the fire radiative power from the MODIS Active Fires product over the time of burning and the area of each burned area cluster. Resulting biomass combusted by unit area based on Woosteŕs scaling factor for FRE to biomass combusted is compared to values found in the literature, and to values found in the Global Fire Emissions Database (GFED). Pyrogenic emissions are then estimated using emission factors. According to our analysis, an average of 11 million ha burn annually (ranging between 8.5 and 12.9 million ha) in Tanzania corresponding to between 10 and 14 % of Tanzaniás land area. Most burned area is recorded in the months from May to October. The land cover types most affected are woodland and shrubland cover types: they comprise almost 70 % of Tanzania's average annual burned area or 6.8 million ha. Most burning occurs in gazetted land, with an annual average of 3.7 million ha in forest reserves, 3.3 million ha in game reserves and 1.46 million ha in national parks, totalling close to 8.5 million ha or 77 % of the annual average burned area of Tanzania. Annual variability of burned area is moderate for most of the analysed classes, and in most cases there is no clear trend to be detected in burned area, except for the Lindi region were annual burned area appears to be increasing. Preliminary results regarding emissions from fires show that for larger fires that burn over a longer time, biomass burned derived through the FRP method compares well to literature values, while the integration over

  19. A cloud model-based approach for water quality assessment.

    PubMed

    Wang, Dong; Liu, Dengfeng; Ding, Hao; Singh, Vijay P; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun

    2016-07-01

    Water quality assessment entails essentially a multi-criteria decision-making process accounting for qualitative and quantitative uncertainties and their transformation. Considering uncertainties of randomness and fuzziness in water quality evaluation, a cloud model-based assessment approach is proposed. The cognitive cloud model, derived from information science, can realize the transformation between qualitative concept and quantitative data, based on probability and statistics and fuzzy set theory. When applying the cloud model to practical assessment, three technical issues are considered before the development of a complete cloud model-based approach: (1) bilateral boundary formula with nonlinear boundary regression for parameter estimation, (2) hybrid entropy-analytic hierarchy process technique for calculation of weights, and (3) mean of repeated simulations for determining the degree of final certainty. The cloud model-based approach is tested by evaluating the eutrophication status of 12 typical lakes and reservoirs in China and comparing with other four methods, which are Scoring Index method, Variable Fuzzy Sets method, Hybrid Fuzzy and Optimal model, and Neural Networks method. The proposed approach yields information concerning membership for each water quality status which leads to the final status. The approach is found to be representative of other alternative methods and accurate.

  20. A spatial simulation approach to account for protein structure when identifying non-random somatic mutations

    PubMed Central

    2014-01-01

    Background Current research suggests that a small set of “driver” mutations are responsible for tumorigenesis while a larger body of “passenger” mutations occur in the tumor but do not progress the disease. Due to recent pharmacological successes in treating cancers caused by driver mutations, a variety of methodologies that attempt to identify such mutations have been developed. Based on the hypothesis that driver mutations tend to cluster in key regions of the protein, the development of cluster identification algorithms has become critical. Results We have developed a novel methodology, SpacePAC (Spatial Protein Amino acid Clustering), that identifies mutational clustering by considering the protein tertiary structure directly in 3D space. By combining the mutational data in the Catalogue of Somatic Mutations in Cancer (COSMIC) and the spatial information in the Protein Data Bank (PDB), SpacePAC is able to identify novel mutation clusters in many proteins such as FGFR3 and CHRM2. In addition, SpacePAC is better able to localize the most significant mutational hotspots as demonstrated in the cases of BRAF and ALK. The R package is available on Bioconductor at: http://www.bioconductor.org/packages/release/bioc/html/SpacePAC.html. Conclusion SpacePAC adds a valuable tool to the identification of mutational clusters while considering protein tertiary structure. PMID:24990767

  1. A soil moisture accounting-procedure with a Richards' equation-based soil texture-dependent parameterization

    NASA Astrophysics Data System (ADS)

    Mathias, Simon A.; Skaggs, Todd H.; Quinn, Simon A.; Egan, Sorcha N. C.; Finch, Lucy E.; Oldham, Corinne D.

    2015-01-01

    Given a time series of potential evapotranspiration and rainfall data, there are at least two approaches for estimating vertical percolation rates. One approach involves solving Richards' equation (RE) with a plant uptake model. An alternative approach involves applying a simple soil moisture accounting procedure (SMAP) based on a set of conceptual stores and conditional statements. It is often desirable to parameterize distributed vertical percolation models using regional soil texture maps. This can be achieved using pedotransfer functions when applying RE. However, robust soil texture based parameterizations for more simple SMAPs have not previously been available. This article presents a new SMAP designed to emulate the response of a one-dimensional homogenous RE model. Model parameters for 231 different soil textures are obtained by calibrating the SMAP model to 20 year time series from equivalent RE model simulations. The results are then validated by comparing to an additional 13 years of simulated RE model data. The resulting work provides a new simple two parameter (% sand and % silt) SMAP, which provides consistent vertical percolation data as compared to RE based models. Results from the 231 numerical simulations are also found to be qualitatively consistent with intuitive ideas concerning soil texture and soil moisture dynamics. Vertical percolation rates are found to be highest in sandy soils. Sandy soils are found to provide less water for evapotranspiration. Surface runoff is found to be more important in soils with high clay content.

  2. The Symbol Grounding Problem Revisited: A Thorough Evaluation of the ANS Mapping Account and the Proposal of an Alternative Account Based on Symbol–Symbol Associations

    PubMed Central

    Reynvoet, Bert; Sasanguie, Delphine

    2016-01-01

    Recently, a lot of studies in the domain of numerical cognition have been published demonstrating a robust association between numerical symbol processing and individual differences in mathematics achievement. Because numerical symbols are so important for mathematics achievement, many researchers want to provide an answer on the ‘symbol grounding problem,’ i.e., how does a symbol acquires its numerical meaning? The most popular account, the approximate number system (ANS) mapping account, assumes that a symbol acquires its numerical meaning by being mapped on a non-verbal and ANS. Here, we critically evaluate four arguments that are supposed to support this account, i.e., (1) there is an evolutionary system for approximate number processing, (2) non-symbolic and symbolic number processing show the same behavioral effects, (3) non-symbolic and symbolic numbers activate the same brain regions which are also involved in more advanced calculation and (4) non-symbolic comparison is related to the performance on symbolic mathematics achievement tasks. Based on this evaluation, we conclude that all of these arguments and consequently also the mapping account are questionable. Next we explored less popular alternative, where small numerical symbols are initially mapped on a precise representation and then, in combination with increasing knowledge of the counting list result in an independent and exact symbolic system based on order relations between symbols. We evaluate this account by reviewing evidence on order judgment tasks following the same four arguments. Although further research is necessary, the available evidence so far suggests that this symbol–symbol association account should be considered as a worthy alternative of how symbols acquire their meaning. PMID:27790179

  3. Generation of SEEAW asset accounts based on water resources management models

    NASA Astrophysics Data System (ADS)

    Pedro-Monzonís, María; Solera, Abel; Andreu, Joaquín

    2015-04-01

    One of the main challenges in the XXI century is related with the sustainable use of water. This is due to the fact that water is an essential element for the life of all who inhabit our planet. In many cases, the lack of economic valuation of water resources causes an inefficient water use. In this regard, society expects of policymakers and stakeholders maximise the profit produced per unit of natural resources. Water planning and the Integrated Water Resources Management (IWRM) represent the best way to achieve this goal. The System of Environmental-Economic Accounting for Water (SEEAW) is displayed as a tool for water allocation which enables the building of water balances in a river basin. The main concern of the SEEAW is to provide a standard approach which allows the policymakers to compare results between different territories. But building water accounts is a complex task due to the difficulty of the collection of the required data. Due to the difficulty of gauging the components of the hydrological cycle, the use of simulation models has become an essential tool extensively employed in last decades. The target of this paper is to present the building up of a database that enables the combined use of hydrological models and water resources models developed with AQUATOOL DSSS to fill in the SEEAW tables. This research is framed within the Water Accounting in a Multi-Catchment District (WAMCD) project, financed by the European Union. Its main goal is the development of water accounts in the Mediterranean Andalusian River Basin District, in Spain. This research pretends to contribute to the objectives of the "Blueprint to safeguard Europe's water resources". It is noteworthy that, in Spain, a large part of these methodological decisions are included in the Spanish Guideline of Water Planning with normative status guaranteeing consistency and comparability of the results.

  4. Approaches to lunar base life support

    NASA Technical Reports Server (NTRS)

    Brown, M. F.; Edeen, M. A.

    1990-01-01

    Various approaches to reliable, low maintenance, low resupply regenerative long-term life support for lunar base application are discussed. The first approach utilizes Space Station Freedom physiochemical systems technology which has closed air and water loops with approximately 99 and 90 percent closure respectively, with minor subsystem changes to the SSF baseline improving the level of water resupply for the water loop. A second approach would be a physiochemical system, including a solid waste processing system and improved air and water loop closure, which would require only food and nitrogen for resupply. A hybrid biological/physiochemical life support system constitutes the third alternative, incorporating some level of food production via plant growth into the life support system. The approaches are described in terms of mass, power, and resupply requirements; and the potential evolution of a small, initial outpost to a large, self-sustaining base is discussed.

  5. Reexamining the language account of cross-national differences in base-10 number representations.

    PubMed

    Vasilyeva, Marina; Laski, Elida V; Ermakova, Anna; Lai, Weng-Feng; Jeong, Yoonkyung; Hachigian, Amy

    2015-01-01

    East Asian students consistently outperform students from other nations in mathematics. One explanation for this advantage is a language account; East Asian languages, unlike most Western languages, provide cues about the base-10 structure of multi-digit numbers, facilitating the development of base-10 number representations. To test this view, the current study examined how kindergartners represented two-digit numbers using single unit-blocks and ten-blocks. The participants (N=272) were from four language groups (Korean, Mandarin, English, and Russian) that vary in the extent of "transparency" of the base-10 structure. In contrast to previous findings with older children, kindergartners showed no cross-language variability in the frequency of producing base-10 representations. Furthermore, they showed a pattern of within-language variability that was not consistent with the language account and was likely attributable to experiential factors. These findings suggest that language might not play as critical a role in the development of base-10 representations as suggested in earlier research.

  6. Salience and Attention in Surprisal-Based Accounts of Language Processing

    PubMed Central

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus. PMID:27375525

  7. Automatic indexing of scanned documents: a layout-based approach

    NASA Astrophysics Data System (ADS)

    Esser, Daniel; Schuster, Daniel; Muthmann, Klemens; Berger, Michael; Schill, Alexander

    2012-01-01

    Archiving official written documents such as invoices, reminders and account statements in business and private area gets more and more important. Creating appropriate index entries for document archives like sender's name, creation date or document number is a tedious manual work. We present a novel approach to handle automatic indexing of documents based on generic positional extraction of index terms. For this purpose we apply the knowledge of document templates stored in a common full text search index to find index positions that were successfully extracted in the past.

  8. Deciding Who Decides Questions at the Intersection of School Finance Reform Litigation and Standards-Based Accountability Policies

    ERIC Educational Resources Information Center

    Superfine, Benjamin Michael

    2009-01-01

    Courts hearing school finance reform cases have recently begun to consider several issues related to standards-based accountability policies. This convergence of school finance reform litigation and standards-based accountability policies represents a chance for the courts to reallocate decision-making authority for each type of reform across the…

  9. Computer-based accountability system (Phase I) for special nuclear materials at Argonne-West

    SciTech Connect

    Ingermanson, R.S.; Proctor, A.E.

    1982-05-01

    An automated accountability system for special nuclear materials (SNM) is under development at Argonne National Laboratory-West. Phase I of the development effort has established the following basic features of the system: a unique file organization allows rapid updating or retrieval of the status of various SNM, based on batch numbers, storage location, serial number, or other attributes. Access to the program is controlled by an interactive user interface that can be easily understood by operators who have had no prior background in electronic data processing. Extensive use of structured programming techniques make the software package easy to understand and to modify for specific applications. All routines are written in FORTRAN.

  10. Physics-based approach to haptic display

    NASA Technical Reports Server (NTRS)

    Brown, J. Michael; Colgate, J. Edward

    1994-01-01

    This paper addresses the implementation of complex multiple degree of freedom virtual environments for haptic display. We suggest that a physics based approach to rigid body simulation is appropriate for hand tool simulation, but that currently available simulation techniques are not sufficient to guarantee successful implementation. We discuss the desirable features of a virtual environment simulation, specifically highlighting the importance of stability guarantees.

  11. Effectiveness and Accountability of the Inquiry-Based Methodology in Middle School Science

    ERIC Educational Resources Information Center

    Hardin, Cade

    2009-01-01

    When teaching science, the time allowed for students to make discoveries on their own through the inquiry method directly conflicts with the mandated targets of a broad spectrum of curricula. Research shows that using an inquiry-based approach can encourage student motivation and increase academic achievement (Wolf & Fraser, 2008, Bryant, 2006,…

  12. Place-Based Pedagogy in the Era of Accountability: An Action Research Study

    ERIC Educational Resources Information Center

    Saracino, Peter C.

    2010-01-01

    Today's most common method of teaching biology--driven by calls for standardization and high-stakes testing--relies on a standards-based, de-contextualized approach to education. This results in "one size fits all" curriculums that ignore local contexts relevant to students' lives, discourage student engagement and ultimately work against a deep…

  13. Water accounting for stressed river basins based on water resources management models.

    PubMed

    Pedro-Monzonís, María; Solera, Abel; Ferrer, Javier; Andreu, Joaquín; Estrela, Teodoro

    2016-09-15

    Water planning and the Integrated Water Resources Management (IWRM) represent the best way to help decision makers to identify and choose the most adequate alternatives among other possible ones. The System of Environmental-Economic Accounting for Water (SEEA-W) is displayed as a tool for the building of water balances in a river basin, providing a standard approach to achieve comparability of the results between different territories. The target of this paper is to present the building up of a tool that enables the combined use of hydrological models and water resources models to fill in the SEEA-W tables. At every step of the modelling chain, we are capable to build the asset accounts and the physical water supply and use tables according to SEEA-W approach along with an estimation of the water services costs. The case study is the Jucar River Basin District (RBD), located in the eastern part of the Iberian Peninsula in Spain which as in other many Mediterranean basins is currently water-stressed. To guide this work we have used PATRICAL model in combination with AQUATOOL Decision Support System (DSS). The results indicate that for the average year the total use of water in the district amounts to 15,143hm(3)/year, being the Total Water Renewable Water Resources 3909hm(3)/year. On the other hand, the water service costs in Jucar RBD amounts to 1634 million € per year at constant 2012 prices. It is noteworthy that 9% of these costs correspond to non-conventional resources, such as desalinated water, reused water and water transferred from other regions.

  14. Accountability in Dispositions for Juvenile Drug Offenders. Monograph.

    ERIC Educational Resources Information Center

    Pacific Inst. for Research and Evaluation, Walnut Creek, CA.

    Guidelines for the general development and implementation of accountability-based approaches for juvenile drug offenders are presented in this monograph. These topics are discussed: (1) the accountability approach; (2) the relevance of the accountability approach to drug offenders and its relationship to drug abuse treatment; (3) surveys of chief…

  15. Advanced Approach of Multiagent Based Buoy Communication

    PubMed Central

    Gricius, Gediminas; Drungilas, Darius; Andziulis, Arunas; Dzemydiene, Dale; Voznak, Miroslav; Kurmis, Mindaugas; Jakovlev, Sergej

    2015-01-01

    Usually, a hydrometeorological information system is faced with great data flows, but the data levels are often excessive, depending on the observed region of the water. The paper presents advanced buoy communication technologies based on multiagent interaction and data exchange between several monitoring system nodes. The proposed management of buoy communication is based on a clustering algorithm, which enables the performance of the hydrometeorological information system to be enhanced. The experiment is based on the design and analysis of the inexpensive but reliable Baltic Sea autonomous monitoring network (buoys), which would be able to continuously monitor and collect temperature, waviness, and other required data. The proposed approach of multiagent based buoy communication enables all the data from the costal-based station to be monitored with limited transition speed by setting different tasks for the agent-based buoy system according to the clustering information. PMID:26345197

  16. Facial Translocation Approach to the Cranial Base

    PubMed Central

    Arriaga, Moises A.; Janecka, Ivo P.

    1991-01-01

    Surgical exposure of the nasopharyngeal region of the cranial base is difficult because of its proximity to key anatomic structures. Our laboratory study outlines the anatomic basis for a new approach to this complex topography. Dissections were performed on eight cadaver halves and two fresh specimens injected with intravascular silicone rubber compound. By utilizing facial soft tissue translocation combined with craniofacial osteotomies; a wide surgical field can be obtained at the skull base. The accessible surgical field extends from the contralateral custachian tube to the ipsilateral geniculate ganglion, including the nasopharyax; clivus, sphonoid, and cavernous sinuses, the entire infratemporal fossa, and superior orbital fissure. The facial translocation approach offers previously unavailable wide and direct exposure, with a potential for immediate reconstruction, of this complex region of the cranial base. ImagesFigure 4Figure 5Figure 7Figure 8Figure 9 PMID:17170817

  17. A network approach based on cliques

    NASA Astrophysics Data System (ADS)

    Fadigas, I. S.; Pereira, H. B. B.

    2013-05-01

    The characterization of complex networks is a procedure that is currently found in several research studies. Nevertheless, few studies present a discussion on networks in which the basic element is a clique. In this paper, we propose an approach based on a network of cliques. This approach consists not only of a set of new indices to capture the properties of a network of cliques but also of a method to characterize complex networks of cliques (i.e., some of the parameters are proposed to characterize the small-world phenomenon in networks of cliques). The results obtained are consistent with results from classical methods used to characterize complex networks.

  18. Is comprehension necessary for error detection? A conflict-based account of monitoring in speech production

    PubMed Central

    Nozari, Nazbanou; Dell, Gary S.; Schwartz, Myrna F.

    2011-01-01

    Despite the existence of speech errors, verbal communication is successful because speakers can detect (and correct) their errors. The standard theory of speech-error detection, the perceptual-loop account, posits that the comprehension system monitors production output for errors. Such a comprehension-based monitor, however, cannot explain the double dissociation between comprehension and error-detection ability observed in the aphasic patients. We propose a new theory of speech-error detection which is instead based on the production process itself. The theory borrows from studies of forced-choice-response tasks the notion that error detection is accomplished by monitoring response conflict via a frontal brain structure, such as the anterior cingulate cortex. We adapt this idea to the two-step model of word production, and test the model-derived predictions on a sample of aphasic patients. Our results show a strong correlation between patients’ error-detection ability and the model’s characterization of their production skills, and no significant correlation between error detection and comprehension measures, thus supporting a production-based monitor, generally, and the implemented conflict-based monitor in particular. The successful application of the conflict-based theory to error-detection in linguistic, as well as non-linguistic domains points to a domain-general monitoring system. PMID:21652015

  19. Place-based pedagogy in the era of accountability: An action research study

    NASA Astrophysics Data System (ADS)

    Saracino, Peter C.

    Today's most common method of teaching biology---driven by calls for standardization and high-stakes testing---relies on a standards-based, de-contextualized approach to education. This results in "one size fits all" curriculums that ignore local contexts relevant to students' lives, discourage student engagement and ultimately work against a deep and lasting understanding of content. In contrast, place-based education---a pedagogical paradigm grounded in situated cognition and the progressive education tradition of John Dewey---utilizes the community as an integrating context for learning. It encourages the growth of school-community partnerships with an eye towards raising student achievement while also drawing students into the economic, political, social and ecological life of their communities. Such an approach seeks to provide students with learning experiences that are both academically significant and valuable to their communities. This study explores how high school science teachers can capitalize on the rich affordances offered by a place-based approach despite the constraints imposed by a state-mandated curriculum and high-stakes testing. Using action research, I designed, implemented, evaluated and refined an intervention that grounded a portion of a Living Environment high school course I teach in a place-based experience. This experience served as a unique anchoring event to contextualize students' learning of other required core topics. The overarching question framing this study is: How can science teachers capitalize on the rich affordances offered by a place-based approach despite the constraints imposed by a state-mandated curriculum and high-stakes testing? The following more specific questions were explored within the context of the intervention: (1) Which elements of the place-based paradigm could I effectively integrate into a Living Environment course? (2) In what ways would this integration impact students' interest? (3) In what ways would

  20. Systems Engineering Interfaces: A Model Based Approach

    NASA Technical Reports Server (NTRS)

    Fosse, Elyse; Delp, Christopher

    2013-01-01

    Currently: Ops Rev developed and maintains a framework that includes interface-specific language, patterns, and Viewpoints. Ops Rev implements the framework to design MOS 2.0 and its 5 Mission Services. Implementation de-couples interfaces and instances of interaction Future: A Mission MOSE implements the approach and uses the model based artifacts for reviews. The framework extends further into the ground data layers and provides a unified methodology.

  1. A Commentary on "Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century"

    ERIC Educational Resources Information Center

    Brandt, Steffen

    2010-01-01

    This article presents the author's commentary on "Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century," in which Isaac I. Bejar and E. Aurora Graf propose the application of a test design--the duplex design (which was proposed in 1988 by Bock and Mislevy) for application in current accountability assessments.…

  2. A genome-wide approach accounting for body mass index identifies genetic variants influencing fasting glycemic traits and insulin resistance.

    PubMed

    Manning, Alisa K; Hivert, Marie-France; Scott, Robert A; Grimsby, Jonna L; Bouatia-Naji, Nabila; Chen, Han; Rybin, Denis; Liu, Ching-Ti; Bielak, Lawrence F; Prokopenko, Inga; Amin, Najaf; Barnes, Daniel; Cadby, Gemma; Hottenga, Jouke-Jan; Ingelsson, Erik; Jackson, Anne U; Johnson, Toby; Kanoni, Stavroula; Ladenvall, Claes; Lagou, Vasiliki; Lahti, Jari; Lecoeur, Cecile; Liu, Yongmei; Martinez-Larrad, Maria Teresa; Montasser, May E; Navarro, Pau; Perry, John R B; Rasmussen-Torvik, Laura J; Salo, Perttu; Sattar, Naveed; Shungin, Dmitry; Strawbridge, Rona J; Tanaka, Toshiko; van Duijn, Cornelia M; An, Ping; de Andrade, Mariza; Andrews, Jeanette S; Aspelund, Thor; Atalay, Mustafa; Aulchenko, Yurii; Balkau, Beverley; Bandinelli, Stefania; Beckmann, Jacques S; Beilby, John P; Bellis, Claire; Bergman, Richard N; Blangero, John; Boban, Mladen; Boehnke, Michael; Boerwinkle, Eric; Bonnycastle, Lori L; Boomsma, Dorret I; Borecki, Ingrid B; Böttcher, Yvonne; Bouchard, Claude; Brunner, Eric; Budimir, Danijela; Campbell, Harry; Carlson, Olga; Chines, Peter S; Clarke, Robert; Collins, Francis S; Corbatón-Anchuelo, Arturo; Couper, David; de Faire, Ulf; Dedoussis, George V; Deloukas, Panos; Dimitriou, Maria; Egan, Josephine M; Eiriksdottir, Gudny; Erdos, Michael R; Eriksson, Johan G; Eury, Elodie; Ferrucci, Luigi; Ford, Ian; Forouhi, Nita G; Fox, Caroline S; Franzosi, Maria Grazia; Franks, Paul W; Frayling, Timothy M; Froguel, Philippe; Galan, Pilar; de Geus, Eco; Gigante, Bruna; Glazer, Nicole L; Goel, Anuj; Groop, Leif; Gudnason, Vilmundur; Hallmans, Göran; Hamsten, Anders; Hansson, Ola; Harris, Tamara B; Hayward, Caroline; Heath, Simon; Hercberg, Serge; Hicks, Andrew A; Hingorani, Aroon; Hofman, Albert; Hui, Jennie; Hung, Joseph; Jarvelin, Marjo-Riitta; Jhun, Min A; Johnson, Paul C D; Jukema, J Wouter; Jula, Antti; Kao, W H; Kaprio, Jaakko; Kardia, Sharon L R; Keinanen-Kiukaanniemi, Sirkka; Kivimaki, Mika; Kolcic, Ivana; Kovacs, Peter; Kumari, Meena; Kuusisto, Johanna; Kyvik, Kirsten Ohm; Laakso, Markku; Lakka, Timo; Lannfelt, Lars; Lathrop, G Mark; Launer, Lenore J; Leander, Karin; Li, Guo; Lind, Lars; Lindstrom, Jaana; Lobbens, Stéphane; Loos, Ruth J F; Luan, Jian'an; Lyssenko, Valeriya; Mägi, Reedik; Magnusson, Patrik K E; Marmot, Michael; Meneton, Pierre; Mohlke, Karen L; Mooser, Vincent; Morken, Mario A; Miljkovic, Iva; Narisu, Narisu; O'Connell, Jeff; Ong, Ken K; Oostra, Ben A; Palmer, Lyle J; Palotie, Aarno; Pankow, James S; Peden, John F; Pedersen, Nancy L; Pehlic, Marina; Peltonen, Leena; Penninx, Brenda; Pericic, Marijana; Perola, Markus; Perusse, Louis; Peyser, Patricia A; Polasek, Ozren; Pramstaller, Peter P; Province, Michael A; Räikkönen, Katri; Rauramaa, Rainer; Rehnberg, Emil; Rice, Ken; Rotter, Jerome I; Rudan, Igor; Ruokonen, Aimo; Saaristo, Timo; Sabater-Lleal, Maria; Salomaa, Veikko; Savage, David B; Saxena, Richa; Schwarz, Peter; Seedorf, Udo; Sennblad, Bengt; Serrano-Rios, Manuel; Shuldiner, Alan R; Sijbrands, Eric J G; Siscovick, David S; Smit, Johannes H; Small, Kerrin S; Smith, Nicholas L; Smith, Albert Vernon; Stančáková, Alena; Stirrups, Kathleen; Stumvoll, Michael; Sun, Yan V; Swift, Amy J; Tönjes, Anke; Tuomilehto, Jaakko; Trompet, Stella; Uitterlinden, Andre G; Uusitupa, Matti; Vikström, Max; Vitart, Veronique; Vohl, Marie-Claude; Voight, Benjamin F; Vollenweider, Peter; Waeber, Gerard; Waterworth, Dawn M; Watkins, Hugh; Wheeler, Eleanor; Widen, Elisabeth; Wild, Sarah H; Willems, Sara M; Willemsen, Gonneke; Wilson, James F; Witteman, Jacqueline C M; Wright, Alan F; Yaghootkar, Hanieh; Zelenika, Diana; Zemunik, Tatijana; Zgaga, Lina; Wareham, Nicholas J; McCarthy, Mark I; Barroso, Ines; Watanabe, Richard M; Florez, Jose C; Dupuis, Josée; Meigs, James B; Langenberg, Claudia

    2012-05-13

    Recent genome-wide association studies have described many loci implicated in type 2 diabetes (T2D) pathophysiology and β-cell dysfunction but have contributed little to the understanding of the genetic basis of insulin resistance. We hypothesized that genes implicated in insulin resistance pathways might be uncovered by accounting for differences in body mass index (BMI) and potential interactions between BMI and genetic variants. We applied a joint meta-analysis approach to test associations with fasting insulin and glucose on a genome-wide scale. We present six previously unknown loci associated with fasting insulin at P < 5 × 10(-8) in combined discovery and follow-up analyses of 52 studies comprising up to 96,496 non-diabetic individuals. Risk variants were associated with higher triglyceride and lower high-density lipoprotein (HDL) cholesterol levels, suggesting a role for these loci in insulin resistance pathways. The discovery of these loci will aid further characterization of the role of insulin resistance in T2D pathophysiology.

  3. A genome-wide approach accounting for body mass index identifies genetic variants influencing fasting glycemic traits and insulin resistance

    PubMed Central

    Manning, Alisa K.; Hivert, Marie-France; Scott, Robert A.; Grimsby, Jonna L.; Bouatia-Naji, Nabila; Chen, Han; Rybin, Denis; Liu, Ching-Ti; Bielak, Lawrence F.; Prokopenko, Inga; Amin, Najaf; Barnes, Daniel; Cadby, Gemma; Hottenga, Jouke-Jan; Ingelsson, Erik; Jackson, Anne U.; Johnson, Toby; Kanoni, Stavroula; Ladenvall, Claes; Lagou, Vasiliki; Lahti, Jari; Lecoeur, Cecile; Liu, Yongmei; Martinez-Larrad, Maria Teresa; Montasser, May E.; Navarro, Pau; Perry, John R. B.; Rasmussen-Torvik, Laura J.; Salo, Perttu; Sattar, Naveed; Shungin, Dmitry; Strawbridge, Rona J.; Tanaka, Toshiko; van Duijn, Cornelia M.; An, Ping; de Andrade, Mariza; Andrews, Jeanette S.; Aspelund, Thor; Atalay, Mustafa; Aulchenko, Yurii; Balkau, Beverley; Bandinelli, Stefania; Beckmann, Jacques S.; Beilby, John P.; Bellis, Claire; Bergman, Richard N.; Blangero, John; Boban, Mladen; Boehnke, Michael; Boerwinkle, Eric; Bonnycastle, Lori L.; Boomsma, Dorret I.; Borecki, Ingrid B.; Böttcher, Yvonne; Bouchard, Claude; Brunner, Eric; Budimir, Danijela; Campbell, Harry; Carlson, Olga; Chines, Peter S.; Clarke, Robert; Collins, Francis S.; Corbatón-Anchuelo, Arturo; Couper, David; de Faire, Ulf; Dedoussis, George V; Deloukas, Panos; Dimitriou, Maria; Egan, Josephine M; Eiriksdottir, Gudny; Erdos, Michael R.; Eriksson, Johan G.; Eury, Elodie; Ferrucci, Luigi; Ford, Ian; Forouhi, Nita G.; Fox, Caroline S; Franzosi, Maria Grazia; Franks, Paul W; Frayling, Timothy M; Froguel, Philippe; Galan, Pilar; de Geus, Eco; Gigante, Bruna; Glazer, Nicole L.; Goel, Anuj; Groop, Leif; Gudnason, Vilmundur; Hallmans, Göran; Hamsten, Anders; Hansson, Ola; Harris, Tamara B.; Hayward, Caroline; Heath, Simon; Hercberg, Serge; Hicks, Andrew A.; Hingorani, Aroon; Hofman, Albert; Hui, Jennie; Hung, Joseph; Jarvelin, Marjo Riitta; Jhun, Min A.; Johnson, Paul C.D.; Jukema, J Wouter; Jula, Antti; Kao, W.H.; Kaprio, Jaakko; Kardia, Sharon L. R.; Keinanen-Kiukaanniemi, Sirkka; Kivimaki, Mika; Kolcic, Ivana; Kovacs, Peter; Kumari, Meena; Kuusisto, Johanna; Kyvik, Kirsten Ohm; Laakso, Markku; Lakka, Timo; Lannfelt, Lars; Lathrop, G Mark; Launer, Lenore J.; Leander, Karin; Li, Guo; Lind, Lars; Lindstrom, Jaana; Lobbens, Stéphane; Loos, Ruth J. F.; Luan, Jian’an; Lyssenko, Valeriya; Mägi, Reedik; Magnusson, Patrik K. E.; Marmot, Michael; Meneton, Pierre; Mohlke, Karen L.; Mooser, Vincent; Morken, Mario A.; Miljkovic, Iva; Narisu, Narisu; O’Connell, Jeff; Ong, Ken K.; Oostra, Ben A.; Palmer, Lyle J.; Palotie, Aarno; Pankow, James S.; Peden, John F.; Pedersen, Nancy L.; Pehlic, Marina; Peltonen, Leena; Penninx, Brenda; Pericic, Marijana; Perola, Markus; Perusse, Louis; Peyser, Patricia A; Polasek, Ozren; Pramstaller, Peter P.; Province, Michael A.; Räikkönen, Katri; Rauramaa, Rainer; Rehnberg, Emil; Rice, Ken; Rotter, Jerome I.; Rudan, Igor; Ruokonen, Aimo; Saaristo, Timo; Sabater-Lleal, Maria; Salomaa, Veikko; Savage, David B.; Saxena, Richa; Schwarz, Peter; Seedorf, Udo; Sennblad, Bengt; Serrano-Rios, Manuel; Shuldiner, Alan R.; Sijbrands, Eric J.G.; Siscovick, David S.; Smit, Johannes H.; Small, Kerrin S.; Smith, Nicholas L.; Smith, Albert Vernon; Stančáková, Alena; Stirrups, Kathleen; Stumvoll, Michael; Sun, Yan V.; Swift, Amy J.; Tönjes, Anke; Tuomilehto, Jaakko; Trompet, Stella; Uitterlinden, Andre G.; Uusitupa, Matti; Vikström, Max; Vitart, Veronique; Vohl, Marie-Claude; Voight, Benjamin F.; Vollenweider, Peter; Waeber, Gerard; Waterworth, Dawn M; Watkins, Hugh; Wheeler, Eleanor; Widen, Elisabeth; Wild, Sarah H.; Willems, Sara M.; Willemsen, Gonneke; Wilson, James F.; Witteman, Jacqueline C.M.; Wright, Alan F.; Yaghootkar, Hanieh; Zelenika, Diana; Zemunik, Tatijana; Zgaga, Lina; Wareham, Nicholas J.; McCarthy, Mark I.; Barroso, Ines; Watanabe, Richard M.; Florez, Jose C.; Dupuis, Josée; Meigs, James B.; Langenberg, Claudia

    2013-01-01

    Recent genome-wide association studies have described many loci implicated in type 2 diabetes (T2D) pathophysiology and beta-cell dysfunction, but contributed little to our understanding of the genetic basis of insulin resistance. We hypothesized that genes implicated in insulin resistance pathways may be uncovered by accounting for differences in body mass index (BMI) and potential interaction between BMI and genetic variants. We applied a novel joint meta-analytical approach to test associations with fasting insulin (FI) and glucose (FG) on a genome-wide scale. We present six previously unknown FI loci at P<5×10−8 in combined discovery and follow-up analyses of 52 studies comprising up to 96,496non-diabetic individuals. Risk variants were associated with higher triglyceride and lower HDL cholesterol levels, suggestive of a role for these FI loci in insulin resistance pathways. The localization of these additional loci will aid further characterization of the role of insulin resistance in T2D pathophysiology. PMID:22581228

  4. Analysis of the Second E-Forum on Competency-Based Approaches

    ERIC Educational Resources Information Center

    Perez, Leticia

    2007-01-01

    As its title suggests, this is an account of "the Second E-Forum on Competency-based Approaches" and summarises the opinions and experiences expressed by the participants. The forum was based on a discussion paper prepared by the Canadian Observatory of Educational Reforms and a series of questions raised by Philippe Jonnaert were used to…

  5. Keeping Accountability Systems Accountable

    ERIC Educational Resources Information Center

    Foote, Martha

    2007-01-01

    The standards and accountability movement in education has undeniably transformed schooling throughout the United States. Even before President Bush signed the No Child Left Behind (NCLB) Act into law in January 2002, mandating annual public school testing in English and math for grades 3-8 and once in high school, most states had already…

  6. What is narrative therapy and what is it not?: the usefulness of Q methodology to explore accounts of White and Epston's (1990) approach to narrative therapy.

    PubMed

    Wallis, Jennifer; Burns, Jan; Capdevila, Rose

    2011-01-01

    OBJECTIVE. 'What is narrative therapy and how do you do it?' is a question that is repeatedly asked of narrative therapy, with little consistent response. This study aimed to explore and distil out the 'common themes' of practitioner definitions of White and Epston's approach to narrative therapy. DESIGN. This was an Internet-based study involving current UK practitioners of this type of narrative therapy using a unique combination of a Delphi Panel and Q methodology. METHOD. A group of experienced practitioners were recruited into the Delphi Poll and were asked two questions about what narrative therapy is and is not, and what techniques are and are not employed. These data combined with other information formed the statements of a Q-sort that was then administered to a wider range of narrative practitioners. FINDINGS. The Delphi Panel agreed on a number of key points relating to the theory, politics and practice of narrative therapy. The Q-sort produced eight distinct accounts of narrative therapy and a number of dimensions along which these different positions could be distinguished. These included narrative therapy as a political stance and integration with other approaches. CONCLUSIONS. For any therapeutic model to demonstrate its efficacy and attract proponents, an accepted definition of its components and practice should preferably be established. This study has provided some data for the UK application of White and Epston's narrative therapy, which may then assist in forming a firmer base for further research and practice.

  7. 26 CFR 1.801-8 - Contracts with reserves based on segregated asset accounts.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... general asset accounts and realized short-term capital gains of $12,000 attributable to its segregated asset accounts. For the taxable year 1962, the excess of the net short-term capital gain ($10,000+$12... attributable to its segregated asset accounts. For the taxable year 1962, the excess of the net...

  8. Patterns of Learning in the Accountancy Profession under an Output-Based Continuing Professional Development Scheme

    ERIC Educational Resources Information Center

    Lindsay, Hilary

    2012-01-01

    Since 2004, professional accountancy bodies in membership of the International Federation of Accountants (IFAC) have been required to adopt mandatory continuing professional development (CPD) schemes. This research explores the learning activities of members of the Institute of Chartered Accountants in England and Wales (ICAEW) which introduced an…

  9. How States Can Hold Schools Accountable: The Strong Schools Model of Standards-Based Reform.

    ERIC Educational Resources Information Center

    Brooks, Sarah R.

    Few states have overcome the political and practical obstacles to implementing a clear, feasible, comprehensive accountability system. The University of Washington's Center on Reinventing Public Education reviewed experiences from state accountability efforts. Workable accountability systems focus on results, clarify goals and roles, and…

  10. Risk-Based Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2002-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine

  11. Matched filter based iterative adaptive approach

    NASA Astrophysics Data System (ADS)

    Nepal, Ramesh; Zhang, Yan Rockee; Li, Zhengzheng; Blake, William

    2016-05-01

    Matched Filter sidelobes from diversified LPI waveform design and sensor resolution are two important considerations in radars and active sensors in general. Matched Filter sidelobes can potentially mask weaker targets, and low sensor resolution not only causes a high margin of error but also limits sensing in target-rich environment/ sector. The improvement in those factors, in part, concern with the transmitted waveform and consequently pulse compression techniques. An adaptive pulse compression algorithm is hence desired that can mitigate the aforementioned limitations. A new Matched Filter based Iterative Adaptive Approach, MF-IAA, as an extension to traditional Iterative Adaptive Approach, IAA, has been developed. MF-IAA takes its input as the Matched Filter output. The motivation here is to facilitate implementation of Iterative Adaptive Approach without disrupting the processing chain of traditional Matched Filter. Similar to IAA, MF-IAA is a user parameter free, iterative, weighted least square based spectral identification algorithm. This work focuses on the implementation of MF-IAA. The feasibility of MF-IAA is studied using a realistic airborne radar simulator as well as actual measured airborne radar data. The performance of MF-IAA is measured with different test waveforms, and different Signal-to-Noise (SNR) levels. In addition, Range-Doppler super-resolution using MF-IAA is investigated. Sidelobe reduction as well as super-resolution enhancement is validated. The robustness of MF-IAA with respect to different LPI waveforms and SNR levels is also demonstrated.

  12. An APEL Tool Based CPU Usage Accounting Infrastructure for Large Scale Computing Grids

    NASA Astrophysics Data System (ADS)

    Jiang, Ming; Novales, Cristina Del Cano; Mathieu, Gilles; Casson, John; Rogers, William; Gordon, John

    The APEL (Accounting Processor for Event Logs) is the fundamental tool for the CPU usage accounting infrastructure deployed within the WLCG and EGEE Grids. In these Grids, jobs are submitted by users to computing resources via a Grid Resource Broker (e.g. gLite Workload Management System). As a log processing tool, APEL interprets logs of Grid gatekeeper (e.g. globus) and batch system logs (e.g. PBS, LSF, SGE and Condor) to produce CPU job accounting records identified with Grid identities. These records provide a complete description of usage of computing resources by user's jobs. APEL publishes accounting records into an accounting record repository at a Grid Operations Centre (GOC) for the access from a GUI web tool. The functions of log files parsing, records generation and publication are implemented by the APEL Parser, APEL Core, and APEL Publisher component respectively. Within the distributed accounting infrastructure, accounting records are transported from APEL Publishers at Grid sites to either a regionalised accounting system or the central one by choice via a common ActiveMQ message broker network. This provides an open transport layer for other accounting systems to publish relevant accounting data to a central accounting repository via a unified interface provided an APEL Publisher and also will give regional/National Grid Initiatives (NGIs) Grids the flexibility in their choice of accounting system. The robust and secure delivery of accounting record messages at an NGI level and between NGI accounting instances and the central one are achieved by using configurable APEL Publishers and an ActiveMQ message broker network.

  13. Object recognition approach based on feature fusion

    NASA Astrophysics Data System (ADS)

    Wang, Runsheng

    2001-09-01

    Multi-sensor information fusion plays an important pole in object recognition and many other application fields. Fusion performance is tightly depended on the fusion level selected and the approach used. Feature level fusion is a potential and difficult fusion level though there might be mainly three fusion levels. Two schemes are developed for key issues of feature level fusion in this paper. In feature selecting, a normal method developed is to analyze the mutual relationship among the features that can be used, and to be applied to order features. In object recognition, a multi-level recognition scheme is developed, whose procedure can be controlled and updated by analyzing the decision result obtained in order to achieve a final reliable result. The new approach is applied to recognize work-piece objects with twelve classes in optical images and open-country objects with four classes based on infrared image sequence and MMW radar. Experimental results are satisfied.

  14. A statistical model-based technique for accounting for prostate gland deformation in endorectal coil-based MR imaging.

    PubMed

    Tahmasebi, Amir M; Sharifi, Reza; Agarwal, Harsh K; Turkbey, Baris; Bernardo, Marcelino; Choyke, Peter; Pinto, Peter; Wood, Bradford; Kruecker, Jochen

    2012-01-01

    In prostate brachytherapy procedures, combining high-resolution endorectal coil (ERC)-MRI with Computed Tomography (CT) images has shown to improve the diagnostic specificity for malignant tumors. Despite such advantage, there exists a major complication in fusion of the two imaging modalities due to the deformation of the prostate shape in ERC-MRI. Conventionally, nonlinear deformable registration techniques have been utilized to account for such deformation. In this work, we present a model-based technique for accounting for the deformation of the prostate gland in ERC-MR imaging, in which a unique deformation vector is estimated for every point within the prostate gland. Modes of deformation for every point in the prostate are statistically identified using a set of MR-based training set (with and without ERC-MRI). Deformation of the prostate from a deformed (ERC-MRI) to a non-deformed state in a different modality (CT) is then realized by first calculating partial deformation information for a limited number of points (such as surface points or anatomical landmarks) and then utilizing the calculated deformation from a subset of the points to determine the coefficient values for the modes of deformations provided by the statistical deformation model. Using a leave-one-out cross-validation, our results demonstrated a mean estimation error of 1mm for a MR-to-MR registration.

  15. LP based approach to optimal stable matchings

    SciTech Connect

    Teo, Chung-Piaw; Sethuraman, J.

    1997-06-01

    We study the classical stable marriage and stable roommates problems using a polyhedral approach. We propose a new LP formulation for the stable roommates problem. This formulation is non-empty if and only if the underlying roommates problem has a stable matching. Furthermore, for certain special weight functions on the edges, we construct a 2-approximation algorithm for the optimal stable roommates problem. Our technique uses a crucial geometry of the fractional solutions in this formulation. For the stable marriage problem, we show that a related geometry allows us to express any fractional solution in the stable marriage polytope as convex combination of stable marriage solutions. This leads to a genuinely simple proof of the integrality of the stable marriage polytope. Based on these ideas, we devise a heuristic to solve the optimal stable roommates problem. The heuristic combines the power of rounding and cutting-plane methods. We present some computational results based on preliminary implementations of this heuristic.

  16. An Ontology Based Approach to Information Security

    NASA Astrophysics Data System (ADS)

    Pereira, Teresa; Santos, Henrique

    The semantically structure of knowledge, based on ontology approaches have been increasingly adopted by several expertise from diverse domains. Recently ontologies have been moved from the philosophical and metaphysics disciplines to be used in the construction of models to describe a specific theory of a domain. The development and the use of ontologies promote the creation of a unique standard to represent concepts within a specific knowledge domain. In the scope of information security systems the use of an ontology to formalize and represent the concepts of security information challenge the mechanisms and techniques currently used. This paper intends to present a conceptual implementation model of an ontology defined in the security domain. The model presented contains the semantic concepts based on the information security standard ISO/IEC_JTC1, and their relationships to other concepts, defined in a subset of the information security domain.

  17. Lunar base CELSS: A bioregenerative approach

    NASA Technical Reports Server (NTRS)

    Easterwood, G. W.; Street, J. J.; Sartain, J. B.; Hubbell, D. H.; Robitaille, H. A.

    1992-01-01

    During the twenty-first century, human habitation of a self-sustaining lunar base could become a reality. To achieve this goal, the occupants will have to have food, water, and an adequate atmosphere within a carefully designed environment. Advanced technology will be employed to support terrestrial life-sustaining processes on the Moon. One approach to a life support system based on food production, waste management and utilization, and product synthesis is outlined. Inputs include an atmosphere, water, plants, biodegradable substrates, and manufacutured materials such as fiberglass containment vessels from lunar resources. Outputs include purification of air and water, food, and hydrogen (H2) generated from methane (CH4). Important criteria are as follows: (1) minimize resupply from Earth; and (2) recycle as efficiently as possible.

  18. Lessons Learned From Community-Based Approaches to Sodium Reduction

    PubMed Central

    Kane, Heather; Strazza, Karen; Losby PhD, Jan L.; Lane, Rashon; Mugavero, Kristy; Anater, Andrea S.; Frost, Corey; Margolis, Marjorie; Hersey, James

    2017-01-01

    Purpose This article describes lessons from a Centers for Disease Control and Prevention initiative encompassing sodium reduction interventions in six communities. Design A multiple case study design was used. Setting This evaluation examined data from programs implemented in six communities located in New York (Broome County, Schenectady County, and New York City); California (Los Angeles County and Shasta County); and Kansas (Shawnee County). Subjects Participants (n = 80) included program staff, program directors, state-level staff, and partners. Measures Measures for this evaluation included challenges, facilitators, and lessons learned from implementing sodium reduction strategies. Analysis The project team conducted a document review of program materials and semi structured interviews 12 to 14 months after implementation. The team coded and analyzed data deductively and inductively. Results Five lessons for implementing community-based sodium reduction approaches emerged: (1) build relationships with partners to understand their concerns, (2) involve individuals knowledgeable about specific venues early, (3) incorporate sodium reduction efforts and messaging into broader nutrition efforts, (4) design the program to reduce sodium gradually to take into account consumer preferences and taste transitions, and (5) identify ways to address the cost of lower-sodium products. Conclusion The experiences of the six communities may assist practitioners in planning community-based sodium reduction interventions. Addressing sodium reduction using a community-based approach can foster meaningful change in dietary sodium consumption. PMID:24575726

  19. An agent-based simulation model to study accountable care organizations.

    PubMed

    Liu, Pai; Wu, Shinyi

    2016-03-01

    Creating accountable care organizations (ACOs) has been widely discussed as a strategy to control rapidly rising healthcare costs and improve quality of care; however, building an effective ACO is a complex process involving multiple stakeholders (payers, providers, patients) with their own interests. Also, implementation of an ACO is costly in terms of time and money. Immature design could cause safety hazards. Therefore, there is a need for analytical model-based decision-support tools that can predict the outcomes of different strategies to facilitate ACO design and implementation. In this study, an agent-based simulation model was developed to study ACOs that considers payers, healthcare providers, and patients as agents under the shared saving payment model of care for congestive heart failure (CHF), one of the most expensive causes of sometimes preventable hospitalizations. The agent-based simulation model has identified the critical determinants for the payment model design that can motivate provider behavior changes to achieve maximum financial and quality outcomes of an ACO. The results show nonlinear provider behavior change patterns corresponding to changes in payment model designs. The outcomes vary by providers with different quality or financial priorities, and are most sensitive to the cost-effectiveness of CHF interventions that an ACO implements. This study demonstrates an increasingly important method to construct a healthcare system analytics model that can help inform health policy and healthcare management decisions. The study also points out that the likely success of an ACO is interdependent with payment model design, provider characteristics, and cost and effectiveness of healthcare interventions.

  20. Design of a Competency-Based Assessment Model in the Field of Accounting

    ERIC Educational Resources Information Center

    Ciudad-Gómez, Adelaida; Valverde-Berrocoso, Jesús

    2012-01-01

    This paper presents the phases involved in the design of a methodology to contribute both to the acquisition of competencies and to their assessment in the field of Financial Accounting, within the European Higher Education Area (EHEA) framework, which we call MANagement of COMpetence in the areas of Accounting (MANCOMA). Having selected and…

  1. Involving Diverse Communities of Practice to Minimize Unintended Consequences of Test-Based Accountability Systems

    ERIC Educational Resources Information Center

    Behizadeh, Nadia; Engelhard, George, Jr.

    2015-01-01

    In his focus article, Koretz (this issue) argues that accountability has become the primary function of large-scale testing in the United States. He then points out that tests being used for accountability purposes are flawed and that the high-stakes nature of these tests creates a context that encourages score inflation. Koretz is concerned about…

  2. Tracking online poker problem gamblers with player account-based gambling data only.

    PubMed

    Luquiens, Amandine; Tanguy, Marie-Laure; Benyamina, Amine; Lagadec, Marthylle; Aubin, Henri-Jean; Reynaud, Michel

    2016-12-01

    The aim was to develop and validate an instrument to track online problem poker gamblers with player account-based gambling data (PABGD). We emailed an invitation to all active poker gamblers on the online gambling service provider Winamax. The 14,261 participants completed the Problem Gambling Severity Index (PGSI). PGSI served as a gold standard to track problem gamblers (i.e., PGSI ≥ 5). We used a stepwise logistic regression to build a predictive model of problem gambling with PABGD, and validated it. Of the sample 18% was composed of online poker problem gamblers. The risk factors of problem gambling included in the predictive model were being male, compulsive, younger than 28 years, making a total deposit > 0 euros, having a mean loss per gambling session > 1.7 euros, losing a total of > 45 euros in the last 30 days, having a total stake > 298 euros, having > 60 gambling sessions in the last 30 days, and multi-tabling. The tracking instrument had a sensitivity of 80%, and a specificity of 50%. The quality of the instrument was good. This study illustrates the feasibility of a method to develop and validate instruments to track online problem gamblers with PABGD only. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Grid-cell-based crop water accounting for the famine early warning system

    USGS Publications Warehouse

    Verdin, J.; Klaver, R.

    2002-01-01

    Rainfall monitoring is a regular activity of food security analysts for sub-Saharan Africa due to the potentially disastrous impact of drought. Crop water accounting schemes are used to track rainfall timing and amounts relative to phenological requirements, to infer water limitation impacts on yield. Unfortunately, many rain gauge reports are available only after significant delays, and the gauge locations leave large gaps in coverage. As an alternative, a grid-cell-based formulation for the water requirement satisfaction index (WRSI) was tested for maize in Southern Africa. Grids of input variables were obtained from remote sensing estimates of rainfall, meteorological models, and digital soil maps. The spatial WRSI was computed for the 1996-97 and 1997-98 growing seasons. Maize yields were estimated by regression and compared with a limited number of reports from the field for the 1996-97 season in Zimbabwe. Agreement at a useful level (r = 0.80) was observed. This is comparable to results from traditional analysis with station data. The findings demonstrate the complementary role that remote sensing, modelling, and geospatial analysis can play in an era when field data collection in sub-Saharan Africa is suffering an unfortunate decline. Published in 2002 by John Wiley & Sons, Ltd.

  4. New approaches in diffraction based optical metrology

    NASA Astrophysics Data System (ADS)

    Ebert, M.; Vanoppen, P.; Jak, M.; v. d. Zouw, G.; Cramer, H.; Nooitgedagt, T.; v. d. Laan, H.

    2016-03-01

    Requirements for on-product overlay, focus and CD uniformity continue to tighten in order to support the demands of 10nm and 7nm nodes. This results in the need for simultaneously accurate, robust and dense metrology data as input for closed-loop control solutions thereby enabling wafer-level control and high order corrections. In addition the use of opaque materials and stringent design rules drive the need for expansion of the available measurement wavelengths and metrology target design space. Diffraction based optical metrology has been established as the leading methodology for integrated as well as standalone optical metrology for overlay, focus and CD monitoring and control in state of the art chip manufacturing. We are presenting the new approaches to diffraction based optical metrology designed to meet the <=10nm node challenges. These approaches have been implemented in the latest addition to the YieldStar metrology platform, the YS350E introducing a new way of acquiring and processing diffraction based metrology signals. In this paper we will present the new detection principle and its impact on key performance characteristics of overlay and focus measurements. We will also describe the wide range of applications of a newly introduced increased measurement spot size, enabling significant improvements to accuracy and process robustness of overlay and focus measurements. With the YS350E the optical CD measurement capability is also extended, to 10x10μm2 targets. We will discuss the performance and value of small targets in after-develop and after-etch applications.

  5. Structure-Based Statistical Mechanical Model Accounts for the Causality and Energetics of Allosteric Communication

    PubMed Central

    Guarnera, Enrico; Berezovsky, Igor N.

    2016-01-01

    Allostery is one of the pervasive mechanisms through which proteins in living systems carry out enzymatic activity, cell signaling, and metabolism control. Effective modeling of the protein function regulation requires a synthesis of the thermodynamic and structural views of allostery. We present here a structure-based statistical mechanical model of allostery, allowing one to observe causality of communication between regulatory and functional sites, and to estimate per residue free energy changes. Based on the consideration of ligand free and ligand bound systems in the context of a harmonic model, corresponding sets of characteristic normal modes are obtained and used as inputs for an allosteric potential. This potential quantifies the mean work exerted on a residue due to the local motion of its neighbors. Subsequently, in a statistical mechanical framework the entropic contribution to allosteric free energy of a residue is directly calculated from the comparison of conformational ensembles in the ligand free and ligand bound systems. As a result, this method provides a systematic approach for analyzing the energetics of allosteric communication based on a single structure. The feasibility of the approach was tested on a variety of allosteric proteins, heterogeneous in terms of size, topology and degree of oligomerization. The allosteric free energy calculations show the diversity of ways and complexity of scenarios existing in the phenomenology of allosteric causality and communication. The presented model is a step forward in developing the computational techniques aimed at detecting allosteric sites and obtaining the discriminative power between agonistic and antagonistic effectors, which are among the major goals in allosteric drug design. PMID:26939022

  6. Toward a dynamic approach of THA planning based on ultrasound.

    PubMed

    Dardenne, Guillaume; Dusseau, Stéphane; Hamitouche, Chafiaâ; Lefèvre, Christian; Stindel, Eric

    2009-04-01

    The risk of dislocation after THA reportedly is minimized if the acetabular implant is oriented at 45 degrees inclination and 15 degrees anteversion with respect to the anterior pelvic plane. This reference plane now is used in computer-assisted protocols. However, this static approach may lead to postoperative instability because the dynamic variations of the pelvis influence effective cup orientation and are not taken into account in this approach. We propose an ultrasound tool to register the preoperative dynamics of the pelvis for THA planning during computer-assisted surgery. To assess this pelvic behavior and its consequences on implant orientation, we tested a new 2.5-dimensional ultrasound-based approach. The pelvic flexion was registered in sitting, standing, and supine positions in 20 subjects. The mean values were -25.2 degrees +/- 5.8 degrees (standard deviation), 2.4 degrees +/- 5.1 degrees , and 6.8 degrees +/- 3.5 degrees , respectively. The mean functional anteversion varied by 26 degrees and the mean functional inclination by 12 degrees depending on the pelvic flexion. We therefore recommend including dynamic pelvic behavior to minimize dislocation risk. The notion of a safe zone should be revisited and extended to include changes with activity.

  7. Toward a Dynamic Approach of THA Planning Based on Ultrasound

    PubMed Central

    Dusseau, Stéphane; Hamitouche, Chafiaâ; Lefèvre, Christian; Stindel, Eric

    2008-01-01

    The risk of dislocation after THA reportedly is minimized if the acetabular implant is oriented at 45° inclination and 15° anteversion with respect to the anterior pelvic plane. This reference plane now is used in computer-assisted protocols. However, this static approach may lead to postoperative instability because the dynamic variations of the pelvis influence effective cup orientation and are not taken into account in this approach. We propose an ultrasound tool to register the preoperative dynamics of the pelvis for THA planning during computer-assisted surgery. To assess this pelvic behavior and its consequences on implant orientation, we tested a new 2.5-dimensional ultrasound-based approach. The pelvic flexion was registered in sitting, standing, and supine positions in 20 subjects. The mean values were −25.2° ± 5.8° (standard deviation), 2.4° ± 5.1°, and 6.8° ± 3.5°, respectively. The mean functional anteversion varied by 26° and the mean functional inclination by 12° depending on the pelvic flexion. We therefore recommend including dynamic pelvic behavior to minimize dislocation risk. The notion of a safe zone should be revisited and extended to include changes with activity. PMID:18688691

  8. Future Performance Trend Indicators: A Current Value Approach to Human Resources Accounting. Report I. Internal Consistencies and Relationships to Performance By Site. Final Report.

    ERIC Educational Resources Information Center

    Pecorella, Patricia A.; Bowers, David G.

    Analyses preparatory to construction of a suitable file for generating a system of future performance trend indicators are described. Such a system falls into the category of a current value approach to human resources accounting. It requires that there be a substantial body of data which: (1) uses the work group or unit, not the individual, as…

  9. Peptide Based Radiopharmaceuticals: Specific Construct Approach

    SciTech Connect

    Som, P; Rhodes, B A; Sharma, S S

    1997-10-21

    The objective of this project was to develop receptor based peptides for diagnostic imaging and therapy. A series of peptides related to cell adhesion molecules (CAM) and immune regulation were designed for radiolabeling with 99mTc and evaluated in animal models as potential diagnostic imaging agents for various disease conditions such as thrombus (clot), acute kidney failure, and inflection/inflammation imaging. The peptides for this project were designed by the industrial partner, Palatin Technologies, (formerly Rhomed, Inc.) using various peptide design approaches including a newly developed rational computer assisted drug design (CADD) approach termed MIDAS (Metal ion Induced Distinctive Array of Structures). In this approach, the biological function domain and the 99mTc complexing domain are fused together so that structurally these domains are indistinguishable. This approach allows construction of conformationally rigid metallo-peptide molecules (similar to cyclic peptides) that are metabolically stable in-vivo. All the newly designed peptides were screened in various in vitro receptor binding and functional assays to identify a lead compound. The lead compounds were formulated in a one-step 99mTc labeling kit form which were studied by BNL for detailed in-vivo imaging using various animals models of human disease. Two main peptides usingMIDAS approach evolved and were investigated: RGD peptide for acute renal failure and an immunomodulatory peptide derived from tuftsin (RMT-1) for infection/inflammation imaging. Various RGD based metallopeptides were designed, synthesized and assayed for their efficacy in inhibiting ADP-induced human platelet aggregation. Most of these peptides displayed biological activity in the 1-100 µM range. Based on previous work by others, RGD-I and RGD-II were evaluated in animal models of acute renal failure. These earlier studies showed that after acute ischemic injury the renal cortex displays

  10. [Demetrius Cantemir in the history of surgery: the first account of transabdominal approach to repair groin hernias].

    PubMed

    Nicolau, A E

    2008-01-01

    The first description of the transabdominal approach for hernia repair was written by Demetrius Cantemir, Prince of Moldavia and encyclopedic scholar, in his 1716 Latin manuscript "Incrementa et Decrementa Aulae Othmanicae". This manuscript was one of the most important of Eastern Europe at the time. It was first translated in English in 1734, and all subsequent translations into various other languages were based on this English version. The original manuscript now belongs to the Houghton Library of Harvard University, where it was recently rediscovered in 1984 by V. Candea. D. Sluşanschi has made the first Romanian translation of the first two volumes based on the original latin manuscript. This translation is now in press. Our article presents for the first time a fragment of this Romanian translation from the Annotations of Volume two, chapter four. In this fragment, Demetrius Cantemir describes the surgical procedure practiced by Albanian physicians in the prince's palace in Constantinopol. The patient was the secretary of the prince. There is a detailed description of the postsurgical therapy and the medical course to recovery. It was first partially reproduced by Mercy in his book on hernia published in 1892, and more recently by Meade in 1965. We consider useful to present to the medical community this valuable but less known contribution to the history of medicine.

  11. QNA-based 'Star Track' QSAR approach.

    PubMed

    Filimonov, D A; Zakharov, A V; Lagunin, A A; Poroikov, V V

    2009-10-01

    In the existing quantitative structure-activity relationship (QSAR) methods any molecule is represented as a single point in a many-dimensional space of molecular descriptors. We propose a new QSAR approach based on Quantitative Neighbourhoods of Atoms (QNA) descriptors, which characterize each atom of a molecule and depend on the whole molecule structure. In the 'Star Track' methodology any molecule is represented as a set of points in a two-dimensional space of QNA descriptors. With our new method the estimate of the target property of a chemical compound is calculated as the average value of the function of QNA descriptors in the points of the atoms of a molecule in QNA descriptor space. Substantially, we propose the use of only two descriptors rather than more than 3000 molecular descriptors that apply in the QSAR method. On the basis of this approach we have developed the computer program GUSAR and compared it with several widely used QSAR methods including CoMFA, CoMSIA, Golpe/GRID, HQSAR and others, using ten data sets representing various chemical series and diverse types of biological activity. We show that in the majority of cases the accuracy and predictivity of GUSAR models appears to be better than those for the reference QSAR methods. High predictive ability and robustness of GUSAR are also shown in the leave-20%-out cross-validation procedure.

  12. The financing of the health system in the Islamic Republic of Iran: A National Health Account (NHA) approach

    PubMed Central

    Zakeri, Mohammadreza; Olyaeemanesh, Alireza; Zanganeh, Marziee; Kazemian, Mahmoud; Rashidian, Arash; Abouhalaj, Masoud; Tofighi, Shahram

    2015-01-01

    Background: The National Health Accounts keep track of all healthcare related activities from the beginning (i.e. resource provision), to the end (i.e. service provision). This study was conducted to address following questions: How is the Iranian health system funded? Who distribute the funds? For what services are the funds spent on?, What service providers receive the funds? Methods: The required study data were collected through a number of methods. The family health expenditure data was obtained through a cross sectional multistage (seasonal) survey; while library and field study was used to collect the registered data. The collected data fell into the following three categories: the household health expenditure (the sample size: 10200 urban households and 6800 rural households-four rounds of questioning), financial agents data, the medical universities financial performance data. Results: The total health expenditure of the Iranian households was 201,496,172 million Rials in 2008, which showed a 34.4% increase when compared to 2007. The share of the total health expenditure was 6.2% of the GDP. The share of the public sector showed a decreasing trend between 2003-2008 while the share of the private sector, of which 95.77% was paid by households, had an increasing trend within the same period. The percent of out of pocket expenditure was 53.79% of the total health expenditure. The total health expenditure per capita was US$ 284.00 based on the official US$ exchange rate and US$ 683.1 based on the international US$ exchange rate.( exchange rate: 1$=9988 Rial). Conclusion: The share of the public and private sectors in financing the health system was imbalanced and did not meet the international standards. The public share of the total health expenditures has increased in the recent years despite the 4th and 5th Development Plans. The inclusion of household health insurance fees and other service related expenses increases the public contribution to 73% of the

  13. Sepsis management: An evidence-based approach.

    PubMed

    Baig, Muhammad Akbar; Shahzad, Hira; Jamil, Bushra; Hussain, Erfan

    2016-03-01

    The Surviving Sepsis Campaign (SSC) guidelines have outlined an early goal directed therapy (EGDT) which demonstrates a standardized approach to ensure prompt and effective management of sepsis. Having said that, there are barriers associated with the application of evidence-based practice, which often lead to an overall poorer adherence to guidelines. Considering the global burden of disease, data from low- to middle-income countries is scarce. Asia is the largest continent but most Asian countries do not have a well-developed healthcare system and compliance rates to resuscitation and management bundles are as low as 7.6% and 3.5%, respectively. Intensive care units are not adequately equipped and financial concerns limit implementation of expensive treatment strategies. Healthcare policy-makers should be notified in order to alleviate financial restrictions and ensure delivery of standard care to septic patients.

  14. Nanotechnology-based approaches in anticancer research.

    PubMed

    Jabir, Nasimudeen R; Tabrez, Shams; Ashraf, Ghulam Md; Shakil, Shazi; Damanhouri, Ghazi A; Kamal, Mohammad A

    2012-01-01

    Cancer is a highly complex disease to understand, because it entails multiple cellular physiological systems. The most common cancer treatments are restricted to chemotherapy, radiation and surgery. Moreover, the early recognition and treatment of cancer remains a technological bottleneck. There is an urgent need to develop new and innovative technologies that could help to delineate tumor margins, identify residual tumor cells and micrometastases, and determine whether a tumor has been completely removed or not. Nanotechnology has witnessed significant progress in the past few decades, and its effect is widespread nowadays in every field. Nanoparticles can be modified in numerous ways to prolong circulation, enhance drug localization, increase drug efficacy, and potentially decrease chances of multidrug resistance by the use of nanotechnology. Recently, research in the field of cancer nanotechnology has made remarkable advances. The present review summarizes the application of various nanotechnology-based approaches towards the diagnostics and therapeutics of cancer.

  15. Functional phosphoproteomic mass spectrometry-based approaches

    PubMed Central

    2012-01-01

    Mass Spectrometry (MS)-based phosphoproteomics tools are crucial for understanding the structure and dynamics of signaling networks. Approaches such as affinity purification followed by MS have also been used to elucidate relevant biological questions in health and disease. The study of proteomes and phosphoproteomes as linked systems, rather than research studies of individual proteins, are necessary to understand the functions of phosphorylated and un-phosphorylated proteins under spatial and temporal conditions. Phosphoproteome studies also facilitate drug target protein identification which may be clinically useful in the near future. Here, we provide an overview of general principles of signaling pathways versus phosphorylation. Likewise, we detail chemical phosphoproteomic tools, including pros and cons with examples where these methods have been applied. In addition, basic clues of electrospray ionization and collision induced dissociation fragmentation are detailed in a simple manner for successful phosphoproteomic clinical studies. PMID:23369623

  16. Strategic approaches to planetary base development

    NASA Technical Reports Server (NTRS)

    Roberts, Barney B.

    1992-01-01

    The evolutionary development of a planetary expansionary outpost is considered in the light of both technical and economic issues. The outline of a partnering taxonomy is set forth which encompasses both institutional and temporal issues related to establishing shared interests and investments. The purely technical issues are discussed in terms of the program components which include nonaerospace technologies such as construction engineering. Five models are proposed in which partnership and autonomy for participants are approached in different ways including: (1) the standard customer/provider relationship; (2) a service-provider scenario; (3) the joint venture; (4) a technology joint-development model; and (5) a redundancy model for reduced costs. Based on the assumed characteristics of planetary surface systems the cooperative private/public models are championed with coordinated design by NASA to facilitate outside cooperation.

  17. A Social Justice Perspective on Strengths-Based Approaches: Exploring Educators' Perspectives and Practices

    ERIC Educational Resources Information Center

    Gardner, Morgan K. A.; Toope, Deborah Florence

    2011-01-01

    What does it mean to engage in strengths-based (SB) approaches from a social justice perspective? In this paper we explore the accounts of educators who work with youth experiencing social and educational barriers to describe what it might mean to engage in SB practices from a social justice perspective. Using data generated from interviews, we…

  18. The accountability for reasonableness approach to guide priority setting in health systems within limited resources – findings from action research at district level in Kenya, Tanzania, and Zambia

    PubMed Central

    2014-01-01

    Background Priority-setting decisions are based on an important, but not sufficient set of values and thus lead to disagreement on priorities. Accountability for Reasonableness (AFR) is an ethics-based approach to a legitimate and fair priority-setting process that builds upon four conditions: relevance, publicity, appeals, and enforcement, which facilitate agreement on priority-setting decisions and gain support for their implementation. This paper focuses on the assessment of AFR within the project REsponse to ACcountable priority setting for Trust in health systems (REACT). Methods This intervention study applied an action research methodology to assess implementation of AFR in one district in Kenya, Tanzania, and Zambia, respectively. The assessments focused on selected disease, program, and managerial areas. An implementing action research team of core health team members and supporting researchers was formed to implement, and continually assess and improve the application of the four conditions. Researchers evaluated the intervention using qualitative and quantitative data collection and analysis methods. Results The values underlying the AFR approach were in all three districts well-aligned with general values expressed by both service providers and community representatives. There was some variation in the interpretations and actual use of the AFR in the decision-making processes in the three districts, and its effect ranged from an increase in awareness of the importance of fairness to a broadened engagement of health team members and other stakeholders in priority setting and other decision-making processes. Conclusions District stakeholders were able to take greater charge of closing the gap between nationally set planning and the local realities and demands of the served communities within the limited resources at hand. This study thus indicates that the operationalization of the four broadly defined and linked conditions is both possible and seems to

  19. Women of Courage: A Personal Account of a Wilderness-Based Experiential Group for Survivors of Abuse

    ERIC Educational Resources Information Center

    Kelly, Virginia A.

    2006-01-01

    Adventure-based therapy has grown in both scope and popularity. These groups are frequently utilized in the treatment of adolescents with behavioral or substance abuse issues. Less evident is the use of this modality with other populations. Described here is a personal account of the author's participation in a wilderness-based group for women.…

  20. A novel Bayesian approach to accounting for uncertainty in fMRI-derived estimates of cerebral oxygen metabolism fluctuations

    PubMed Central

    Simon, Aaron B.; Dubowitz, David J.; Blockley, Nicholas P.; Buxton, Richard B.

    2016-01-01

    Calibrated blood oxygenation level dependent (BOLD) imaging is a multimodal functional MRI technique designed to estimate changes in cerebral oxygen metabolism from measured changes in cerebral blood flow and the BOLD signal. This technique addresses fundamental ambiguities associated with quantitative BOLD signal analysis; however, its dependence on biophysical modeling creates uncertainty in the resulting oxygen metabolism estimates. In this work, we developed a Bayesian approach to estimating the oxygen metabolism response to a neural stimulus and used it to examine the uncertainty that arises in calibrated BOLD estimation due to the presence of unmeasured model parameters. We applied our approach to estimate the CMRO2 response to a visual task using the traditional hypercapnia calibration experiment as well as to estimate the metabolic response to both a visual task and hypercapnia using the measurement of baseline apparent R2′ as a calibration technique. Further, in order to examine the effects of cerebral spinal fluid (CSF) signal contamination on the measurement of apparent R2′, we examined the effects of measuring this parameter with and without CSF-nulling. We found that the two calibration techniques provided consistent estimates of the metabolic response on average, with a median R2′-based estimate of the metabolic response to CO2 of 1.4%, and R2′- and hypercapnia-calibrated estimates of the visual response of 27% and 24%, respectively. However, these estimates were sensitive to different sources of estimation uncertainty. The R2′-calibrated estimate was highly sensitive to CSF contamination and to uncertainty in unmeasured model parameters describing flow-volume coupling, capillary bed characteristics, and the iso-susceptibility saturation of blood. The hypercapnia-calibrated estimate was relatively insensitive to these parameters but highly sensitive to the assumed metabolic response to CO2. PMID:26790354

  1. A novel Bayesian approach to accounting for uncertainty in fMRI-derived estimates of cerebral oxygen metabolism fluctuations.

    PubMed

    Simon, Aaron B; Dubowitz, David J; Blockley, Nicholas P; Buxton, Richard B

    2016-04-01

    Calibrated blood oxygenation level dependent (BOLD) imaging is a multimodal functional MRI technique designed to estimate changes in cerebral oxygen metabolism from measured changes in cerebral blood flow and the BOLD signal. This technique addresses fundamental ambiguities associated with quantitative BOLD signal analysis; however, its dependence on biophysical modeling creates uncertainty in the resulting oxygen metabolism estimates. In this work, we developed a Bayesian approach to estimating the oxygen metabolism response to a neural stimulus and used it to examine the uncertainty that arises in calibrated BOLD estimation due to the presence of unmeasured model parameters. We applied our approach to estimate the CMRO2 response to a visual task using the traditional hypercapnia calibration experiment as well as to estimate the metabolic response to both a visual task and hypercapnia using the measurement of baseline apparent R2' as a calibration technique. Further, in order to examine the effects of cerebral spinal fluid (CSF) signal contamination on the measurement of apparent R2', we examined the effects of measuring this parameter with and without CSF-nulling. We found that the two calibration techniques provided consistent estimates of the metabolic response on average, with a median R2'-based estimate of the metabolic response to CO2 of 1.4%, and R2'- and hypercapnia-calibrated estimates of the visual response of 27% and 24%, respectively. However, these estimates were sensitive to different sources of estimation uncertainty. The R2'-calibrated estimate was highly sensitive to CSF contamination and to uncertainty in unmeasured model parameters describing flow-volume coupling, capillary bed characteristics, and the iso-susceptibility saturation of blood. The hypercapnia-calibrated estimate was relatively insensitive to these parameters but highly sensitive to the assumed metabolic response to CO2.

  2. PPDF-based method to account for atmospheric light scattering in observations of carbon dioxide from space

    NASA Astrophysics Data System (ADS)

    Oshchepkov, Sergey; Bril, Andrey; Yokota, Tatsuya

    2008-12-01

    We present an original method that accounts for thin clouds in carbon dioxide retrievals from space-based reflected sunlight observations in near-infrared regions. This approach involves a reasonable, simple parameterization of effective transmittance using a set of parameters that describe the path-length modification caused by clouds. The complete retrieval scheme included the following: estimation of cloud parameters from the 0.76-μm O2 A-band and from the H2O-saturated absorption area of the 2.0-μm band; a necessary correction to utilize these parameters at the target CO2 1.58-μm band using estimated ground surface albedo outside of gas absorption lines in this band; and retrieval of CO2 amount at the 1.58-μm band using a maximum a posteriori method of inversion. The primary retrieved parameters refer to the CO2 volume mixing ratio vertical profile that is then transformed to an averaged-column amount under a pre-defined increment of pressure. A set of numerical simulations with synthetic radiance spectra particular to Greenhouse Gases Observing Satellite (GOSAT) observations showed that the proposed method provides acceptably accurate CO2 retrievals from an atmosphere that includes thin cirrus clouds. Efficiency of the aerosol and cloud corrections was demonstrated by comparing it with a modified iterative maximum a posteriori-DOAS (IMAP-DOAS) that neglects path length modifications due to the scattering effects.

  3. Accounting for linkage in family-based tests of association with missing parental genotypes.

    PubMed

    Martin, Eden R; Bass, Meredyth P; Hauser, Elizabeth R; Kaplan, Norman L

    2003-11-01

    In studies of complex diseases, a common paradigm is to conduct association analysis at markers in regions identified by linkage analysis, to attempt to narrow the region of interest. Family-based tests for association based on parental transmissions to affected offspring are often used in fine-mapping studies. However, for diseases with late onset, parental genotypes are often missing. Without parental genotypes, family-based tests either compare allele frequencies in affected individuals with those in their unaffected siblings or use siblings to infer missing parental genotypes. An example of the latter approach is the score test implemented in the computer program TRANSMIT. The inference of missing parental genotypes in TRANSMIT assumes that transmissions from parents to affected siblings are independent, which is appropriate when there is no linkage. However, using computer simulations, we show that, when the marker and disease locus are linked and the data set consists of families with multiple affected siblings, this assumption leads to a bias in the score statistic under the null hypothesis of no association between the marker and disease alleles. This bias leads to an inflated type I error rate for the score test in regions of linkage. We present a novel test for association in the presence of linkage (APL) that correctly infers missing parental genotypes in regions of linkage by estimating identity-by-descent parameters, to adjust for correlation between parental transmissions to affected siblings. In simulated data, we demonstrate the validity of the APL test under the null hypothesis of no association and show that the test can be more powerful than the pedigree disequilibrium test and family-based association test. As an example, we compare the performance of the tests in a candidate-gene study in families with Parkinson disease.

  4. Toward a Direct Realist Account of Observation.

    ERIC Educational Resources Information Center

    Sievers, K. H.

    1999-01-01

    Criticizes the account of observation given by Alan Chalmers in "What Is This Thing Called Science?" and provides an alternative based on direct realist approaches to perception. Contains 15 references. (Author/WRM)

  5. Accounting Curriculum.

    ERIC Educational Resources Information Center

    Prickett, Charlotte

    This curriculum guide describes the accounting curriculum in the following three areas: accounting clerk, bookkeeper, and nondegreed accountant. The competencies and tasks complement the Arizona validated listing in these areas. The guide lists 24 competencies for nondegreed accountants, 10 competencies for accounting clerks, and 11 competencies…

  6. Development of prototype induced-fission-based Pu accountancy instrument for safeguards applications.

    PubMed

    Seo, Hee; Lee, Seung Kyu; An, Su Jung; Park, Se-Hwan; Ku, Jeong-Hoe; Menlove, Howard O; Rael, Carlos D; LaFleur, Adrienne M; Browne, Michael C

    2016-09-01

    Prototype safeguards instrument for nuclear material accountancy (NMA) of uranium/transuranic (U/TRU) products that could be produced in a future advanced PWR fuel processing facility has been developed and characterized. This is a new, hybrid neutron measurement system based on fast neutron energy multiplication (FNEM) and passive neutron albedo reactivity (PNAR) methods. The FNEM method is sensitive to the induced fission rate by fast neutrons, while the PNAR method is sensitive to the induced fission rate by thermal neutrons in the sample to be measured. The induced fission rate is proportional to the total amount of fissile material, especially plutonium (Pu), in the U/TRU product; hence, the Pu amount can be calibrated as a function of the induced fission rate, which can be measured using either the FNEM or PNAR method. In the present study, the prototype system was built using six (3)He tubes, and its performance was evaluated for various detector parameters including high-voltage (HV) plateau, efficiency profiles, dead time, and stability. The system's capability to measure the difference in the average neutron energy for the FNEM signature also was evaluated, using AmLi, PuBe, (252)Cf, as well as four Pu-oxide sources each with a different impurity (Al, F, Mg, and B) and producing (α,n) neutrons with different average energies. Future work will measure the hybrid signature (i.e., FNEM×PNAR) for a Pu source with an external interrogating neutron source after enlarging the cavity size of the prototype system to accommodate a large-size Pu source (~600g Pu).

  7. Appearance questions can be misleading: a discourse-based account of the appearance-reality problem.

    PubMed

    Hansen, Mikkel B; Markman, Ellen M

    2005-05-01

    Preschoolers' success on the appearance-reality task is a milestone in theory-of-mind development. On the standard task children see a deceptive object, such as a sponge that looks like a rock, and are asked, "What is this really?" and "What does this look like?" Children below 412 years of age fail saying that the object not only is a sponge but also looks like a sponge. We propose that young children's difficulty stems from ambiguity in the meaning of "looks like." This locution can refer to outward appearance ("Peter looks like Paul") but in fact often refers to likely reality ("That looks like Jim"). We propose that "looks like" is taken to refer to likely reality unless the reality is already part of the common ground of the conversation. Because this joint knowledge is unclear to young children on the appearance-reality task, they mistakenly think the appearance question is about likely reality. Study 1 analyzed everyday conversations from the CHILDES database and documented that 2 and 3-year-olds are familiar with these two different uses of the locution. To disambiguate the meaning of "looks like," Study 2 clarified that reality was shared knowledge as part of the appearance question, e.g., "What does the sponge look like?" Study 3 used a non-linguistic measure to emphasize the shared knowledge of the reality in the appearance question. Study 4 asked children on their own to articulate the contrast between appearance and reality. At 91%, 85%, and 81% correct responses, children were at near ceiling levels in each of our manipulations while they failed the standard versions of the tasks. Moreover, we show how this discourse-based explanation accounts for findings in the literature. Thus children master the appearance-reality distinction by the age of 3 but the standard task masks this understanding because of the discourse structure involved in talking about appearances.

  8. A performance weighting procedure for GCMs based on explicit probabilistic models and accounting for observation uncertainty

    NASA Astrophysics Data System (ADS)

    Renard, Benjamin; Vidal, Jean-Philippe

    2016-04-01

    In recent years, the climate modeling community has put a lot of effort into releasing the outputs of multimodel experiments for use by the wider scientific community. In such experiments, several structurally distinct GCMs are run using the same observed forcings (for the historical period) or the same projected forcings (for the future period). In addition, several members are produced for a single given model structure, by running each GCM with slightly different initial conditions. This multiplicity of GCM outputs offers many opportunities in terms of uncertainty quantification or GCM comparisons. In this presentation, we propose a new procedure to weight GCMs according to their ability to reproduce the observed climate. Such weights can be used to combine the outputs of several models in a way that rewards good-performing models and discards poorly-performing ones. The proposed procedure has the following main properties: 1. It is based on explicit probabilistic models describing the time series produced by the GCMs and the corresponding historical observations, 2. It can use several members whenever available, 3. It accounts for the uncertainty in observations, 4. It assigns a weight to each GCM (all weights summing up to one), 5. It can also assign a weight to the "H0 hypothesis" that all GCMs in the multimodel ensemble are not compatible with observations. The application of the weighting procedure is illustrated with several case studies including synthetic experiments, simple cases where the target GCM output is a simple univariate variable and more realistic cases where the target GCM output is a multivariate and/or a spatial variable. These case studies illustrate the generality of the procedure which can be applied in a wide range of situations, as long as the analyst is prepared to make an explicit probabilistic assumption on the target variable. Moreover, these case studies highlight several interesting properties of the weighting procedure. In

  9. A Self-Study Guide to Implementation of Inclusive Assessment and Accountability Systems. A Best Practice Approach.

    ERIC Educational Resources Information Center

    Quenemoen, Rachel F.; Thompson, Sandra J.; Thurlow, Martha L.; Lehr, Camilla A.

    The National Center on Educational Outcomes has identified six core principles of assessment and accountability systems that include all students, specifically students with disabilities, and they have developed brief statements of rationale for each of the principles. These principles, which reflect best practices, are consistent with the…

  10. Civil Society and School Accountability: A Human Rights Approach to Parent and Community Participation in NYC Schools.

    ERIC Educational Resources Information Center

    Sullivan, Elizabeth

    This paper asserts that while many factors contribute to the poor quality of education in New York City public schools, one of the primary obstacles to guaranteeing the right to education is a widespread lack of accountability by school officials. This lack of participation is tied to the school system's failure to ensure effective participation…

  11. Human Rights Education in Japan: An Historical Account, Characteristics and Suggestions for a Better-Balanced Approach

    ERIC Educational Resources Information Center

    Takeda, Sachiko

    2012-01-01

    Although human rights are often expressed as universal tenets, the concept was conceived in a particular socio-political and historical context. Conceptualisations and practice of human rights vary across societies, and face numerous challenges. After providing an historical account of the conceptualisation of human rights in Japanese society,…

  12. Approach to proliferation risk assessment based on multiple objective analysis framework

    SciTech Connect

    Andrianov, A.; Kuptsov, I.

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  13. Accountable Professional Practice in ELT

    ERIC Educational Resources Information Center

    Farmer, Frank

    2006-01-01

    Professionalism is widely thought to be desirable in ELT, and at the same time institutions are taking seriously the need to evaluate their teachers. This article presents a general approach to professionalism focused on the accountability of the professional to the client based on TESOL's (2000) classification of adult ELT within eight general…

  14. The Impact of a Home-Based Palliative Care Program in an Accountable Care Organization

    PubMed Central

    Mudra, Mitchell; Romano, Carole; Lukoski, Ed; Chang, Andy; Mittelberger, James; Scherr, Terry; Cooper, David

    2017-01-01

    Abstract Background: People with advanced illness usually want their healthcare where they live—at home—not in the hospital. Innovative models of palliative care that better meet the needs of seriously ill people at lower cost should be explored. Objectives: We evaluated the impact of a home-based palliative care (HBPC) program implemented within an Accountable Care Organization (ACO) on cost and resource utilization. Methods: This was a retrospective analysis to quantify cost savings associated with a HBPC program in a Medicare Shared Savings Program ACO where total cost of care is available. We studied 651 decedents; 82 enrolled in a HBPC program compared to 569 receiving usual care in three New York counties who died between October 1, 2014, and March 31, 2016. We also compared hospital admissions, ER visits, and hospice utilization rates in the final months of life. Results: The cost per patient during the final three months of life was $12,000 lower with HBPC than with usual care ($20,420 vs. $32,420; p = 0.0002); largely driven by a 35% reduction in Medicare Part A ($16,892 vs. $26,171; p = 0.0037). HBPC also resulted in a 37% reduction in Medicare Part B in the final three months of life compared to usual care ($3,114 vs. $4,913; p = 0.0008). Hospital admissions were reduced by 34% in the final month of life for patients enrolled in HBPC. The number of admissions per 1000 beneficiaries per year was 3073 with HBPC and 4640 with usual care (p = 0.0221). HBPC resulted in a 35% increased hospice enrollment rate (p = 0.0005) and a 240% increased median hospice length of stay compared to usual care (34 days vs. 10 days; p < 0.0001). Conclusion: HBPC within an ACO was associated with significant cost savings, fewer hospitalizations, and increased hospice use in the final months of life. PMID:27574868

  15. A spatiotemporal dengue fever early warning model accounting for nonlinear associations with meteorological factors: a Bayesian maximum entropy approach

    NASA Astrophysics Data System (ADS)

    Lee, Chieh-Han; Yu, Hwa-Lung; Chien, Lung-Chang

    2014-05-01

    Dengue fever has been identified as one of the most widespread vector-borne diseases in tropical and sub-tropical. In the last decade, dengue is an emerging infectious disease epidemic in Taiwan especially in the southern area where have annually high incidences. For the purpose of disease prevention and control, an early warning system is urgently needed. Previous studies have showed significant relationships between climate variables, in particular, rainfall and temperature, and the temporal epidemic patterns of dengue cases. However, the transmission of the dengue fever is a complex interactive process that mostly understated the composite space-time effects of dengue fever. This study proposes developing a one-week ahead warning system of dengue fever epidemics in the southern Taiwan that considered nonlinear associations between weekly dengue cases and meteorological factors across space and time. The early warning system based on an integration of distributed lag nonlinear model (DLNM) and stochastic Bayesian Maximum Entropy (BME) analysis. The study identified the most significant meteorological measures including weekly minimum temperature and maximum 24-hour rainfall with continuous 15-week lagged time to dengue cases variation under condition of uncertainty. Subsequently, the combination of nonlinear lagged effects of climate variables and space-time dependence function is implemented via a Bayesian framework to predict dengue fever occurrences in the southern Taiwan during 2012. The result shows the early warning system is useful for providing potential outbreak spatio-temporal prediction of dengue fever distribution. In conclusion, the proposed approach can provide a practical disease control tool for environmental regulators seeking more effective strategies for dengue fever prevention.

  16. Subtraction-based approach for enhancing the depth sensitivity of time-resolved NIRS

    PubMed Central

    Milej, Daniel; Abdalmalak, Androu; McLachlan, Peter; Diop, Mamadou; Liebert, Adam; St. Lawrence, Keith.

    2016-01-01

    The aim of this study was to evaluate enhancing of the depth sensitivity of time-resolved near-infrared spectroscopy with a subtraction-based approach. Due to the complexity of light propagation in a heterogeneous media, and to prove the validity of the proposed method in a heterogeneous turbid media we conducted a broad analysis taking into account a number of parameters related to the method as well as various parameters of this media. The results of these experiments confirm that the depth sensitivity of the subtraction-based approach is better than classical approaches using continuous-wave or time-resolved methods. Furthermore, the results showed that the subtraction-based approach has a unique, selective sensitivity to a layer at a specific depth. In vivo application of the proposed method resulted in a greater magnitude of the hemodynamic changes during functional activation than with the standard approach. PMID:27895992

  17. Nanotechnology-Based Approaches for Guiding Neural Regeneration.

    PubMed

    Shah, Shreyas; Solanki, Aniruddh; Lee, Ki-Bum

    2016-01-19

    The mammalian brain is a phenomenal piece of "organic machinery" that has fascinated scientists and clinicians for centuries. The intricate network of tens of billions of neurons dispersed in a mixture of chemical and biochemical constituents gives rise to thoughts, feelings, memories, and life as we know it. In turn, subtle imbalances or damage to this system can cause severe complications in physical, motor, psychological, and cognitive function. Moreover, the inevitable loss of nerve tissue caused by degenerative diseases and traumatic injuries is particularly devastating because of the limited regenerative capabilities of the central nervous system (i.e., the brain and spinal cord). Among current approaches, stem-cell-based regenerative medicine has shown the greatest promise toward repairing and regenerating destroyed neural tissue. However, establishing controlled and reliable methodologies to guide stem cell differentiation into specialized neural cells of interest (e.g., neurons and oligodendrocytes) has been a prevailing challenge in the field. In this Account, we summarize the nanotechnology-based approaches our group has recently developed to guide stem-cell-based neural regeneration. We focus on three overarching strategies that were adopted to selectively control this process. First, soluble microenvironmental factors play a critical role in directing the fate of stem cells. Multiple factors have been developed in the form of small-molecule drugs, biochemical analogues, and DNA/RNA-based vectors to direct neural differentiation. However, the delivery of these factors with high transfection efficiency and minimal cytotoxicity has been challenging, especially to sensitive cell lines such as stem cells. In our first approach, we designed nanoparticle-based systems for the efficient delivery of such soluble factors to control neural differentiation. Our nanoparticles, comprising either organic or inorganic elements, were biocompatible and offered

  18. Volatility in School Test Scores: Implications for Test-Based Accountability Systems

    ERIC Educational Resources Information Center

    Kane, Thomas J.; Staiger, Douglas O.

    2002-01-01

    By the spring of 2000, forty states had begun using student test scores to rate school performance. Twenty states have gone a step further and are attaching explicit monetary rewards or sanctions to a school's test performance. In this paper, the authors focus on accountability programs in which states measure the effectiveness of individual…

  19. Adapting Educational Measurement to the Demands of Test-Based Accountability

    ERIC Educational Resources Information Center

    Koretz, Daniel

    2015-01-01

    Accountability has become a primary function of large-scale testing in the United States. The pressure on educators to raise scores is vastly greater than it was several decades ago. Research has shown that high-stakes testing can generate behavioral responses that inflate scores, often severely. I argue that because of these responses, using…

  20. Accounting for Teamwork: A Critical Study of Group-Based Systems of Organizational Control.

    ERIC Educational Resources Information Center

    Ezzamel, Mahmoud; Willmott, Hugh

    1998-01-01

    Examines the role of accounting calculations in reorganizing manufacturing capabilities of a vertically integrated global retailing company. Introducing teamwork to replace line work extended traditional, hierarchical management control systems. Teamwork's self-managing demands contravened workers' established sense of self-identity as…

  1. Performance-Based Incentives and the Behavior of Accounting Academics: Responding to Changes

    ERIC Educational Resources Information Center

    Moya, Soledad; Prior, Diego; Rodríguez-Pérez, Gonzalo

    2015-01-01

    When laws change the rules of the game, it is important to observe the effects on the players' behavior. Some effects can be anticipated while others are difficult to enunciate before the law comes into force. In this paper we have analyzed articles authored by Spanish accounting academics between 1996 and 2005 to assess the impact of a change in…

  2. Hamlet without the Prince: Shortcomings of an Activity-Based Account of Joint Attention

    ERIC Educational Resources Information Center

    Hobson, R. Peter

    2007-01-01

    In this commentary, I consider several strengths of the position adopted by Racine and Carpendale (2007), but suggest that the authors are in danger of overstating their case. In doing so, they appear to sideline an issue that should be pivotal for accounts of joint attention: how does a child come to arrive at an understanding that people, both…

  3. School-Based Accountability and the Distribution of Teacher Quality across Grades in Elementary School

    ERIC Educational Resources Information Center

    Fuller, Sarah C.; Ladd, Helen F.

    2013-01-01

    We use North Carolina data to explore whether the quality of teachers in the lower elementary grades (K-2) falls short of teacher quality in the upper grades (3-5) and to examine the hypothesis that school accountability pressures contribute to such quality shortfalls. Our concern with the early grades arises from recent studies highlighting how…

  4. Teachers' Perceptions of the Impact of Performance-Based Accountability on Teacher Efficacy

    ERIC Educational Resources Information Center

    Gantt, Phyllis Elizabeth Crowley

    2012-01-01

    Implementation of state and federal high-stakes accountability measures such as end-of-course tests (EoCTs) has contributed to increased teacher stress in the classroom, decreased teacher creativity and autonomy, and reduced effectiveness. Prior research focused primarily on the elementary and middle school levels, so this study sought to examine…

  5. A process-based approach to estimate point snow instability

    NASA Astrophysics Data System (ADS)

    Reuter, B.; Schweizer, J.; van Herwijnen, A.

    2015-05-01

    Snow instability data provide information about the mechanical state of the snow cover and are essential for forecasting snow avalanches. So far, direct observations of instability (recent avalanches, shooting cracks or whumpf sounds) are complemented with field tests such as the rutschblock test, since no measurement method for instability exists. We propose a new approach based on snow mechanical properties derived from the snow micro-penetrometer that takes into account the two essential processes during dry-snow avalanche release: failure initiation and crack propagation. To estimate the propensity of failure initiation we define a stress-based failure criterion, whereas the propensity of crack propagation is described by the critical cut length as obtained with a propagation saw test. The input parameters include layer thickness, snow density, effective elastic modulus, strength and specific fracture energy of the weak layer - all derived from the penetration-force signal acquired with the snow micro-penetrometer. Both instability measures were validated with independent field data and correlated well with results from field tests. Comparisons with observed signs of instability clearly indicated that a snowpack is only prone to avalanche if the two separate conditions for failure initiation and crack propagation are fulfilled. To our knowledge, this is the first time that an objective method for estimating snow instability has been proposed. The approach can either be used directly based on field measurements with the snow micro-penetrometer, or be implemented in numerical snow cover models. With an objective measure of instability at hand, the problem of spatial variations of instability and its causes can now be tackled.

  6. Concurrency-based approaches to parallel programming

    NASA Technical Reports Server (NTRS)

    Kale, L.V.; Chrisochoides, N.; Kohl, J.; Yelick, K.

    1995-01-01

    The inevitable transition to parallel programming can be facilitated by appropriate tools, including languages and libraries. After describing the needs of applications developers, this paper presents three specific approaches aimed at development of efficient and reusable parallel software for irregular and dynamic-structured problems. A salient feature of all three approaches in their exploitation of concurrency within a processor. Benefits of individual approaches such as these can be leveraged by an interoperability environment which permits modules written using different approaches to co-exist in single applications.

  7. Concurrency-based approaches to parallel programming

    SciTech Connect

    Kale, L.V.; Chrisochoides, N.; Kohl, J.

    1995-07-17

    The inevitable transition to parallel programming can be facilitated by appropriate tools, including languages and libraries. After describing the needs of applications developers, this paper presents three specific approaches aimed at development of efficient and reusable parallel software for irregular and dynamic-structured problems. A salient feature of all three approaches in their exploitation of concurrency within a processor. Benefits of individual approaches such as these can be leveraged by an interoperability environment which permits modules written using different approaches to co-exist in single applications.

  8. Pattern recognition tool based on complex network-based approach

    NASA Astrophysics Data System (ADS)

    Casanova, Dalcimar; Backes, André Ricardo; Martinez Bruno, Odemir

    2013-02-01

    This work proposed a generalization of the method proposed by the authors: 'A complex network-based approach for boundary shape analysis'. Instead of modelling a contour into a graph and use complex networks rules to characterize it, here, we generalize the technique. This way, the work proposes a mathematical tool for characterization signals, curves and set of points. To evaluate the pattern description power of the proposal, an experiment of plat identification based on leaf veins image are conducted. Leaf vein is a taxon characteristic used to plant identification proposes, and one of its characteristics is that these structures are complex, and difficult to be represented as a signal or curves and this way to be analyzed in a classical pattern recognition approach. Here, we model the veins as a set of points and model as graphs. As features, we use the degree and joint degree measurements in a dynamic evolution. The results demonstrates that the technique has a good power of discrimination and can be used for plant identification, as well as other complex pattern recognition tasks.

  9. Standards-Based Accountability under No Child Left Behind: Experiences of Teachers and Administrators in Three States. MG-589-NSF

    ERIC Educational Resources Information Center

    Hamilton, Laura S.; Stecher, Brian M.; Marsh, Julie A.; McCombs, Jennifer Sloan; Robyn, Abby; Russell, Jennifer; Naftel, Scott; Barney, Heather

    2007-01-01

    Since 2001-2002, standards-based accountability (SBA) provisions of the No Child Left Behind Act of 2001 (NCLB) have shaped the work of public school teachers and administrators in the United States. NCLB requires each state to develop content and achievement standards in several subjects, administer tests to measure students' progress toward…

  10. An Empirical Research of Chinese Learners' Acquisition of the English Article System--Based on Syntactic Misanalysis Account

    ERIC Educational Resources Information Center

    Jian, Shi

    2013-01-01

    In the field of applied linguistics, the English article is the acknowledged teaching and learning difficulty and receives lots of attention in second language acquisition (SLA). This paper, based on the Syntactic Misanalysis Account (SMA) advocated by Trenkic in which L2 articles are analyzed as adjectives by L2ers, proposes the English article…

  11. The Effects of Project Based Learning on 21st Century Skills and No Child Left Behind Accountability Standards

    ERIC Educational Resources Information Center

    Holmes, Lisa Marie

    2012-01-01

    The purpose of this study was to determine ways "Digital Biographies," a Project Based Learning Unit, developed 21st century skills while simultaneously supporting NCLB accountability standards. The main goal of this study was to inform professional practice by exploring ways to address two separate, seemingly opposing, demands of…

  12. Distorting Value Added: The Use of Longitudinal, Vertically Scaled Student Achievement Data for Growth-Based, Value-Added Accountability

    ERIC Educational Resources Information Center

    Martineau, Joseph A.

    2006-01-01

    Longitudinal, student performance-based, value-added accountability models have become popular of late and continue to enjoy increasing popularity. Such models require student data to be vertically scaled across wide grade and developmental ranges so that the value added to student growth/achievement by teachers, schools, and districts may be…

  13. Reuse: A knowledge-based approach

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui

    1992-01-01

    This paper describes our research in automating the reuse process through the use of application domain models. Application domain models are explicit formal representations of the application knowledge necessary to understand, specify, and generate application programs. Furthermore, they provide a unified repository for the operational structure, rules, policies, and constraints of a specific application area. In our approach, domain models are expressed in terms of a transaction-based meta-modeling language. This paper has described in detail the creation and maintenance of hierarchical structures. These structures are created through a process that includes reverse engineering of data models with supplementary enhancement from application experts. Source code is also reverse engineered but is not a major source of domain model instantiation at this time. In the second phase of the software synthesis process, program specifications are interactively synthesized from an instantiated domain model. These specifications are currently integrated into a manual programming process but will eventually be used to derive executable code with mechanically assisted transformations. This research is performed within the context of programming-in-the-large types of systems. Although our goals are ambitious, we are implementing the synthesis system in an incremental manner through which we can realize tangible results. The client/server architecture is capable of supporting 16 simultaneous X/Motif users and tens of thousands of attributes and classes. Domain models have been partially synthesized from five different application areas. As additional domain models are synthesized and additional knowledge is gathered, we will inevitably add to and modify our representation. However, our current experience indicates that it will scale and expand to meet our modeling needs.

  14. Concept Based Approach for Adaptive Personalized Course Learning System

    ERIC Educational Resources Information Center

    Salahli, Mehmet Ali; Özdemir, Muzaffer; Yasar, Cumali

    2013-01-01

    One of the most important factors for improving the personalization aspects of learning systems is to enable adaptive properties to them. The aim of the adaptive personalized learning system is to offer the most appropriate learning path and learning materials to learners by taking into account their profiles. In this paper, a new approach to…

  15. Accounting for ecosystem services in Life Cycle Assessment, Part II: toward an ecologically based LCA.

    PubMed

    Zhang, Yi; Baral, Anil; Bakshi, Bhavik R

    2010-04-01

    Despite the essential role of ecosystem goods and services in sustaining all human activities, they are often ignored in engineering decision making, even in methods that are meant to encourage sustainability. For example, conventional Life Cycle Assessment focuses on the impact of emissions and consumption of some resources. While aggregation and interpretation methods are quite advanced for emissions, similar methods for resources have been lagging, and most ignore the role of nature. Such oversight may even result in perverse decisions that encourage reliance on deteriorating ecosystem services. This article presents a step toward including the direct and indirect role of ecosystems in LCA, and a hierarchical scheme to interpret their contribution. The resulting Ecologically Based LCA (Eco-LCA) includes a large number of provisioning, regulating, and supporting ecosystem services as inputs to a life cycle model at the process or economy scale. These resources are represented in diverse physical units and may be compared via their mass, fuel value, industrial cumulative exergy consumption, or ecological cumulative exergy consumption or by normalization with total consumption of each resource or their availability. Such results at a fine scale provide insight about relative resource use and the risk and vulnerability to the loss of specific resources. Aggregate indicators are also defined to obtain indices such as renewability, efficiency, and return on investment. An Eco-LCA model of the 1997 economy is developed and made available via the web (www.resilience.osu.edu/ecolca). An illustrative example comparing paper and plastic cups provides insight into the features of the proposed approach. The need for further work in bridging the gap between knowledge about ecosystem services and their direct and indirect role in supporting human activities is discussed as an important area for future work.

  16. Oregon’s Medicaid Transformation: An Innovative Approach To Holding A Health System Accountable For Spending Growth

    PubMed Central

    McConnell, K. John; Chang, Anna Marie; Cohen, Deborah J.; Wallace, Neal; Chernew, Michael E.; Kautz, Glenn; McCarty, Dennis; McFarland, Bentson; Wright, Bill; Smith, Jeanene

    2014-01-01

    In 2012, Oregon initiated a significant transformation of its Medicaid program, catalyzed in part through an innovative arrangement with the Centers for Medicare and Medicaid Services (CMS), which provided an upfront investment of $1.9 billion to the state. In exchange, Oregon agreed to reduce the rate of Medicaid spending by 2 percentage points without degrading quality. A failure to meet these targets triggers penalties on the order of hundreds of millions of dollars from CMS. We describe the novel arrangement with CMS and how the CCO structure compares to Accountable Care Organizations (ACOs) and managed care organizations (MCOs). PMID:25540719

  17. Cost unit accounting based on a clinical pathway: a practical tool for DRG implementation.

    PubMed

    Feyrer, R; Rösch, J; Weyand, M; Kunzmann, U

    2005-10-01

    Setting up a reliable cost unit accounting system in a hospital is a fundamental necessity for economic survival, given the current general conditions in the healthcare system. Definition of a suitable cost unit is a crucial factor for success. We present here the development and use of a clinical pathway as a cost unit as an alternative to the DRG. Elective coronary artery bypass grafting was selected as an example. Development of the clinical pathway was conducted according to a modular concept that mirrored all the treatment processes across various levels and modules. Using service records and analyses the process algorithms of the clinical pathway were developed and visualized with CorelTM iGrafix Process 2003. A detailed process cost record constituted the basis of the pathway costing, in which financial evaluation of the treatment processes was performed. The result of this study was a structured clinical pathway for coronary artery bypass grafting together with a cost calculation in the form of cost unit accounting. The use of a clinical pathway as a cost unit offers considerable advantages compared to the DRG or clinical case. The variance in the diagnoses and procedures within a pathway is minimal, so the consumption of resources is homogeneous. This leads to a considerable improvement in the value of cost unit accounting as a strategic control instrument in hospitals.

  18. Minimally invasive surgery of the anterior skull base: transorbital approaches

    PubMed Central

    Gassner, Holger G.; Schwan, Franziska; Schebesch, Karl-Michael

    2016-01-01

    Minimally invasive approaches are becoming increasingly popular to access the anterior skull base. With interdisciplinary cooperation, in particular endonasal endoscopic approaches have seen an impressive expansion of indications over the past decades. The more recently described transorbital approaches represent minimally invasive alternatives with a differing spectrum of access corridors. The purpose of the present paper is to discuss transorbital approaches to the anterior skull base in the light of the current literature. The transorbital approaches allow excellent exposure of areas that are difficult to reach like the anterior and posterior wall of the frontal sinus; working angles may be more favorable and the paranasal sinus system can be preserved while exposing the skull base. Because of their minimal morbidity and the cosmetically excellent results, the transorbital approaches represent an important addition to established endonasal endoscopic and open approaches to the anterior skull base. Their execution requires an interdisciplinary team approach. PMID:27453759

  19. Allocating physicians' overhead costs to services: an econometric/accounting-activity based-approach.

    PubMed

    Peden, Al; Baker, Judith J

    2002-01-01

    Using the optimizing properties of econometric analysis, this study analyzes how physician overhead costs (OC) can be allocated to multiple activities to maximize precision in reimbursing the costs of services. Drawing on work by Leibenstein and Friedman, the analysis also shows that allocating OC to multiple activities unbiased by revenue requires controlling for revenue when making the estimates. Further econometric analysis shows that it is possible to save about 10 percent of OC by paying only for those that are necessary.

  20. Nanotechnology based approaches in cancer therapeutics

    NASA Astrophysics Data System (ADS)

    Kumer Biswas, Amit; Reazul Islam, Md; Sadek Choudhury, Zahid; Mostafa, Asif; Fahim Kadir, Mohammad

    2014-12-01

    The current decades are marked not by the development of new molecules for the cure of various diseases but rather the development of new delivery methods for optimum treatment outcome. Nanomedicine is perhaps playing the biggest role in this concern. Nanomedicine offers numerous advantages over conventional drug delivery approaches and is particularly the hot topic in anticancer research. Nanoparticles (NPs) have many unique criteria that enable them to be incorporated in anticancer therapy. This topical review aims to look at the properties and various forms of NPs and their use in anticancer treatment, recent development of the process of identifying new delivery approaches as well as progress in clinical trials with these newer approaches. Although the outcome of cancer therapy can be increased using nanomedicine there are still many disadvantages of using this approach. We aim to discuss all these issues in this review.

  1. CLAIM (CLinical Accounting InforMation)--an XML-based data exchange standard for connecting electronic medical record systems to patient accounting systems.

    PubMed

    Guo, Jinqiu; Takada, Akira; Tanaka, Koji; Sato, Junzo; Suzuki, Muneou; Takahashi, Kiwamu; Daimon, Hiroyuki; Suzuki, Toshiaki; Nakashima, Yusei; Araki, Kenji; Yoshihara, Hiroyuki

    2005-08-01

    With the evolving and diverse electronic medical record (EMR) systems, there appears to be an ever greater need to link EMR systems and patient accounting systems with a standardized data exchange format. To this end, the CLinical Accounting InforMation (CLAIM) data exchange standard was developed. CLAIM is subordinate to the Medical Markup Language (MML) standard, which allows the exchange of medical data among different medical institutions. CLAIM uses eXtensible Markup Language (XML) as a meta-language. The current version, 2.1, inherited the basic structure of MML 2.x and contains two modules including information related to registration, appointment, procedure and charging. CLAIM 2.1 was implemented successfully in Japan in 2001. Consequently, it was confirmed that CLAIM could be used as an effective data exchange format between EMR systems and patient accounting systems.

  2. Comparison of Methods to Account for Relatedness in Genome-Wide Association Studies with Family-Based Data

    PubMed Central

    Eu-ahsunthornwattana, Jakris; Miller, E. Nancy; Fakiola, Michaela; Jeronimo, Selma M. B.; Blackwell, Jenefer M.; Cordell, Heather J.

    2014-01-01

    Approaches based on linear mixed models (LMMs) have recently gained popularity for modelling population substructure and relatedness in genome-wide association studies. In the last few years, a bewildering variety of different LMM methods/software packages have been developed, but it is not always clear how (or indeed whether) any newly-proposed method differs from previously-proposed implementations. Here we compare the performance of several LMM approaches (and software implementations, including EMMAX, GenABEL, FaST-LMM, Mendel, GEMMA and MMM) via their application to a genome-wide association study of visceral leishmaniasis in 348 Brazilian families comprising 3626 individuals (1972 genotyped). The implementations differ in precise details of methodology implemented and through various user-chosen options such as the method and number of SNPs used to estimate the kinship (relatedness) matrix. We investigate sensitivity to these choices and the success (or otherwise) of the approaches in controlling the overall genome-wide error-rate for both real and simulated phenotypes. We compare the LMM results to those obtained using traditional family-based association tests (based on transmission of alleles within pedigrees) and to alternative approaches implemented in the software packages MQLS, ROADTRIPS and MASTOR. We find strong concordance between the results from different LMM approaches, and all are successful in controlling the genome-wide error rate (except for some approaches when applied naively to longitudinal data with many repeated measures). We also find high correlation between LMMs and alternative approaches (apart from transmission-based approaches when applied to SNPs with small or non-existent effects). We conclude that LMM approaches perform well in comparison to competing approaches. Given their strong concordance, in most applications, the choice of precise LMM implementation cannot be based on power/type I error considerations but must instead be

  3. Stochastic Turing patterns: analysis of compartment-based approaches.

    PubMed

    Cao, Yang; Erban, Radek

    2014-12-01

    Turing patterns can be observed in reaction-diffusion systems where chemical species have different diffusion constants. In recent years, several studies investigated the effects of noise on Turing patterns and showed that the parameter regimes, for which stochastic Turing patterns are observed, can be larger than the parameter regimes predicted by deterministic models, which are written in terms of partial differential equations (PDEs) for species concentrations. A common stochastic reaction-diffusion approach is written in terms of compartment-based (lattice-based) models, where the domain of interest is divided into artificial compartments and the number of molecules in each compartment is simulated. In this paper, the dependence of stochastic Turing patterns on the compartment size is investigated. It has previously been shown (for relatively simpler systems) that a modeler should not choose compartment sizes which are too small or too large, and that the optimal compartment size depends on the diffusion constant. Taking these results into account, we propose and study a compartment-based model of Turing patterns where each chemical species is described using a different set of compartments. It is shown that the parameter regions where spatial patterns form are different from the regions obtained by classical deterministic PDE-based models, but they are also different from the results obtained for the stochastic reaction-diffusion models which use a single set of compartments for all chemical species. In particular, it is argued that some previously reported results on the effect of noise on Turing patterns in biological systems need to be reinterpreted.

  4. A clinical approach to acid-base conundrums.

    PubMed

    Garrubba, Carl; Truscott, Judy

    2016-04-01

    Acid-base disorders can provide essential clues to underlying patient conditions. This article provides a simple, practical approach to identifying simple acid-base disorders and their compensatory mechanisms. Using this stepwise approach, clinicians can quickly identify and appropriately treat acid-base disorders.

  5. Constitutive Description of 7075 Aluminum Alloy During Hot Deformation by Apparent and Physically-Based Approaches

    NASA Astrophysics Data System (ADS)

    Mirzadeh, Hamed

    2015-03-01

    Hot flow stress of 7075 aluminum alloy during compressive hot deformation was correlated to the Zener-Hollomon parameter through constitutive analyses based on the apparent approach and the proposed physically-based approach which accounts for the dependence of the Young's modulus and the self-diffusion coefficient of aluminum on temperature. It was shown that the latter approach not only results in a more reliable constitutive equation, but also significantly simplifies the constitutive analysis, which in turn makes it possible to conduct comparative hot working studies. It was also demonstrated that the theoretical exponent of 5 and the lattice self-diffusion activation energy of aluminum (142 kJ/mol) can be set in the hyperbolic sine law to describe the peak flow stresses and the resulting constitutive equation was found to be consistent with that resulted from the proposed physically-based approach.

  6. Randomly Accountable

    ERIC Educational Resources Information Center

    Kane, Thomas J.; Staiger, Douglas O.; Geppert, Jeffrey

    2002-01-01

    The accountability debate tends to devolve into a battle between the pro-testing and anti-testing crowds. When it comes to the design of a school accountability system, the devil is truly in the details. A well-designed accountability plan may go a long way toward giving school personnel the kinds of signals they need to improve performance.…

  7. Is an attention-based associative account of adjacent and nonadjacent dependency learning valid?

    PubMed

    Pacton, Sébastien; Sobaco, Amélie; Perruchet, Pierre

    2015-05-01

    Pacton and Perruchet (2008) reported that participants who were asked to process adjacent elements located within a sequence of digits learned adjacent dependencies but did not learn nonadjacent dependencies and conversely, participants who were asked to process nonadjacent digits learned nonadjacent dependencies but did not learn adjacent dependencies. In the present study, we showed that when participants were simply asked to read aloud the same sequences of digits, a task demand that did not require the intentional processing of specific elements as in standard statistical learning tasks, only adjacent dependencies were learned. The very same pattern was observed when digits were replaced by syllables. These results show that the perfect symmetry found in Pacton and Perruchet was not due to the fact that the processing of digits is less sensitive to their distance than the processing of syllables, tones, or visual shapes used in most statistical learning tasks. Moreover, the present results, completed with a reanalysis of the data collected in Pacton and Perruchet (2008), demonstrate that participants are highly sensitive to violations involving the spacing between paired elements. Overall, these results are consistent with the Pacton and Perruchet's single-process account of adjacent and nonadjacent dependencies, in which the joint attentional processing of the two events is a necessary and sufficient condition for learning the relation between them, irrespective of their distance. However, this account should be completed to encompass the notion that the presence or absence of an intermediate event is an intrinsic component of the representation of an association.

  8. Hierarchical modeling of contingency-based source monitoring: a test of the probability-matching account.

    PubMed

    Arnold, Nina R; Bayen, Ute J; Kuhlmann, Beatrice G; Vaterrodt, Bianca

    2013-04-01

    According to the probability-matching account of source guessing (Spaniol & Bayen, Journal of Experimental Psychology: Learning, Memory, and Cognition 28:631-651, 2002), when people do not remember the source of an item in a source-monitoring task, they match the source-guessing probabilities to the perceived contingencies between sources and item types. In a source-monitoring experiment, half of the items presented by each of two sources were consistent with schematic expectations about this source, whereas the other half of the items were consistent with schematic expectations about the other source. Participants' source schemas were activated either at the time of encoding or just before the source-monitoring test. After test, the participants judged the contingency of the item type and source. Individual parameter estimates of source guessing were obtained via beta-multinomial processing tree modeling (beta-MPT; Smith & Batchelder, Journal of Mathematical Psychology 54:167-183, 2010). We found a significant correlation between the perceived contingency and source guessing, as well as a correlation between the deviation of the guessing bias from the true contingency and source memory when participants did not receive the schema information until retrieval. These findings support the probability-matching account.

  9. Decompensation: A Novel Approach to Accounting for Stress Arising From the Effects of Ideology and Social Norms.

    PubMed

    Riggs, Damien W; Treharne, Gareth J

    2017-01-01

    To date, research that has drawn on Meyer's (2003) minority stress model has largely taken for granted the premises underpinning it. In this article we provide a close reading of how "stress" is conceptualized in the model and suggest that aspects of the model do not attend to the institutionalized nature of stressors experienced by people with marginalized identities, particularly lesbian, gay, bisexual, and transgender individuals. As a counter to this, we highlight the importance of a focus on the effects of ideology and social norms in terms of stress, and we argue why an intersectional approach is necessary to ensure recognition of multiple axes of marginalization and privilege. The article then outlines the concept of decompensation and suggests that it may offer one way to reconsider the effects of ideology and social norms. The decompensation approach centers on the need for social change rather than solely relying on individuals to be resilient.

  10. Bare Forms and Lexical Insertions in Code-Switching: A Processing-Based Account

    ERIC Educational Resources Information Center

    Owens, Jonathan

    2005-01-01

    Bare forms (or [slashed O] forms), uninflected lexical L2 insertions in contexts where the matrix language expects morphological marking, have been recognized as an anomaly in different approaches to code-switching. Myers-Scotton (1997, 2002) has explained their existence in terms of structural incongruity between the matrix and embedded…

  11. A unified account of tilt illusions, association fields, and contour detection based on elastica.

    PubMed

    Keemink, Sander W; van Rossum, Mark C W

    2016-09-01

    As expressed in the Gestalt law of good continuation, human perception tends to associate stimuli that form smooth continuations. Contextual modulation in primary visual cortex, in the form of association fields, is believed to play an important role in this process. Yet a unified and principled account of the good continuation law on the neural level is lacking. In this study we introduce a population model of primary visual cortex. Its contextual interactions depend on the elastica curvature energy of the smoothest contour connecting oriented bars. As expected, this model leads to association fields consistent with data. However, in addition the model displays tilt-illusions for stimulus configurations with grating and single bars that closely match psychophysics. Furthermore, the model explains not only pop-out of contours amid a variety of backgrounds, but also pop-out of single targets amid a uniform background. We thus propose that elastica is a unifying principle of the visual cortical network.

  12. Virial based equations of state with account of three-body interaction for noble gases and their mixtures

    NASA Astrophysics Data System (ADS)

    Akhmatov, Z. A.; Khokonov, A. Kh; Khokonov, M. Kh

    2016-11-01

    Within the frame of molecular dynamics the equations of state of noble gases and their mixtures have been obtained by means of time averaging procedure for virial based equation with account of three-body interaction. It has been shown, that equations of state can be extrapolated by van-der-Waals-type equations. The corresponding parameters have been calculated. A visible foliation of Xe and Kr components of Kob-Andersen mixture has been found.

  13. Effects of a social accountability approach, CARE’s Community Score Card, on reproductive health-related outcomes in Malawi: A cluster-randomized controlled evaluation

    PubMed Central

    Galavotti, Christine; Sebert Kuhlmann, Anne; Msiska, Thumbiko; Hastings, Phil; Marti, C. Nathan

    2017-01-01

    Background Social accountability approaches, which emphasize mutual responsibility and accountability by community members, health care workers, and local health officials for improving health outcomes in the community, are increasingly being employed in low-resource settings. We evaluated the effects of a social accountability approach, CARE’s Community Score Card (CSC), on reproductive health outcomes in Ntcheu district, Malawi using a cluster-randomized control design. Methods We matched 10 pairs of communities, randomly assigning one from each pair to intervention and control arms. We conducted two independent cross-sectional surveys of women who had given birth in the last 12 months, at baseline and at two years post-baseline. Using difference-in-difference (DiD) and local average treatment effect (LATE) estimates, we evaluated the effects on outcomes including modern contraceptive use, antenatal and postnatal care service utilization, and service satisfaction. We also evaluated changes in indicators developed by community members and service providers in the intervention areas. Results DiD analyses showed significantly greater improvements in the proportion of women receiving a home visit during pregnancy (B = 0.20, P < .01), receiving a postnatal visit (B = 0.06, P = .01), and overall service satisfaction (B = 0.16, P < .001) in intervention compared to control areas. LATE analyses estimated significant effects of the CSC intervention on home visits by health workers (114% higher in intervention compared to control) (B = 1.14, P < .001) and current use of modern contraceptives (57% higher) (B = 0.57, P < .01). All 13 community- and provider-developed indicators improved, with 6 of them showing significant improvements. Conclusions By facilitating the relationship between community members, health service providers, and local government officials, the CSC contributed to important improvements in reproductive health-related outcomes. Further, the CSC builds

  14. Assessment of Person Fit Using Resampling-Based Approaches

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2016-01-01

    De la Torre and Deng suggested a resampling-based approach for person-fit assessment (PFA). The approach involves the use of the [math equation unavailable] statistic, a corrected expected a posteriori estimate of the examinee ability, and the Monte Carlo (MC) resampling method. The Type I error rate of the approach was closer to the nominal level…

  15. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  16. How to Tell the Truth with Statistics: The Case for Accountable Data Analyses in Team-based Science

    PubMed Central

    Gelfond, Jonathan A. L.; Klugman, Craig M.; Welty, Leah J.; Heitman, Elizabeth; Louden, Christopher; Pollock, Brad H.

    2015-01-01

    Data analysis is essential to translational medicine, epidemiology, and the scientific process. Although recent advances in promoting reproducibility and reporting standards have made some improvements, the data analysis process remains insufficiently documented and susceptible to avoidable errors, bias, and even fraud. Comprehensively accounting for the full analytical process requires not only records of the statistical methodology used, but also records of communications among the research team. In this regard, the data analysis process can benefit from the principle of accountability that is inherent in other disciplines such as clinical practice. We propose a novel framework for capturing the analytical narrative called the Accountable Data Analysis Process (ADAP), which allows the entire research team to participate in the analysis in a supervised and transparent way. The framework is analogous to an electronic health record in which the dataset is the “patient” and actions related to the dataset are recorded in a project management system. We discuss the design, advantages, and challenges in implementing this type of system in the context of academic health centers, where team based science increasingly demands accountability. PMID:26290897

  17. Evaluating a pivot-based approach for bilingual lexicon extraction.

    PubMed

    Kim, Jae-Hoon; Kwon, Hong-Seok; Seo, Hyeong-Won

    2015-01-01

    A pivot-based approach for bilingual lexicon extraction is based on the similarity of context vectors represented by words in a pivot language like English. In this paper, in order to show validity and usability of the pivot-based approach, we evaluate the approach in company with two different methods for estimating context vectors: one estimates them from two parallel corpora based on word association between source words (resp., target words) and pivot words and the other estimates them from two parallel corpora based on word alignment tools for statistical machine translation. Empirical results on two language pairs (e.g., Korean-Spanish and Korean-French) have shown that the pivot-based approach is very promising for resource-poor languages and this approach observes its validity and usability. Furthermore, for words with low frequency, our method is also well performed.

  18. Evaluating a Pivot-Based Approach for Bilingual Lexicon Extraction

    PubMed Central

    Kim, Jae-Hoon; Kwon, Hong-Seok; Seo, Hyeong-Won

    2015-01-01

    A pivot-based approach for bilingual lexicon extraction is based on the similarity of context vectors represented by words in a pivot language like English. In this paper, in order to show validity and usability of the pivot-based approach, we evaluate the approach in company with two different methods for estimating context vectors: one estimates them from two parallel corpora based on word association between source words (resp., target words) and pivot words and the other estimates them from two parallel corpora based on word alignment tools for statistical machine translation. Empirical results on two language pairs (e.g., Korean-Spanish and Korean-French) have shown that the pivot-based approach is very promising for resource-poor languages and this approach observes its validity and usability. Furthermore, for words with low frequency, our method is also well performed. PMID:25983745

  19. Knowledge Based Approach to OOTW Coalition Formation

    DTIC Science & Technology

    2002-04-01

    the OOTW planning. A multi - agent system consists of a number of agents that group themselves in various, temporary coalitions (each solving a specific...it in order to plan an optimal mission. 159 The developed approach has been tested on the CPlanT multi - agent system implementation. 2 CPlanT System...Architecture CPlanT is a multi - agent system for planning humanitarian relief operations where any agent can initiate the planning process. Classical

  20. Probabilistic-based approach to optimal filtering

    PubMed

    Hannachi

    2000-04-01

    The signal-to-noise ratio maximizing approach in optimal filtering provides a robust tool to detect signals in the presence of colored noise. The method fails, however, when the data present a regimelike behavior. An approach is developed in this manuscript to recover local (in phase space) behavior in an intermittent regimelike behaving system. The method is first formulated in its general form within a Gaussian framework, given an estimate of the noise covariance, and demands that the signal corresponds to minimizing the noise probability distribution for any given value, i.e., on isosurfaces, of the data probability distribution. The extension to the non-Gaussian case is provided through the use of finite mixture models for data that show regimelike behavior. The method yields the correct signal when applied in a simplified manner to synthetic time series with and without regimes, compared to the signal-to-noise ratio approach, and helps identify the right frequency of the oscillation spells in the classical and variants of the Lorenz system.

  1. Prior knowledge-based approach for associating ...

    EPA Pesticide Factsheets

    Evaluating the potential human health and/or ecological risks associated with exposures to complex chemical mixtures in the ambient environment is one of the central challenges of chemical safety assessment and environmental protection. There is a need for approaches that can help to integrate chemical monitoring and bio-effects data to evaluate risks associated with chemicals present in the environment. We used prior knowledge about chemical-gene interactions to develop a knowledge assembly model for detected chemicals at five locations near two wastewater treatment plants. The assembly model was used to generate hypotheses about the biological impacts of the chemicals at each location. The hypotheses were tested using empirical hepatic gene expression data from fathead minnows exposed for 12 d at each location. Empirical gene expression data was also mapped to the assembly models to statistically evaluate the likelihood of a chemical contributing to the observed biological responses. The prior knowledge approach was able reasonably hypothesize the biological impacts at one site but not the other. Chemicals most likely contributing to the observed biological responses were identified at each location. Despite limitations to the approach, knowledge assembly models have strong potential for associating chemical occurrence with potential biological effects and providing a foundation for hypothesis generation to guide research and/or monitoring efforts relat

  2. Site-Based Management versus Systems-Based Thinking: The Impact of Data-Driven Accountability and Reform

    ERIC Educational Resources Information Center

    Mette, Ian M.; Bengtson, Ed

    2015-01-01

    This case was written to help prepare building-level and central office administrators who are expected to effectively lead schools and systems in an often tumultuous world of educational accountability and reform. The intent of this case study is to allow educators to examine the impact data management has on the types of thinking required when…

  3. General random matrix approach to account for the effect of static disorder on the spectral properties of light harvesting systems.

    PubMed

    Sener, Melih K; Schulten, Klaus

    2002-03-01

    We develop a random matrix model approach to study static disorder in pigment-protein complexes in photosynthetic organisms. As a case study, we examine the ring of B850 bacteriochlorophylls in the peripheral light-harvesting complex of Rhodospirillum molischianum, formulated in terms of an effective Hamiltonian describing the collective electronic excitations of the system. We numerically examine and compare various models of disorder and observe that both the density of states and the absorption spectrum of the model show remarkable spectral universality. For the case of unitary disorder, we develop a method to analytically evaluate the density of states of the ensemble using the supersymmetric formulation of random matrix theory. Succinct formulas that can be readily applied in future studies are provided in an appendix.

  4. When Creative Problem Solving Strategy Meets Web-Based Cooperative Learning Environment in Accounting Education

    ERIC Educational Resources Information Center

    Cheng, Kai Wen

    2011-01-01

    Background: Facing highly competitive and changing environment, cultivating citizens with problem-solving attitudes is one critical vision of education. In brief, the importance of education is to cultivate students with practical abilities. Realizing the advantages of web-based cooperative learning (web-based CL) and creative problem solving…

  5. Negotiating Accountability during Student Teaching: The Influence of an Inquiry-Based Student Teaching Seminar

    ERIC Educational Resources Information Center

    Cuenca, Alexander

    2014-01-01

    Drawing on the work of Russian literary critic, Mikhail Bakhtin, this article explores how an inquiry-based social studies student teaching seminar helped three preservice teachers negotiate the pressures of standards-based reforms during student teaching. The author first examines how initial perceptions of standardization and high-stakes testing…

  6. Accounting & Computing Curriculum Guide.

    ERIC Educational Resources Information Center

    Avani, Nathan T.; And Others

    This curriculum guide consists of materials for use in teaching a competency-based accounting and computing course that is designed to prepare students for employability in the following occupational areas: inventory control clerk, invoice clerk, payroll clerk, traffic clerk, general ledger bookkeeper, accounting clerk, account information clerk,…

  7. A Carbon Monitoring System Approach to US Coastal Wetland Carbon Fluxes: Progress Towards a Tier II Accounting Method with Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Windham-Myers, L.; Holmquist, J. R.; Bergamaschi, B. A.; Byrd, K. B.; Callaway, J.; Crooks, S.; Drexler, J. Z.; Feagin, R. A.; Ferner, M. C.; Gonneea, M. E.; Kroeger, K. D.; Megonigal, P.; Morris, J. T.; Schile, L. M.; Simard, M.; Sutton-Grier, A.; Takekawa, J.; Troxler, T.; Weller, D.; Woo, I.

    2015-12-01

    Despite their high rates of long-term carbon (C) sequestration when compared to upland ecosystems, coastal C accounting is only recently receiving the attention of policy makers and carbon markets. Assessing accuracy and uncertainty in net C flux estimates requires both direct and derived measurements based on both short and long term dynamics in key drivers, particularly soil accretion rates and soil organic content. We are testing the ability of remote sensing products and national scale datasets to estimate biomass and soil stocks and fluxes over a wide range of spatial and temporal scales. For example, the 2013 Wetlands Supplement to the 2006 IPCC GHG national inventory reporting guidelines requests information on development of Tier I-III reporting, which express increasing levels of detail. We report progress toward development of a Carbon Monitoring System for "blue carbon" that may be useful for IPCC reporting guidelines at Tier II levels. Our project uses a current dataset of publically available and contributed field-based measurements to validate models of changing soil C stocks, across a broad range of U.S. tidal wetland types and landuse conversions. Additionally, development of biomass algorithms for both radar and spectral datasets will be tested and used to determine the "price of precision" of different satellite products. We discuss progress in calculating Tier II estimates focusing on variation introduced by the different input datasets. These include the USFWS National Wetlands Inventory, NOAA Coastal Change Analysis Program, and combinations to calculate tidal wetland area. We also assess the use of different attributes and depths from the USDA-SSURGO database to map soil C density. Finally, we examine the relative benefit of radar, spectral and hybrid approaches to biomass mapping in tidal marshes and mangroves. While the US currently plans to report GHG emissions at a Tier I level, we argue that a Tier II analysis is possible due to national

  8. EFL Reading Instruction: Communicative Task-Based Approach

    ERIC Educational Resources Information Center

    Sidek, Harison Mohd

    2012-01-01

    The purpose of this study was to examine the overarching framework of EFL (English as a Foreign Language) reading instructional approach reflected in an EFL secondary school curriculum in Malaysia. Based on such analysis, a comparison was made if Communicative Task-Based Language is the overarching instructional approach for the Malaysian EFL…

  9. A Strength-Based Approach to Teacher Professional Development

    ERIC Educational Resources Information Center

    Zwart, Rosanne C.; Korthagen, Fred A. J.; Attema-Noordewier, Saskia

    2015-01-01

    Based on positive psychology, self-determination theory and a perspective on teacher quality, this study proposes and examines a strength-based approach to teacher professional development. A mixed method pre-test/post-test design was adopted to study perceived outcomes of the approach for 93 teachers of six primary schools in the Netherlands and…

  10. The Methodology of the Module: A Content-Based Approach.

    ERIC Educational Resources Information Center

    Martin, Ian

    A discussion of the use of thematic modules in content-based second language instruction argues that the approach has a number of advantages over others. Thematic modules are defined as longer than lessons and shorter than a course, and it is suggested that in a content-based approach, the module constitutes the basic unit of study. Content-based…

  11. Investigative Primary Science: A Problem-Based Learning Approach

    ERIC Educational Resources Information Center

    Etherington, Matthew B.

    2011-01-01

    This study reports on the success of using a problem-based learning approach (PBL) as a pedagogical mode of learning open inquiry science within a traditional four-year undergraduate elementary teacher education program. In 2010, a problem-based learning approach to teaching primary science replaced the traditional content driven syllabus. During…

  12. An energy-based model accounting for snow accumulation and snowmelt in a coniferous forest and in an open area

    NASA Astrophysics Data System (ADS)

    Matějka, Ondřej; Jeníček, Michal

    2016-04-01

    An energy balance approach was used to simulate snow water equivalent (SWE) evolution in an open area, forest clearing and coniferous forest during winter seasons 2011/12 and 2012/13 in the Bystřice River basin (Krušné Mountains, Czech Republic). The aim was to describe the impact of vegetation on snow accumulation and snowmelt under different forest canopy structure and trees density. Hemispherical photographs were used to describe the forest canopy structure. Energy balance model of snow accumulation and melt was set up. The snow model was adjusted to account the effects of forest canopy on driving meteorological variables. Leaf area index derived from 32 hemispherical photographs of vegetation and sky was used to implement the forest influence in the snow model. The model was evaluated using snow depth and SWE data measured at 16 localities in winter seasons from 2011 to 2013. The model was able to reproduce the SWE evolution in both winter seasons beneath the forest canopy, forest clearing and open area. The SWE maximum in forest sites was by 18% lower than in open areas and forest clearings. The portion of shortwave radiation on snowmelt rate was by 50% lower in forest areas than in open areas due to shading effect. The importance of turbulent fluxes was by 30% lower in forest sites compared to openings because of wind speed reduction up to 10% of values at corresponding open areas. Indirect estimation of interception rates was derived. Between 14 and 60% of snowfall was intercept and sublimated in the forest canopy in both winter seasons. Based on model results, the underestimation of solid precipitation (heated precipitation gauge used for measurement) at the weather station Hřebečná was revealed. The snowfall was underestimated by 40% in winter season 2011/12 and by 13% in winter season 2012/13. Although, the model formulation appeared sufficient for both analysed winter seasons, canopy effects on the longwave radiation and ground heat flux were not

  13. RESULTS FROM A DEMONSTRATION OF RF-BASED UF6 CYLINDER ACCOUNTING AND TRACKING SYSTEM INSTALLED AT A USEC FACILITY

    SciTech Connect

    Pickett, Chris A; Kovacic, Donald N; Morgan, Jim; Younkin, James R; Carrick, Bernie; Ken, Whittle; Johns, R E

    2008-09-01

    add tamper-indicating and data authentication features to some of the pertinent system components. Future efforts will focus on these needs along with implementing protocols relevant to IAEA safeguards. The work detailed in this report demonstrates the feasibility of constructing RF devices that can survive the operational rigors associated with the transportation, storage, and processing of UF6 cylinders. The system software specially designed for this project is called Cylinder Accounting and Tracking System (CATS). This report details the elements of the CATS rules-based architecture and its use in safeguards-monitoring and asset-tracking applications. Information is also provided on improvements needed to make the technology ready, as well as options for improving the safeguards aspects of the technology. The report also includes feedback from personnel involved in the testing, as well as individuals who could utilize an RF-based system in supporting the performance of their work. The system software was set up to support a Mailbox declaration, where a declaration can be made either before or after cylinder movements take place. When the declaration is made before cylinders move, the operators must enter this information into CATS. If the IAEA then shows up unexpectedly at the facility, they can see how closely the operational condition matches the declaration. If the declaration is made after the cylinders move, this provides greater operational flexibility when schedules are interrupted or are changed, by allowing operators to declare what moves have been completed. The IAEA can then compare where cylinders are with where CATS or the system says they are located. The ability of CATS to automatically generate Mailbox declarations is seen by the authors as a desirable feature. The Mailbox approach is accepted by the IAEA but has not been widely implemented (and never in enrichment facilities). During the course of this project, we have incorporated alternative

  14. A biomimetic vision-based hovercraft accounts for bees' complex behaviour in various corridors.

    PubMed

    Roubieu, Frédéric L; Serres, Julien R; Colonnier, Fabien; Franceschini, Nicolas; Viollet, Stéphane; Ruffier, Franck

    2014-09-01

    Here we present the first systematic comparison between the visual guidance behaviour of a biomimetic robot and those of honeybees flying in similar environments. We built a miniature hovercraft which can travel safely along corridors with various configurations. For the first time, we implemented on a real physical robot the 'lateral optic flow regulation autopilot', which we previously studied computer simulations. This autopilot inspired by the results of experiments on various species of hymenoptera consists of two intertwined feedback loops, the speed and lateral control loops, each of which has its own optic flow (OF) set-point. A heading-lock system makes the robot move straight ahead as fast as 69 cm s(-1) with a clearance from one wall as small as 31 cm, giving an unusually high translational OF value (125° s(-1)). Our biomimetic robot was found to navigate safely along straight, tapered and bent corridors, and to react appropriately to perturbations such as the lack of texture on one wall, the presence of a tapering or non-stationary section of the corridor and even a sloping terrain equivalent to a wind disturbance. The front end of the visual system consists of only two local motion sensors (LMS), one on each side. This minimalistic visual system measuring the lateral OF suffices to control both the robot's forward speed and its clearance from the walls without ever measuring any speeds or distances. We added two additional LMSs oriented at +/-45° to improve the robot's performances in stiffly tapered corridors. The simple control system accounts for worker bees' ability to navigate safely in six challenging environments: straight corridors, single walls, tapered corridors, straight corridors with part of one wall moving or missing, as well as in the presence of wind.

  15. LIFE CLIMATREE project: A novel approach for accounting and monitoring carbon sequestration of tree crops and their potential as carbon sink areas

    NASA Astrophysics Data System (ADS)

    Stergiou, John; Tagaris, Efthimios; -Eleni Sotiropoulou, Rafaella

    2016-04-01

    Climate Change Mitigation is one of the most important objectives of the Kyoto Convention, and is mostly oriented towards reducing GHG emissions. However, carbon sink is retained only in the calculation of the forests capacity since agricultural land and farmers practices for securing carbon stored in soils have not been recognized in GHG accounting, possibly resulting in incorrect estimations of the carbon dioxide balance in the atmosphere. The agricultural sector, which is a key sector in the EU, presents a consistent strategic framework since 1954, in the form of Common Agricultural Policy (CAP). In its latest reform of 2013 (reg. (EU) 1305/13) CAP recognized the significance of Agriculture as a key player in Climate Change policy. In order to fill this gap the "LIFE ClimaTree" project has recently founded by the European Commission aiming to provide a novel method for including tree crop cultivations in the LULUCF's accounting rules for GHG emissions and removal. In the framework of "LIFE ClimaTree" project estimation of carbon sink within EU through the inclusion of the calculated tree crop capacity will be assessed for both current and future climatic conditions by 2050s using the GISS-WRF modeling system in a very fine scale (i.e., 9km x 9km) using RCP8.5 and RCP4.5 climate scenarios. Acknowledgement: LIFE CLIMATREE project "A novel approach for accounting and monitoring carbon sequestration of tree crops and their potential as carbon sink areas" (LIFE14 CCM/GR/000635).

  16. Business Approach To Lunar Base Activation

    NASA Astrophysics Data System (ADS)

    Schmitt, Harrison H.

    2003-01-01

    It remains unlikely that any government or group of governments will make the long-term funding commitments necessary to return to the Moon in support of scientific goals or resource production. If a lunar base is to be established within the foreseeable future, it will support of commercial production and use of unique energy resources Business plan development for commercial production and use of lunar Helium-3 requires a number of major steps, including identification of the required investor base and development of fusion power technology through a series of business bridges that provide required rates of return.

  17. Discriminating bot accounts based solely on temporal features of microblog behavior

    NASA Astrophysics Data System (ADS)

    Pan, Junshan; Liu, Ying; Liu, Xiang; Hu, Hanping

    2016-05-01

    As the largest microblog service in China, Sina Weibo has attracted numerous automated applications (known as bots) due to its popularity and open architecture. We classify the active users from Sina Weibo into human, bot-based and hybrid groups based solely on the study of temporal features of their posting behavior. The anomalous burstiness parameter and time-interval entropy value are exploited to characterize automation. We also reveal different behavior patterns among the three types of users regarding their reposting ratio, daily rhythm and active days. Our findings may help Sina Weibo manage a better community and should be considered for dynamic models of microblog behaviors.

  18. Context-Based Chemistry: The Salters Approach

    ERIC Educational Resources Information Center

    Bennett, Judith; Lubben, Fred

    2006-01-01

    This paper describes briefly the development and key features of one of the major context-based courses for upper high school students, Salters Advanced Chemistry. It goes on to consider the research evidence on the impact of the course, focusing on teachers' views, and, in particular, on students' affective and cognitive responses. The research…

  19. A hybrid agent-based approach for modeling microbiological systems.

    PubMed

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  20. Adventure-Based Service Learning: University Students' Self-Reflection Accounts of Service with Children

    ERIC Educational Resources Information Center

    Quezada, Reyes L.; Christopherson, Richard W.

    2005-01-01

    The need to provide alternative and exciting community service-learning experiences with university students has been a challenge to institutions of higher education. One institution was able to capitalize on an idea of integrating challenge and adventure-based activities as a form of community service. This article focuses on undergraduate…

  1. Bringing Technology to Students' Proximity: A Sociocultural Account of Technology-Based Learning Projects

    ERIC Educational Resources Information Center

    Mukama, Evode

    2014-01-01

    This paper depicts a study carried out in Rwanda concerning university students who participated in a contest to produce short documentary films. The purpose of this research is to conceptualize these kinds of technology-based learning projects (TBLPs) through a sociocultural perspective. The methodology included focus-group discussions and field…

  2. Bringing Technology to Students' Proximity: A Sociocultural Account of Technology-Based Learning Projects

    ERIC Educational Resources Information Center

    Mukama, Evode

    2014-01-01

    This paper depicts a study carried out in Rwanda concerning university students who participated in a contest to produce short documentary films. The purpose of this research is to conceptualize these kinds of technology-based learning projects (TBLPs) through a sociocultural perspective. The methodology included focus group discussions and field…

  3. ACCOUNTING FOR BIOLOGICAL AND ANTHROPOGENIC FACTORS IN NATIONAL LAND-BASED CARBON BUDGETS

    EPA Science Inventory

    Efforts to quantify net greenhouse gas emissions at the national scale, as required by the United Nations Framework Convention on Climate Change, must include both industrial emissions and the net flux associated with the land base. In this study, data on current land use, rates ...

  4. The Social Foundation of Team-Based Learning: Students Accountable to Students

    ERIC Educational Resources Information Center

    Sweet, Michael; Pelton-Sweet, Laura M.

    2008-01-01

    As one form of small group learning, team-based learning's (TBL's) unique sequence of individual and group work with immediate feedback enables and encourages students to engage course content and each other in remarkable ways. Specifically, TBL creates an environment where students can fulfill their human need to belong in the process of…

  5. Frequency Affects Object Relative Clause Processing: Some Evidence in Favor of Usage-Based Accounts

    ERIC Educational Resources Information Center

    Reali, Florencia

    2014-01-01

    The processing difficulty of nested grammatical structure has been explained by different psycholinguistic theories. Here I provide corpus and behavioral evidence in favor of usage-based models, focusing on the case of object relative clauses in Spanish as a first language. A corpus analysis of spoken Spanish reveals that, as in English, the…

  6. Audit Culture: Unintended Consequences of Accountability Practices in Evidence-Based Programs

    ERIC Educational Resources Information Center

    Owczarzak, Jill; Broaddus, Michelle; Pinkerton, Steven

    2016-01-01

    Evaluation has become expected within the nonprofit sector, including HIV prevention service delivery through community-based organizations (CBOs). While staff and directors at CBOs may acknowledge the potential contribution of evaluation data to the improvement of agency services, the results of evaluation are often used to demonstrate fiscal…

  7. Temporal Adverbials, Negation and Finiteness in Dutch as a Second Language: A Scope-Based Account

    ERIC Educational Resources Information Center

    Verhagen, Josje

    2009-01-01

    This study investigates the acquisition of post-verbal (temporal) adverbials and post-verbal negation in L2 Dutch. It is based on previous findings for L2 French that post-verbal negation poses less of a problem for L2 learners than post-verbal adverbial placement (Hawkins, Towell, Bazergui, Second Language Research 9: 189-233, 1993; Herschensohn,…

  8. Theory-Based Approaches to the Concept of Life

    ERIC Educational Resources Information Center

    El-Hani, Charbel Nino

    2008-01-01

    In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

  9. Approach for classification and taxonomy within family Rickettsiaceae based on the Formal Order Analysis.

    PubMed

    Shpynov, S; Pozdnichenko, N; Gumenuk, A

    2015-01-01

    Genome sequences of 36 Rickettsia and Orientia were analyzed using Formal Order Analysis (FOA). This approach takes into account arrangement of nucleotides in each sequence. A numerical characteristic, the average distance (remoteness) - "g" was used to compare of genomes. Our results corroborated previous separation of three groups within the genus Rickettsia, including typhus group, classic spotted fever group, and the ancestral group and Orientia as a separate genus. Rickettsia felis URRWXCal2 and R. akari Hartford were not in the same group based on FOA, therefore designation of a so-called transitional Rickettsia group could not be confirmed with this approach.

  10. Nanotechnology-Based Approach in Tuberculosis Treatment

    PubMed Central

    Neyaz, Md. Kausar; Das, Shilpi

    2017-01-01

    Tuberculosis, commonly known as TB, is the second most fatal infectious disease after AIDS, caused by bacterium called Mycobacterium tuberculosis. Prolonged treatment, high pill burden, low compliance, and stiff administration schedules are factors that are responsible for emergence of MDR and XDR cases of tuberculosis. Till date, only BCG vaccine is available which is ineffective against adult pulmonary TB, which is the most common form of disease. Various unique antibodies have been developed to overcome drug resistance, reduce the treatment regimen, and elevate the compliance to treatment. Therefore, we need an effective and robust system to subdue technological drawbacks and improve the effectiveness of therapeutic drugs which still remains a major challenge for pharmaceutical technology. Nanoparticle-based ideology has shown convincing treatment and promising outcomes for chronic infectious diseases. Different types of nanocarriers have been evaluated as promising drug delivery systems for various administration routes. Controlled and sustained release of drugs is one of the advantages of nanoparticle-based antituberculosis drugs over free drug. It also reduces the dosage frequency and resolves the difficulty of low poor compliance. This paper reviews various nanotechnology-based therapies which can be used for the treatment of TB. PMID:28210505

  11. Design of an activity landscape view taking compound-based feature probabilities into account.

    PubMed

    Zhang, Bijun; Vogt, Martin; Bajorath, Jürgen

    2014-09-01

    Activity landscapes (ALs) of compound data sets are rationalized as graphical representations that integrate similarity and potency relationships between active compounds. ALs enable the visualization of structure-activity relationship (SAR) information and are thus computational tools of interest for medicinal chemistry. For AL generation, similarity and potency relationships are typically evaluated in a pairwise manner and major AL features are assessed at the level of compound pairs. In this study, we add a conditional probability formalism to AL design that makes it possible to quantify the probability of individual compounds to contribute to characteristic AL features. Making this information graphically accessible in a molecular network-based AL representation is shown to further increase AL information content and helps to quickly focus on SAR-informative compound subsets. This feature probability-based AL variant extends the current spectrum of AL representations for medicinal chemistry applications.

  12. Design of an activity landscape view taking compound-based feature probabilities into account

    NASA Astrophysics Data System (ADS)

    Zhang, Bijun; Vogt, Martin; Bajorath, Jürgen

    2014-09-01

    Activity landscapes (ALs) of compound data sets are rationalized as graphical representations that integrate similarity and potency relationships between active compounds. ALs enable the visualization of structure-activity relationship (SAR) information and are thus computational tools of interest for medicinal chemistry. For AL generation, similarity and potency relationships are typically evaluated in a pairwise manner and major AL features are assessed at the level of compound pairs. In this study, we add a conditional probability formalism to AL design that makes it possible to quantify the probability of individual compounds to contribute to characteristic AL features. Making this information graphically accessible in a molecular network-based AL representation is shown to further increase AL information content and helps to quickly focus on SAR-informative compound subsets. This feature probability-based AL variant extends the current spectrum of AL representations for medicinal chemistry applications.

  13. What Part of "No" Do Children Not Understand? A Usage-Based Account of Multiword Negation

    ERIC Educational Resources Information Center

    Cameron-Faulkner, Thea; Lieven, Elena; Theakston, Anna

    2007-01-01

    The study investigates the development of English multiword negation, in particular the negation of zero marked verbs (e.g. "no sleep", "not see", "can't reach") from a usage-based perspective. The data was taken from a dense database consisting of the speech of an English-speaking child (Brian) aged 2;3-3;4 (MLU 2.05-3.1) and his mother. The…

  14. Liaison acquisition, word segmentation and construction in French: a usage-based account.

    PubMed

    Chevrot, Jean-Pierre; Dugua, Celine; Fayol, Michel

    2009-06-01

    In the linguistics field, liaison in French is interpreted as an indicator of interactions between the various levels of language organization. The current study examines the same issue while adopting a developmental perspective. Five experiments involving children aged two to six years provide evidence for a developmental scenario which interrelates a number of different issues: the acquisition of phonological alternations, the segmentation of new words, the long-term stabilization of the word form in the lexicon and the formation of item-based constructions. According to this scenario, children favour the presence of initial CV syllables when segmenting stored chunks of speech of the type word1-liaison-word2 (les arbres 'the trees' is segmented as /le/+/zarbr/). They cope with the variation of the liaison in the input by memorizing multiple exemplars of the same word2 (/zarbr/, /narbr/). They learn the correct relations between the word1s and the word2 exemplars through exposure to the well-formed sequence (un+/narbr/, deux+/zarbr/). They generalize the relation between a word1 and a class of word2 exemplars beginning with a specific liaison consonant by integrating this information into an item-based schema (e.g. un+/nX/, deux+/zX/). This model is based on the idea that the segmentation of new words and the development of syntactic schemas are two aspects of the same process.

  15. Accounting for failure: risk-based regulation and the problems of ensuring healthcare quality in the NHS.

    PubMed

    Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry

    2016-05-18

    In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013-2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes.

  16. Accounting for failure: risk-based regulation and the problems of ensuring healthcare quality in the NHS

    PubMed Central

    Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry

    2016-01-01

    In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013–2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes. PMID:27499677

  17. Emerging accounting trends accounting for leases.

    PubMed

    Valletta, Robert; Huggins, Brian

    2010-12-01

    A new model for lease accounting can have a significant impact on hospitals and healthcare organizations. The new approach proposes a "right-of-use" model that involves complex estimates and significant administrative burden. Hospitals and health systems that draw heavily on lease arrangements should start preparing for the new approach now even though guidance and a final rule are not expected until mid-2011. This article highlights a number of considerations from the lessee point of view.

  18. Wavelet-based approach to character skeleton.

    PubMed

    You, Xinge; Tang, Yuan Yan

    2007-05-01

    Character skeleton plays a significant role in character recognition. The strokes of a character may consist of two regions, i.e., singular and regular regions. The intersections and junctions of the strokes belong to singular region, while the straight and smooth parts of the strokes are categorized to regular region. Therefore, a skeletonization method requires two different processes to treat the skeletons in theses two different regions. All traditional skeletonization algorithms are based on the symmetry analysis technique. The major problems of these methods are as follows. 1) The computation of the primary skeleton in the regular region is indirect, so that its implementation is sophisticated and costly. 2) The extracted skeleton cannot be exactly located on the central line of the stroke. 3) The captured skeleton in the singular region may be distorted by artifacts and branches. To overcome these problems, a novel scheme of extracting the skeleton of character based on wavelet transform is presented in this paper. This scheme consists of two main steps, namely: a) extraction of primary skeleton in the regular region and b) amendment processing of the primary skeletons and connection of them in the singular region. A direct technique is used in the first step, where a new wavelet-based symmetry analysis is developed for finding the central line of the stroke directly. A novel method called smooth interpolation is designed in the second step, where a smooth operation is applied to the primary skeleton, and, thereafter, the interpolation compensation technique is proposed to link the primary skeleton, so that the skeleton in the singular region can be produced. Experiments are conducted and positive results are achieved, which show that the proposed skeletonization scheme is applicable to not only binary image but also gray-level image, and the skeleton is robust against noise and affine transform.

  19. Workflow-based approaches to neuroimaging analysis.

    PubMed

    Fissell, Kate

    2007-01-01

    Analysis of functional and structural magnetic resonance imaging (MRI) brain images requires a complex sequence of data processing steps to proceed from raw image data to the final statistical tests. Neuroimaging researchers have begun to apply workflow-based computing techniques to automate data analysis tasks. This chapter discusses eight major components of workflow management systems (WFMSs): the workflow description language, editor, task modules, data access, verification, client, engine, and provenance, and their implementation in the Fiswidgets neuroimaging workflow system. Neuroinformatics challenges involved in applying workflow techniques in the domain of neuroimaging are discussed.

  20. INDIVIDUAL BASED MODELLING APPROACH TO THERMAL ...

    EPA Pesticide Factsheets

    Diadromous fish populations in the Pacific Northwest face challenges along their migratory routes from declining habitat quality, harvest, and barriers to longitudinal connectivity. Changes in river temperature regimes are producing an additional challenge for upstream migrating adult salmon and steelhead, species that are sensitive to absolute and cumulative thermal exposure. Adult salmon populations have been shown to utilize cold water patches along migration routes when mainstem river temperatures exceed thermal optimums. We are employing an individual based model (IBM) to explore the costs and benefits of spatially-distributed cold water refugia for adult migrating salmon. Our model, developed in the HexSim platform, is built around a mechanistic behavioral decision tree that drives individual interactions with their spatially explicit simulated environment. Population-scale responses to dynamic thermal regimes, coupled with other stressors such as disease and harvest, become emergent properties of the spatial IBM. Other model outputs include arrival times, species-specific survival rates, body energetic content, and reproductive fitness levels. Here, we discuss the challenges associated with parameterizing an individual based model of salmon and steelhead in a section of the Columbia River. Many rivers and streams in the Pacific Northwest are currently listed as impaired under the Clean Water Act as a result of high summer water temperatures. Adverse effec

  1. A new computational account of cognitive control over reinforcement-based decision-making: Modeling of a probabilistic learning task.

    PubMed

    Zendehrouh, Sareh

    2015-11-01

    Recent work on decision-making field offers an account of dual-system theory for decision-making process. This theory holds that this process is conducted by two main controllers: a goal-directed system and a habitual system. In the reinforcement learning (RL) domain, the habitual behaviors are connected with model-free methods, in which appropriate actions are learned through trial-and-error experiences. However, goal-directed behaviors are associated with model-based methods of RL, in which actions are selected using a model of the environment. Studies on cognitive control also suggest that during processes like decision-making, some cortical and subcortical structures work in concert to monitor the consequences of decisions and to adjust control according to current task demands. Here a computational model is presented based on dual system theory and cognitive control perspective of decision-making. The proposed model is used to simulate human performance on a variant of probabilistic learning task. The basic proposal is that the brain implements a dual controller, while an accompanying monitoring system detects some kinds of conflict including a hypothetical cost-conflict one. The simulation results address existing theories about two event-related potentials, namely error related negativity (ERN) and feedback related negativity (FRN), and explore the best account of them. Based on the results, some testable predictions are also presented.

  2. Ethics education for health professionals: a values based approach.

    PubMed

    Godbold, Rosemary; Lees, Amanda

    2013-11-01

    It is now widely accepted that ethics is an essential part of educating health professionals. Despite a clear mandate to educators, there are differing approaches, in particular, how and where ethics is positioned in training programmes, underpinning philosophies and optimal modes of assessment. This paper explores varying practices and argues for a values based approach to ethics education. It then explores the possibility of using a web-based technology, the Values Exchange, to facilitate a values based approach. It uses the findings of a small scale study to signal the potential of the Values Exchange for engaging, meaningful and applied ethics education.

  3. Towards Project-Based Learning: An Autoethnographic Account of One Assistant Professor's Struggle to Be a Better Teacher

    ERIC Educational Resources Information Center

    Greer, Wil

    2016-01-01

    This paper outlines an approach to incorporating project-based learning (PBL) in a master's level educational administration diversity course. It draws on the qualitative methodology of autoethnography, and details the characteristics of this technique. In alignment with that method, the author discusses his positionality and engages in…

  4. Multiresolution approach based on projection matrices

    SciTech Connect

    Vargas, Javier; Quiroga, Juan Antonio

    2009-03-01

    Active triangulation measurement systems with a rigid geometric configuration are inappropriate for scanning large objects with low measuring tolerances. The reason is that the ratio between the depth recovery error and the lateral extension is a constant that depends on the geometric setup. As a consequence, measuring large areas with low depth recovery error requires the use of multiresolution techniques. We propose a multiresolution technique based on a camera-projector system previously calibrated. The method consists of changing the camera or projector's parameters in order to increase the system depth sensitivity. A subpixel retroprojection error in the self-calibration process and a decrease of approximately one order of magnitude in the depth recovery error can be achieved using the proposed method.

  5. Ameliorated GA approach for base station planning

    NASA Astrophysics Data System (ADS)

    Wang, Andong; Sun, Hongyue; Wu, Xiaomin

    2011-10-01

    In this paper, we aim at locating base station (BS) rationally to satisfy the most customs by using the least BSs. An ameliorated GA is proposed to search for the optimum solution. In the algorithm, we mesh the area to be planned according to least overlap length derived from coverage radius, bring into isometric grid encoding method to represent BS distribution as well as its number and develop select, crossover and mutation operators to serve our unique necessity. We also construct our comprehensive object function after synthesizing coverage ratio, overlap ratio, population and geographical conditions. Finally, after importing an electronic map of the area to be planned, a recommended strategy draft would be exported correspondingly. We eventually import HongKong, China to simulate and yield a satisfactory solution.

  6. Accounting: "Balancing Out" the Accounting Program.

    ERIC Educational Resources Information Center

    Babcock, Coleen

    1979-01-01

    The vocational accounting laboratory is a viable, meaningful educational experience for high school seniors, due to the uniqueness of its educational approach and the direct involvement of the professional and business community. A balance of experiences is provided to match individual needs and goals of students. (CT)

  7. An Individual-Based Model of Zebrafish Population Dynamics Accounting for Energy Dynamics

    PubMed Central

    Beaudouin, Rémy; Goussen, Benoit; Piccini, Benjamin; Augustine, Starrlight; Devillers, James; Brion, François; Péry, Alexandre R. R.

    2015-01-01

    Developing population dynamics models for zebrafish is crucial in order to extrapolate from toxicity data measured at the organism level to biological levels relevant to support and enhance ecological risk assessment. To achieve this, a dynamic energy budget for individual zebrafish (DEB model) was coupled to an individual based model of zebrafish population dynamics (IBM model). Next, we fitted the DEB model to new experimental data on zebrafish growth and reproduction thus improving existing models. We further analysed the DEB-model and DEB-IBM using a sensitivity analysis. Finally, the predictions of the DEB-IBM were compared to existing observations on natural zebrafish populations and the predicted population dynamics are realistic. While our zebrafish DEB-IBM model can still be improved by acquiring new experimental data on the most uncertain processes (e.g. survival or feeding), it can already serve to predict the impact of compounds at the population level. PMID:25938409

  8. Beyond simple models of self-control to circuit-based accounts of adolescent behavior.

    PubMed

    Casey, B J

    2015-01-03

    Adolescence is the transition from childhood to adulthood that begins around the onset of puberty and ends with relative independence from the parent. This developmental period is one when an individual is probably stronger, of higher reasoning capacity, and more resistant to disease than ever before, yet when mortality rates increase by 200%. These untimely deaths are not due to disease but to preventable deaths associated with adolescents putting themselves in harm's way (e.g., accidental fatalities). We present evidence that these alarming health statistics are in part due to diminished self-control--the ability to inhibit inappropriate desires, emotions, and actions in favor of appropriate ones. Findings of adolescent-specific changes in self-control and underlying brain circuitry are considered in terms of how evolutionarily based biological constraints and experiences shape the brain to adapt to the unique intellectual, physical, sexual, and social challenges of adolescence.

  9. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    NASA Technical Reports Server (NTRS)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  10. [Management of large marine ecosystem based on ecosystem approach].

    PubMed

    Chu, Jian-song

    2011-09-01

    Large marine ecosystem (LME) is a large area of ocean characterized by distinct oceanology and ecology. Its natural characteristics require management based on ecosystem approach. A series of international treaties and regulations definitely or indirectly support that it should adopt ecosystem approach to manage LME to achieve the sustainable utilization of marine resources. In practices, some countries such as Canada, Australia, and U.S.A. have adopted ecosystem-based approach to manage their oceans, and some international organizations such as global environment fund committee have carried out a number of LME programs based on ecosystem approach. Aiming at the sustainable development of their fisheries, the regional organizations such as Caribbean Community have established regional fisheries mechanism. However, the adoption of ecosystem approach to manage LME is not only a scientific and legal issue, but also a political matter largely depending on the political will and the mutual cooperation degree of related countries.

  11. Enuresis in children: a case based approach.

    PubMed

    Baird, Drew C; Seehusen, Dean A; Bode, David V

    2014-10-15

    Enuresis is defined as intermittent urinary incontinence during sleep in a child at least five years of age. Approximately 5% to 10% of all seven-year-olds have enuresis, and an estimated 5 to 7 million children in the United States have enuresis. The pathophysiology of primary nocturnal enuresis involves the inability to awaken from sleep in response to a full bladder, coupled with excessive nighttime urine production or a decreased functional capacity of the bladder. Initial evaluation should include a history, physical examination, and urinalysis. Several conditions, such as constipation, obstructive sleep apnea, diabetes mellitus, diabetes insipidus, chronic kidney disease, and psychiatric disorders, are associated with enuresis. If identified, these conditions should be evaluated and treated. Treatment of primary monosymptomatic enuresis (i.e., the only symptom is nocturnal bed-wetting in a child who has never been dry) begins with counseling the child and parents on effective behavioral modifications. First-line treatments for enuresis include bed alarm therapy and desmopressin. The choice of therapy is based on the child's age and nighttime voiding patterns, and the desires of the child and family. Referral to a pediatric urologist is indicated for children with primary enuresis refractory to standard and combination therapies, and for children with some secondary causes of enuresis, including urinary tract malformations, recurrent urinary tract infections, or neurologic disorders.

  12. Accounting for phase drifts in SSVEP-based BCIs by means of biphasic stimulation.

    PubMed

    Wu, Hung-Yi; Lee, Po-Lei; Chang, Hsiang-Chih; Hsieh, Jen-Chuen

    2011-05-01

    This study proposes a novel biphasic stimulation technique to solve the issue of phase drifts in steady-state visual evoked potential (SSVEPs) in phase-tagged systems. Phase calibration was embedded in stimulus sequences using a biphasic flicker, which is driven by a sequence with alternating reference and phase-shift states. Nine subjects were recruited to participate in off-line and online tests. Signals were bandpass filtered and segmented by trigger signals into reference and phase-shift epochs. Frequency components of SSVEP in the reference and phase-shift epochs were extracted using the Fourier method with a 50% overlapped sliding window. The real and imaginary parts of the SSVEP frequency components were organized into complex vectors in each epoch. Hotelling's t-square test was used to determine the significances of nonzero mean vectors. The rejection of noisy data segments and the validation of gaze detections were made based on p values. The phase difference between the valid mean vectors of reference and phase-shift epochs was used to identify user's gazed targets in this system. Data showed an average information transfer rate of 44.55 and 38.21 bits/min in off-line and online tests, respectively.

  13. Simulation-Based Constructivist Approach for Education Leaders

    ERIC Educational Resources Information Center

    Shapira-Lishchinsky, Orly

    2015-01-01

    The purpose of this study was to reflect the leadership strategies that may arise using a constructivist approach based on organizational learning. This approach involved the use of simulations that focused on ethical tensions in school principals' daily experiences, and the development of codes of ethical conduct to reduce these tensions. The…

  14. Theory Based Approaches to Learning. Implications for Adult Educators.

    ERIC Educational Resources Information Center

    Bolton, Elizabeth B.; Jones, Edward V.

    This paper presents a codification of theory-based approaches that are applicable to adult learning situations. It also lists some general guidelines that can be used when selecting a particular approach or theory as a basis for planning instruction. Adult education's emphasis on practicality and the relationship between theory and practice is…

  15. Interteaching: An Evidence-Based Approach to Instruction

    ERIC Educational Resources Information Center

    Brown, Thomas Wade; Killingsworth, Kenneth; Alavosius, Mark P.

    2014-01-01

    This paper describes "interteaching" as an evidence-based method of instruction. Instructors often rely on more traditional approaches, such as lectures, as means to deliver instruction. Despite high usage, these methods are ineffective at achieving desirable academic outcomes. We discuss an innovative approach to delivering instruction…

  16. A values-based approach to medical leadership.

    PubMed

    Moen, Charlotte; Prescott, Patricia

    2016-11-02

    Integrity, trust and authenticity are essential characteristics of an effective leader, demonstrated through a values-based approach to leadership. This article explores whether Covey's (1989) principle-centred leadership model is a useful approach to developing doctors' leadership qualities and skills.

  17. Effects of storm runoff on acid-base accounting of mine drainage

    SciTech Connect

    Sjoegren, D.R.; Olyphant, G.A.; Harper, D.

    1997-12-31

    Pre-reclamation conditions were documented at an abandoned mine site in an upland area at the headwaters of a small perennial stream in southwestern Indiana. Stream discharge and chemistry were monitored from April to October 1995, in an effort to assess the total acid-base budget of outflows from the site. The chemistry of three lakes, a shallow aquifer, and flooded mine voids was also monitored. During the period of monitoring, thirty-five rainfall-runoff events occurred, producing a total storm discharge of approximately 6.12 x 10{sup 7} L. Baseflow during the monitoring period was approximately 1.10 x 10{sup 8} L and was characterized by water chemistry that was similar to that of a spring that issued from the flooded mine voids. Analysis of the discharge and chemistry associated with an isolated thunderstorm revealed fluctuations in acidity that were not congruent with fluctuations in the total discharge hydrograph. For example, acidity increased rapidly during the initial phase of hydrograph rise, but dropped significantly as the storm hydrograph peaked. A second, more subdued, rise in acidity occurred during a second rain pulse, and the acidity gradually decreased to pre-storm levels during hydrograph recession. The trends are interpreted to reflect different sources of storm runoff associated with various components of the total discharge hydrograph. Preliminary calculations indicate that the total quantity of acidity that is discharged during stormflow is about eight times higher than that which is discharged during a comparable period under baseflow conditions. While the lower acid concentrations generated during storm events are ecologically favorable, the increase in total quantities of acidity can have implications for the buffering capacities of receiving water bodies.

  18. A Comparison of Seven Cox Regression-Based Models to Account for Heterogeneity Across Multiple HIV Treatment Cohorts in Latin America and the Caribbean

    PubMed Central

    Giganti, Mark J.; Luz, Paula M.; Caro-Vega, Yanink; Cesar, Carina; Padgett, Denis; Koenig, Serena; Echevarria, Juan; McGowan, Catherine C.; Shepherd, Bryan E.

    2015-01-01

    Abstract Many studies of HIV/AIDS aggregate data from multiple cohorts to improve power and generalizability. There are several analysis approaches to account for cross-cohort heterogeneity; we assessed how different approaches can impact results from an HIV/AIDS study investigating predictors of mortality. Using data from 13,658 HIV-infected patients starting antiretroviral therapy from seven Latin American and Caribbean cohorts, we illustrate the assumptions of seven readily implementable approaches to account for across cohort heterogeneity with Cox proportional hazards models, and we compare hazard ratio estimates across approaches. As a sensitivity analysis, we modify cohort membership to generate specific heterogeneity conditions. Hazard ratio estimates varied slightly between the seven analysis approaches, but differences were not clinically meaningful. Adjusted hazard ratio estimates for the association between AIDS at treatment initiation and death varied from 2.00 to 2.20 across approaches that accounted for heterogeneity; the adjusted hazard ratio was estimated as 1.73 in analyses that ignored across cohort heterogeneity. In sensitivity analyses with more extreme heterogeneity, we noted a slightly greater distinction between approaches. Despite substantial heterogeneity between cohorts, the impact of the specific approach to account for heterogeneity was minimal in our case study. Our results suggest that it is important to account for across cohort heterogeneity in analyses, but that the specific technique for addressing heterogeneity may be less important. Because of their flexibility in accounting for cohort heterogeneity, we prefer stratification or meta-analysis methods, but we encourage investigators to consider their specific study conditions and objectives. PMID:25647087

  19. Combining U.S.-based prioritization tools to improve screening level accountability for environmental impact: the case of the chemical manufacturing industry.

    PubMed

    Zhou, Xiaoying; Schoenung, Julie M

    2009-12-15

    There are two quantitative indicators that are most widely used to assess the extent of compliance of industrial facilities with environmental regulations: the quantity of hazardous waste generated and the amount of toxics released. These indicators, albeit useful in terms of some environmental monitoring, fail to account for direct or indirect effects on human and environmental health, especially when aggregating total quantity of releases for a facility or industry sector. Thus, there is a need for a more comprehensive approach that can prioritize a particular chemical (or industry sector) on the basis of its relevant environmental performance and impact on human health. Accordingly, the objective of the present study is to formulate an aggregation of tools that can simultaneously capture multiple effects and several environmental impact categories. This approach allows us to compare and combine results generated with the aid of select U.S.-based quantitative impact assessment tools, thereby supplementing compliance-based metrics such as data from the U.S. Toxic Release Inventory. A case study, which presents findings for the U.S. chemical manufacturing industry, is presented to illustrate the aggregation of these tools. Environmental impacts due to both upstream and manufacturing activities are also evaluated for each industry sector. The proposed combinatorial analysis allows for a more robust evaluation for rating and prioritizing the environmental impacts of industrial waste.

  20. Branch-Based Model for the Diameters of the Pulmonary Airways: Accounting for Departures From Self-Consistency and Registration Errors

    SciTech Connect

    Neradilek, Moni B.; Polissar, Nayak L.; Einstein, Daniel R.; Glenny, Robb W.; Minard, Kevin R.; Carson, James P.; Jiao, Xiangmin; Jacob, Richard E.; Cox, Timothy C.; Postlethwait, Edward M.; Corley, Richard A.

    2012-04-24

    We examine a previously published branch-based approach to modeling airway diameters that is predicated on the assumption of self-consistency across all levels of the tree. We mathematically formulate this assumption, propose a method to test it and develop a more general model to be used when the assumption is violated. We discuss the effect of measurement error on the estimated models and propose methods that account for it. The methods are illustrated on data from MRI and CT images of silicone casts of two rats, two normal monkeys and one ozone-exposed monkey. Our results showed substantial departures from self-consistency in all five subjects. When departures from selfconsistency exist we do not recommend using the self-consistency model, even as an approximation, as we have shown that it may likely lead to an incorrect representation of the diameter geometry. Measurement error has an important impact on the estimated morphometry models and needs to be accounted for in the analysis.

  1. Sensor-independent approach to recognition: the object-based approach

    NASA Astrophysics Data System (ADS)

    Morrow, Jim C.; Hossain, Sqama

    1994-03-01

    This paper introduces a fundamentally different approach to recognition -- the object-based approach -- which is inherently knowledge-based and sensor independent. The paper begins with a description of an object-based recognition system, contrasting it with the image-based approach. Next, the multilevel stage of the system, incorporating several sensor data sources is described. From these sources elements of the situation hypothesis are generated as directed by the recognition goal. Depending on the degree of correspondence between the sensor-fed elements and the object-model-fed elements, a hypothetical element is created. The hypothetical element is further employed to develop evidence for the sensor-fed element through the inclusion of secondary sensor outputs. The sensor-fed element is thus modeled in more detail, and further evidence is added to the hypothetical element. Several levels of reasoning and data integration are involved in this overall process; further, a self-adjusting correction mechanism is included through the feedback from the hypothetical element to the sensors, thus defining secondary output connections to the sensor-fed element. Some preliminary work based on this approach has been carried out and initial results show improvements over the conventional image-based approach.

  2. [Global brain metastases management strategy: a multidisciplinary-based approach].

    PubMed

    Métellus, P; Tallet, A; Dhermain, F; Reyns, N; Carpentier, A; Spano, J-P; Azria, D; Noël, G; Barlési, F; Taillibert, S; Le Rhun, É

    2015-02-01

    Brain metastases management has evolved over the last fifteen years and may use varying strategies, including more or less aggressive treatments, sometimes combined, leading to an improvement in patient's survival and quality of life. The therapeutic decision is subject to a multidisciplinary analysis, taking into account established prognostic factors including patient's general condition, extracerebral disease status and clinical and radiological presentation of lesions. In this article, we propose a management strategy based on the state of current knowledge and available therapeutic resources.

  3. A relaxation-based approach to damage modeling

    NASA Astrophysics Data System (ADS)

    Junker, Philipp; Schwarz, Stephan; Makowski, Jerzy; Hackl, Klaus

    2017-01-01

    Material models, including softening effects due to, for example, damage and localizations, share the problem of ill-posed boundary value problems that yield mesh-dependent finite element results. It is thus necessary to apply regularization techniques that couple local behavior described, for example, by internal variables, at a spatial level. This can take account of the gradient of the internal variable to yield mesh-independent finite element results. In this paper, we present a new approach to damage modeling that does not use common field functions, inclusion of gradients or complex integration techniques: Appropriate modifications of the relaxed (condensed) energy hold the same advantage as other methods, but with much less numerical effort. We start with the theoretical derivation and then discuss the numerical treatment. Finally, we present finite element results that prove empirically how the new approach works.

  4. An agent-based simulation model of patient choice of health care providers in accountable care organizations.

    PubMed

    Alibrahim, Abdullah; Wu, Shinyi

    2016-10-04

    Accountable care organizations (ACO) in the United States show promise in controlling health care costs while preserving patients' choice of providers. Understanding the effects of patient choice is critical in novel payment and delivery models like ACO that depend on continuity of care and accountability. The financial, utilization, and behavioral implications associated with a patient's decision to forego local health care providers for more distant ones to access higher quality care remain unknown. To study this question, we used an agent-based simulation model of a health care market composed of providers able to form ACO serving patients and embedded it in a conditional logit decision model to examine patients capable of choosing their care providers. This simulation focuses on Medicare beneficiaries and their congestive heart failure (CHF) outcomes. We place the patient agents in an ACO delivery system model in which provider agents decide if they remain in an ACO and perform a quality improving CHF disease management intervention. Illustrative results show that allowing patients to choose their providers reduces the yearly payment per CHF patient by $320, reduces mortality rates by 0.12 percentage points and hospitalization rates by 0.44 percentage points, and marginally increases provider participation in ACO. This study demonstrates a model capable of quantifying the effects of patient choice in a theoretical ACO system and provides a potential tool for policymakers to understand implications of patient choice and assess potential policy controls.

  5. Dependency Resolution Difficulty Increases with Distance in Persian Separable Complex Predicates: Evidence for Expectation and Memory-Based Accounts.

    PubMed

    Safavi, Molood S; Husain, Samar; Vasishth, Shravan

    2016-01-01

    Delaying the appearance of a verb in a noun-verb dependency tends to increase processing difficulty at the verb; one explanation for this locality effect is decay and/or interference of the noun in working memory. Surprisal, an expectation-based account, predicts that delaying the appearance of a verb either renders it no more predictable or more predictable, leading respectively to a prediction of no effect of distance or a facilitation. Recently, Husain et al. (2014) suggested that when the exact identity of the upcoming verb is predictable (strong predictability), increasing argument-verb distance leads to facilitation effects, which is consistent with surprisal; but when the exact identity of the upcoming verb is not predictable (weak predictability), locality effects are seen. We investigated Husain et al.'s proposal using Persian complex predicates (CPs), which consist of a non-verbal element-a noun in the current study-and a verb. In CPs, once the noun has been read, the exact identity of the verb is highly predictable (strong predictability); this was confirmed using a sentence completion study. In two self-paced reading (SPR) and two eye-tracking (ET) experiments, we delayed the appearance of the verb by interposing a relative clause (Experiments 1 and 3) or a long PP (Experiments 2 and 4). We also included a simple Noun-Verb predicate configuration with the same distance manipulation; here, the exact identity of the verb was not predictable (weak predictability). Thus, the design crossed Predictability Strength and Distance. We found that, consistent with surprisal, the verb in the strong predictability conditions was read faster than in the weak predictability conditions. Furthermore, greater verb-argument distance led to slower reading times; strong predictability did not neutralize or attenuate the locality effects. As regards the effect of distance on dependency resolution difficulty, these four experiments present evidence in favor of working memory

  6. Dependency Resolution Difficulty Increases with Distance in Persian Separable Complex Predicates: Evidence for Expectation and Memory-Based Accounts

    PubMed Central

    Safavi, Molood S.; Husain, Samar; Vasishth, Shravan

    2016-01-01

    Delaying the appearance of a verb in a noun-verb dependency tends to increase processing difficulty at the verb; one explanation for this locality effect is decay and/or interference of the noun in working memory. Surprisal, an expectation-based account, predicts that delaying the appearance of a verb either renders it no more predictable or more predictable, leading respectively to a prediction of no effect of distance or a facilitation. Recently, Husain et al. (2014) suggested that when the exact identity of the upcoming verb is predictable (strong predictability), increasing argument-verb distance leads to facilitation effects, which is consistent with surprisal; but when the exact identity of the upcoming verb is not predictable (weak predictability), locality effects are seen. We investigated Husain et al.'s proposal using Persian complex predicates (CPs), which consist of a non-verbal element—a noun in the current study—and a verb. In CPs, once the noun has been read, the exact identity of the verb is highly predictable (strong predictability); this was confirmed using a sentence completion study. In two self-paced reading (SPR) and two eye-tracking (ET) experiments, we delayed the appearance of the verb by interposing a relative clause (Experiments 1 and 3) or a long PP (Experiments 2 and 4). We also included a simple Noun-Verb predicate configuration with the same distance manipulation; here, the exact identity of the verb was not predictable (weak predictability). Thus, the design crossed Predictability Strength and Distance. We found that, consistent with surprisal, the verb in the strong predictability conditions was read faster than in the weak predictability conditions. Furthermore, greater verb-argument distance led to slower reading times; strong predictability did not neutralize or attenuate the locality effects. As regards the effect of distance on dependency resolution difficulty, these four experiments present evidence in favor of working

  7. An Efficient Deterministic Approach to Model-based Prediction Uncertainty Estimation

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Saxena, Abhinav; Goebel, Kai

    2012-01-01

    Prognostics deals with the prediction of the end of life (EOL) of a system. EOL is a random variable, due to the presence of process noise and uncertainty in the future inputs to the system. Prognostics algorithm must account for this inherent uncertainty. In addition, these algorithms never know exactly the state of the system at the desired time of prediction, or the exact model describing the future evolution of the system, accumulating additional uncertainty into the predicted EOL. Prediction algorithms that do not account for these sources of uncertainty are misrepresenting the EOL and can lead to poor decisions based on their results. In this paper, we explore the impact of uncertainty in the prediction problem. We develop a general model-based prediction algorithm that incorporates these sources of uncertainty, and propose a novel approach to efficiently handle uncertainty in the future input trajectories of a system by using the unscented transformation. Using this approach, we are not only able to reduce the computational load but also estimate the bounds of uncertainty in a deterministic manner, which can be useful to consider during decision-making. Using a lithium-ion battery as a case study, we perform several simulation-based experiments to explore these issues, and validate the overall approach using experimental data from a battery testbed.

  8. Accounting Specialist.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication identifies 20 subjects appropriate for use in a competency list for the occupation of accounting specialist, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 20 units are as follows:…

  9. Painless Accountability.

    ERIC Educational Resources Information Center

    Brown, R. W.; And Others

    The computerized Painless Accountability System is a performance objective system from which instructional programs are developed. Three main simplified behavioral response levels characterize this system: (1) cognitive, (2) psychomotor, and (3) affective domains. Each of these objectives are classified by one of 16 descriptors. The second major…

  10. Accountability Overboard

    ERIC Educational Resources Information Center

    Chieppo, Charles D.; Gass, James T.

    2009-01-01

    This article reports that special interest groups opposed to charter schools and high-stakes testing have hijacked Massachusetts's once-independent board of education and stand poised to water down the Massachusetts Comprehensive Assessment System (MCAS) tests and the accountability system they support. President Barack Obama and Massachusetts…

  11. Component-Based Approach in Learning Management System Development

    ERIC Educational Resources Information Center

    Zaitseva, Larisa; Bule, Jekaterina; Makarov, Sergey

    2013-01-01

    The paper describes component-based approach (CBA) for learning management system development. Learning object as components of e-learning courses and their metadata is considered. The architecture of learning management system based on CBA being developed in Riga Technical University, namely its architecture, elements and possibilities are…

  12. Integration of Task-Based Approaches in a TESOL Course

    ERIC Educational Resources Information Center

    Chien, Chin-Wen

    2014-01-01

    Under task-based language teaching (TBLT), language learners engage in purposeful, problem-oriented, and outcome-driven tasks that are comparable to real-world activities. This qualitative case study discusses the integration of a task-based approach into a TESOL course in a language teacher education program in Taiwan with regard to 39…

  13. The use of mindfulness-based approaches for suicidal patients.

    PubMed

    Williams, J Mark G; Swales, Michaela

    2004-01-01

    Mindfulness-based approaches are becoming more widely used for individuals at risk of suicidal behavior: in the treatment of borderline personality disorder (in Dialectical Behavior Therapy), and as a way to reduce relapse in recurrent major depression (in Mindfulness-based Cognitive Therapy). This article describes and examines the commonalities and differences in the use of mindfulness in these two treatments. The reasons for considering the use of mindfulness-based approaches with suicidal individuals more widely are considered and potential risks outlined. The article closes with case examples to illustrate the use of mindfulness in the treatment of suicidal thoughts and behaviors.

  14. System identification based approach to dynamic weighing revisited

    NASA Astrophysics Data System (ADS)

    Niedźwiecki, Maciej; Meller, Michał; Pietrzak, Przemysław

    2016-12-01

    Dynamic weighing, i.e., weighing of objects in motion, without stopping them on the weighing platform, allows one to increase the rate of operation of automatic weighing systems, used in industrial production processes, without compromising their accuracy. Since the classical identification-based approach to dynamic weighing, based on the second-order mass-spring-damper model of the weighing system, does not yield satisfactory results when applied to conveyor belt type checkweighers, several extensions of this technique are examined. Experiments confirm that when appropriately modified the identification-based approach becomes a reliable tool for dynamic mass measurement in checkweighers.

  15. Would GAAP - Based Accounting Practices Improve Financial Management and Decision-Making in the Department of Defense?

    DTIC Science & Technology

    1990-08-01

    Accounting Procedures Act of 1950 and the FMFIA are carried out are OMB Circular A- 123 , Internal Control Systems,65 and OMB Circular A- 127, Financial...Office. Standards For Internal Controls In The Federal Government, Accounting Series, 1983. 65 U. S. Office of Management and Budget. OMB Circular A- 123 ...defined by FASB 125 for the 123 Young, Ronald S., Director, Accounting Principles and Standards Group, Accounting and Financial Management Division, U. S

  16. Cluster Guide. Accounting Occupations.

    ERIC Educational Resources Information Center

    Beaverton School District 48, OR.

    Based on a recent task inventory of key occupations in the accounting cluster taken in the Portland, Oregon, area, this curriculum guide is intended to assist administrators and teachers in the design and implementation of high school accounting cluster programs. The guide is divided into four major sections: program organization and…

  17. THE FUTURE OF COMPUTER-BASED TOXICITY PREDICTION: MECHANISM-BASED MODELS VS. INFORMATION MINING APPROACHES

    EPA Science Inventory


    The Future of Computer-Based Toxicity Prediction:
    Mechanism-Based Models vs. Information Mining Approaches

    When we speak of computer-based toxicity prediction, we are generally referring to a broad array of approaches which rely primarily upon chemical structure ...

  18. Biomarker Discovery by Novel Sensors Based on Nanoproteomics Approaches

    PubMed Central

    Dasilva, Noelia; Díez, Paula; Matarraz, Sergio; González-González, María; Paradinas, Sara; Orfao, Alberto; Fuentes, Manuel

    2012-01-01

    During the last years, proteomics has facilitated biomarker discovery by coupling high-throughput techniques with novel nanosensors. In the present review, we focus on the study of label-based and label-free detection systems, as well as nanotechnology approaches, indicating their advantages and applications in biomarker discovery. In addition, several disease biomarkers are shown in order to display the clinical importance of the improvement of sensitivity and selectivity by using nanoproteomics approaches as novel sensors. PMID:22438764

  19. Healthcare information system approaches based on middleware concepts.

    PubMed

    Holena, M; Blobel, B

    1997-01-01

    To meet the challenges for efficient and high-level quality, health care systems must implement the "Shared Care" paradigm of distributed co-operating systems. To this end, both the newly developed and legacy applications must be fully integrated into the care process. These requirements can be fulfilled by information systems based on middleware concepts. In the paper, the middleware approaches HL7, DHE, and CORBA are described. The relevance of those approaches to the healthcare domain is documented. The description presented here is complemented through two other papers in this volume, concentrating on the evaluation of the approaches, and on their security threats and solutions.

  20. A Link-Based Approach to the Cluster Ensemble Problem.

    PubMed

    Iam-On, Natthakan; Boongoen, Tossapon; Garrett, Simon; Price, Chris

    2011-12-01

    Cluster ensembles have recently emerged as a powerful alternative to standard cluster analysis, aggregating several input data clusterings to generate a single output clustering, with improved robustness and stability. From the early work, these techniques held great promise; however, most of them generate the final solution based on incomplete information of a cluster ensemble. The underlying ensemble-information matrix reflects only cluster-data point relations, while those among clusters are generally overlooked. This paper presents a new link-based approach to improve the conventional matrix. It achieves this using the similarity between clusters that are estimated from a link network model of the ensemble. In particular, three new link-based algorithms are proposed for the underlying similarity assessment. The final clustering result is generated from the refined matrix using two different consensus functions of feature-based and graph-based partitioning. This approach is the first to address and explicitly employ the relationship between input partitions, which has not been emphasized by recent studies of matrix refinement. The effectiveness of the link-based approach is empirically demonstrated over 10 data sets (synthetic and real) and three benchmark evaluation measures. The results suggest the new approach is able to efficiently extract information embedded in the input clusterings, and regularly illustrate higher clustering quality in comparison to several state-of-the-art techniques.

  1. Mindfulness-based approaches: are they all the same?

    PubMed

    Chiesa, Alberto; Malinowski, Peter

    2011-04-01

    Mindfulness-based approaches are increasingly employed as interventions for treating a variety of psychological, psychiatric and physical problems. Such approaches include ancient Buddhist mindfulness meditations such as Vipassana and Zen meditations, modern group-based standardized meditations, such as mindfulness-based stress reduction and mindfulness-based cognitive therapy, and further psychological interventions, such as dialectical behavioral therapy and acceptance and commitment therapy. We review commonalities and differences of these interventions regarding philosophical background, main techniques, aims, outcomes, neurobiology and psychological mechanisms. In sum, the currently applied mindfulness-based interventions show large differences in the way mindfulness is conceptualized and practiced. The decision to consider such practices as unitary or as distinct phenomena will probably influence the direction of future research.

  2. An Open Science Approach to Gis-Based Paleoenvironment Data

    NASA Astrophysics Data System (ADS)

    Willmes, C.; Becker, D.; Verheul, J.; Yener, Y.; Zickel, M.; Bolten, A.; Bubenzer, O.; Bareth, G.

    2016-06-01

    Paleoenvironmental studies and according information (data) are abundantly published and available in the scientific record. However, GIS-based paleoenvironmental information and datasets are comparably rare. Here, we present an Open Science approach for creating GIS-based data and maps of paleoenvironments, and Open Access publishing them in a web based Spatial Data Infrastructure (SDI), for access by the archaeology and paleoenvironment communities. We introduce an approach to gather and create GIS datasets from published non-GIS based facts and information (data), such as analogous maps, textual information or figures in scientific publications. These collected and created geo-datasets and maps are then published, including a Digital Object Identifier (DOI) to facilitate scholarly reuse and citation of the data, in a web based Open Access Research Data Management Infrastructure. The geo-datasets are additionally published in an Open Geospatial Consortium (OGC) standards compliant SDI, and available for GIS integration via OGC Open Web Services (OWS).

  3. How do control-based approaches enter into biology?

    PubMed

    LeDuc, Philip R; Messner, William C; Wikswo, John P

    2011-08-15

    Control is intrinsic to biological organisms, whose cells are in a constant state of sensing and response to numerous external and self-generated stimuli. Diverse means are used to study the complexity through control-based approaches in these cellular systems, including through chemical and genetic manipulations, input-output methodologies, feedback approaches, and feed-forward approaches. We first discuss what happens in control-based approaches when we are not actively examining or manipulating cells. We then present potential methods to determine what the cell is doing during these times and to reverse-engineer the cellular system. Finally, we discuss how we can control the cell's extracellular and intracellular environments, both to probe the response of the cells using defined experimental engineering-based technologies and to anticipate what might be achieved by applying control-based approaches to affect cellular processes. Much work remains to apply simplified control models and develop new technologies to aid researchers in studying and utilizing cellular and molecular processes.

  4. A Market-Based Approach to Multi-factory Scheduling

    NASA Astrophysics Data System (ADS)

    Vytelingum, Perukrishnen; Rogers, Alex; MacBeth, Douglas K.; Dutta, Partha; Stranjak, Armin; Jennings, Nicholas R.

    In this paper, we report on the design of a novel market-based approach for decentralised scheduling across multiple factories. Specifically, because of the limitations of scheduling in a centralised manner - which requires a center to have complete and perfect information for optimality and the truthful revelation of potentially commercially private preferences to that center - we advocate an informationally decentralised approach that is both agile and dynamic. In particular, this work adopts a market-based approach for decentralised scheduling by considering the different stakeholders representing different factories as self-interested, profit-motivated economic agents that trade resources for the scheduling of jobs. The overall schedule of these jobs is then an emergent behaviour of the strategic interaction of these trading agents bidding for resources in a market based on limited information and their own preferences. Using a simple (zero-intelligence) bidding strategy, we empirically demonstrate that our market-based approach achieves a lower bound efficiency of 84%. This represents a trade-off between a reasonable level of efficiency (compared to a centralised approach) and the desirable benefits of a decentralised solution.

  5. E-Learning Personalization Using Triple-Factor Approach in Standard-Based Education

    NASA Astrophysics Data System (ADS)

    Laksitowening, K. A.; Santoso, H. B.; Hasibuan, Z. A.

    2017-01-01

    E-Learning can be a tool in monitoring learning process and progress towards the targeted competency. Process and progress on every learner can be different one to another, since every learner may have different learning type. Learning type itself can be identified by taking into account learning style, motivation, and knowledge ability. This study explores personalization for learning type based on Triple-Factor Approach. Considering that factors in Triple-Factor Approach are dynamic, the personalization system needs to accommodate the changes that may occurs. Originated from the issue, this study proposed personalization that guides learner progression dynamically towards stages of their learning process. The personalization is implemented in the form of interventions that trigger learner to access learning contents and discussion forums more often as well as improve their level of knowledge ability based on their state of learning type.

  6. Grid-based electronic structure calculations: The tensor decomposition approach

    SciTech Connect

    Rakhuba, M.V.; Oseledets, I.V.

    2016-05-01

    We present a fully grid-based approach for solving Hartree–Fock and all-electron Kohn–Sham equations based on low-rank approximation of three-dimensional electron orbitals. Due to the low-rank structure the total complexity of the algorithm depends linearly with respect to the one-dimensional grid size. Linear complexity allows for the usage of fine grids, e.g. 8192{sup 3} and, thus, cheap extrapolation procedure. We test the proposed approach on closed-shell atoms up to the argon, several molecules and clusters of hydrogen atoms. All tests show systematical convergence with the required accuracy.

  7. Use of the ‘Accountability for Reasonableness’ Approach to Improve Fairness in Accessing Dialysis in a Middle-Income Country

    PubMed Central

    Maree, Jonathan David; Chirehwa, Maxwell T.; Benatar, Solomon R.

    2016-01-01

    Universal access to renal replacement therapy is beyond the economic capability of most low and middle-income countries due to large patient numbers and the high recurrent cost of treating end stage kidney disease. In countries where limited access is available, no systems exist that allow for optimal use of the scarce dialysis facilities. We previously reported that using national guidelines to select patients for renal replacement therapy resulted in biased allocation. We reengineered selection guidelines using the ‘Accountability for Reasonableness’ (procedural fairness) framework in collaboration with relevant stakeholders, applying these in a novel way to categorize and prioritize patients in a unique hierarchical fashion. The guidelines were primarily premised on patients being transplantable. We examined whether the revised guidelines enhanced fairness of dialysis resource allocation. This is a descriptive study of 1101 end stage kidney failure patients presenting to a tertiary renal unit in a middle-income country, evaluated for dialysis treatment over a seven-year period. The Assessment Committee used the accountability for reasonableness-based guidelines to allocate patients to one of three assessment groups. Category 1 patients were guaranteed renal replacement therapy, Category 3 patients were palliated, and Category 2 were offered treatment if resources allowed. Only 25.2% of all end stage kidney disease patients assessed were accepted for renal replacement treatment. The majority of patients (48%) were allocated to Category 2. Of 134 Category 1 patients, 98% were accepted for treatment while 438 (99.5%) Category 3 patients were excluded. Compared with those palliated, patients accepted for dialysis treatment were almost 10 years younger, employed, married with children and not diabetic. Compared with our previous selection process our current method of priority setting based on procedural fairness arguably resulted in more equitable allocation of

  8. A modified Shockley equation taking into account the multi-element nature of light emitting diodes based on nanowire ensembles.

    PubMed

    Musolino, M; Tahraoui, A; Treeck, D van; Geelhaar, L; Riechert, H

    2016-07-08

    In this work we study how the multi-element nature of light emitting diodes (LEDs) based on nanowire (NW) ensembles influences their current voltage (I-V) characteristics. We systematically address critical issues of the fabrication process that can result in significant fluctuations of the electrical properties among the individual NWs in such LEDs, paying particular attention to the planarization step. Electroluminescence (EL) maps acquired for two nominally identical NW-LEDs reveal that small processing variations can result in a large difference in the number of individual nano-devices emitting EL. The lower number of EL spots in one of the LEDs is caused by its inhomogeneous electrical properties. The I-V characteristics of this LED cannot be described well by the classical Shockley model. We are able to take into account the multi-element nature of such LEDs and fit the I-V characteristics in the forward bias regime by employing an ad hoc adjusted version of the Shockley equation. More specifically, we introduce a bias dependence of the ideality factor. The basic considerations of our model should remain valid also for other types of devices based on ensembles of interconnected p-n junctions with inhomogeneous electrical properties, regardless of the employed material system.

  9. A modified Shockley equation taking into account the multi-element nature of light emitting diodes based on nanowire ensembles

    NASA Astrophysics Data System (ADS)

    Musolino, M.; Tahraoui, A.; van Treeck, D.; Geelhaar, L.; Riechert, H.

    2016-07-01

    In this work we study how the multi-element nature of light emitting diodes (LEDs) based on nanowire (NW) ensembles influences their current voltage (I-V) characteristics. We systematically address critical issues of the fabrication process that can result in significant fluctuations of the electrical properties among the individual NWs in such LEDs, paying particular attention to the planarization step. Electroluminescence (EL) maps acquired for two nominally identical NW-LEDs reveal that small processing variations can result in a large difference in the number of individual nano-devices emitting EL. The lower number of EL spots in one of the LEDs is caused by its inhomogeneous electrical properties. The I-V characteristics of this LED cannot be described well by the classical Shockley model. We are able to take into account the multi-element nature of such LEDs and fit the I-V characteristics in the forward bias regime by employing an ad hoc adjusted version of the Shockley equation. More specifically, we introduce a bias dependence of the ideality factor. The basic considerations of our model should remain valid also for other types of devices based on ensembles of interconnected p-n junctions with inhomogeneous electrical properties, regardless of the employed material system.

  10. A proven knowledge-based approach to prioritizing process information

    NASA Technical Reports Server (NTRS)

    Corsberg, Daniel R.

    1991-01-01

    Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.

  11. A simple microviscometric approach based on Brownian motion tracking

    NASA Astrophysics Data System (ADS)

    Hnyluchová, Zuzana; Bjalončíková, Petra; Karas, Pavel; Mravec, Filip; Halasová, Tereza; Pekař, Miloslav; Kubala, Lukáš; Víteček, Jan

    2015-02-01

    Viscosity—an integral property of a liquid—is traditionally determined by mechanical instruments. The most pronounced disadvantage of such an approach is the requirement of a large sample volume, which poses a serious obstacle, particularly in biology and biophysics when working with limited samples. Scaling down the required volume by means of microviscometry based on tracking the Brownian motion of particles can provide a reasonable alternative. In this paper, we report a simple microviscometric approach which can be conducted with common laboratory equipment. The core of this approach consists in a freely available standalone script to process particle trajectory data based on a Newtonian model. In our study, this setup allowed the sample to be scaled down to 10 μl. The utility of the approach was demonstrated using model solutions of glycerine, hyaluronate, and mouse blood plasma. Therefore, this microviscometric approach based on a newly developed freely available script can be suggested for determination of the viscosity of small biological samples (e.g., body fluids).

  12. Achieving Excellence: Accountability Report, 2002-2003.

    ERIC Educational Resources Information Center

    Wisconsin Univ. System, Madison.

    This report, intended to be issued annually, represents the University of Wisconsin (UW) System's effort to provide the citizens of Wisconsin with broad-based accountability of its largest public higher education system. The report focuses on two distinct approaches to the measurement of university performance. First, it presents the UW System's…

  13. An innovative approach to capability-based emergency operations planning

    PubMed Central

    Keim, Mark E

    2013-01-01

    This paper describes the innovative use information technology for assisting disaster planners with an easily-accessible method for writing and improving evidence-based emergency operations plans. This process is used to identify all key objectives of the emergency response according to capabilities of the institution, community or society. The approach then uses a standardized, objective-based format, along with a consensus-based method for drafting capability-based operational-level plans. This information is then integrated within a relational database to allow for ease of access and enhanced functionality to search, sort and filter and emergency operations plan according to user need and technological capacity. This integrated approach is offered as an effective option for integrating best practices of planning with the efficiency, scalability and flexibility of modern information and communication technology.

  14. A Generic Approach for Pen-Based User Interface Development

    NASA Astrophysics Data System (ADS)

    Macé, Sébastien; Anquetil, Éric

    Pen-based interaction is an intuitive way to realize hand drawn structured documents, but few applications take advantage of it. Indeed, the interpretation of the user hand drawn strokes in the context of document is a complex problem. In this paper, we propose a new generic approach to develop such systems based on three independent components. The first one is a set of graphical and editing functions adapted to pen interaction. The second one is a rule-based formalism that models structured document composition and the corresponding interpretation process. The last one is a hand drawn stroke analyzer that is able to interpret strokes progressively, directly while the user is drawing. We highlight in particular the human-computer interaction induced from this progressive interpretation process. Thanks to this generic approach, three pen-based system prototypes have already been developed, for musical score editing, for graph editing, and for UML class diagram editing

  15. [Internet-based approaches in the therapy of eating disorders].

    PubMed

    Fichter, M M; Quadflieg, N; Nisslmüller, K; Lindner, S; Voderholzer, U; Wünsch-Leiteritz, W; Osen, B; Huber, T; Zahn, S; Meermann, R; Irrgang, V; Bleichner, F

    2011-09-01

    Recent technological developments of communication media offer new approaches to diagnostic and therapeutic interactions with patients. One major development is Internet-based primary prevention in vulnerable individuals not yet suffering as well as the development of new therapeutic approaches for affected individuals based on the experiences of guided self-help through CD, DVD or bibliotherapy. The eating disorder literature shows several interesting, partly controlled and randomized, studies on bulimia nervosa, a few studies on binge eating disorder and no studies on anorexia nervosa. As part of the German Eating Disorder Network on Psychotherapy (EDNET) a 9-month Internet-based relapse prevention program for patients with anorexia nervosa after inpatient treatment was evaluated. Conception, first experiences and first results of the Internet-based relapse prevention program for anorexia nervosa are reported.

  16. Small molecule-based approaches to adult stem cell therapies.

    PubMed

    Lairson, Luke L; Lyssiotis, Costas A; Zhu, Shoutian; Schultz, Peter G

    2013-01-01

    There is considerable interest in the development of stem cell-based strategies for the treatment of a broad range of human diseases, including neurodegenerative, autoimmune, cardiovascular, and musculoskeletal diseases. To date, such regenerative approaches have focused largely on the development of cell transplantation therapies using cells derived from pluripotent embryonic stem cells (ESCs). Although there have been exciting preliminary reports describing the efficacy of ESC-derived replacement therapies, approaches involving ex vivo manipulated ESCs are hindered by issues of mutation, immune rejection, and ethical controversy. An alternative approach involves direct in vivo modulation or ex vivo expansion of endogenous adult stem cell populations using drug-like small molecules. Here we describe chemical approaches to the regulation of somatic stem cell biology that are yielding new biological insights and that may ultimately lead to innovative new medicines.

  17. From equation to inequality using a function-based approach

    NASA Astrophysics Data System (ADS)

    Verikios, Petros; Farmaki, Vassiliki

    2010-06-01

    This article presents features of a qualitative research study concerning the teaching and learning of school algebra using a function-based approach in a grade 8 class, of 23 students, in 26 lessons, in a state school of Athens, in the school year 2003-2004. In this article, we are interested in the inequality concept and our aim is to investigate if and how our approach could facilitate students to comprehend inequality and to solve problems related to this concept. Data analysis showed that, in order to comprehend the new concept, the students should make a transition from equation to inequality. The role of the situation context proved decisive in this transition and in making sense of involved symbols. Also, students used function representations as problem-solving strategies in problems that included inequalities. However, the extension of the function-based approach in solving an abstract equation or inequality proved problematic for the students.

  18. Tennis: Applied Examples of a Game-Based Teaching Approach

    ERIC Educational Resources Information Center

    Crespo, Miguel; Reid, Machar M.; Miley, Dave

    2004-01-01

    In this article, the authors reveal that tennis has been increasingly taught with a tactical model or game-based approach, which emphasizes learning through practice in match-like drills and actual play, rather than in practicing strokes for exact technical execution. Its goal is to facilitate the player's understanding of the tactical, physical…

  19. A Theoretical Approach to School-based HIV Prevention.

    ERIC Educational Resources Information Center

    DeMuth, Diane; Symons, Cynthia Wolford

    1989-01-01

    Presents examples of appropriate intervention strategies for professionals working with school-based human immunodeficiency virus (HIV) prevention among adolescents. A multidisciplinary approach is advisable because influencing adolescent sexual behavior is a complex matter. Consistent, continuous messages through multiple channels and by multiple…

  20. Graphene metamaterials based tunable terahertz absorber: effective surface conductivity approach.

    PubMed

    Andryieuski, Andrei; Lavrinenko, Andrei V

    2013-04-08

    In this paper we present the efficient design of functional thin-film metamaterial devices with the effective surface conductivity approach. As an example, we demonstrate a graphene based perfect absorber. After formulating the requirements to the perfect absorber in terms of surface conductivity we investigate the properties of graphene wire medium and graphene fishnet metamaterials and demonstrate both narrowband and broadband tunable absorbers.

  1. Evaluation Theory in Problem-Based Learning Approach.

    ERIC Educational Resources Information Center

    Hsu, Yu-chen

    The purpose of this paper is to review evaluation theories and techniques in both the medical and educational fields and to propose an evaluation theory to explain the condition variables, the method variables, and the outcome variables of student assessment in a problem-based learning (PBL) approach. The PBL definition and process are presented,…

  2. Evaluation of a Blog Based Parent Involvement Approach by Parents

    ERIC Educational Resources Information Center

    Ozcinar, Zehra; Ekizoglu, Nihat

    2013-01-01

    Despite the well-known benefits of parent involvement in children's education, research clearly shows that it is difficult to effectively involve parents. This study aims to capture parents' views of a Blog Based Parent Involvement Approach (BPIA) designed to secure parent involvement in education by strengthening school-parent communication. Data…

  3. Content Based Image Retrieval and Information Theory: A General Approach.

    ERIC Educational Resources Information Center

    Zachary, John; Iyengar, S. S.; Barhen, Jacob

    2001-01-01

    Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image…

  4. Agent-based Approaches to Dynamic Team Simulation

    DTIC Science & Technology

    2008-09-01

    behavior. The second section reviews agent-based models of teamwork describing work involving both teamwork approaches to design of multiagent systems...there is less direct evidence for teams. Hough (1992), for example, found that ratings on conscientiousness, emotional stability, and agreeableness...Peeters, Rutte, Tuijl, and Reymen (2006) who found agreeableness and emotional stability positively related to satisfaction with the team make

  5. Training Team Problem Solving Skills: An Event-Based Approach.

    ERIC Educational Resources Information Center

    Oser, R. L.; Gualtieri, J. W.; Cannon-Bowers, J. A.; Salas, E.

    1999-01-01

    Discusses how to train teams in problem-solving skills. Topics include team training, the use of technology, instructional strategies, simulations and training, theoretical framework, and an event-based approach for training teams to perform in naturalistic environments. Contains 68 references. (Author/LRW)

  6. From Equation to Inequality Using a Function-Based Approach

    ERIC Educational Resources Information Center

    Verikios, Petros; Farmaki, Vassiliki

    2010-01-01

    This article presents features of a qualitative research study concerning the teaching and learning of school algebra using a function-based approach in a grade 8 class, of 23 students, in 26 lessons, in a state school of Athens, in the school year 2003-2004. In this article, we are interested in the inequality concept and our aim is to…

  7. Economic Dispatch Using Genetic Algorithm Based Hybrid Approach

    SciTech Connect

    Tahir Nadeem Malik; Aftab Ahmad; Shahab Khushnood

    2006-07-01

    Power Economic Dispatch (ED) is vital and essential daily optimization procedure in the system operation. Present day large power generating units with multi-valves steam turbines exhibit a large variation in the input-output characteristic functions, thus non-convexity appears in the characteristic curves. Various mathematical and optimization techniques have been developed, applied to solve economic dispatch (ED) problem. Most of these are calculus-based optimization algorithms that are based on successive linearization and use the first and second order differentiations of objective function and its constraint equations as the search direction. They usually require heat input, power output characteristics of generators to be of monotonically increasing nature or of piecewise linearity. These simplifying assumptions result in an inaccurate dispatch. Genetic algorithms have used to solve the economic dispatch problem independently and in conjunction with other AI tools and mathematical programming approaches. Genetic algorithms have inherent ability to reach the global minimum region of search space in a short time, but then take longer time to converge the solution. GA based hybrid approaches get around this problem and produce encouraging results. This paper presents brief survey on hybrid approaches for economic dispatch, an architecture of extensible computational framework as common environment for conventional, genetic algorithm and hybrid approaches based solution for power economic dispatch, the implementation of three algorithms in the developed framework. The framework tested on standard test systems for its performance evaluation. (authors)

  8. An Evidence-Based Approach to Introductory Chemistry

    ERIC Educational Resources Information Center

    Johnson, Philip

    2014-01-01

    Drawing on research into students' understanding, this article argues that the customary approach to introductory chemistry has created difficulties for students. Instead of being based on the notion of "solids, liquids and gases", introductory chemistry should be structured to develop the concept of a substance. The concept of a…

  9. Assessing the Success of a Discipline-Based Communication Skills Development and Enhancement Program in a Graduate Accounting Course

    ERIC Educational Resources Information Center

    Barratt, Catherine; Hanlon, Dean; Rankin, Michaela

    2011-01-01

    In this paper we present results of the impact diagnostic testing and associated context-specific workshops have on students' written communication skills in a graduate-level accounting course. We find that students who undertook diagnostic testing performed better in their first semester accounting subject. This improvement is positively…

  10. Statistical Properties of Accountability Measures Based on ACT's Educational Planning and Assessment System. ACT Research Report Series, 2009-1

    ERIC Educational Resources Information Center

    Allen, Jeff; Bassiri, Dina; Noble, Julie

    2009-01-01

    Educational accountability has grown substantially over the last decade, due in large part to the No Child Left Behind Act of 2001. Accordingly, educational researchers and policymakers are interested in the statistical properties of accountability models used for NCLB, such as status, improvement, and growth models; as well as others that are not…

  11. Revising a design course from a lecture approach to a project-based learning approach

    NASA Astrophysics Data System (ADS)

    Kunberger, Tanya

    2013-06-01

    In order to develop the evaluative skills necessary for successful performance of design, a senior, Geotechnical Engineering course was revised to immerse students in the complexity of the design process utilising a project-based learning (PBL) approach to instruction. The student-centred approach stresses self-directed group learning, which focuses on the process rather than the result and underscores not only the theoretical but also the practical constraints of a problem. The shift in course emphasis, to skills over concepts, results in reduced content coverage but increased student ability to independently acquire a breadth of knowledge.

  12. An Efficient Soft Set-Based Approach for Conflict Analysis

    PubMed Central

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  13. A hybrid LSSVR/HMM-based prognostic approach.

    PubMed

    Liu, Zhijuan; Li, Qing; Liu, Xianhui; Mu, Chundi

    2013-04-26

    n a health management system, prognostics, which is an engineering discipline that predicts a system's future health, is an important aspect yet there is currently limited research in this field. In this paper, a hybrid approach for prognostics is proposed. The approach combines the least squares support vector regression (LSSVR) with the hidden Markov model (HMM). Features extracted from sensor signals are used to train HMMs, which represent different health levels. A LSSVR algorithm is used to predict the feature trends. The LSSVR training and prediction algorithms are modified by adding new data and deleting old data and the probabilities of the predicted features for each HMM are calculated based on forward or backward algorithms. Based on these probabilities, one can determine a system's future health state and estimate the remaining useful life (RUL). To evaluate the proposed approach, a test was carried out using bearing vibration signals. Simulation results show that the LSSVR/HMM approach can forecast faults long before they occur and can predict the RUL. Therefore, the LSSVR/HMM approach is very promising in the field of prognostics.

  14. Accounting for non-photosynthetic vegetation in remote-sensing-based estimates of carbon flux in wetlands

    USGS Publications Warehouse

    Schile, Lisa M.; Byrd, Kristin B.; Windham-Myers, Lisamarie; Kelly, Maggi

    2013-01-01

    Monitoring productivity in coastal wetlands is important due to their high carbon sequestration rates and potential role in climate change mitigation. We tested agricultural- and forest-based methods for estimating the fraction of absorbed photosynthetically active radiation (f APAR), a key parameter for modelling gross primary productivity (GPP), in a restored, managed wetland with a dense litter layer of non-photosynthetic vegetation, and we compared the difference in canopy light transmission between a tidally influenced wetland and the managed wetland. The presence of litter reduced correlations between spectral vegetation indices and f APAR. In the managed wetland, a two-band vegetation index incorporating simulated World View-2 or Hyperion green and near-infrared bands, collected with a field spectroradiometer, significantly correlated with f APAR only when measured above the litter layer, not at the ground where measurements typically occur. Measures of GPP in these systems are difficult to capture via remote sensing, and require an investment of sampling effort, practical methods for measuring green leaf area and accounting for background effects of litter and water.

  15. An Intention-Based Account of Perspective-Taking: Why Perspective-Taking Can Both Decrease and Increase Moral Condemnation.

    PubMed

    Lucas, Brian J; Galinksy, Adam D; Murnighan, Keith J

    2016-09-20

    Perspective-taking often increases generosity in behavior and attributions. We present an intentions-based account to explain how perspective-taking can both decrease and increase moral condemnation. Consistent with past research, we predicted perspective-taking would reduce condemnation when the perspective-taker initially attributed benevolent intent to a transgressor. However, we predicted perspective-taking would increase condemnation when malevolent intentions were initially attributed to the wrongdoer. We propose that perspective-taking amplifies the intentions initially attributed to a transgressor. Three studies measured and manipulated intention attributions and found that perspective-taking increased condemnation when malevolent intentions were initially attributed to a transgressor. Perspective-taking also increased costly punishment of a transgressor, an effect mediated by malevolent intentions. In contrast, empathy did not increase punitive responses, supporting its conceptual distinction from perspective-taking. Whether perspective-taking leads to forgiveness or condemnation depends on the intentions the perspective-taker initially attributes to a transgressor.

  16. A Hydrological Model To Bridge The Gap Between Conceptual and Physically Based Approaches

    NASA Astrophysics Data System (ADS)

    Lempert, M.; Ostrowski, M.; Blöschl, G.

    In the last decade it has become evident that models are needed to account for more realistic physical assumptions and for improved data availability and computational facilities. In general it seems to be a dominant objective to better account for nonlin- earity and for less uncertain parameter identification. This allows its application also to ungaged catchments. To account for these objectives and for improved computa- tional boundary conditions a new model has been developed, tested and validated at Darmstadt University of Technology. The model is a quasi non linear model, it uses GIS provided data and includes physically based (not physical) model parameters, quite readily available from digitally stored information. Surface runoff determined after physically based non linear soil moisture modelling is routed with the kinematic cascade approach according to digital elevation grid models while sub-surface flow is routed through linear conceptual modules. The model uses generally accepted param- eters for soil moisture modelling including vegetation canopy such as total porosity, field cvapacity, wilting point, hydraulic conductivities and leaf area index and canopy coverage. The model has been successfully applied to several test sites and catchments at local, micro and lower macro scales. It is the objective of the paper to - explain the background of model development - briefly explain algorithms - discuss model parameter identification - present case study results

  17. Assessment of acid-base balance. Stewart's approach.

    PubMed

    Fores-Novales, B; Diez-Fores, P; Aguilera-Celorrio, L J

    2016-04-01

    The study of acid-base equilibrium, its regulation and its interpretation have been a source of debate since the beginning of 20th century. Most accepted and commonly used analyses are based on pH, a notion first introduced by Sorensen in 1909, and on the Henderson-Hasselbalch equation (1916). Since then new concepts have been development in order to complete and make easier the understanding of acid-base disorders. In the early 1980's Peter Stewart brought the traditional interpretation of acid-base disturbances into question and proposed a new method. This innovative approach seems more suitable for studying acid-base abnormalities in critically ill patients. The aim of this paper is to update acid-base concepts, methods, limitations and applications.

  18. Colorimetry-based edge preservation approach for color image enhancement

    NASA Astrophysics Data System (ADS)

    Suresh, Merugu; Jain, Kamal

    2016-07-01

    "Subpixel-based downsampling" is an approach that can implicitly enhance perceptible image resolution of a downsampled image by managing subpixel-level representation preferably with individual pixel. A subpixel-level representation for color image sample at edge region and color image representation is focused with the problem of directional filtration based on horizontal and vertical orientations using colorimetric color space with the help of saturation and desaturation pixels. A diagonal tracing algorithm and an edge preserving approach with colorimetric color space were used for color image enhancement. Since, there exist high variations at the edge regions, it could not be considered as constant or zero, and when these variations are random the need to compensate these to minimum value and then process for image representation. Finally, the results of the proposed method show much better image information as compared with traditional direct pixel-based methods with increased luminance and chrominance resolutions.

  19. SimTool - An object based approach to simulation construction

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Yazbeck, Marwan E.; Edwards, H. C.; Barnette, Randall D.

    1993-01-01

    The creation and maintenance of large complex simulations can be a difficult and error prone task. A number of interactive and automated tools have been developed to aid in simulation construction and maintenance. Many of these tools are based upon object oriented analysis and design concepts. One such tool, SimTool, is an object based integrated tool set for the development, maintenance, and operation of large, complex and long lived simulations. This paper discusses SimTool's object based approach to simulation design, construction and execution. It also discusses the services provided to various levels of SimTool users to assist them in a wide range of simulation tasks. Also, with the aid of an implemented and working simulation example, this paper discusses SimTool's key design and operational features. Finally, this paper presents a condensed discussion of SimTool's Entity-Relationship-Attribute (ERA) modeling approach.

  20. Attitudes toward a game-based approach to mental health.

    PubMed

    Kreutzer, Christine P; Bowers, Clint A

    2015-01-01

    Based on preliminary research, game-based treatments appear to be a promising approach to post-traumatic stress disorder (PTSD). However, attitudes toward this novel approach must be better understood. Thus, the objective of this study was to determine if video game self-efficacy mediates the relationship between expectations and reactions to a game-based treatment for PTSD. Participants played the serious game "Walk in My Shoes" (Novonics Corp., Orlando, FL) and completed a series of scales to measure attitudes toward the intervention. Video game self-efficacy was found to be a partial mediator of expectancies and reactions. These results suggest that enhancing attitudes via self-efficacy in a clinical setting may maximize treatment effectiveness.

  1. Risk-based approach to petroleum hydrocarbon remediation. Research study

    SciTech Connect

    Miller, R.N.; Haas, P.; Faile, M.; Taffinder, S.

    1994-12-31

    The risk-based approach utilizes tools developed under the BTEX, Intrinsic Remediation (natural attenuation), Bioslurper, and Bioventing Initiatives of the Air Force Center for Environmental Excellence Technology Transfer Division (AFCEE/ERT) to construct a risk-based cost-effective approach to the cleanup of petroleum contaminated sites. The AFCEE Remediation Matrix (Enclosure 1) identifies natural attenuation as the first remediation alternative for soil and ground water contaminated with petroleum hydrocarbons. The intrinsic remediation (natural attenuation) alternative requires a scientifically defensible risk assessment based on contaminant sources, pathways, and receptors. For fuel contaminated sites, the first step is to determine contaminants of interest. For the ground water pathway (usually considered most important by regulators), this will normally be the most soluble, mobile, and toxic compounds, namely benzene, toluene, ethyl benzene, and o, m, p, xylene (BTEX).

  2. Human Rights-Based Approaches to Mental Health

    PubMed Central

    Bradley, Valerie J.; Sahakian, Barbara J.

    2016-01-01

    Abstract The incidence of human rights violations in mental health care across nations has been described as a “global emergency” and an “unresolved global crisis.” The relationship between mental health and human rights is complex and bidirectional. Human rights violations can negatively impact mental health. Conversely, respecting human rights can improve mental health. This article reviews cases where an explicitly human rights-based approach was used in mental health care settings. Although the included studies did not exhibit a high level of methodological rigor, the qualitative information obtained was considered useful and informative for future studies. All studies reviewed suggest that human-rights based approaches can lead to clinical improvements at relatively low costs. Human rights-based approaches should be utilized for legal and moral reasons, since human rights are fundamental pillars of justice and civilization. The fact that such approaches can contribute to positive therapeutic outcomes and, potentially, cost savings, is additional reason for their implementation. However, the small sample size and lack of controlled, quantitative measures limit the strength of conclusions drawn from included studies. More objective, high quality research is needed to ascertain the true extent of benefits to service users and providers. PMID:27781015

  3. Searching for adaptive traits in genetic resources - phenology based approach

    NASA Astrophysics Data System (ADS)

    Bari, Abdallah

    2015-04-01

    Searching for adaptive traits in genetic resources - phenology based approach Abdallah Bari, Kenneth Street, Eddy De Pauw, Jalal Eddin Omari, and Chandra M. Biradar International Center for Agricultural Research in the Dry Areas, Rabat Institutes, Rabat, Morocco Phenology is an important plant trait not only for assessing and forecasting food production but also for searching in genebanks for adaptive traits. Among the phenological parameters we have been considering to search for such adaptive and rare traits are the onset (sowing period) and the seasonality (growing period). Currently an application is being developed as part of the focused identification of germplasm strategy (FIGS) approach to use climatic data in order to identify crop growing seasons and characterize them in terms of onset and duration. These approximations of growing period characteristics can then be used to estimate flowering and maturity dates for dryland crops, such as wheat, barley, faba bean, lentils and chickpea, and assess, among others, phenology-related traits such as days to heading [dhe] and grain filling period [gfp]. The approach followed here is based on first calculating long term average daily temperatures by fitting a curve to the monthly data over days from beginning of the year. Prior to the identification of these phenological stages the onset is extracted first from onset integer raster GIS layers developed based on a model of the growing period that considers both moisture and temperature limitations. The paper presents some examples of real applications of the approach to search for rare and adaptive traits.

  4. Health-Based Audible Noise Guidelines Account for Infrasound and Low-Frequency Noise Produced by Wind Turbines

    PubMed Central

    Berger, Robert G.; Ashtiani, Payam; Ollson, Christopher A.; Whitfield Aslund, Melissa; McCallum, Lindsay C.; Leventhall, Geoff; Knopper, Loren D.

    2015-01-01

    Setbacks for wind turbines have been established in many jurisdictions to address potential health concerns associated with audible noise. However, in recent years, it has been suggested that infrasound (IS) and low-frequency noise (LFN) could be responsible for the onset of adverse health effects self-reported by some individuals living in proximity to wind turbines, even when audible noise limits are met. The purpose of this paper was to investigate whether current audible noise-based guidelines for wind turbines account for the protection of human health, given the levels of IS and LFN typically produced by wind turbines. New field measurements of indoor IS and outdoor LFN at locations between 400 and 900 m from the nearest turbine, which were previously underrepresented in the scientific literature, are reported and put into context with existing published works. Our analysis showed that indoor IS levels were below auditory threshold levels while LFN levels at distances >500 m were similar to background LFN levels. A clear contribution to LFN due to wind turbine operation (i.e., measured with turbines on in comparison to with turbines off) was noted at a distance of 480 m. However, this corresponded to an increase in overall audible sound measures as reported in dB(A), supporting the hypothesis that controlling audible sound produced by normally operating wind turbines will also control for LFN. Overall, the available data from this and other studies suggest that health-based audible noise wind turbine siting guidelines provide an effective means to evaluate, monitor, and protect potential receptors from audible noise as well as IS and LFN. PMID:25759808

  5. Health-based audible noise guidelines account for infrasound and low-frequency noise produced by wind turbines.

    PubMed

    Berger, Robert G; Ashtiani, Payam; Ollson, Christopher A; Whitfield Aslund, Melissa; McCallum, Lindsay C; Leventhall, Geoff; Knopper, Loren D

    2015-01-01

    Setbacks for wind turbines have been established in many jurisdictions to address potential health concerns associated with audible noise. However, in recent years, it has been suggested that infrasound (IS) and low-frequency noise (LFN) could be responsible for the onset of adverse health effects self-reported by some individuals living in proximity to wind turbines, even when audible noise limits are met. The purpose of this paper was to investigate whether current audible noise-based guidelines for wind turbines account for the protection of human health, given the levels of IS and LFN typically produced by wind turbines. New field measurements of indoor IS and outdoor LFN at locations between 400 and 900 m from the nearest turbine, which were previously underrepresented in the scientific literature, are reported and put into context with existing published works. Our analysis showed that indoor IS levels were below auditory threshold levels while LFN levels at distances >500 m were similar to background LFN levels. A clear contribution to LFN due to wind turbine operation (i.e., measured with turbines on in comparison to with turbines off) was noted at a distance of 480 m. However, this corresponded to an increase in overall audible sound measures as reported in dB(A), supporting the hypothesis that controlling audible sound produced by normally operating wind turbines will also control for LFN. Overall, the available data from this and other studies suggest that health-based audible noise wind turbine siting guidelines provide an effective means to evaluate, monitor, and protect potential receptors from audible noise as well as IS and LFN.

  6. Gilmore-Perelomov symmetry based approach to photonic lattices.

    PubMed

    Vergara, Liliana Villanueva; Rodríguez-Lara, B M

    2015-08-24

    We revisit electromagnetic field propagation through tight-binding arrays of coupled photonic waveguides, with properties independent of the propagation distance, and recast it as a symmetry problem. We focus our analysis on photonic lattices with underlying symmetries given by three well-known groups, SU(2), SU(1, 1) and Heisenberg-Weyl, to show that disperssion relations, normal states and impulse functions can be constructed following a Gilmore-Perelomov coherent state approach. Furthermore, this symmetry based approach can be followed for each an every lattice with an underlying symmetry given by a dynamical group.

  7. Style: A Computational and Conceptual Blending-Based Approach

    NASA Astrophysics Data System (ADS)

    Goguen, Joseph A.; Harrell, D. Fox

    This chapter proposes a new approach to style, arising from our work on computational media using structural blending, which enriches the conceptual blending of cognitive linguistics with structure building operations in order to encompass syntax and narrative as well as metaphor. We have implemented both conceptual and structural blending, and conducted initial experiments with poetry, including interactive multimedia poetry, although the approach generalizes to other media. The central idea is to generate multimedia content and analyze style in terms of blending principles, based on our finding that different principles from those of common sense blending are often needed for some contemporary poetic metaphors.

  8. Network Medicine: A Network-based Approach to Human Diseases

    NASA Astrophysics Data System (ADS)

    Ghiassian, Susan Dina

    With the availability of large-scale data, it is now possible to systematically study the underlying interaction maps of many complex systems in multiple disciplines. Statistical physics has a long and successful history in modeling and characterizing systems with a large number of interacting individuals. Indeed, numerous approaches that were first developed in the context of statistical physics, such as the notion of random walks and diffusion processes, have been applied successfully to study and characterize complex systems in the context of network science. Based on these tools, network science has made important contributions to our understanding of many real-world, self-organizing systems, for example in computer science, sociology and economics. Biological systems are no exception. Indeed, recent studies reflect the necessity of applying statistical and network-based approaches in order to understand complex biological systems, such as cells. In these approaches, a cell is viewed as a complex network consisting of interactions among cellular components, such as genes and proteins. Given the cellular network as a platform, machinery, functionality and failure of a cell can be studied with network-based approaches, a field known as systems biology. Here, we apply network-based approaches to explore human diseases and their associated genes within the cellular network. This dissertation is divided in three parts: (i) A systematic analysis of the connectivity patterns among disease proteins within the cellular network. The quantification of these patterns inspires the design of an algorithm which predicts a disease-specific subnetwork containing yet unknown disease associated proteins. (ii) We apply the introduced algorithm to explore the common underlying mechanism of many complex diseases. We detect a subnetwork from which inflammatory processes initiate and result in many autoimmune diseases. (iii) The last chapter of this dissertation describes the

  9. A data base approach for prediction of deforestation-induced mass wasting events

    NASA Technical Reports Server (NTRS)

    Logan, T. L.

    1981-01-01

    A major topic of concern in timber management is determining the impact of clear-cutting on slope stability. Deforestation treatments on steep mountain slopes have often resulted in a high frequency of major mass wasting events. The Geographic Information System (GIS) is a potentially useful tool for predicting the location of mass wasting sites. With a raster-based GIS, digitally encoded maps of slide hazard parameters can be overlayed and modeled to produce new maps depicting high probability slide areas. The present investigation has the objective to examine the raster-based information system as a tool for predicting the location of the clear-cut mountain slopes which are most likely to experience shallow soil debris avalanches. A literature overview is conducted, taking into account vegetation, roads, precipitation, soil type, slope-angle and aspect, and models predicting mass soil movements. Attention is given to a data base approach and aspects of slide prediction.

  10. Comparison of CTT and Rasch-based approaches for the analysis of longitudinal Patient Reported Outcomes.

    PubMed

    Blanchin, Myriam; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Blanchard, Claire; Mirallié, Eric; Sébille, Véronique

    2011-04-15

    Health sciences frequently deal with Patient Reported Outcomes (PRO) data for the evaluation of concepts, in particular health-related quality of life, which cannot be directly measured and are often called latent variables. Two approaches are commonly used for the analysis of such data: Classical Test Theory (CTT) and Item Response Theory (IRT). Longitudinal data are often collected to analyze the evolution of an outcome over time. The most adequate strategy to analyze longitudinal latent variables, which can be either based on CTT or IRT models, remains to be identified. This strategy must take into account the latent characteristic of what PROs are intended to measure as well as the specificity of longitudinal designs. A simple and widely used IRT model is the Rasch model. The purpose of our study was to compare CTT and Rasch-based approaches to analyze longitudinal PRO data regarding type I error, power, and time effect estimation bias. Four methods were compared: the Score and Mixed models (SM) method based on the CTT approach, the Rasch and Mixed models (RM), the Plausible Values (PV), and the Longitudinal Rasch model (LRM) methods all based on the Rasch model. All methods have shown comparable results in terms of type I error, all close to 5 per cent. LRM and SM methods presented comparable power and unbiased time effect estimations, whereas RM and PV methods showed low power and biased time effect estimations. This suggests that RM and PV methods should be avoided to analyze longitudinal latent variables.

  11. A Kalman-Filter-Based Approach to Combining Independent Earth-Orientation Series

    NASA Technical Reports Server (NTRS)

    Gross, Richard S.; Eubanks, T. M.; Steppe, J. A.; Freedman, A. P.; Dickey, J. O.; Runge, T. F.

    1998-01-01

    An approach. based upon the use of a Kalman filter. that is currently employed at the Jet Propulsion Laboratory (JPL) for combining independent measurements of the Earth's orientation, is presented. Since changes in the Earth's orientation can be described is a randomly excited stochastic process, the uncertainty in our knowledge of the Earth's orientation grows rapidly in the absence of measurements. The Kalman-filter methodology allows for an objective accounting of this uncertainty growth, thereby facilitating the intercomparison of measurements taken at different epochs (not necessarily uniformly spaced in time) and with different precision. As an example of this approach to combining Earth-orientation series, a description is given of a combination, SPACE95, that has been generated recently at JPL.

  12. A novel image fusion approach based on compressive sensing

    NASA Astrophysics Data System (ADS)

    Yin, Hongpeng; Liu, Zhaodong; Fang, Bin; Li, Yanxia

    2015-11-01

    Image fusion can integrate complementary and relevant information of source images captured by multiple sensors into a unitary synthetic image. The compressive sensing-based (CS) fusion approach can greatly reduce the processing speed and guarantee the quality of the fused image by integrating fewer non-zero coefficients. However, there are two main limitations in the conventional CS-based fusion approach. Firstly, directly fusing sensing measurements may bring greater uncertain results with high reconstruction error. Secondly, using single fusion rule may result in the problems of blocking artifacts and poor fidelity. In this paper, a novel image fusion approach based on CS is proposed to solve those problems. The non-subsampled contourlet transform (NSCT) method is utilized to decompose the source images. The dual-layer Pulse Coupled Neural Network (PCNN) model is used to integrate low-pass subbands; while an edge-retention based fusion rule is proposed to fuse high-pass subbands. The sparse coefficients are fused before being measured by Gaussian matrix. The fused image is accurately reconstructed by Compressive Sampling Matched Pursuit algorithm (CoSaMP). Experimental results demonstrate that the fused image contains abundant detailed contents and preserves the saliency structure. These also indicate that our proposed method achieves better visual quality than the current state-of-the-art methods.

  13. Modeling approaches for ligand-based 3D similarity.

    PubMed

    Tresadern, Gary; Bemporad, Daniele

    2010-10-01

    3D ligand-based similarity approaches are widely used in the early phases of drug discovery for tasks such as hit finding by virtual screening or compound design with quantitative structure-activity relationships. Here in we review widely used software for performing such tasks. Some techniques are based on relatively mature technology, shape-based similarity for instance. Typically, these methods remained in the realm of the expert user, the experienced modeler. However, advances in implementation and speed have improved usability and allow these methods to be applied to databases comprising millions of compounds. There are now many reports of such methods impacting drug-discovery projects. As such, the medicinal chemistry community has become the intended market for some of these new tools, yet they may consider the wide array and choice of approaches somewhat disconcerting. Each method has subtle differences and is better suited to certain tasks than others. In this article we review some of the widely used computational methods via application, provide straightforward background on the underlying theory and provide examples for the interested reader to pursue in more detail. In the new era of preclinical drug discovery there will be ever more pressure to move faster and more efficiently, and computational approaches based on 3D ligand similarity will play an increasing role in in this process.

  14. Future Performance Trend Indicators: A Current Value Approach to Human Resources Accounting. Report II: Internal Consistencies and Relationships to Performance in Organization VI. Technical Report.

    ERIC Educational Resources Information Center

    Pecorella, Patricia A.; Bowers, David G.

    Conventional accounting systems provide no indication as to what conditions and events lead to reported outcomes, since they traditionally do not include measurements of the human organization and its relationship to events at the outcome stage. Human resources accounting is used to measure these additional types of data. This research is…

  15. A fuzzy behaviorist approach to sensor-based robot control

    SciTech Connect

    Pin, F.G.

    1996-05-01

    Sensor-based operation of autonomous robots in unstructured and/or outdoor environments has revealed to be an extremely challenging problem, mainly because of the difficulties encountered when attempting to represent the many uncertainties which are always present in the real world. These uncertainties are primarily due to sensor imprecisions and unpredictability of the environment, i.e., lack of full knowledge of the environment characteristics and dynamics. An approach. which we have named the {open_quotes}Fuzzy Behaviorist Approach{close_quotes} (FBA) is proposed in an attempt to remedy some of these difficulties. This approach is based on the representation of the system`s uncertainties using Fuzzy Set Theory-based approximations and on the representation of the reasoning and control schemes as sets of elemental behaviors. Using the FBA, a formalism for rule base development and an automated generator of fuzzy rules have been developed. This automated system can automatically construct the set of membership functions corresponding to fuzzy behaviors. Once these have been expressed in qualitative terms by the user. The system also checks for completeness of the rule base and for non-redundancy of the rules (which has traditionally been a major hurdle in rule base development). Two major conceptual features, the suppression and inhibition mechanisms which allow to express a dominance between behaviors are discussed in detail. Some experimental results obtained with the automated fuzzy, rule generator applied to the domain of sensor-based navigation in aprion unknown environments. using one of our autonomous test-bed robots as well as a real car in outdoor environments, are then reviewed and discussed to illustrate the feasibility of large-scale automatic fuzzy rule generation using the {open_quotes}Fuzzy Behaviorist{close_quotes} concepts.

  16. Choosing the Best Method to Introduce Accounting.

    ERIC Educational Resources Information Center

    Guerrieri, Donald J.

    1988-01-01

    Of the traditional approaches to teaching accounting--single entry, journal, "T" account, balance sheet, and accounting equation--the author recommends the accounting equation approach. It is the foundation of the double entry system, new material is easy to introduce, and it provides students with a rationale for understanding basic concepts.…

  17. Thinking about Accountability

    PubMed Central

    Deber, Raisa B.

    2014-01-01

    Accountability is a key component of healthcare reforms, in Canada and internationally, but there is increasing recognition that one size does not fit all. A more nuanced understanding begins with clarifying what is meant by accountability, including specifying for what, by whom, to whom and how. These papers arise from a Partnership for Health System Improvement (PHSI), funded by the Canadian Institutes of Health Research (CIHR), on approaches to accountability that examined accountability across multiple healthcare subsectors in Ontario. The partnership features collaboration among an interdisciplinary team, working with senior policy makers, to clarify what is known about best practices to achieve accountability under various circumstances. This paper presents our conceptual framework. It examines potential approaches (policy instruments) and postulates that their outcomes may vary by subsector depending upon (a) the policy goals being pursued, (b) governance/ownership structures and relationships and (c) the types of goods and services being delivered, and their production characteristics (e.g., contestability, measurability and complexity). PMID:25305385

  18. Branch-based model for the diameters of the pulmonary airways: accounting for departures from self-consistency and registration errors.

    PubMed

    Neradilek, Moni B; Polissar, Nayak L; Einstein, Daniel R; Glenny, Robb W; Minard, Kevin R; Carson, James P; Jiao, Xiangmin; Jacob, Richard E; Cox, Timothy C; Postlethwait, Edward M; Corley, Richard A

    2012-06-01

    We examine a previously published branch-based approach for modeling airway diameters that is predicated on the assumption of self-consistency across all levels of the tree. We mathematically formulate this assumption, propose a method to test it and develop a more general model to be used when the assumption is violated. We discuss the effect of measurement error on the estimated models and propose methods that take account of error. The methods are illustrated on data from MRI and CT images of silicone casts of two rats, two normal monkeys, and one ozone-exposed monkey. Our results showed substantial departures from self-consistency in all five subjects. When departures from self-consistency exist, we do not recommend using the self-consistency model, even as an approximation, as we have shown that it may likely lead to an incorrect representation of the diameter geometry. The new variance model can be used instead. Measurement error has an important impact on the estimated morphometry models and needs to be addressed in the analysis.

  19. Steady shear flow thermodynamics based on a canonical distribution approach.

    PubMed

    Taniguchi, Tooru; Morriss, Gary P

    2004-11-01

    A nonequilibrium steady-state thermodynamics to describe shear flow is developed using a canonical distribution approach. We construct a canonical distribution for shear flow based on the energy in the moving frame using the Lagrangian formalism of the classical mechanics. From this distribution, we derive the Evans-Hanley shear flow thermodynamics, which is characterized by the first law of thermodynamics dE=TdS-Qdgamma relating infinitesimal changes in energy E, entropy S, and shear rate gamma with kinetic temperature T. Our central result is that the coefficient Q is given by Helfand's moment for viscosity. This approach leads to thermodynamic stability conditions for shear flow, one of which is equivalent to the positivity of the correlation function for Q. We show the consistency of this approach with the Kawasaki distribution function for shear flow, from which a response formula for viscosity is derived in the form of a correlation function for the time-derivative of Q. We emphasize the role of the external work required to sustain the steady shear flow in this approach, and show theoretically that the ensemble average of its power W must be non-negative. A nonequilibrium entropy, increasing in time, is introduced, so that the amount of heat based on this entropy is equal to the average of W. Numerical results from nonequilibrium molecular-dynamics simulation of two-dimensional many-particle systems with soft-core interactions are presented which support our interpretation.

  20. A microfabrication-based approach to quantitative isothermal titration calorimetry.

    PubMed

    Wang, Bin; Jia, Yuan; Lin, Qiao

    2016-04-15

    Isothermal titration calorimetry (ITC) directly measures heat evolved in a chemical reaction to determine equilibrium binding properties of biomolecular systems. Conventional ITC instruments are expensive, use complicated design and construction, and require long analysis times. Microfabricated calorimetric devices are promising, although they have yet to allow accurate, quantitative ITC measurements of biochemical reactions. This paper presents a microfabrication-based approach to integrated, quantitative ITC characterization of biomolecular interactions. The approach integrates microfabricated differential calorimetric sensors with microfluidic titration. Biomolecules and reagents are introduced at each of a series of molar ratios, mixed, and allowed to react. The reaction thermal power is differentially measured, and used to determine the thermodynamic profile of the biomolecular interactions. Implemented in a microdevice featuring thermally isolated, well-defined reaction volumes with minimized fluid evaporation as well as highly sensitive thermoelectric sensing, the approach enables accurate and quantitative ITC measurements of protein-ligand interactions under different isothermal conditions. Using the approach, we demonstrate ITC characterization of the binding of 18-Crown-6 with barium chloride, and the binding of ribonuclease A with cytidine 2'-monophosphate within reaction volumes of approximately 0.7 µL and at concentrations down to 2mM. For each binding system, the ITC measurements were completed with considerably reduced analysis times and material consumption, and yielded a complete thermodynamic profile of the molecular interaction in agreement with published data. This demonstrates the potential usefulness of our approach for biomolecular characterization in biomedical applications.

  1. Wind-Disturbance-Based Control Approach for Blimp Robots

    NASA Astrophysics Data System (ADS)

    Furukawa, Hayato; Shimada, Akira

    Blimps have some advantages, for example, that they do not need driving forces to float and can move in three-dimensional space. However, it is not easy to control them since they are underactuated systems with nonholonomic constraints. Some papers have presented control technologies, but the technologies need impractical conditions. Furukawa et al. presented a blimp control technology by considering a nonholonomic situation, but it cannot be used to control blimps in the presence of wind disturbances. This paper introduces a wind-observer-based control approach that involves the consideration of wind disturbances. A controller based on the proposed approach can generate a driving force in the direction opposite to that of a wind disturbance and help a blimp move against the wind.

  2. English to Sanskrit Machine Translation Using Transfer Based approach

    NASA Astrophysics Data System (ADS)

    Pathak, Ganesh R.; Godse, Sachin P.

    2010-11-01

    Translation is one of the needs of global society for communicating thoughts and ideas of one country with other country. Translation is the process of interpretation of text meaning and subsequent production of equivalent text, also called as communicating same meaning (message) in another language. In this paper we gave detail information on how to convert source language text in to target language text using Transfer Based Approach for machine translation. Here we implemented English to Sanskrit machine translator using transfer based approach. English is global language used for business and communication but large amount of population in India is not using and understand the English. Sanskrit is ancient language of India most of the languages in India are derived from Sanskrit. Sanskrit can be act as an intermediate language for multilingual translation.

  3. An SQL-based approach to physics analysis

    NASA Astrophysics Data System (ADS)

    Limper, Maaike, Dr

    2014-06-01

    As part of the CERN openlab collaboration a study was made into the possibility of performing analysis of the data collected by the experiments at the Large Hadron Collider (LHC) through SQL-queries on data stored in a relational database. Currently LHC physics analysis is done using data stored in centrally produced "ROOT-ntuple" files that are distributed through the LHC computing grid. The SQL-based approach to LHC physics analysis presented in this paper allows calculations in the analysis to be done at the database and can make use of the database's in-built parallelism features. Using this approach it was possible to reproduce results for several physics analysis benchmarks. The study shows the capability of the database to handle complex analysis tasks but also illustrates the limits of using row-based storage for storing physics analysis data, as performance was limited by the I/O read speed of the system.

  4. An Extended Normalization Model of Attention Accounts for Feature-Based Attentional Enhancement of Both Response and Coherence Gain

    PubMed Central

    Krishna, B. Suresh; Treue, Stefan

    2016-01-01

    Paying attention to a sensory feature improves its perception and impairs that of others. Recent work has shown that a Normalization Model of Attention (NMoA) can account for a wide range of physiological findings and the influence of different attentional manipulations on visual performance. A key prediction of the NMoA is that attention to a visual feature like an orientation or a motion direction will increase the response of neurons preferring the attended feature (response gain) rather than increase the sensory input strength of the attended stimulus (input gain). This effect of feature-based attention on neuronal responses should translate to similar patterns of improvement in behavioral performance, with psychometric functions showing response gain rather than input gain when attention is directed to the task-relevant feature. In contrast, we report here that when human subjects are cued to attend to one of two motion directions in a transparent motion display, attentional effects manifest as a combination of input and response gain. Further, the impact on input gain is greater when attention is directed towards a narrow range of motion directions than when it is directed towards a broad range. These results are captured by an extended NMoA, which either includes a stimulus-independent attentional contribution to normalization or utilizes direction-tuned normalization. The proposed extensions are consistent with the feature-similarity gain model of attention and the attentional modulation in extrastriate area MT, where neuronal responses are enhanced and suppressed by attention to preferred and non-preferred motion directions respectively. PMID:27977679

  5. Graph-based and statistical approaches for detecting spectrally variable target materials

    NASA Astrophysics Data System (ADS)

    Ziemann, Amanda K.; Theiler, James

    2016-05-01

    In discriminating target materials from background clutter in hyperspectral imagery, one must contend with variability in both. Most algorithms focus on the clutter variability, but for some materials there is considerable variability in the spectral signatures of the target. This is especially the case for solid target materials, whose signatures depend on morphological properties (particle size, packing density, etc.) that are rarely known a priori. In this paper, we investigate detection algorithms that explicitly take into account the diversity of signatures for a given target. In particular, we investigate variable target detectors when applied to new representations of the hyperspectral data: a manifold learning based approach, and a residual based approach. The graph theory and manifold learning based approach incorporates multiple spectral signatures of the target material of interest; this is built upon previous work that used a single target spectrum. In this approach, we first build an adaptive nearest neighbors (ANN) graph on the data and target spectra, and use a biased locally linear embedding (LLE) transformation to perform nonlinear dimensionality reduction. This biased transformation results in a lower-dimensional representation of the data that better separates the targets from the background. The residual approach uses an annulus based computation to represent each pixel after an estimate of the local background is removed, which suppresses local backgrounds and emphasizes the target-containing pixels. We will show detection results in the original spectral space, the dimensionality-reduced space, and the residual space, all using subspace detectors: ranked spectral angle mapper (rSAM), subspace adaptive matched filter (ssAMF), and subspace adaptive cosine/coherence estimator (ssACE). Results of this exploratory study will be shown on a ground-truthed hyperspectral image with variable target spectra and both full and mixed pixel targets.

  6. Computer-based Approaches for Training Interactive Digital Map Displays

    DTIC Science & Technology

    2005-09-01

    SUPPLEMENTARY NOTES Subject Matter POC: Jean L. Dyer 14. ABSTRACT (Maximum 200 words): Five computer-based training approaches for learning digital skills...Training assessment Exploratory Learning Guided ExploratoryTraining Guided Discovery SECURITY CLASSIFICATION OF 19. LIMITATION OF 20. NUMBER 21...the other extreme of letting Soldiers learn a digital interface on their own. The research reported here examined these two conditions and three other

  7. Science based integrated approach to advanced nuclear fuel development - vision, approach, and overview

    SciTech Connect

    Unal, Cetin; Pasamehmetoglu, Kemal; Carmack, Jon

    2010-01-01

    Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Rcactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems is critical. In order to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating the phase and microstructural behavior of the nuclear fuel system materials and matrices. The purpose of this paper is to identify the modeling and simulation approach in order to deliver predictive tools for advanced fuels development. The coordination between experimental nuclear fuel design, development technical experts, and computational fuel modeling and simulation technical experts is a critical aspect of the approach and naturally leads to an integrated, goal-oriented science-based R & D approach and strengthens both the experimental and computational efforts. The Advanced Fuels Campaign (AFC) and Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Integrated Performance and Safety Code (IPSC) are working together to determine experimental data and modeling needs. The primary objective of the NEAMS fuels IPSC project is to deliver a coupled, three-dimensional, predictive computational platform for modeling the fabrication and both normal and abnormal operation of nuclear fuel pins and assemblies, applicable to both existing and future reactor fuel designs. The science based program is pursuing the development of an integrated multi-scale and multi-physics modeling and simulation platform for nuclear fuels. This overview paper discusses the vision, goals and approaches how to develop and implement the new approach.

  8. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2001-01-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  9. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2000-12-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  10. Physiologically based computational approach to camouflage and masking patterns

    NASA Astrophysics Data System (ADS)

    Irvin, Gregg E.; Dowler, Michael G.

    1992-09-01

    A computational system was developed to integrate both Fourier image processing techniques and biologically based image processing techniques. The Fourier techniques allow the spatially global manipulation of phase and amplitude spectra. The biologically based techniques allow for spatially localized manipulation of phase, amplitude and orientation independently on multiple spatial frequency scales. These techniques combined with a large variety of basic image processing functions allow for a versatile and systematic approach to be taken toward the development of specialized patterning and visual textures. Current applications involve research for the development of 2-dimensional spatial patterning that can function as effective camouflage patterns and masking patterns for the human visual system.

  11. A mindfulness-based approach to the treatment of insomnia.

    PubMed

    Ong, Jason; Sholtes, David

    2010-11-01

    Mindfulness meditation has emerged as a novel approach to emotion regulation and stress reduction that has several health benefits. Preliminary work has been conducted on mindfulness-based therapy for insomnia (MBT-I), a meditation-based program for individuals suffering from chronic sleep disturbance. This treatment integrates behavioral treatments for insomnia with the principles and practices of mindfulness meditation. A case illustration of a chronic insomnia sufferer demonstrates the application of mindfulness principles for developing adaptive ways of working with the nocturnal symptoms and waking consequences of chronic insomnia.

  12. The FPGA based L1 track finding Tracklet approach

    NASA Astrophysics Data System (ADS)

    Kyriacou, Savvas; CMS Collaboration

    2017-01-01

    The High Luminosity upgraded LHC is expected to deliver proton-proton collisions per 25ns with an estimated 140-200 pile up interactions per bunch crossing. Ultrafast track finding is vital for handling trigger rates in such conditions. An FPGA based road search algorithm is developed, the Tracklet approach one of a few currently under consideration, for the CMS L1 trigger system. Based on low/high transverse momentum track discrimination and designed for the HL upgraded outer tracker, the algorithm achieves microsecond scale track reconstruction in the expected high track multiplicity environment. The Tracklet method overview, implementation, hardware demonstrator and performance results are presented and discussed.

  13. A New Approach to Image Fusion Based on Cokriging

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; LeMoigne, Jacqueline; Mount, David M.; Morisette, Jeffrey T.

    2005-01-01

    We consider the image fusion problem involving remotely sensed data. We introduce cokriging as a method to perform fusion. We investigate the advantages of fusing Hyperion with ALI. The evaluation is performed by comparing the classification of the fused data with that of input images and by calculating well-chosen quantitative fusion quality metrics. We consider the Invasive Species Forecasting System (ISFS) project as our fusion application. The fusion of ALI with Hyperion data is studies using PCA and wavelet-based fusion. We then propose utilizing a geostatistical based interpolation method called cokriging as a new approach for image fusion.

  14. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Goebel, Kai

    2011-01-01

    Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.

  15. A machine vision based approach for timber knots detection

    NASA Astrophysics Data System (ADS)

    Hittawe, Mohamad Mazen; Sidibé, Désiré; Mériaudeau, Fabrice

    2015-04-01

    Wood singularities detection is a primary step in wood grading enhancement. Our approach is purely machine vision based. The main objective is to compute physical properties like density, modulus of elasticity (MOE) and modulus of rupture (MOR) given wood surface images. Knots are one of the main singularities which directly affect the wood strength. Hence, our target is to detect knots and classify them into transverse and non-transverse ones. Then the Knots Depth Ratio (KDR) is computed based on all found transverse knots. Afterwards, KDR is used for the wood mechanical model improvement. Our technique is based on colour image analysis where the knots are detected by means of contrast intensity transformation and morphological operations. Then KDR computations are based on transverse knots and clear wood densities. Finally, MOE and MOR are computed using KDR images. The accuracy of number of knots found, their locations, MOE and MOR has been validated using a dataset of 252 images. In our dataset, these values were manually calculated. To the best of our knowledge our approach is the first purely machine vision based method to compute KDR, MOE and MOR.

  16. Model-based approach for elevator performance estimation

    NASA Astrophysics Data System (ADS)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  17. Optimizing Dendritic Cell-Based Approaches for Cancer Immunotherapy

    PubMed Central

    Datta, Jashodeep; Terhune, Julia H.; Lowenfeld, Lea; Cintolo, Jessica A.; Xu, Shuwen; Roses, Robert E.; Czerniecki, Brian J.

    2014-01-01

    Dendritic cells (DC) are professional antigen-presenting cells uniquely suited for cancer immunotherapy. They induce primary immune responses, potentiate the effector functions of previously primed T-lymphocytes, and orchestrate communication between innate and adaptive immunity. The remarkable diversity of cytokine activation regimens, DC maturation states, and antigen-loading strategies employed in current DC-based vaccine design reflect an evolving, but incomplete, understanding of optimal DC immunobiology. In the clinical realm, existing DC-based cancer immunotherapy efforts have yielded encouraging but inconsistent results. Despite recent U.S. Federal and Drug Administration (FDA) approval of DC-based sipuleucel-T for metastatic castration-resistant prostate cancer, clinically effective DC immunotherapy as monotherapy for a majority of tumors remains a distant goal. Recent work has identified strategies that may allow for more potent “next-generation” DC vaccines. Additionally, multimodality approaches incorporating DC-based immunotherapy may improve clinical outcomes. PMID:25506283

  18. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  19. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  20. Use of an ecologically relevant modelling approach to improve remote sensing-based schistosomiasis risk profiling.

    PubMed

    Walz, Yvonne; Wegmann, Martin; Leutner, Benjamin; Dech, Stefan; Vounatsou, Penelope; N'Goran, Eliézer K; Raso, Giovanna; Utzinger, Jürg

    2015-11-30

    Schistosomiasis is a widespread water-based disease that puts close to 800 million people at risk of infection with more than 250 million infected, mainly in sub-Saharan Africa. Transmission is governed by the spatial distribution of specific freshwater snails that act as intermediate hosts and the frequency, duration and extent of human bodies exposed to infested water sources during human water contact. Remote sensing data have been utilized for spatially explicit risk profiling of schistosomiasis. Since schistosomiasis risk profiling based on remote sensing data inherits a conceptual drawback if school-based disease prevalence data are directly related to the remote sensing measurements extracted at the location of the school, because the disease transmission usually does not exactly occur at the school, we took the local environment around the schools into account by explicitly linking ecologically relevant environmental information of potential disease transmission sites to survey measurements of disease prevalence. Our models were validated at two sites with different landscapes in Côte d'Ivoire using high- and moderate-resolution remote sensing data based on random forest and partial least squares regression. We found that the ecologically relevant modelling approach explained up to 70% of the variation in Schistosoma infection prevalence and performed better compared to a purely pixel-based modelling approach. Furthermore, our study showed that model performance increased as a function of enlarging the school catchment area, confirming the hypothesis that suitable environments for schistosomiasis transmission rarely occur at the location of survey measurements.

  1. Assisting students for lecture preparation: A Web-based approach

    NASA Astrophysics Data System (ADS)

    Herrick, Brad Jay

    Students continue to arrive at universities with poor study and time management skills: they are not proactive in their studies while professors are not willing to hold them accountable for their shortcomings. The result is a 'dumbing down' of the course. This can be defeated by student preparation prior to attending lecture, especially in very large-lecture classrooms (N>400). In fact, it provides a process to 'dumb up' the course. A Web-based system for providing content specific lecture preparations (termed 'Previews') was developed and tested in three courses in a large southwestern research institution. Significance was found in final course achievement by treatment levels, including variations by the total number of participations in the lecture preparations. Method of implementation and results are discussed, including future considerations.

  2. A soil moisture accounting-procedure with a Richards' equation-based soil texture-dependent parameterization

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Given a time series of potential evapotranspiration and rainfall data, there are at least two approaches for estimating vertical percolation rates. One approach involves solving Richards' equation (RE) with a plant uptake model. An alternative approach involves applying a simple soil moisture accoun...

  3. Dependent component analysis based approach to robust demarcation of skin tumors

    NASA Astrophysics Data System (ADS)

    Kopriva, Ivica; Peršin, Antun; Puizina-Ivić, Neira; Mirić, Lina

    2009-02-01

    Method for robust demarcation of the basal cell carcinoma (BCC) is presented employing novel dependent component analysis (DCA)-based approach to unsupervised segmentation of the red-green-blue (RGB) fluorescent image of the BCC. It exploits spectral diversity between the BCC and the surrounding tissue. DCA represents an extension of the independent component analysis (ICA) and is necessary to account for statistical dependence induced by spectral similarity between the BCC and surrounding tissue. Robustness to intensity fluctuation is due to the scale invariance property of DCA algorithms. By comparative performance analysis with state-of-the-art image segmentation methods such as active contours (level set), K-means clustering, non-negative matrix factorization and ICA we experimentally demonstrate good performance of DCA-based BCC demarcation in demanding scenario where intensity of the fluorescent image has been varied almost two-orders of magnitude.

  4. Parkinson's disease prediction using diffusion-based atlas approach

    NASA Astrophysics Data System (ADS)

    Teodorescu, Roxana O.; Racoceanu, Daniel; Smit, Nicolas; Cretu, Vladimir I.; Tan, Eng K.; Chan, Ling L.

    2010-03-01

    We study Parkinson's disease (PD) using an automatic specialized diffusion-based atlas. A total of 47 subjects, among who 22 patients diagnosed clinically with PD and 25 control cases, underwent DTI imaging. The EPIs have lower resolution but provide essential anisotropy information for the fiber tracking process. The two volumes of interest (VOI) represented by the Substantia Nigra and the Putamen are detected on the EPI and FA respectively. We use the VOIs for the geometry-based registration. We fuse the anatomical detail detected on FA image for the putamen volume with the EPI. After 3D fibers growing on the two volumes, we compute the fiber density (FD) and the fiber volume (FV). Furthermore, we compare patients based on the extracted fibers and evaluate them according to Hohen&Yahr (H&Y) scale. This paper introduces the method used for automatic volume detection and evaluates the fiber growing method on these volumes. Our approach is important from the clinical standpoint, providing a new tool for the neurologists to evaluate and predict PD evolution. From the technical point of view, the fusion approach deals with the tensor based information (EPI) and the extraction of the anatomical detail (FA and EPI).

  5. Energy function-based approaches to graph coloring.

    PubMed

    Di Blas, A; Jagota, A; Hughey, R

    2002-01-01

    We describe an approach to optimization based on a multiple-restart quasi-Hopfield network where the only problem-specific knowledge is embedded in the energy function that the algorithm tries to minimize. We apply this method to three different variants of the graph coloring problem: the minimum coloring problem, the spanning subgraph k-coloring problem, and the induced subgraph k-coloring problem. Though Hopfield networks have been applied in the past to the minimum coloring problem, our encoding is more natural and compact than almost all previous ones. In particular, we use k-state neurons while almost all previous approaches use binary neurons. This reduces the number of connections in the network from (Nk)(2) to N(2) asymptotically and also circumvents a problem in earlier approaches, that of multiple colors being assigned to a single vertex. Experimental results show that our approach compares favorably with other algorithms, even nonneural ones specifically developed for the graph coloring problem.

  6. Synchronization-based approach for detecting functional activation of brain

    NASA Astrophysics Data System (ADS)

    Hong, Lei; Cai, Shi-Min; Zhang, Jie; Zhuo, Zhao; Fu, Zhong-Qian; Zhou, Pei-Ling

    2012-09-01

    In this paper, we investigate a synchronization-based, data-driven clustering approach for the analysis of functional magnetic resonance imaging (fMRI) data, and specifically for detecting functional activation from fMRI data. We first define a new measure of similarity between all pairs of data points (i.e., time series of voxels) integrating both complete phase synchronization and amplitude correlation. These pairwise similarities are taken as the coupling between a set of Kuramoto oscillators, which in turn evolve according to a nearest-neighbor rule. As the network evolves, similar data points naturally synchronize with each other, and distinct clusters will emerge. The clustering behavior of the interaction network of the coupled oscillators, therefore, mirrors the clustering property of the original multiple time series. The clustered regions whose cross-correlation coefficients are much greater than other regions are considered as the functionally activated brain regions. The analysis of fMRI data in auditory and visual areas shows that the recognized brain functional activations are in complete correspondence with those from the general linear model of statistical parametric mapping, but with a significantly lower time complexity. We further compare our results with those from traditional K-means approach, and find that our new clustering approach can distinguish between different response patterns more accurately and efficiently than the K-means approach, and therefore more suitable in detecting functional activation from event-related experimental fMRI data.

  7. A Thermodynamically-consistent FBA-based Approach to Biogeochemical Reaction Modeling

    NASA Astrophysics Data System (ADS)

    Shapiro, B.; Jin, Q.

    2015-12-01

    Microbial rates are critical to understanding biogeochemical processes in natural environments. Recently, flux balance analysis (FBA) has been applied to predict microbial rates in aquifers and other settings. FBA is a genome-scale constraint-based modeling approach that computes metabolic rates and other phenotypes of microorganisms. This approach requires a prior knowledge of substrate uptake rates, which is not available for most natural microbes. Here we propose to constrain substrate uptake rates on the basis of microbial kinetics. Specifically, we calculate rates of respiration (and fermentation) using a revised Monod equation; this equation accounts for both the kinetics and thermodynamics of microbial catabolism. Substrate uptake rates are then computed from the rates of respiration, and applied to FBA to predict rates of microbial growth. We implemented this method by linking two software tools, PHREEQC and COBRA Toolbox. We applied this method to acetotrophic methanogenesis by Methanosarcina barkeri, and compared the simulation results to previous laboratory observations. The new method constrains acetate uptake by accounting for the kinetics and thermodynamics of methanogenesis, and predicted well the observations of previous experiments. In comparison, traditional methods of dynamic-FBA constrain acetate uptake on the basis of enzyme kinetics, and failed to reproduce the experimental results. These results show that microbial rate laws may provide a better constraint than enzyme kinetics for applying FBA to biogeochemical reaction modeling.

  8. A novel rules based approach for estimating software birthmark.

    PubMed

    Nazir, Shah; Shahzad, Sara; Khan, Sher Afzal; Alias, Norma Binti; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark.

  9. A Novel Rules Based Approach for Estimating Software Birthmark

    PubMed Central

    Binti Alias, Norma; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363

  10. Matching sensors to missions using a knowledge-based approach

    NASA Astrophysics Data System (ADS)

    Preece, Alun; Gomez, Mario; de Mel, Geeth; Vasconcelos, Wamberto; Sleeman, Derek; Colley, Stuart; Pearson, Gavin; Pham, Tien; La Porta, Thomas

    2008-04-01

    Making decisions on how best to utilise limited intelligence, surveillance and reconnaisance (ISR) resources is a key issue in mission planning. This requires judgements about which kinds of available sensors are more or less appropriate for specific ISR tasks in a mission. A methodological approach to addressing this kind of decision problem in the military context is the Missions and Means Framework (MMF), which provides a structured way to analyse a mission in terms of tasks, and assess the effectiveness of various means for accomplishing those tasks. Moreover, the problem can be defined as knowledge-based matchmaking: matching the ISR requirements of tasks to the ISR-providing capabilities of available sensors. In this paper we show how the MMF can be represented formally as an ontology (that is, a specification of a conceptualisation); we also represent knowledge about ISR requirements and sensors, and then use automated reasoning to solve the matchmaking problem. We adopt the Semantic Web approach and the Web Ontology Language (OWL), allowing us to import elements of existing sensor knowledge bases. Our core ontologies use the description logic subset of OWL, providing efficient reasoning. We describe a prototype tool as a proof-of-concept for our approach. We discuss the various kinds of possible sensor-mission matches, both exact and inexact, and how the tool helps mission planners consider alternative choices of sensors.

  11. Pedestrian detection from thermal images: A sparse representation based approach

    NASA Astrophysics Data System (ADS)

    Qi, Bin; John, Vijay; Liu, Zheng; Mita, Seiichi

    2016-05-01

    Pedestrian detection, a key technology in computer vision, plays a paramount role in the applications of advanced driver assistant systems (ADASs) and autonomous vehicles. The objective of pedestrian detection is to identify and locate people in a dynamic environment so that accidents can be avoided. With significant variations introduced by illumination, occlusion, articulated pose, and complex background, pedestrian detection is a challenging task for visual perception. Different from visible images, thermal images are captured and presented with intensity maps based objects' emissivity, and thus have an enhanced spectral range to make human beings perceptible from the cool background. In this study, a sparse representation based approach is proposed for pedestrian detection from thermal images. We first adopted the histogram of sparse code to represent image features and then detect pedestrian with the extracted features in an unimodal and a multimodal framework respectively. In the unimodal framework, two types of dictionaries, i.e. joint dictionary and individual dictionary, are built by learning from prepared training samples. In the multimodal framework, a weighted fusion scheme is proposed to further highlight the contributions from features with higher separability. To validate the proposed approach, experiments were conducted to compare with three widely used features: Haar wavelets (HWs), histogram of oriented gradients (HOG), and histogram of phase congruency (HPC) as well as two classification methods, i.e. AdaBoost and support vector machine (SVM). Experimental results on a publicly available data set demonstrate the superiority of the proposed approach.

  12. Probabilistic Risk-Based Approach to Aeropropulsion System Assessment Developed

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2001-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decisionmakers to determine the feasibility and return-on-investment of a new aircraft engine.

  13. Predicting dispersal distance in mammals: a trait-based approach.

    PubMed

    Whitmee, Sarah; Orme, C David L

    2013-01-01

    Dispersal is one of the principal mechanisms influencing ecological and evolutionary processes but quantitative empirical data are unfortunately scarce. As dispersal is likely to influence population responses to climate change, whether by adaptation or by migration, there is an urgent need to obtain estimates of dispersal distance. Cross-species correlative approaches identifying predictors of dispersal distance can provide much-needed insights into this data-scarce area. Here, we describe the compilation of a new data set of natal dispersal distances and use it to test life-history predictors of dispersal distance in mammals and examine the strength of the phylogenetic signal in dispersal distance. We find that both maximum and median dispersal distances have strong phylogenetic signals. No single model performs best in describing either maximum or median dispersal distances when phylogeny is taken into account but many models show high explanatory power, suggesting that dispersal distance per generation can be estimated for mammals with comparatively little data availability. Home range area, geographic range size and body mass are identified as the most important terms across models. Cross-validation of models supports the ability of these variables to predict dispersal distances, suggesting that models may be extended to species where dispersal distance is unknown.

  14. Rights-Based Approaches to Ensure Sustainable Nutrition Security.

    PubMed

    Banerjee, Sweta

    2016-01-01

    In India, a rights-based approach has been used to address large-scale malnutrition, including both micro- and macro-level nutrition deficiencies. Stunting, which is an intergenerational chronic consequence of malnutrition, is especially widespread in India (38% among children under 5 years old). To tackle this problem, the government of India has designed interventions for the first 1,000 days, a critical period of the life cycle, through a number of community-based programs to fulfill the rights to food and life. However, the entitlements providing these rights have not yet produced the necessary changes in the malnutrition status of people, especially women and children. The government of India has already implemented laws and drafted a constitution that covers the needs of its citizens, but corruption, bureaucracy, lack of awareness of rights and entitlements and social discrimination limit people's access to basic rights and services. To address this crisis, Welthungerhilfe India, working in remote villages of the most backward states in India, has shifted from a welfare-based approach to a rights-based approach. The Fight Hunger First Initiative, started by Welthungerhilfe in 2011, is designed on the premise that in the long term, poor people can only leave poverty behind if adequate welfare systems are in place and if basic rights are fulfilled; these rights include access to proper education, sufficient access to adequate food and income, suitable health services and equal rights. Only then can the next generation of disadvantaged populations look forward to a new and better future and can growth benefit the entire society. The project, co-funded by the Federal Ministry for Economic Cooperation and Development, is a long-term multi-sectoral program that involves institution-building and empowerment.

  15. New approaches to addiction treatment based on learning and memory.

    PubMed

    Kiefer, Falk; Dinter, Christina

    2013-01-01

    Preclinical studies suggest that physiological learning processes are similar to changes observed in addicts at the molecular, neuronal, and structural levels. Based on the importance of classical and instrumental conditioning in the development and maintenance of addictive disorders, many have suggested cue-exposure-based extinction training of conditioned, drug-related responses as a potential new treatment of addiction. It may also be possible to facilitate this extinction training with pharmacological compounds that strengthen memory consolidation during cue exposure. Another potential therapeutic intervention would be based on the so-called reconsolidation theory. According to this hypothesis, already-consolidated memories return to a labile state when reactivated, allowing them to undergo another phase of consolidation-reconsolidation, which can be pharmacologically manipulated. These approaches suggest that the extinction of drug-related memories may represent a viable treatment strategy in the future treatment of addiction.

  16. Bridging Ayurveda with evidence-based scientific approaches in medicine.

    PubMed

    Patwardhan, Bhushan

    2014-01-01

    This article reviews contemporary approaches for bridging Ayurveda with evidence-based medicine. In doing so, the author presents a pragmatic assessment of quality, methodology and extent of scientific research in Ayurvedic medicine. The article discusses the meaning of evidence and indicates the need to adopt epistemologically sensitive methods and rigorous experimentation using modern science. The author critically analyzes the status of Ayurvedic medicine based on personal observations, peer interactions and published research. This review article concludes that traditional knowledge systems like Ayurveda and modern scientific evidence-based medicine should be integrated. The author advocates that Ayurvedic researchers should develop strategic collaborations with innovative initiatives like 'Horizon 2020' involving predictive, preventive and personalized medicine (PPPM).

  17. Clinical engineering internships: a regional hospital-based approach.

    PubMed

    Bronzino, J D

    1985-01-01

    Clinical engineering has been defined as that branch of applied science that is concerned with solving problems associated with the clinical aspects of health care delivery and patient care using principles, methods and approaches drawn from engineering science and technology. To prepare individuals for this type of activity requires that they be exposed to the clinical environment during their academic programs. Such an experience permits the student to observe not only the operation of specific medical instruments, but also the environment in which they are used and the people who use them. The nature of this clinical experience may vary in terms of its duration and specificity, but it must occur. Consequently, all clinical engineering programs must contain, as an integral part of their activity, a significant internship experience. This article presents the activities of a regional, hospital-based clinical engineering internship program that has been in operation during the past decade, and highlights the major arguments for the internship approach.

  18. Nucleic Acid-Based Therapy Approaches for Huntington's Disease

    PubMed Central

    Vagner, Tatyana; Young, Deborah; Mouravlev, Alexandre

    2012-01-01

    Huntington's disease (HD) is caused by a dominant mutation that results in an unstable expansion of a CAG repeat in the huntingtin gene leading to a toxic gain of function in huntingtin protein which causes massive neurodegeneration mainly in the striatum and clinical symptoms associated with the disease. Since the mutation has multiple effects in the cell and the precise mechanism of the disease remains to be elucidated, gene therapy approaches have been developed that intervene in different aspects of the condition. These approaches include increasing expression of growth factors, decreasing levels of mutant huntingtin, and restoring cell metabolism and transcriptional balance. The aim of this paper is to outline the nucleic acid-based therapeutic strategies that have been tested to date. PMID:22288011

  19. The acne continuum: an age-based approach to therapy.

    PubMed

    Friedlander, Sheila Fallon; Baldwin, Hilary E; Mancini, Anthony J; Yan, Albert C; Eichenfield, Lawrence F

    2011-09-01

    Acne vulgaris is classically considered a disease of adolescence. Although it most commonly occurs and has been best studied in that age group, it can develop at any time during childhood. It is important that health care practitioners recognize the manifestations of neonatal, infantile and childhood acne, as well as the differential diagnosis and best therapeutic approach in the younger child. Acneiform eruptions in infants and toddlers can occasionally be associated with scarring or with other significant disorders that may be life-threatening. In this article, the authors draw on their own clinical experience as well as the available literature to suggest an age-based approach to managing acne in children from the neonatal period through age 11 years.

  20. A scale-based approach to interdisciplinary research and expertise in sports.

    PubMed

    Ibáñez-Gijón, Jorge; Buekers, Martinus; Morice, Antoine; Rao, Guillaume; Mascret, Nicolas; Laurin, Jérome; Montagne, Gilles

    2017-02-01

    After more than 20 years since the introduction of ecological and dynamical approaches in sports research, their promising opportunity for interdisciplinary research has not been fulfilled yet. The complexity of the research process and the theoretical and empirical difficulties associated with an integrated ecological-dynamical approach have been the major factors hindering the generalisation of interdisciplinary projects in sports sciences. To facilitate this generalisation, we integrate the major concepts from the ecological and dynamical approaches to study behaviour as a multi-scale process. Our integration gravitates around the distinction between functional (ecological) and execution (organic) scales, and their reciprocal intra- and inter-scale constraints. We propose an (epistemological) scale-based definition of constraints that accounts for the concept of synergies as emergent coordinative structures. To illustrate how we can operationalise the notion of multi-scale synergies we use an interdisciplinary model of locomotor pointing. To conclude, we show the value of this approach for interdisciplinary research in sport sciences, as we discuss two examples of task-specific dimensionality reduction techniques in the context of an ongoing project that aims to unveil the determinants of expertise in basketball free throw shooting. These techniques provide relevant empirical evidence to help bootstrap the challenging modelling efforts required in sport sciences.