Science.gov

Sample records for accounting approach based

  1. Accounting Control Technology Using SAP: A Case-Based Approach

    ERIC Educational Resources Information Center

    Ragan, Joseph; Puccio, Christopher; Talisesky, Brandon

    2014-01-01

    The Sarbanes-Oxley Act (SOX) revolutionized the accounting and audit industry. The use of preventative and process controls to evaluate the continuous audit process done via an SAP ERP ECC 6.0 system is key to compliance with SOX and managing costs. This paper can be used in a variety of ways to discuss issues associated with auditing and testing…

  2. Back to the Future: Implementing a Broad Economic, Inquiry-Based Approach to Accounting Education

    ERIC Educational Resources Information Center

    Frecka, Thomas J.; Morris, Michael H.; Ramanan, Ramachandran

    2004-01-01

    Motivated by concerns about the quality of accounting education and calls for a broader, more active approach to learning by numerous accounting educators and practitioners over the past 2 decades, the authors of this article sought to provide a framework and example materials to address those issues. The framework makes use of broad, economic…

  3. Greenhouse Gas Emissions Accounting of Urban Residential Consumption: A Household Survey Based Approach

    PubMed Central

    Lin, Tao; Yu, Yunjun; Bai, Xuemei; Feng, Ling; Wang, Jin

    2013-01-01

    Devising policies for a low carbon city requires a careful understanding of the characteristics of urban residential lifestyle and consumption. The production-based accounting approach based on top-down statistical data has a limited ability to reflect the total greenhouse gas (GHG) emissions from residential consumption. In this paper, we present a survey-based GHG emissions accounting methodology for urban residential consumption, and apply it in Xiamen City, a rapidly urbanizing coastal city in southeast China. Based on this, the main influencing factors determining residential GHG emissions at the household and community scale are identified, and the typical profiles of low, medium and high GHG emission households and communities are identified. Up to 70% of household GHG emissions are from regional and national activities that support household consumption including the supply of energy and building materials, while 17% are from urban level basic services and supplies such as sewage treatment and solid waste management, and only 13% are direct emissions from household consumption. Housing area and household size are the two main factors determining GHG emissions from residential consumption at the household scale, while average housing area and building height were the main factors at the community scale. Our results show a large disparity in GHG emissions profiles among different households, with high GHG emissions households emitting about five times more than low GHG emissions households. Emissions from high GHG emissions communities are about twice as high as from low GHG emissions communities. Our findings can contribute to better tailored and targeted policies aimed at reducing household GHG emissions, and developing low GHG emissions residential communities in China. PMID:23405187

  4. Greenhouse gas emissions accounting of urban residential consumption: a household survey based approach.

    PubMed

    Lin, Tao; Yu, Yunjun; Bai, Xuemei; Feng, Ling; Wang, Jin

    2013-01-01

    Devising policies for a low carbon city requires a careful understanding of the characteristics of urban residential lifestyle and consumption. The production-based accounting approach based on top-down statistical data has a limited ability to reflect the total greenhouse gas (GHG) emissions from residential consumption. In this paper, we present a survey-based GHG emissions accounting methodology for urban residential consumption, and apply it in Xiamen City, a rapidly urbanizing coastal city in southeast China. Based on this, the main influencing factors determining residential GHG emissions at the household and community scale are identified, and the typical profiles of low, medium and high GHG emission households and communities are identified. Up to 70% of household GHG emissions are from regional and national activities that support household consumption including the supply of energy and building materials, while 17% are from urban level basic services and supplies such as sewage treatment and solid waste management, and only 13% are direct emissions from household consumption. Housing area and household size are the two main factors determining GHG emissions from residential consumption at the household scale, while average housing area and building height were the main factors at the community scale. Our results show a large disparity in GHG emissions profiles among different households, with high GHG emissions households emitting about five times more than low GHG emissions households. Emissions from high GHG emissions communities are about twice as high as from low GHG emissions communities. Our findings can contribute to better tailored and targeted policies aimed at reducing household GHG emissions, and developing low GHG emissions residential communities in China. PMID:23405187

  5. The Job-Oriented Approach to Beginning Accounting

    ERIC Educational Resources Information Center

    Spanswick, Ralph

    1976-01-01

    An instructional approach for high school students, based on employment opportunities, is described in four phases: exploring accounting jobs, the accounting cycle, job training, and job placement. (MS)

  6. Building Student Success Using Problem-Based Learning Approach in the Accounting Classroom

    ERIC Educational Resources Information Center

    Shawver, Todd A.

    2015-01-01

    A major area of concern in academia is that of student retention at the university, college, and departmental levels. As academics, there is a considerable amount that we can do to improve student retention, and reduce the attrition rates in our departments. One way to solve this is to take an innovative approach in the classroom to enhance the…

  7. Information theoretic approach for accounting classification

    NASA Astrophysics Data System (ADS)

    Ribeiro, E. M. S.; Prataviera, G. A.

    2014-12-01

    In this paper we consider an information theoretic approach for the accounting classification process. We propose a matrix formalism and an algorithm for calculations of information theoretic measures associated to accounting classification. The formalism may be useful for further generalizations and computer-based implementation. Information theoretic measures, mutual information and symmetric uncertainty, were evaluated for daily transactions recorded in the chart of accounts of a small company during two years. Variation in the information measures due the aggregation of data in the process of accounting classification is observed. In particular, the symmetric uncertainty seems to be a useful parameter for comparing companies over time or in different sectors or different accounting choices and standards.

  8. Accounting for selection bias in species distribution models: An econometric approach on forested trees based on structural modeling

    NASA Astrophysics Data System (ADS)

    Ay, Jean-Sauveur; Guillemot, Joannès; Martin-StPaul, Nicolas K.; Doyen, Luc; Leadley, Paul

    2015-04-01

    Species distribution models (SDMs) are widely used to study and predict the outcome of global change on species. In human dominated ecosystems the presence of a given species is the result of both its ecological suitability and human footprint on nature such as land use choices. Land use choices may thus be responsible for a selection bias in the presence/absence data used in SDM calibration. We present a structural modelling approach (i.e. based on structural equation modelling) that accounts for this selection bias. The new structural species distribution model (SSDM) estimates simultaneously land use choices and species responses to bioclimatic variables. A land use equation based on an econometric model of landowner choices was joined to an equation of species response to bioclimatic variables. SSDM allows the residuals of both equations to be dependent, taking into account the possibility of shared omitted variables and measurement errors. We provide a general description of the statistical theory and a set of application on forested trees over France using databases of climate and forest inventory at different spatial resolution (from 2km to 8 km). We also compared the output of the SSDM with outputs of a classical SDM in term of bioclimatic response curves and potential distribution under current climate. According to the species and the spatial resolution of the calibration dataset, shapes of bioclimatic response curves the modelled species distribution maps differed markedly between the SSDM and classical SDMs. The magnitude and directions of these differences were dependent on the correlations between the errors from both equations and were highest for higher spatial resolutions. A first conclusion is that the use of classical SDMs can potentially lead to strong miss-estimation of the actual and future probability of presence modelled. Beyond this selection bias, the SSDM we propose represents a crucial step to account for economic constraints on tree

  9. Starworld: Preparing Accountants for the Future: A Case-Based Approach to Teach International Financial Reporting Standards Using ERP Software

    ERIC Educational Resources Information Center

    Ragan, Joseph M.; Savino, Christopher J.; Parashac, Paul; Hosler, Jonathan C.

    2010-01-01

    International Financial Reporting Standards now constitute an important part of educating young professional accountants. This paper looks at a case based process to teach International Financial Reporting Standards using integrated Enterprise Resource Planning software. The case contained within the paper can be used within a variety of courses…

  10. School Centered Evidence Based Accountability

    ERIC Educational Resources Information Center

    Milligan, Charles

    2015-01-01

    Achievement scores drive much of the effort in today's accountability system, however, there is much more that occurs in every school, every day. School Centered Evidence Based Accountability can be used from micro to macro giving School Boards and Administration a process for monitoring the results of the entire school operation effectively and…

  11. Integrated Approach to User Account Management

    NASA Technical Reports Server (NTRS)

    Kesselman, Glenn; Smith, William

    2007-01-01

    IT environments consist of both Windows and other platforms. Providing user account management for this model has become increasingly diffi cult. If Microsoft#s Active Directory could be enhanced to extend a W indows identity for authentication services for Unix, Linux, Java and Macintosh systems, then an integrated approach to user account manag ement could be realized.

  12. Assessing Students' Accounting Knowledge: A Structural Approach.

    ERIC Educational Resources Information Center

    Boldt, Margaret N.

    2001-01-01

    Comparisons of students' representations of financial accounting concepts with the knowledge structures of experts were depicted using Pathfinder networks. This structural approach identified the level of students' understanding of concepts and knowledge gaps that need to be addressed. (SK)

  13. Competency-Based Accounting Instruction

    ERIC Educational Resources Information Center

    Graham, John E.

    1977-01-01

    Shows how the proposed model (an individualized competency based learning system) can be used effectively to produce a course in accounting principles which adapts to different entering competencies and to different rates and styles of learning. (TA)

  14. The National Stream Quality Accounting Network: a flux-based approach to monitoring the water quality of large rivers

    NASA Astrophysics Data System (ADS)

    Hooper, Richard P.; Aulenbach, Brent T.; Kelly, Valerie J.

    2001-05-01

    Estimating the annual mass flux at a network of fixed stations is one approach to characterizing water quality of large rivers. The interpretive context provided by annual flux includes identifying source and sink areas for constituents and estimating the loadings to receiving waters, such as reservoirs or the ocean. Since 1995, the US Geological Survey's National Stream Quality Accounting Network (NASQAN) has employed this approach at a network of 39 stations in four of the largest river basins of the USA: the Mississippi, the Columbia, the Colorado and the Rio Grande. In this paper, the design of NASQAN is described and its effectiveness at characterizing the water quality of these rivers is evaluated using data from the first 3 years of operation. A broad range of constituents was measured by NASQAN, including trace organic and inorganic chemicals, major ions, sediment and nutrients. Where possible, a regression model relating concentration to discharge and season was used to interpolate between chemical observations for flux estimation. For water-quality network design, the most important finding from NASQAN was the importance of having a specific objective (that is, estimating annual mass flux) and, from that, an explicitly stated data analysis strategy, namely the use of regression models to interpolate between observations. The use of such models aided in the design of sampling strategy and provided a context for data review. The regression models essentially form null hypotheses for concentration variation that can be evaluated by the observed data. The feedback between network operation and data collection established by the hypothesis tests places the water-quality network on a firm scientific footing. Published in 2001 by John Wiley & Sons, Ltd.

  15. Approaches to accountability in long-term care.

    PubMed

    Berta, Whitney; Laporte, Audrey; Wodchis, Walter P

    2014-09-01

    This paper discusses the array of approaches to accountability in Ontario long-term care (LTC) homes. A focus group involving key informants from the LTC industry, including both for-profit and not-for-profit nursing home owners/operators, was used to identify stakeholders involved in formulating and implementing LTC accountability approaches and the relevant regulations, policies and initiatives relating to accountability in the LTC sector. These documents were then systematically reviewed. We found that the dominant mechanisms have been financial incentives and oversight, regulations and information; professionalism has played a minor role. More recently, measurement for accountability in LTC has grown to encompass an array of fiscal, clinical and public accountability measurement mechanisms. The goals of improved quality and accountability are likely more achievable using these historical regulatory approaches, but the recent rapid increase in data and measurability could also enable judicious application of market-based approaches. PMID:25305396

  16. Approaches to Accountability in Long-Term Care

    PubMed Central

    Berta, Whitney; Laporte, Audrey; Wodchis, Walter P.

    2014-01-01

    This paper discusses the array of approaches to accountability in Ontario long-term care (LTC) homes. A focus group involving key informants from the LTC industry, including both for-profit and not-for-profit nursing home owners/operators, was used to identify stakeholders involved in formulating and implementing LTC accountability approaches and the relevant regulations, policies and initiatives relating to accountability in the LTC sector. These documents were then systematically reviewed. We found that the dominant mechanisms have been financial incentives and oversight, regulations and information; professionalism has played a minor role. More recently, measurement for accountability in LTC has grown to encompass an array of fiscal, clinical and public accountability measurement mechanisms. The goals of improved quality and accountability are likely more achievable using these historical regulatory approaches, but the recent rapid increase in data and measurability could also enable judicious application of market-based approaches. PMID:25305396

  17. School District Program Cost Accounting: An Alternative Approach

    ERIC Educational Resources Information Center

    Hentschke, Guilbert C.

    1975-01-01

    Discusses the value for school districts of a program cost accounting system and examines different approaches to generating program cost data, with particular emphasis on the "cost allocation to program system" (CAPS) and the traditional "transaction-based system." (JG)

  18. Restorative Justice as Strength-Based Accountability

    ERIC Educational Resources Information Center

    Ball, Robert

    2003-01-01

    This article compares strength-based and restorative justice philosophies for young people and their families. Restorative justice provides ways to respond to crime and harm that establish accountability while seeking to reconcile members of a community. Restorative approaches are an important subset of strength-based interventions.

  19. Teaching the Indirect Method of the Statement of Cash Flows in Introductory Financial Accounting: A Comprehensive, Problem-Based Approach

    ERIC Educational Resources Information Center

    Brickner, Daniel R.; McCombs, Gary B.

    2004-01-01

    In this article, the authors provide an instructional resource for presenting the indirect method of the statement of cash flows (SCF) in an introductory financial accounting course. The authors focus primarily on presenting a comprehensive example that illustrates the "why" of SCF preparation and show how journal entries and T-accounts can be…

  20. Accounting Ethics Education: An Interactive Approach

    ERIC Educational Resources Information Center

    White, Gwendolen B.

    2004-01-01

    An interactive and technological approach was used to discuss ethics with accounting students. Students responded anonymously to ethics questions using wireless transmitters. The students' responses were shown to the group. A customized DVD of movie scenes from "The Producers" and "Wall Street" and a still picture of Enron's…

  1. Teaching Financial Accounting via a Worksheet Approach.

    ERIC Educational Resources Information Center

    Vincent, Vern C.; Dietz, Elizabeth M.

    A classroom research study investigated the effectiveness of an approach to financial accounting instruction that uses worksheets to bring together the conceptual and practical aspects of the field. Students were divided into two groups, one taught by traditional lecture method and the other taught with worksheet exercises and lectures stressing…

  2. A new approach to account for the medium-dependent effect in model-based dose calculations for kilovoltage x-rays

    NASA Astrophysics Data System (ADS)

    Pawlowski, Jason M.; Ding, George X.

    2011-07-01

    This study presents a new approach to accurately account for the medium-dependent effect in model-based dose calculations for kilovoltage (kV) x-rays. This approach is based on the hypothesis that the correction factors needed to convert dose from model-based dose calculations to absorbed dose-to-medium depend on both the attenuation characteristics of the absorbing media and the changes to the energy spectrum of the incident x-rays as they traverse media with an effective atomic number different than that of water. Using Monte Carlo simulation techniques, we obtained empirical medium-dependent correction factors that take both effects into account. We found that the correction factors can be expressed as a function of a single quantity, called the effective bone depth, which is a measure of the amount of bone that an x-ray beam must penetrate to reach a voxel. Since the effective bone depth can be calculated from volumetric patient CT images, the medium-dependent correction factors can be obtained for model-based dose calculations based on patient CT images. We tested the accuracy of this new approach on 14 patients for the case of calculating imaging dose from kilovoltage cone-beam computed tomography used for patient setup in radiotherapy, and compared it with the Monte Carlo method, which is regarded as the 'gold standard'. For all patients studied, the new approach resulted in mean dose errors of less than 3%. This is in contrast to current available inhomogeneity corrected methods, which have been shown to result in mean errors of up to -103% for bone and 8% for soft tissue. Since there is a huge gain in the calculation speed relative to the Monte Carlo method (~two orders of magnitude) with an acceptable loss of accuracy, this approach provides an alternative accurate dose calculation method for kV x-rays.

  3. A partnership approach to learning about accountability.

    PubMed

    Plant, Nigel; Pitt, Richard; Troke, Ben

    Clinicians and healthcare providers are frequently reminded that they are 'accountable' practitioners - but what is the definition of accountability, and how does it apply in a practical and legal context? To clarify these issues, the University of Nottingham School of Nursing has formed a partnership with Browne Jacobson Solicitors. Together they have developed a 7-stage training programme for nursing students which covers the key aspects of accountability, including ethical concepts, the law of negligence, and scenario-based training on being called as a witness in an investigation. This article introduces the implications of accountability and describes the structure and syllabus of the programme, including participants' feedback on the benefits of the experience. PMID:20622781

  4. Bookkeeping and Accounting: The "Time" Approach to Teaching Accounting

    ERIC Educational Resources Information Center

    Mallue, Henry E., Jr.

    1977-01-01

    Describes the "time" approach, a non-traditional method for teaching Bookkeeping I, which redirects the general climate of the first week of class by not introducing crucial balance sheet and journal concepts, but makes use of sections 441 and 446 of the Internal Revenue Code, thereby permitting students to learn the important role "time"…

  5. The Effects of Different Teaching Approaches in Introductory Financial Accounting

    ERIC Educational Resources Information Center

    Chiang, Bea; Nouri, Hossein; Samanta, Subarna

    2014-01-01

    The purpose of the research is to examine the effect of the two different teaching approaches in the first accounting course on student performance in a subsequent finance course. The study compares 128 accounting and finance students who took introductory financial accounting by either a user approach or a traditional preparer approach to examine…

  6. Enhanced Student Learning in Accounting Utilising Web-Based Technology, Peer-Review Feedback and Reflective Practices: A Learning Community Approach to Assessment

    ERIC Educational Resources Information Center

    Taylor, Sue; Ryan, Mary; Pearce, Jon

    2015-01-01

    Higher education is becoming a major driver of economic competitiveness in an increasingly knowledge-driven global economy. Maintaining the competitive edge has seen an increase in public accountability of higher education institutions through the mechanism of ranking universities based on the quality of their teaching and learning outcomes. As a…

  7. The Cyclical Relationship Approach in Teaching Basic Accounting Principles.

    ERIC Educational Resources Information Center

    Golen, Steven

    1981-01-01

    Shows how teachers can provide a more meaningful presentation of various accounting principles by illustrating them through a cyclical relationship approach. Thus, the students see the entire accounting relationship as a result of doing business. (CT)

  8. Risk-Based Educational Accountability in Dutch Primary Education

    ERIC Educational Resources Information Center

    Timmermans, A. C.; de Wolf, I. F.; Bosker, R. J.; Doolaard, S.

    2015-01-01

    A recent development in educational accountability is a risk-based approach, in which intensity and frequency of school inspections vary across schools to make educational accountability more efficient and effective by enabling inspectorates to focus on organizations at risk. Characteristics relevant in predicting which schools are "at risk…

  9. Students' Approaches to Learning in Problem-Based Learning: Taking into Account Professional Behavior in the Tutorial Groups, Self-Study Time, and Different Assessment Aspects

    ERIC Educational Resources Information Center

    Loyens, Sofie M. M.; Gijbels, David; Coertjens, Liesje; Cote, Daniel J.

    2013-01-01

    Problem-based learning (PBL) represents a major development in higher educational practice and is believed to promote deep learning in students. However, empirical findings on the promotion of deep learning in PBL remain unclear. The aim of the present study is to investigate the relationships between students' approaches to learning (SAL) and…

  10. Students' Approaches to Study in Introductory Accounting Courses

    ERIC Educational Resources Information Center

    Elias, Rafik Z.

    2005-01-01

    Significant education research has focused on the study approaches of students. Two study approaches have been clearly identified: deep and surface. In this study, the author examined the way in which students approach studying introductory accounting courses. In general, he found that GPA and expected course grade were correlated positively with…

  11. Problem-Based Learning in Accounting

    ERIC Educational Resources Information Center

    Dockter, DuWayne L.

    2012-01-01

    Seasoned educators use an assortment of student-centered methods and tools to enhance their student's learning environment. In respects to methodologies used in accounting, educators have utilized and created new forms of problem-based learning exercises, including case studies, simulations, and other projects, to help students become more active…

  12. Accounting for Endowment Losses: An Examination of Two Approaches.

    ERIC Educational Resources Information Center

    Jones, M. Paul; Swieringa, Robert J.

    1996-01-01

    Two accounting strategies for classifying college or university endowment losses are compared: reduction of permanently restricted net assets and reduction of unrestricted net assets. The approaches differ in their effects on classification of net assets only when capital losses on endowment investments bring the fund below the level required in…

  13. Intergovernmental Approaches for Strengthening K-12 Accountability Systems

    ERIC Educational Resources Information Center

    Armour-Garb, Allison, Ed.

    2007-01-01

    This volume contains an edited transcript of the Rockefeller Institute's October 29, 2007 symposium (Chicago, IL) entitled "Intergovernmental Approaches to Strengthen K-12 Accountability Systems" as well as a framework paper circulated in preparation for the symposium. The transcript begins with a list of the forty state and federal education…

  14. Arkansas' Curriculum Guide. Competency Based Computerized Accounting.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock. Div. of Vocational, Technical and Adult Education.

    This guide contains the essential parts of a total curriculum for a one-year secondary-level course in computerized accounting. Addressed in the individual sections of the guide are the following topics: the complete accounting cycle, computer operations for accounting, computerized accounting and general ledgers, computerized accounts payable,…

  15. Trade-based carbon sequestration accounting.

    PubMed

    King, Dennis M

    2004-04-01

    This article describes and illustrates an accounting method to assess and compare "early" carbon sequestration investments and trades on the basis of the number of standardized CO2 emission offset credits they will provide. The "gold standard" for such credits is assumed to be a relatively riskless credit based on a CO2 emission reduction that provides offsets against CO2 emissions on a one-for-one basis. The number of credits associated with carbon sequestration needs to account for time, risk, durability, permanence, additionality, and other factors that future trade regulators will most certainly use to assign "official" credits to sequestration projects. The method that is presented here uses established principles of natural resource accounting and conventional rules of asset valuation to "score" projects. A review of 20 "early" voluntary United States based CO2 offset trades that involve carbon sequestration reveals that the assumptions that buyers, sellers, brokers, and traders are using to characterize the economic potential of their investments and trades vary enormously. The article develops a "universal carbon sequestration credit scoring equation" and uses two of these trades to illustrate the sensitivity of trade outcomes to various assumptions about how future trade auditors are likely to "score" carbon sequestration projects in terms of their "equivalency" with CO2 emission reductions. The article emphasizes the importance of using a standard credit scoring method that accounts for time and risk to assess and compare even unofficial prototype carbon sequestration trades. The scoring method illustrated in this article is a tool that can protect the integrity of carbon sequestration credit trading and can assist buyers and sellers in evaluating the real economic potential of prospective trades. PMID:15453408

  16. Accountability for Project-Based Collaborative Learning

    ERIC Educational Resources Information Center

    Jamal, Abu-Hussain; Essawi, Mohammad; Tilchin, Oleg

    2014-01-01

    One perspective model for the creation of the learning environment and engendering students' thinking development is the Project-Based Collaborative Learning (PBCL) model. This model organizes learning by collaborative performance of various projects. In this paper we describe an approach to enhancing the PBCL model through the creation of…

  17. The Value of Information: Approaches in Economics, Accounting, and Management Science.

    ERIC Educational Resources Information Center

    Repo, Aatto J.

    1989-01-01

    This review and analysis of research on the economics of information performed by economists, accounting researchers, and management scientists focuses on their approaches to describing and measuring the value of information. The discussion includes comparisons of research approaches based on cost effectiveness and on the value of information. (77…

  18. Accountability, Student Assessment, and the Need for a Comprehensive Approach

    ERIC Educational Resources Information Center

    Volante, Louis

    2005-01-01

    Accountability has become synonymous with standardized testing in many Western countries such as Canada, the United States, Great Britain and New Zealand. Schools and districts are increasingly ranked based on their students' performance on standardized tests. Unfortunately, standardized testing measures possess a number of limitations that…

  19. An application of model-based reasoning to accounting systems

    SciTech Connect

    Nado, R.; Chams, M.; Delisio, J.; Hamscher, W.

    1996-12-31

    An important problem faced by auditors is gauging how much reliance can be placed on the accounting systems that process millions of transactions to produce the numbers summarized in a company`s financial statements. Accounting systems contain internal controls, procedures designed to detect and correct errors and irregularities that may occur in the processing of transactions. In a complex accounting system, it can be an extremely difficult task for the auditor to anticipate the possible errors that can occur and to evaluate the effectiveness of the controls at detecting them. An accurate analysis must take into account the unique features of each company`s business processes. To cope with this complexity and variability, the Comet system applies a model-based reasoning approach to the analysis of accounting systems and their controls. An auditor uses Comet to create a hierarchical flowchart model that describes the intended processing of business transactions by an accounting system and the operation of its controls. Comet uses the constructed model to automatically analyze the effectiveness of the controls in detecting potential errors. Price Waterhouse auditors have used Comet on a variety of real audits in several countries around the world.

  20. A Comparative Study of a Traditional Approach and a Multimedia Approach to Teaching the Accounting Cycle.

    ERIC Educational Resources Information Center

    Squizzero, William E.

    Because of course difficulty and dryness, students of elementary accounting often have poor grades and manifest low interest and a 40-50% dropout rate. In answer to these problems, an experimental multimedia approach to teaching the accounting cycle was tested with 58 students against a control group of 62. Both groups consisted of male and female…

  1. Accounting for Errors in Model Analysis Theory: A Numerical Approach

    NASA Astrophysics Data System (ADS)

    Sommer, Steven R.; Lindell, Rebecca S.

    2004-09-01

    By studying the patterns of a group of individuals' responses to a series of multiple-choice questions, researchers can utilize Model Analysis Theory to create a probability distribution of mental models for a student population. The eigenanalysis of this distribution yields information about what mental models the students possess, as well as how consistently they utilize said mental models. Although the theory considers the probabilistic distribution to be fundamental, there exists opportunities for random errors to occur. In this paper we will discuss a numerical approach for mathematically accounting for these random errors. As an example of this methodology, analysis of data obtained from the Lunar Phases Concept Inventory will be presented. Limitations and applicability of this numerical approach will be discussed.

  2. Trading Land: A Review of Approaches to Accounting for Upstream Land Requirements of Traded Products

    PubMed Central

    Haberl, Helmut; Kastner, Thomas; Wiedenhofer, Dominik; Eisenmenger, Nina; Erb, Karl‐Heinz

    2015-01-01

    Summary Land use is recognized as a pervasive driver of environmental impacts, including climate change and biodiversity loss. Global trade leads to “telecoupling” between the land use of production and the consumption of biomass‐based goods and services. Telecoupling is captured by accounts of the upstream land requirements associated with traded products, also commonly referred to as land footprints. These accounts face challenges in two main areas: (1) the allocation of land to products traded and consumed and (2) the metrics to account for differences in land quality and land‐use intensity. For two main families of accounting approaches (biophysical, factor‐based and environmentally extended input‐output analysis), this review discusses conceptual differences and compares results for land footprints. Biophysical approaches are able to capture a large number of products and different land uses, but suffer from a truncation problem. Economic approaches solve the truncation problem, but are hampered by the limited disaggregation of sectors and products. In light of the conceptual differences, the overall similarity of results generated by both types of approaches is remarkable. Diametrically opposed results for some of the world's largest producers and consumers of biomass‐based products, however, make interpretation difficult. This review aims to provide clarity on some of the underlying conceptual issues of accounting for land footprints. PMID:27547028

  3. Accountability.

    ERIC Educational Resources Information Center

    The Newsletter of the Comprehensive Center-Region VI, 1999

    1999-01-01

    Controversy surrounding the accountability movement is related to how the movement began in response to dissatisfaction with public schools. Opponents see it as one-sided, somewhat mean-spirited, and a threat to the professional status of teachers. Supporters argue that all other spheres of the workplace have accountability systems and that the…

  4. Accountability.

    ERIC Educational Resources Information Center

    Lashway, Larry

    1999-01-01

    This issue reviews publications that provide a starting point for principals looking for a way through the accountability maze. Each publication views accountability differently, but collectively these readings argue that even in an era of state-mandated assessment, principals can pursue proactive strategies that serve students' needs. James A.…

  5. Teaching Consolidations Accounting: An Approach to Easing the Challenge

    ERIC Educational Resources Information Center

    Murphy, Elizabeth A.; McCarthy, Mark A.

    2010-01-01

    Teaching and learning accounting for consolidations is a challenging endeavor. Students not only need to understand the conceptual underpinnings of the accounting requirements for consolidations, but also must master the complex accounting needed to prepare consolidated financial statements. To add to the challenge, the consolidation process is…

  6. Incentives and Test-Based Accountability in Education

    ERIC Educational Resources Information Center

    Hout, Michael, Ed.; Elliott, Stuart W., Ed.

    2011-01-01

    In recent years there have been increasing efforts to use accountability systems based on large-scale tests of students as a mechanism for improving student achievement. The federal No Child Left Behind Act (NCLB) is a prominent example of such an effort, but it is only the continuation of a steady trend toward greater test-based accountability in…

  7. A Humanistic Approach to South African Accounting Education

    ERIC Educational Resources Information Center

    West, A.; Saunders, S.

    2006-01-01

    Humanistic psychologist Carl Rogers made a distinction between traditional approaches and humanistic "learner-centred" approaches to education. The traditional approach holds that educators impart their knowledge to willing and able recipients; whereas the humanistic approach holds that educators act as facilitators who assist learners in their…

  8. Community-Based School Finance and Accountability: A New Era for Local Control in Education Policy?

    ERIC Educational Resources Information Center

    Vasquez Heilig, Julian; Ward, Derrick R.; Weisman, Eric; Cole, Heather

    2014-01-01

    Top-down accountability policies have arguably had very limited impact over the past 20 years. Education stakeholders are now contemplating new forms of bottom-up accountability. In 2013, policymakers in California enacted a community-based approach that creates the Local Control Funding Formula (LCFF) process for school finance to increase…

  9. Student Accountability in Team-Based Learning Classes

    ERIC Educational Resources Information Center

    Stein, Rachel E.; Colyer, Corey J.; Manning, Jason

    2016-01-01

    Team-based learning (TBL) is a form of small-group learning that assumes stable teams promote accountability. Teamwork promotes communication among members; application exercises promote active learning. Students must prepare for each class; failure to do so harms their team's performance. Therefore, TBL promotes accountability. As part of the…

  10. Test-Based Teacher Evaluations: Accountability vs. Responsibility

    ERIC Educational Resources Information Center

    Bolyard, Chloé

    2015-01-01

    Gert Biesta contends that managerial accountability, which focuses on efficiency and competition, dominates the current political arena in education. Such accountability has influenced states' developments of test-based teacher evaluations in an attempt to quantify teachers' efficacy on student learning. With numerous state policies requiring the…

  11. Individual Health Accounts: An Alternative Health Care Financing Approach

    PubMed Central

    Stano, Miron

    1981-01-01

    After examining the major determinants of inefficiency in health care markets and several recent proposals to correct these problems, this paper introduces a market-oriented alternative which could be highly efficient while meeting all the established goals of a national health plan. To achieve these objectives, traditional forms of insurance would be replaced by a system with the following characteristics: (1) Instead of buying insurance, individuals and their employers would be required to contribute into individual health accounts from which each family would pay for medical care; (2) Once accumulations attain a designated level, any excess accumulations are distributed to the individual; and (3) A national health fund is established to support those without regular accumulations or those whose accounts have been depleted. This paper develops these principles to show how everyone would have access to care as well as the financial security normally associated with comprehensive insurance. But, by inducing many patients to behave as if they were paying for the full cost of care through reductions in potential earnings from their accounts, the paper explains how significant savings in total spending could also be achieved. PMID:10309471

  12. Future Performance Trend Indicators: A Current Value Approach to Human Resources Accounting. Report V: The Value Attribution Process. Technical Report.

    ERIC Educational Resources Information Center

    Lapointe, Jean B.; And Others

    The development of future performance trend indicators is based on the current value approach to human resource accounting. The value attribution portion of the current value approach is used to estimate the dollar value of observed changes in the state of the human organization. The procedure for value attribution includes: prediction of changes…

  13. SNM accounting systems: dBase versus C

    SciTech Connect

    Bearse, R.C.; Tisinger, R.M.; Ballmann, J.S.

    1989-01-01

    The Fuel Manufacturing Facility (FMF) at Argonne National Laboratories-West (ANL-W) in Idaho Falls accomplishes its internal special nuclear material accounting with a PC-based DYnamic Material ACcounting (PC/DYMAC) system developed as a collaboration between FMF and Los Alamos National Laboratory staff members. This system comprises four computers communicating via floppy disks containing transfer information. The accounting software was written in dBase and compiled under Clipper. The decision was made to network the computers and to speed the accounting process. Moreover, it was decided to extend the collaboration to Sandia National Laboratory staff and to incorporate their recently developed CAMUS and WATCH systems to automate data input and to provide a measure of material control. The current version of the code is being translated into the C language. The implications of such a change will be discussed. 9 refs., 3 figs.

  14. 12 CFR 563b.465 - Do account holders retain any voting rights based on their liquidation sub-accounts?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 5 2011-01-01 2011-01-01 false Do account holders retain any voting rights based on their liquidation sub-accounts? 563b.465 Section 563b.465 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY CONVERSIONS FROM MUTUAL TO STOCK FORM Standard Conversions Liquidation Account § 563b.465 Do account...

  15. 12 CFR 192.465 - Do account holders retain any voting rights based on their liquidation sub-accounts?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 1 2013-01-01 2013-01-01 false Do account holders retain any voting rights based on their liquidation sub-accounts? 192.465 Section 192.465 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY CONVERSIONS FROM MUTUAL TO STOCK FORM Standard Conversions Liquidation Account § 192.465 Do account holders...

  16. The utilization of activity-based cost accounting in hospitals.

    PubMed

    Emmett, Dennis; Forget, Robert

    2005-01-01

    Healthcare costs are being examined on all fronts. Healthcare accounts for 11% of the gross national product and will continue to rise as the "babyboomers" reach retirement age. While ascertaining costs is important, most research shows that costing methods have not been implemented in hospitals. This study is concerned with the use of costing methods; particularly activity-based cost accounting. A mail survey of CFOs was undertaken to determine the type of cost accounting method they use. In addition, they were asked whether they were aware of activity-based cost accounting and whether they had implemented it or were planning to implement it. Only 71.8% were aware of it and only 4.7% had implemented it. In addition, only 52% of all hospitals report using any cost accounting systems. Education needs to ensure that all healthcare executives are cognizant of activity-based accounting and its importance in determining costs. Only by determining costs can hospitals strive to contain them. PMID:16201419

  17. Uncertainty-accounted calculational-experimental approach for improved conservative evaluations of VVER RPV radiation loading parameters

    SciTech Connect

    Borodkin, P.G.; Borodkin, G.I.; Khrennikov, N.N.

    2011-07-01

    The approach of improved uncertainty-accounted conservative evaluation of vodo-vodyanoi energetichesky reactor (VVER) (reactor-) pressure-vessel (RPV) radiation loading parameters has been proposed. This approach is based on the calculational-experimental procedure, which takes into account C/E ratio, depending on over- or underestimation, and uncertainties of measured and calculated results. An application of elaborated approach to the full-scale ex-vessel neutron dosimetry experiments on Russian VVERs combined with neutron-transport calculations has been demonstrated in the paper. (authors)

  18. IT Metrics and Money: One Approach to Public Accountability

    ERIC Educational Resources Information Center

    Daigle, Stephen L.

    2004-01-01

    Performance measurement can be a difficult political as well as technical challenge for educational institutions at all levels. Performance-based budgeting can raise the stakes still higher by linking resource allocation to a public "report card." The 23-campus system of the California State University (CSU) accepted each of these accountability…

  19. Accounting for Parameter Uncertainty in Reservoir Uncertainty Assessment: The Conditional Finite-Domain Approach

    SciTech Connect

    Babak, Olena Deutsch, Clayton V.

    2009-03-15

    An important aim of modern geostatistical modeling is to quantify uncertainty in geological systems. Geostatistical modeling requires many input parameters. The input univariate distribution or histogram is perhaps the most important. A new method for assessing uncertainty in the histogram, particularly uncertainty in the mean, is presented. This method, referred to as the conditional finite-domain (CFD) approach, accounts for the size of the domain and the local conditioning data. It is a stochastic approach based on a multivariate Gaussian distribution. The CFD approach is shown to be convergent, design independent, and parameterization invariant. The performance of the CFD approach is illustrated in a case study focusing on the impact of the number of data and the range of correlation on the limiting uncertainty in the parameters. The spatial bootstrap method and CFD approach are compared. As the number of data increases, uncertainty in the sample mean decreases in both the spatial bootstrap and the CFD. Contrary to spatial bootstrap, uncertainty in the sample mean in the CFD approach decreases as the range of correlation increases. This is a direct result of the conditioning data being more correlated to unsampled locations in the finite domain. The sensitivity of the limiting uncertainty relative to the variogram and the variable limits are also discussed.

  20. Accounting for Recoil Effects in Geochronometers: A New Model Approach

    NASA Astrophysics Data System (ADS)

    Lee, V. E.; Huber, C.

    2012-12-01

    dated grain is a major control on the magnitude of recoil loss, the first feature is the ability to calculate recoil effects on isotopic compositions for realistic, complex grain shapes and surface roughnesses. This is useful because natural grains may have irregular shapes that do not conform to simple geometric descriptions. Perhaps more importantly, the surface area over which recoiled nuclides are lost can be significantly underestimated when grain surface roughness is not accounted for, since the recoil distances can be of similar characteristic lengthscales to surface roughness features. The second key feature is the ability to incorporate dynamical geologic processes affecting grain surfaces in natural settings, such as dissolution and crystallization. We describe the model and its main components, and point out implications for the geologically-relevant chronometers mentioned above.

  1. A distributed approach to accounting for carbon in wood products

    SciTech Connect

    Marland, Eric; Stellar, Kirk; Marland, Gregg

    2010-01-01

    With an evolving political environment of commitments to limit emissions of greenhouse gases, and of markets to trade in emissions permits, there is growing scientific, political, and economic need to accurately evaluate carbon (C) stocks and flows especially those related to human activities. One component of the global carbon cycle that has been contentious is the stock of carbon that is physically held in harvested wood products. The carbon stored in wood products has been sometimes overlooked, but the amount of carbon contained in wood products is not trivial, it is increasing with time, and it is significant to some Parties. This paper is concerned with accurate treatment of harvested wood products in inventories of CO2 emissions to the atmosphere. The methodologies outlined demonstrate a flexible way to expand current methods beyond the assumption of a simple, first-order decay to include the use of more accurate and detailed data while retaining the simplicity of simple formulas. The paper demonstrates that a more accurate representation of decay time can have significant economic implications in a system where emissions are taxed or emissions permits are traded. The method can be easily applied using only data on annual production of wood products and two parameters to characterize their expected lifetime. These methods are not specific to wood products but can be applied to long-lived, carbon-containing products from sources other than wood, e.g. long-lived petrochemical products. A single unifying approach that is both simple and flexible has the potential to be both more accurate in its results, more efficient in its implementation, and economically important to some Parties.

  2. On School Choice and Test-Based Accountability

    ERIC Educational Resources Information Center

    Betebenner, Damian W.; Howe, Kenneth R.; Foster, Samara S.

    2005-01-01

    Among the two most prominent school reform measures currently being implemented in The United States are school choice and test-based accountability. Until recently, the two policy initiatives remained relatively distinct from one another. With the passage of the No Child Left Behind Act of 2001 (NCLB), a mutualism between choice and…

  3. Knuckling Under? School Superintendents and Accountability-Based Educational Reform

    ERIC Educational Resources Information Center

    Feuerstein, Abe

    2013-01-01

    The goal of this article is to explore the various ways that superintendents have responded to accountability-based educational reform efforts such as No Child Left Behind, the factors that have influenced their responses, and the implications of these responses for current and future educational leaders. With respect to the first issue, empirical…

  4. Ontology-Based e-Assessment for Accounting Education

    ERIC Educational Resources Information Center

    Litherland, Kate; Carmichael, Patrick; Martínez-García, Agustina

    2013-01-01

    This summary reports on a pilot of a novel, ontology-based e-assessment system in accounting. The system, OeLe, uses emerging semantic technologies to offer an online assessment environment capable of marking students' free text answers to questions of a conceptual nature. It does this by matching their response with a "concept map" or…

  5. A New Approach to Accountability: Creating Effective Learning Environments for Programs

    ERIC Educational Resources Information Center

    Surr, Wendy

    2012-01-01

    This article describes a new paradigm for accountability that envisions afterschool programs as learning organizations continually engaged in improving quality. Nearly 20 years into the era of results-based accountability, a new generation of afterschool accountability systems is emerging. Rather than aiming to test whether programs have produced…

  6. Water Accounting Plus (WA+) - a water accounting procedure for complex river basins based on satellite measurements

    NASA Astrophysics Data System (ADS)

    Karimi, P.; Bastiaanssen, W. G. M.; Molden, D.

    2012-11-01

    Coping with the issue of water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links hydrological flows to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper we introduce Water Accounting Plus (WA+), which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use on the water cycle is described explicitly by defining land use groups with common characteristics. Analogous to financial accounting, WA+ presents four sheets including (i) a resource base sheet, (ii) a consumption sheet, (iii) a productivity sheet, and (iv) a withdrawal sheet. Every sheet encompasses a set of indicators that summarize the overall water resources situation. The impact of external (e.g. climate change) and internal influences (e.g. infrastructure building) can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used for 3 out of the 4 sheets, but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  7. Serving Public Interests in Educational Accountability: Alternative Approaches to Democratic Evaluation

    ERIC Educational Resources Information Center

    Ryan, Katherine E.

    2004-01-01

    Today, educational evaluation theory and practice face a critical juncture with the kind of educational accountability evaluation legislated by No Child Left Behind. While the goal of this kind of educational accountability is to improve education, it is characterized by a hierarchical, top-down approach to improving educational achievement…

  8. Accountability and a Systems Approach to Marital Counseling in the University.

    ERIC Educational Resources Information Center

    Paulson, Donald L., Jr.

    The purpose of this paper was to bring together the concept of educational accountability and a systems approach to delivering marital counseling services to a university community. In so doing a heavy emphasis was placed on outliving the basic assumptions underlying the current movement for educational accountability and presenting one, very…

  9. Failing Tests: Commentary on "Adapting Educational Measurement to the Demands of Test-Based Accountability"

    ERIC Educational Resources Information Center

    Thissen, David

    2015-01-01

    In "Adapting Educational Measurement to the Demands of Test-Based Accountability" Koretz takes the time-honored engineering approach to educational measurement, identifying specific problems with current practice and proposing minimal modifications of the system to alleviate those problems. In response to that article, David Thissen…

  10. Improving hospital cost accounting with activity-based costing.

    PubMed

    Chan, Y C

    1993-01-01

    In this article, activity-based costing, an approach that has proved to be an improvement over the conventional costing system in product costing, is introduced. By combining activity-based costing with standard costing, health care administrators can better plan and control the costs of health services provided while ensuring that the organization's bottom line is healthy. PMID:8444618

  11. Skull base approaches in neurosurgery

    PubMed Central

    2010-01-01

    The skull base surgery is one of the most demanding surgeries. There are different structures that can be injured easily, by operating in the skull base. It is very important for the neurosurgeon to choose the right approach in order to reach the lesion without harming the other intact structures. Due to the pioneering work of Cushing, Hirsch, Yasargil, Krause, Dandy and other dedicated neurosurgeons, it is possible to address the tumor and other lesions in the anterior, the mid-line and the posterior cranial base. With the transsphenoidal, the frontolateral, the pterional and the lateral suboccipital approach nearly every region of the skull base is exposable. In the current state many different skull base approaches are described for various neurosurgical diseases during the last 20 years. The selection of an approach may differ from country to country, e.g., in the United States orbitozygomaticotomy for special lesions of the anterior skull base or petrosectomy for clivus meningiomas, are found more frequently than in Europe. The reason for writing the review was the question: Are there keyhole approaches with which someone can deal with a vast variety of lesions in the neurosurgical field? In my opinion the different surgical approaches mentioned above cover almost 95% of all skull base tumors and lesions. In the following text these approaches will be described. These approaches are: 1) pterional approach 2) frontolateral approach 3) transsphenoidal approach 4) suboccipital lateral approach These approaches can be extended and combined with each other. In the following we want to enhance this philosophy. PMID:20602753

  12. Ecological accounting based on extended exergy: a sustainability perspective.

    PubMed

    Dai, Jing; Chen, Bin; Sciubba, Enrico

    2014-08-19

    The excessive energy consumption, environmental pollution, and ecological destruction problems have gradually become huge obstacles for the development of societal-economic-natural complex ecosystems. Regarding the national ecological-economic system, how to make explicit the resource accounting, diagnose the resource conversion, and measure the disturbance of environmental emissions to the systems are the fundamental basis of sustainable development and coordinated management. This paper presents an extended exergy (EE) accounting including the material exergy and exergy equivalent of externalities consideration in a systematic process from production to consumption, and China in 2010 is chosen as a case study to foster an in-depth understanding of the conflict between high-speed development and the available resources. The whole society is decomposed into seven sectors (i.e., Agriculture, Extraction, Conversion, Industry, Transportation, Tertiary, and Domestic sectors) according to their distinct characteristics. An adaptive EE accounting database, which incorporates traditional energy, renewable energy, mineral element, and other natural resources as well as resource-based secondary products, is constructed on the basis of the internal flows in the system. In addition, the environmental emission accounting has been adjusted to calculate the externalities-equivalent exergy. The results show that the EE value for the year 2010 in China was 1.80 × 10(14) MJ, which is greatly increased. Furthermore, an EE-based sustainability indices system has been established to provide an epitomized exploration for evaluating the performance of flows and storages with the system from a sustainability perspective. The value of the EE-based sustainability indicator was calculated to be 0.23, much lower than the critical value of 1, implying that China is still developing in the stages of high energy consumption and a low sustainability level. PMID:25062284

  13. Water Accounting Plus (WA+) - a water accounting procedure for complex river basins based on satellite measurements

    NASA Astrophysics Data System (ADS)

    Karimi, P.; Bastiaanssen, W. G. M.; Molden, D.

    2013-07-01

    Coping with water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links depletion to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper, we introduce Water Accounting Plus (WA+), which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use and landscape evapotranspiration on the water cycle is described explicitly by defining land use groups with common characteristics. WA+ presents four sheets including (i) a resource base sheet, (ii) an evapotranspiration sheet, (iii) a productivity sheet, and (iv) a withdrawal sheet. Every sheet encompasses a set of indicators that summarise the overall water resources situation. The impact of external (e.g., climate change) and internal influences (e.g., infrastructure building) can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used to acquire a vast amount of required data but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  14. An Inter-Institutional Exploration of the Learning Approaches of Students Studying Accounting

    ERIC Educational Resources Information Center

    Byrne, Marann; Flood, Barbara; Willis, Pauline

    2009-01-01

    This paper provides a comparative analysis of the learning approaches of students taking their first course in accounting at a United States or an Irish university. The data for this study was gathered from 204 students in the U.S. and 309 in Ireland, using the Approaches and Study Skills Inventory for Students (ASSIST, 1997) which measures…

  15. Consumption-based accounting of CO2 emissions

    PubMed Central

    Davis, Steven J.; Caldeira, Ken

    2010-01-01

    CO2 emissions from the burning of fossil fuels are the primary cause of global warming. Much attention has been focused on the CO2 directly emitted by each country, but relatively little attention has been paid to the amount of emissions associated with the consumption of goods and services in each country. Consumption-based accounting of CO2 emissions differs from traditional, production-based inventories because of imports and exports of goods and services that, either directly or indirectly, involve CO2 emissions. Here, using the latest available data, we present a global consumption-based CO2 emissions inventory and calculations of associated consumption-based energy and carbon intensities. We find that, in 2004, 23% of global CO2 emissions, or 6.2 gigatonnes CO2, were traded internationally, primarily as exports from China and other emerging markets to consumers in developed countries. In some wealthy countries, including Switzerland, Sweden, Austria, the United Kingdom, and France, >30% of consumption-based emissions were imported, with net imports to many Europeans of >4 tons CO2 per person in 2004. Net import of emissions to the United States in the same year was somewhat less: 10.8% of total consumption-based emissions and 2.4 tons CO2 per person. In contrast, 22.5% of the emissions produced in China in 2004 were exported, on net, to consumers elsewhere. Consumption-based accounting of CO2 emissions demonstrates the potential for international carbon leakage. Sharing responsibility for emissions among producers and consumers could facilitate international agreement on global climate policy that is now hindered by concerns over the regional and historical inequity of emissions. PMID:20212122

  16. Building performance-based accountability with limited empirical evidence: performance measurement for public health preparedness.

    PubMed

    Shelton, Shoshana R; Nelson, Christopher D; McLees, Anita W; Mumford, Karen; Thomas, Craig

    2013-08-01

    Efforts to respond to performance-based accountability mandates for public health emergency preparedness have been hindered by a weak evidence base linking preparedness activities with response outcomes. We describe an approach to measure development that was successfully implemented in the Centers for Disease Control and Prevention Public Health Emergency Preparedness Cooperative Agreement. The approach leverages insights from process mapping and experts to guide measure selection, and provides mechanisms for reducing performance-irrelevant variation in measurement data. Also, issues are identified that need to be addressed to advance the science of measurement in public health emergency preparedness. PMID:24229520

  17. Desired emotions across cultures: A value-based account.

    PubMed

    Tamir, Maya; Schwartz, Shalom H; Cieciuch, Jan; Riediger, Michaela; Torres, Claudio; Scollon, Christie; Dzokoto, Vivian; Zhou, Xiaolu; Vishkin, Allon

    2016-07-01

    Values reflect how people want to experience the world; emotions reflect how people actually experience the world. Therefore, we propose that across cultures people desire emotions that are consistent with their values. Whereas prior research focused on the desirability of specific affective states or 1 or 2 target emotions, we offer a broader account of desired emotions. After reporting initial evidence for the potential causal effects of values on desired emotions in a preliminary study (N = 200), we tested the predictions of our proposed model in 8 samples (N = 2,328) from distinct world cultural regions. Across cultural samples, we found that people who endorsed values of self-transcendence (e.g., benevolence) wanted to feel more empathy and compassion, people who endorsed values of self-enhancement (e.g., power) wanted to feel more anger and pride, people who endorsed values of openness to change (e.g., self-direction) wanted to feel more interest and excitement, and people who endorsed values of conservation (e.g., tradition) wanted to feel more calmness and less fear. These patterns were independent of differences in emotional experience. We discuss the implications of our value-based account of desired emotions for understanding emotion regulation, culture, and other individual differences. (PsycINFO Database Record PMID:26524003

  18. A School-Based Work Experience for Accounting Students.

    ERIC Educational Resources Information Center

    Lannan, Beverly

    2001-01-01

    The accounting lab at the Pinkerton Academy in Derry, New Hampshire, encompasses the learning of theory, concepts, skills, technology, and work-related competencies, and gives students the opportunity to explore the accounting field.(JOW)

  19. Use of Resources, People and Approaches by Accounting Students in a Blending Learning Environment

    ERIC Educational Resources Information Center

    O'Keefe, Patricia; Rienks, Jane H.; Smith, Bernadette

    2014-01-01

    This research investigates how students used or "blended" the various learning resources, including people,while studying a compulsory, first year accounting unit. The unit design incorporated a blended learning approach. The study was motivated by perceived low rates of attendance and low levels of communication with lecturers which…

  20. Pension Accounting and Reporting with Other Comprehensive Income and Deferred Taxes: A Worksheet Approach

    ERIC Educational Resources Information Center

    Jackson, Robert E.; Sneathen, L. Dwight, Jr.; Veal, Timothy R.

    2012-01-01

    This instructional tool presents pension accounting using a worksheet approach where debits equal credits for both the employer and for the plan. Transactions associated with the initiation of the plan through the end of the second year of the plan are presented, including their impact on accumulated other comprehensive income and deferred taxes.…

  1. A Comparison of the Learning Approaches of Accounting and Science Students at an Irish University

    ERIC Educational Resources Information Center

    Byrne, Marann; Finlayson, Odilla; Flood, Barbara; Lyons, Orla; Willis, Pauline

    2010-01-01

    One of the major challenges facing accounting education is the creation of a learning environment that promotes high-quality learning. Comparative research across disciplines offers educators the opportunity to gain a better understanding of the influence of contextual and personal variables on students' learning approaches. Using the Approaches…

  2. 76 FR 20974 - Implications of Climate Change for Bioassessment Programs and Approaches To Account for Effects

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-14

    ...EPA is announcing that Eastern Research Group, Inc. (ERG), an EPA contractor for external scientific peer review, will convene an independent panel of experts and organize and conduct an external peer review workshop to review the external review draft report titled, ``Implications of Climate Change for Bioassessment Programs and Approaches to Account for Effects'' (EPA/600/R-11/036A) and its......

  3. An Internet-Based Accounting Information Systems Project

    ERIC Educational Resources Information Center

    Miller, Louise

    2012-01-01

    This paper describes a student project assignment used in an accounting information systems course. We are now truly immersed in the internet age, and while many required accounting information systems courses and textbooks introduce database design, accounting software development, cloud computing, and internet security, projects involving the…

  4. Communication: A combined periodic density functional and incremental wave-function-based approach for the dispersion-accounting time-resolved dynamics of {sup 4}He nanodroplets on surfaces: {sup 4}He/graphene

    SciTech Connect

    Lara-Castells, María Pilar de; Stoll, Hermann; Civalleri, Bartolomeo; Causà, Mauro; Voloshina, Elena; Mitrushchenkov, Alexander O.; Pi, Martí

    2014-10-21

    In this work we propose a general strategy to calculate accurate He–surface interaction potentials. It extends the dispersionless density functional approach recently developed by Pernal et al. [Phys. Rev. Lett. 103, 263201 (2009)] to adsorbate-surface interactions by including periodic boundary conditions. We also introduce a scheme to parametrize the dispersion interaction by calculating two- and three-body dispersion terms at coupled cluster singles and doubles and perturbative triples (CCSD(T)) level via the method of increments [H. Stoll, J. Chem. Phys. 97, 8449 (1992)]. The performance of the composite approach is tested on {sup 4}He/graphene by determining the energies of the low-lying selective adsorption states, finding an excellent agreement with the best available theoretical data. Second, the capability of the approach to describe dispersionless correlation effects realistically is used to extract dispersion effects in time-dependent density functional simulations on the collision of {sup 4}He droplets with a single graphene sheet. It is found that dispersion effects play a key role in the fast spreading of the {sup 4}He nanodroplet, the evaporation-like process of helium atoms, and the formation of solid-like helium structures. These characteristics are expected to be quite general and highly relevant to explain experimental measurements with the newly developed helium droplet mediated deposition technique.

  5. A behavior-analytic account of depression and a case report using acceptance-based procedures

    PubMed Central

    Dougher, Michael J.; Hackbert, Lucianne

    1994-01-01

    Although roughly 6% of the general population is affected by depression at some time during their lifetime, the disorder has been relatively neglected by behavior analysts. The preponderance of research on the etiology and treatment of depression has been conducted by cognitive behavior theorists and biological psychiatrists and psychopharmacologists interested in the biological substrates of depression. These approaches have certainly been useful, but their reliance on cognitive and biological processes and their lack of attention to environment—behavior relations render them unsatisfactory from a behavior-analytic perspective. The purpose of this paper is to provide a behavior-analytic account of depression and to derive from this account several possible treatment interventions. In addition, case material is presented to illustrate an acceptance-based approach with a depressed client. PMID:22478195

  6. How do the approaches to accountability compare for charities working in international development?

    PubMed

    Kirsch, David

    2014-09-01

    Approaches to accountability vary between charities working to reduce under-five mortality in underdeveloped countries, and healthcare workers and facilities in Canada. Comparison reveals key differences, similarities and trade-offs. For example, while health professionals are governed by legislation and healthcare facilities have a de facto obligation to be accredited, charities and other international organizations are not subject to mandatory international laws or guidelines or to de facto international standards. Charities have policy goals similar to those found in the Canadian substudies, including access, quality, cost control, cost-effectiveness and customer satisfaction. However, the relative absence of external policy tools means that these goals may not be realized. Accountability can be beneficial, but too much or the wrong kind of accountability can divert resources and diminish returns. PMID:25305397

  7. Creating Meaningful Accountability through Web-Based Electronic NCATE Exhibits.

    ERIC Educational Resources Information Center

    Salzman, Stephanie; Zimmerly, Chuck

    This paper presents the proactive steps taken by Idaho State University to address accountability in teacher education. The university addressed accountability mandates and new accreditation standards through a Web site (http://www.ed.isu.edu) that includes electronic documents providing evidence of meeting National Council for the Accreditation…

  8. Performance-Based Accountability: Newarks Charter School Experience.

    ERIC Educational Resources Information Center

    Callahan, Kathe; Sadovnik, Alan; Visconti, Louisa

    This study assessed how New Jersey's state accountability system encouraged or thwarted charter school success, how effectively performance standards were defined and enacted by authorizing agents, and how individual charter schools were developing accountability processes that made them more or less successful than their charter school…

  9. Computer-Based Instruction in Accounting Using the CREATE System.

    ERIC Educational Resources Information Center

    Henkle, Edward B.; Robertson, Kenneth W.

    The Graduate Logistics program of the United States Air Force (USAF) Institute of Technology has required that prospective students show a satisfactory level of competence in basic accounting procedures before entering the program. The purpose of this thesis was to develop accounting case problems for use with the CREATE computer system that would…

  10. Comparison of joint versus postprocessor approaches for hydrological uncertainty estimation accounting for error autocorrelation and heteroscedasticity

    NASA Astrophysics Data System (ADS)

    Evin, Guillaume; Thyer, Mark; Kavetski, Dmitri; McInerney, David; Kuczera, George

    2014-03-01

    The paper appraises two approaches for the treatment of heteroscedasticity and autocorrelation in residual errors of hydrological models. Both approaches use weighted least squares (WLS), with heteroscedasticity modeled as a linear function of predicted flows and autocorrelation represented using an AR(1) process. In the first approach, heteroscedasticity and autocorrelation parameters are inferred jointly with hydrological model parameters. The second approach is a two-stage "postprocessor" scheme, where Stage 1 infers the hydrological parameters ignoring autocorrelation and Stage 2 conditionally infers the heteroscedasticity and autocorrelation parameters. These approaches are compared to a WLS scheme that ignores autocorrelation. Empirical analysis is carried out using daily data from 12 US catchments from the MOPEX set using two conceptual rainfall-runoff models, GR4J, and HBV. Under synthetic conditions, the postprocessor and joint approaches provide similar predictive performance, though the postprocessor approach tends to underestimate parameter uncertainty. However, the MOPEX results indicate that the joint approach can be nonrobust. In particular, when applied to GR4J, it often produces poor predictions due to strong multiway interactions between a hydrological water balance parameter and the error model parameters. The postprocessor approach is more robust precisely because it ignores these interactions. Practical benefits of accounting for error autocorrelation are demonstrated by analyzing streamflow predictions aggregated to a monthly scale (where ignoring daily-scale error autocorrelation leads to significantly underestimated predictive uncertainty), and by analyzing one-day-ahead predictions (where accounting for the error autocorrelation produces clearly higher precision and better tracking of observed data). Including autocorrelation into the residual error model also significantly affects calibrated parameter values and uncertainty estimates. The

  11. A shape-based account for holistic face processing.

    PubMed

    Zhao, Mintao; Bülthoff, Heinrich H; Bülthoff, Isabelle

    2016-04-01

    Faces are processed holistically, so selective attention to 1 face part without any influence of the others often fails. In this study, 3 experiments investigated what type of facial information (shape or surface) underlies holistic face processing and whether generalization of holistic processing to nonexperienced faces requires extensive discrimination experience. Results show that facial shape information alone is sufficient to elicit the composite face effect (CFE), 1 of the most convincing demonstrations of holistic processing, whereas facial surface information is unnecessary (Experiment 1). The CFE is eliminated when faces differ only in surface but not shape information, suggesting that variation of facial shape information is necessary to observe holistic face processing (Experiment 2). Removing 3-dimensional (3D) facial shape information also eliminates the CFE, indicating the necessity of 3D shape information for holistic face processing (Experiment 3). Moreover, participants show similar holistic processing for faces with and without extensive discrimination experience (i.e., own- and other-race faces), suggesting that generalization of holistic processing to nonexperienced faces requires facial shape information, but does not necessarily require further individuation experience. These results provide compelling evidence that facial shape information underlies holistic face processing. This shape-based account not only offers a consistent explanation for previous studies of holistic face processing, but also suggests a new ground-in addition to expertise-for the generalization of holistic processing to different types of faces and to nonface objects. (PsycINFO Database Record PMID:26371495

  12. Acid/base account and minesoils: A review

    SciTech Connect

    Hossner, L.R.; Brandt, J.E.

    1997-12-31

    Generation of acidity from the oxidation of iron sulfides (FeS{sub 2}) is a common feature of geological materials exposed to the atmosphere by mining activities. Acid/base accounting (ABA) has been the primary method to evaluate the acid- or alkaline-potential of geological materials and to predict if weathering of these materials will have an adverse effect on terrestrial and aquatic environments. The ABA procedure has also been used to evaluate minesoils at different stages of weathering and, in some cases, to estimate lime requirements. Conflicting assessments of the methodology have been reported in the literature. The ABA is the fastest and easiest way to evaluate the acid-forming characteristics of overburden materials; however, accurate evaluations sometimes require that ABA data be examined in conjunction with additional sample information and results from other analytical procedures. The end use of ABA data, whether it be for minesoil evaluation or water quality prediction, will dictate the method`s interpretive criteria. Reaction kinetics and stoichiometry may vary and are not clearly defined for all situations. There is an increasing awareness of the potential for interfering compounds, particularly siderite (FeCO{sub 3}), to be present in geological materials associated with coal mines. Hardrock mines, with possible mixed sulfide mineralogy, offer a challenge to the ABA, since acid generation may be caused by minerals other than pyrite. A combination of methods, static and kinetic, is appropriate to properly evaluate the presence of acid-forming materials.

  13. Territory management an appropriate approach for taking into account dynamic risks

    NASA Astrophysics Data System (ADS)

    Fernandez, M.; Ruegg, J.

    2012-04-01

    The territorial approach in risk analysis is well established in scientific communications in recent years, especially in the francophone literature. It is an especially appropriate approach for exploring a large number of criteria and factors influencing, on the territory, the composition of the vulnerabilities and risks. In these sense, this approach is appropriate to identify not only risks due to natural hazards but also social and environmental risks. Our case study explores the catastrophic landslide, a collapse of 6 millions cubic meters of rock in Los Chorros, in the municipality of San Cristobal Verapaz-Guatemala, in January 2009. We demonstrate that the same natural hazard has different consequences within this territory and may also increase or even create new vulnerabilities and risks for the population. The analysis shows that the same event can endanger various aspects of the territory: resources, functions (agriculture, or houses uses for example) and allocations and highlights the different types of vulnerabilities that land users (i.e., farmers, merchants transport drivers) face. To resolve a post-disaster situation, the actors choose one vulnerability among a set of vulnerabilities (in a multi-vulnerability context) and with this choice they define their own acceptable risk limits. To give an example, the transport driver choose to reduce the economic vulnerability when going to the local market and crossing the landslide (physical vulnerability). In the context of a developing country with weak development and limited resources, land users that become the Risk managers after the disaster are compelled to prioritize between different actions for reducing risks This study provides a novel approach to risk management by adding a political science and geography dimension through the territory approach for improving our understanding of multi-hazard and multi-risk management. Based on findings from this case study, this work asserts that risk is not

  14. A Grid storage accounting system based on DGAS and HLRmon

    NASA Astrophysics Data System (ADS)

    Cristofori, A.; Fattibene, E.; Gaido, L.; Guarise, A.; Veronesi, P.

    2012-12-01

    Accounting in a production-level Grid infrastructure is of paramount importance in order to measure the utilization of the available resources. While several CPU accounting systems are deployed within the European Grid Infrastructure (EGI), storage accounting systems, stable enough to be adopted in a production environment are not yet available. As a consequence, there is a growing interest in storage accounting and work on this is being carried out in the Open Grid Forum (OGF) where a Usage Record (UR) definition suitable for storage resources has been proposed for standardization. In this paper we present a storage accounting system which is composed of three parts: a sensor layer, a data repository with a transport layer (Distributed Grid Accounting System - DGAS) and a web portal providing graphical and tabular reports (HLRmon). The sensor layer is responsible for the creation of URs according to the schema (described in this paper) that is currently being discussed within OGF. DGAS is one of the CPU accounting systems used within EGI, in particular by the Italian Grid Infrastructure (IGI) and some other National Grid Initiatives (NGIs) and projects. DGAS architecture is evolving in order to collect Usage Records for different types of resources. This improvement allows DGAS to be used as a ‘general’ data repository and transport layer. HLRmon is the web portal acting as an interface to DGAS. It has been improved to retrieve storage accounting data from the DGAS repository and create reports in an easy way. This is very useful not only for the Grid users and administrators but also for the stakeholders.

  15. Accounting Faculty Utilization of Web-Based Resources to Enhance In-Class Instruction

    ERIC Educational Resources Information Center

    Black, Thomas G.; Turetsky, Howard F.

    2010-01-01

    Our study examines the extent to which accounting faculty use web-based resources to augment classroom instruction. Moreover, we explore the effects of the institutional factors of accounting accreditation and the existence of an accounting Ph.D. program on internet use by accounting academics toward enhancing pedagogy, while controlling for the…

  16. Integrating Mission-Based Values into Accounting Curriculum: Catholic Social Teaching and Introductory Accounting

    ERIC Educational Resources Information Center

    Hise, Joan Vane; Koeplin, John P.

    2010-01-01

    This paper presents several reasons why mission-based values, in this case Catholic Social Teaching (CST), should be incorporated into a university business curriculum. The CST tenets include the sanctity of human life; call to family, community, and participation; rights and responsibilities; option for the poor and vulnerable; the dignity of…

  17. Standards-Based Accountability: Reification, Responsibility and the Ethical Subject

    ERIC Educational Resources Information Center

    Kostogriz, Alex; Doecke, Brenton

    2011-01-01

    Over the last two decades, teachers in Australia have witnessed multiple incarnations of the idea of "educational accountability" and its enactment. Research into this phenomenon of educational policy and practice has revealed various layers of the concept, particularly its professional, bureaucratic, political and cultural dimensions that are…

  18. Holding Schools Accountable: Performance-Based Reform in Education.

    ERIC Educational Resources Information Center

    Ladd, Helen F., Ed.

    Many people believe that future reforms of education should focus on the primary mission of elementary and secondary schools and that these schools must be held more accountable for the academic performance of their students. This book brings together researchers from various disciplines--most notably economics, educational policy and management,…

  19. Statistical approaches to account for false-positive errors in environmental DNA samples.

    PubMed

    Lahoz-Monfort, José J; Guillera-Arroita, Gurutzeta; Tingley, Reid

    2016-05-01

    Environmental DNA (eDNA) sampling is prone to both false-positive and false-negative errors. We review statistical methods to account for such errors in the analysis of eDNA data and use simulations to compare the performance of different modelling approaches. Our simulations illustrate that even low false-positive rates can produce biased estimates of occupancy and detectability. We further show that removing or classifying single PCR detections in an ad hoc manner under the suspicion that such records represent false positives, as sometimes advocated in the eDNA literature, also results in biased estimation of occupancy, detectability and false-positive rates. We advocate alternative approaches to account for false-positive errors that rely on prior information, or the collection of ancillary detection data at a subset of sites using a sampling method that is not prone to false-positive errors. We illustrate the advantages of these approaches over ad hoc classifications of detections and provide practical advice and code for fitting these models in maximum likelihood and Bayesian frameworks. Given the severe bias induced by false-negative and false-positive errors, the methods presented here should be more routinely adopted in eDNA studies. PMID:26558345

  20. An enhanced nonlinear damping approach accounting for system constraints in active mass dampers

    NASA Astrophysics Data System (ADS)

    Venanzi, Ilaria; Ierimonti, Laura; Ubertini, Filippo

    2015-11-01

    Active mass dampers are a viable solution for mitigating wind-induced vibrations in high-rise buildings and improve occupants' comfort. Such devices suffer particularly when they reach force saturation of the actuators and maximum extension of their stroke, which may occur in case of severe loading conditions (e.g. wind gust and earthquake). Exceeding actuators' physical limits can impair the control performance of the system or even lead to devices damage, with consequent need for repair or substitution of part of the control system. Controllers for active mass dampers should account for their technological limits. Prior work of the authors was devoted to stroke issues and led to the definition of a nonlinear damping approach, very easy to implement in practice. It consisted of a modified skyhook algorithm complemented with a nonlinear braking force to reverse the direction of the mass before reaching the stroke limit. This paper presents an enhanced version of this approach, also accounting for force saturation of the actuator and keeping the simplicity of implementation. This is achieved by modulating the control force by a nonlinear smooth function depending on the ratio between actuator's force and saturation limit. Results of a numerical investigation show that the proposed approach provides similar results to the method of the State Dependent Riccati Equation, a well-established technique for designing optimal controllers for constrained systems, yet very difficult to apply in practice.

  1. Pharmacokinetic Modeling of Manganese III. Physiological Approaches Accounting for Background and Tracer Kinetics

    SciTech Connect

    Teeguarden, Justin G.; Gearhart, Jeffrey; Clewell, III, H. J.; Covington, Tammie R.; Nong, Andy; Anderson, Melvin E.

    2007-01-01

    Manganese (Mn) is an essential nutrient. Mn deficiency is associated with altered lipid (Kawano et al. 1987) and carbohydrate metabolism (Baly et al. 1984; Baly et al. 1985), abnormal skeletal cartilage development (Keen et al. 2000), decreased reproductive capacity, and brain dysfunction. Occupational and accidental inhalation exposures to aerosols containing high concentrations of Mn produce neurological symptoms with Parkinson-like characteristics in workers. At present, there is also concern about use of the manganese-containing compound, methylcyclopentadienyl manganese tricarbonyl (MMT), in unleaded gasoline as an octane enhancer. Combustion of MMT produces aerosols containing a mixture of manganese salts (Lynam et al. 1999). These Mn particulates may be inhaled at low concentrations by the general public in areas using MMT. Risk assessments for essential elements need to acknowledge that risks occur with either excesses or deficiencies and the presence of significant amounts of these nutrients in the body even in the absence of any exogenous exposures. With Mn there is an added complication, i.e., the primary risk is associated with inhalation while Mn is an essential dietary nutrient. Exposure standards for inhaled Mn will need to consider the substantial background uptake from normal ingestion. Andersen et al. (1999) suggested a generic approach for essential nutrient risk assessment. An acceptable exposure limit could be based on some ‘tolerable’ change in tissue concentration in normal and exposed individuals, i.e., a change somewhere from 10 to 25 % of the individual variation in tissue concentration seen in a large human population. A reliable multi-route, multi-species pharmacokinetic model would be necessary for the implementation of this type of dosimetry-based risk assessment approach for Mn. Physiologically-based pharmacokinetic (PBPK) models for various xenobiotics have proven valuable in contributing to a variety of chemical specific risk

  2. Australian Rural Accountants' Views on How Locally Provided CPD Compares with City-Based Provision

    ERIC Educational Resources Information Center

    Halabi, Abdel K.

    2015-01-01

    This paper analyses Australian rural accountants' attitudes and levels of satisfaction with continuing professional development (CPD), based on whether the CPD was delivered by a professional accounting body in a rural or metropolitan area. The paper responds to prior research that finds rural accountants are dissatisfied with professional…

  3. HLRmon: a role-based grid accounting report web tool

    NASA Astrophysics Data System (ADS)

    Pra, S. D.; Fattibene, E.; Misurelli, G.; Pescarmona, F.; Gaido, L.

    2008-07-01

    Both Grid users and Grid operators need ways to get CPU usage statistics about jobs executed in a given time period at various different levels, depending on their specific Grid's role and rights. While a Grid user is interested in reports about its own jobs and should not get access to other's data, Site or Virtual Organization (VO) or Regional Operation Centre (ROC) manager would also like to see how resources are used through the Grid in a per Site or per VO basis, or both. The whole set of different reports turns out to be quite large, and various existing tools made to create them tend to better satisfy a single user's category, eventually despite of another. HLRmon[1] results from our efforts to generate suitable reports for all existing categories and has been designed to serve them within a unified layout. Thanks to its ability to authenticate clients through certificate and related authorization rights, it can a-priori restrict the selectable items range offered to the web user, so that sensitive information can only be provided to specifically enabled people. Information are gathered by HLRmon from a Home Location Register (HLR) which stores complete accounting data in a per job basis. Depending on the kind of reports that are to be generated, it directly queries the HLR server using an ad-hoc Distributed Grid Accounting System (DGAS[2]) query tool (tipically user's level detail info), or a local RDBMS table with daily aggregate information in a per Day, Site, VO basis, thus saving connection delay time and needless load on the HLR server.

  4. Accounting for water management issues within hydrological simulation: Alternative modelling options and a network optimization approach

    NASA Astrophysics Data System (ADS)

    Efstratiadis, Andreas; Nalbantis, Ioannis; Rozos, Evangelos; Koutsoyiannis, Demetris

    2010-05-01

    In mixed natural and artificialized river basins, many complexities arise due to anthropogenic interventions in the hydrological cycle, including abstractions from surface water bodies, groundwater pumping or recharge and water returns through drainage systems. Typical engineering approaches adopt a multi-stage modelling procedure, with the aim to handle the complexity of process interactions and the lack of measured abstractions. In such context, the entire hydrosystem is separated into natural and artificial sub-systems or components; the natural ones are modelled individually, and their predictions (i.e. hydrological fluxes) are transferred to the artificial components as inputs to a water management scheme. To account for the interactions between the various components, an iterative procedure is essential, whereby the outputs of the artificial sub-systems (i.e. abstractions) become inputs to the natural ones. However, this strategy suffers from multiple shortcomings, since it presupposes that pure natural sub-systems can be located and that sufficient information is available for each sub-system modelled, including suitable, i.e. "unmodified", data for calibrating the hydrological component. In addition, implementing such strategy is ineffective when the entire scheme runs in stochastic simulation mode. To cope with the above drawbacks, we developed a generalized modelling framework, following a network optimization approach. This originates from the graph theory, which has been successfully implemented within some advanced computer packages for water resource systems analysis. The user formulates a unified system which is comprised of the hydrographical network and the typical components of a water management network (aqueducts, pumps, junctions, demand nodes etc.). Input data for the later include hydraulic properties, constraints, targets, priorities and operation costs. The real-world system is described through a conceptual graph, whose dummy properties

  5. Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century

    ERIC Educational Resources Information Center

    Bejar, Isaac I.; Graf, E. Aurora

    2010-01-01

    The duplex design by Bock and Mislevy for school-based testing is revisited and evaluated as a potential platform in test-based accountability assessments today. We conclude that the model could be useful in meeting the many competing demands of today's test-based accountability assessments, although many research questions will need to be…

  6. Attributing patients to accountable care organizations: performance year approach aligns stakeholders' interests.

    PubMed

    Lewis, Valerie A; McClurg, Asha Belle; Smith, Jeremy; Fisher, Elliott S; Bynum, Julie P W

    2013-03-01

    The accountable care organization (ACO) model of health care delivery is rapidly being implemented under government and private-sector initiatives. The model requires that each ACO have a defined patient population for which the ACO will be held accountable for both total cost of care and quality performance. However, there is no empirical evidence about the best way to define how patients are assigned to these groups of doctors, hospitals, and other health care providers. We examined the two major methods of defining, or attributing, patient populations to ACOs: the prospective method and the performance year method. The prospective method uses data from one year to assign patients to an ACO for the following performance year. The performance year method assigns patients to an ACO at the end of the performance year based on the population served during the performance year. We used Medicare fee-for-service claims data from 2008 and 2009 to simulate a set of ACOs to compare the two methods. Although both methods have benefits and drawbacks, we found that attributing patients using the performance year method yielded greater overlap of attributed patients and patients treated during the performance year and resulted in a higher proportion of care concentrated within an accountable care organization. Together, these results suggest that performance year attribution may more fully and accurately reflect an ACO's patient population and may better position an ACO to achieve shared savings. PMID:23459739

  7. Materiality in a Practice-Based Approach

    ERIC Educational Resources Information Center

    Svabo, Connie

    2009-01-01

    Purpose: The paper aims to provide an overview of the vocabulary for materiality which is used by practice-based approaches to organizational knowing. Design/methodology/approach: The overview is theoretically generated and is based on the anthology Knowing in Organizations: A Practice-based Approach edited by Nicolini, Gherardi and Yanow. The…

  8. Accounting for cross-sectoral linkages of climate change impacts based on multi-model projections

    NASA Astrophysics Data System (ADS)

    Frieler, Katja

    2013-04-01

    Understanding how natural and human systems will be affected by climate change is not possible without accounting for cascading effects across different sectors. However, cross-sectoral inter-linkages remain strongly underrepresented in model-based assessments of climate change impacts. Based on the currently unique cross-sectoral multi-model data set generated for ISI-MIP (the first Inter-Sectoral Impact Model Intercomparison Project), we investigate climate-induced adaptation pressures on the global food production system, taking into account cross-sectoral co-limitations and response options, and quantifying uncertainties due to different model categories involved (climate-, crop-, hydrology-, ecosystem-models). Results from 7 global crop models are synthesised to analyse changes in global wheat, maize, rice, and soy production as a function of global mean warming, on current agricultural land. To integrate constraints on the availability of water we propose a simple approach to estimate the maximum possible increase in global production based on limitations of renewable irrigation water as projected by 11 global hydrological models. The effect is compared to the production increase due to land-use changes as suggested by the demand fulfilling agro-economic model MAgPIE. While providing production increases the extension of farmland exerts a strong pressure on natural vegetation systems. This pressure is again compared to the pressure on natural vegetation that is induced by climate change itself. The analysis will provide a cross sectoral synthesis of the ISI-MIP results.

  9. Accounting for image uncertainty in SAR-based flood mapping

    NASA Astrophysics Data System (ADS)

    Giustarini, L.; Vernieuwe, H.; Verwaeren, J.; Chini, M.; Hostache, R.; Matgen, P.; Verhoest, N. E. C.; De Baets, B.

    2015-02-01

    Operational flood mitigation and flood modeling activities benefit from a rapid and automated flood mapping procedure. A valuable information source for such a flood mapping procedure can be remote sensing synthetic aperture radar (SAR) data. In order to be reliable, an objective characterization of the uncertainty associated with the flood maps is required. This work focuses on speckle uncertainty associated with the SAR data and introduces the use of a non-parametric bootstrap method to take into account this uncertainty on the resulting flood maps. From several synthetic images, constructed through bootstrapping the original image, flood maps are delineated. The accuracy of these flood maps is also evaluated w.r.t. an independent validation data set, obtaining, in the two test cases analyzed in this paper, F-values (i.e. values of the Jaccard coefficient) comprised between 0.50 and 0.65. This method is further compared to an image segmentation method for speckle analysis, with which similar results are obtained. The uncertainty analysis of the ensemble of bootstrapped synthetic images was found to be representative of image speckle, with the advantage that no segmentation and speckle estimations are required. Furthermore, this work assesses to what extent the bootstrap ensemble size can be reduced while remaining representative of the original ensemble, as operational applications would clearly benefit from such reduced ensemble sizes.

  10. A user-friendly approach to cost accounting in laboratory animal facilities.

    PubMed

    Baker, David G

    2011-09-01

    Cost accounting is an essential management activity for laboratory animal facility management. In this report, the author describes basic principles of cost accounting and outlines steps for carrying out cost accounting in laboratory animal facilities. Methods of post hoc cost accounting analysis for maximizing the efficiency of facility operations are also described. PMID:21857645

  11. Accounting for non-independent detection when estimating abundance of organisms with a Bayesian approach

    USGS Publications Warehouse

    Martin, Julien; Royle, J. Andrew; MacKenzie, Darryl I.; Edwards, Holly H.; Kery, Marc; Gardner, Beth

    2011-01-01

    Summary 1. Binomial mixture models use repeated count data to estimate abundance. They are becoming increasingly popular because they provide a simple and cost-effective way to account for imperfect detection. However, these models assume that individuals are detected independently of each other. This assumption may often be violated in the field. For instance, manatees (Trichechus manatus latirostris) may surface in turbid water (i.e. become available for detection during aerial surveys) in a correlated manner (i.e. in groups). However, correlated behaviour, affecting the non-independence of individual detections, may also be relevant in other systems (e.g. correlated patterns of singing in birds and amphibians). 2. We extend binomial mixture models to account for correlated behaviour and therefore to account for non-independent detection of individuals. We simulated correlated behaviour using beta-binomial random variables. Our approach can be used to simultaneously estimate abundance, detection probability and a correlation parameter. 3. Fitting binomial mixture models to data that followed a beta-binomial distribution resulted in an overestimation of abundance even for moderate levels of correlation. In contrast, the beta-binomial mixture model performed considerably better in our simulation scenarios. We also present a goodness-of-fit procedure to evaluate the fit of beta-binomial mixture models. 4. We illustrate our approach by fitting both binomial and beta-binomial mixture models to aerial survey data of manatees in Florida. We found that the binomial mixture model did not fit the data, whereas there was no evidence of lack of fit for the beta-binomial mixture model. This example helps illustrate the importance of using simulations and assessing goodness-of-fit when analysing ecological data with N-mixture models. Indeed, both the simulations and the goodness-of-fit procedure highlighted the limitations of the standard binomial mixture model for aerial

  12. Mindfulness meditation-based pain relief: a mechanistic account.

    PubMed

    Zeidan, Fadel; Vago, David R

    2016-06-01

    Pain is a multidimensional experience that involves interacting sensory, cognitive, and affective factors, rendering the treatment of chronic pain challenging and financially burdensome. Further, the widespread use of opioids to treat chronic pain has led to an opioid epidemic characterized by exponential growth in opioid misuse and addiction. The staggering statistics related to opioid use highlight the importance of developing, testing, and validating fast-acting nonpharmacological approaches to treat pain. Mindfulness meditation is a technique that has been found to significantly reduce pain in experimental and clinical settings. The present review delineates findings from recent studies demonstrating that mindfulness meditation significantly attenuates pain through multiple, unique mechanisms-an important consideration for the millions of chronic pain patients seeking narcotic-free, self-facilitated pain therapy. PMID:27398643

  13. Accountability to Public Stakeholders in Watershed-Based Restoration

    EPA Science Inventory

    There is an increasing push at the federal, state, and local levels for watershed-based conservation projects. These projects work to address water quality issues in degraded waterways through the implementation of a suite of best management practices on land throughout a watersh...

  14. School-Based Budgets: Getting, Spending, and Accounting.

    ERIC Educational Resources Information Center

    Herman, Jerry L.; Herman, Janice L.

    With the advent of large interest in school-based management came the task of inventing a different type of budgeting system--one that delegated the many tasks of developing a budget, expending the allocated funds, and controlling those expenditures in a way that did not exceed the allocation to the site level. This book explores the various means…

  15. Performance-Based Measurement: Action for Organizations and HPT Accountability

    ERIC Educational Resources Information Center

    Larbi-Apau, Josephine A.; Moseley, James L.

    2010-01-01

    Basic measurements and applications of six selected general but critical operational performance-based indicators--effectiveness, efficiency, productivity, profitability, return on investment, and benefit-cost ratio--are presented. With each measurement, goals and potential impact are explored. Errors, risks, limitations to measurements, and a…

  16. Students' Concern about Indebtedness: A Rank Based Social Norms Account

    ERIC Educational Resources Information Center

    Aldrovandi, Silvio; Wood, Alex M.; Maltby, John; Brown, Gordon D. A.

    2015-01-01

    This paper describes a new model of students' concern about indebtedness within a rank-based social norms framework. Study 1 found that students hold highly variable beliefs about how much other students will owe at the end of their degree. Students' concern about their own anticipated debt--and their intention of taking on a part-time job during…

  17. One Paradox in District Accountability and Site-Based Management.

    ERIC Educational Resources Information Center

    Shellman, David W.

    The paradox of site-based school management with use of standardized tests or instructional management systems that restrict teacher choices was evident in one school district in North Carolina in which measurement of student success has centered on student performance on state-mandated tests. A study was conducted to see if students whose…

  18. Impacts of Performance-Based Accountability on Institutional Performance in the U.S.

    ERIC Educational Resources Information Center

    Shin, Jung Cheol

    2010-01-01

    In the 1990s, most US states adopted new forms of performance-based accountability, e.g., performance-based budgeting, funding, or reporting. This study analyzed changes in institutional performance following the adoption of these new accountability standards. We measured institutional performance by representative education and research…

  19. Left behind By Design: Proficiency Counts and Test-Based Accountability. NBER Working Paper No. 13293

    ERIC Educational Resources Information Center

    Neal, Derek; Schanzenbach, Diane Whitmore

    2007-01-01

    Many test-based accountability systems, including the No Child Left Behind Act of 2001 (NCLB), place great weight on the numbers of students who score at or above specified proficiency levels in various subjects. Accountability systems based on these metrics often provide incentives for teachers and principals to target children near current…

  20. A blue/green water-based accounting framework for assessment of water security

    NASA Astrophysics Data System (ADS)

    Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.

    2014-09-01

    A comprehensive assessment of water security can incorporate several water-related concepts, while accounting for Blue and Green Water (BW and GW) types defined in accordance with the hydrological processes involved. Here we demonstrate how a quantitative analysis of provision probability and use of BW and GW can be conducted, so as to provide indicators of water scarcity and vulnerability at the basin level. To illustrate the approach, we use the Soil and Water Assessment Tool (SWAT) to model the hydrology of an agricultural basin (291 km2) within the Cantareira Water Supply System in Brazil. To provide a more comprehensive basis for decision making, we analyze the BW and GW-Footprint components against probabilistic levels (50th and 30th percentile) of freshwater availability for human activities, during a 23 year period. Several contrasting situations of BW provision are distinguished, using different hydrological-based methodologies for specifying monthly Environmental Flow Requirements (EFRs), and the risk of natural EFR violation is evaluated by use of a freshwater provision index. Our results reveal clear spatial and temporal patterns of water scarcity and vulnerability levels within the basin. Taking into account conservation targets for the basin, it appears that the more restrictive EFR methods are more appropriate than the method currently employed at the study basin. The blue/green water-based accounting framework developed here provides a useful integration of hydrologic, ecosystem and human needs information on a monthly basis, thereby improving our understanding of how and where water-related threats to human and aquatic ecosystem security can arise.

  1. A Comparative Study of the Effect of Web-Based versus In-Class Textbook Ethics Instruction on Accounting Students' Propensity to Whistle-Blow

    ERIC Educational Resources Information Center

    McManus, Lisa; Subramaniam, Nava; James, Wendy

    2012-01-01

    The authors examined whether accounting students' propensity to whistle-blow differed between those instructed through a web-based teaching module and those exposed to a traditional in-class textbook-focused approach. A total of 156 students from a second-year financial accounting course participated in the study. Ninety students utilized the…

  2. A New Approach to Account for the Correlations among Single Nucleotide Polymorphisms in Genome-Wide Association Studies

    PubMed Central

    Chen, Zhongxue; Liu, Qingzhong

    2011-01-01

    In genetic association studies, such as genome-wide association studies (GWAS), the number of single nucleotide polymorphisms (SNPs) can be as large as hundreds of thousands. Due to linkage disequilibrium, many SNPs are highly correlated; assuming they are independent is not valid. The commonly used multiple comparison methods, such as Bonferroni correction, are not appropriate and are too conservative when applied to GWAS. To overcome these limitations, many approaches have been proposed to estimate the so-called effective number of independent tests to account for the correlations among SNPs. However, many current effective number estimation methods are based on eigenvalues of the correlation matrix. When the dimension of the matrix is large, the numeric results may be unreliable or even unobtainable. To circumvent this obstacle and provide better estimates, we propose a new effective number estimation approach which is not based on the eigenvalues. We compare the new method with others through simulated and real data. The comparison results show that the proposed method has very good performance. PMID:21849789

  3. Accounting for Heaping in Retrospectively Reported Event Data – A Mixture-Model Approach

    PubMed Central

    Bar, Haim Y.; Lillard, Dean R.

    2012-01-01

    When event data are retrospectively reported, more temporally distal events tend to get “heaped” on even multiples of reporting units. Heaping may introduce a type of attenuation bias because it causes researchers to mismatch time-varying right-hand side variables. We develop a model-based approach to estimate the extent of heaping in the data, and how it affects regression parameter estimates. We use smoking cessation data as a motivating example, but our method is general. It facilitates the use of retrospective data from the multitude of cross-sectional and longitudinal studies worldwide that collect and potentially could collect event data. PMID:22733577

  4. ASPEN--A Web-Based Application for Managing Student Server Accounts

    ERIC Educational Resources Information Center

    Sandvig, J. Christopher

    2004-01-01

    The growth of the Internet has greatly increased the demand for server-side programming courses at colleges and universities. Students enrolled in such courses must be provided with server-based accounts that support the technologies that they are learning. The process of creating, managing and removing large numbers of student server accounts is…

  5. Performance-Based Accountability Program: 1993-94 School Year Report.

    ERIC Educational Resources Information Center

    North Carolina State Board of Education, Raleigh.

    The Performance-Based Accountability Program was enacted by the North Carolina General Assembly in 1989 as part of the School Improvement and Accountability Act. This report addresses five components of the program and includes appendixes that provide detailed data for several of the components. The Flexible Funding Provision is made for the…

  6. 24 CFR 990.280 - Project-based budgeting and accounting.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... in this subpart (including project-based management, budgeting, and accounting). Asset management... accounting. 990.280 Section 990.280 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN... URBAN DEVELOPMENT THE PUBLIC HOUSING OPERATING FUND PROGRAM Asset Management § 990.280...

  7. 24 CFR 990.280 - Project-based budgeting and accounting.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... in this subpart (including project-based management, budgeting, and accounting). Asset management... accounting. 990.280 Section 990.280 Housing and Urban Development Regulations Relating to Housing and Urban... URBAN DEVELOPMENT THE PUBLIC HOUSING OPERATING FUND PROGRAM Asset Management § 990.280...

  8. 24 CFR 990.280 - Project-based budgeting and accounting.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... in this subpart (including project-based management, budgeting, and accounting). Asset management... accounting. 990.280 Section 990.280 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN... URBAN DEVELOPMENT THE PUBLIC HOUSING OPERATING FUND PROGRAM Asset Management § 990.280...

  9. 24 CFR 990.280 - Project-based budgeting and accounting.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... in this subpart (including project-based management, budgeting, and accounting). Asset management... accounting. 990.280 Section 990.280 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN... URBAN DEVELOPMENT THE PUBLIC HOUSING OPERATING FUND PROGRAM Asset Management § 990.280...

  10. 24 CFR 990.280 - Project-based budgeting and accounting.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... in this subpart (including project-based management, budgeting, and accounting). Asset management... accounting. 990.280 Section 990.280 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN... URBAN DEVELOPMENT THE PUBLIC HOUSING OPERATING FUND PROGRAM Asset Management § 990.280...

  11. The Effect of Web-Based Collaborative Learning Methods to the Accounting Courses in Technical Education

    ERIC Educational Resources Information Center

    Cheng, K. W. Kevin

    2009-01-01

    This study mainly explored the effect of applying web-based collaborative learning instruction to the accounting curriculum on student's problem-solving attitudes in Technical Education. The research findings and proposed suggestions would serve as a reference for the development of accounting-related curricula and teaching strategies. To achieve…

  12. Measure for Measure: How Proficiency-Based Accountability Systems Affect Inequality in Academic Achievement

    ERIC Educational Resources Information Center

    Jennings, Jennifer; Sohn, Heeju

    2014-01-01

    How do proficiency-based accountability systems affect inequality in academic achievement? This article reconciles mixed findings in the literature by demonstrating that three factors jointly determine accountability's impact. First, by analyzing student-level data from a large urban school district, we find that when educators face…

  13. The Tension between Teacher Accountability and Flexibility: The Paradox of Standards-Based Reform

    ERIC Educational Resources Information Center

    Nadelson, Louis S.; Fuller, Michael; Briggs, Pamela; Hammons, David; Bubak, Katie; Sass, Margaret

    2012-01-01

    The anticipated constraints imposed by the accountability process associated with standards-based reform on teachers' practice suggest a tension between teachers' desire for flexibility and the accountability mandates associated with reform initiatives. In particular, we posited that the teachers would negatively perceive the influence of…

  14. A network identity authentication protocol of bank account system based on fingerprint identification and mixed encryption

    NASA Astrophysics Data System (ADS)

    Zhu, Lijuan; Liu, Jingao

    2013-07-01

    This paper describes a network identity authentication protocol of bank account system based on fingerprint identification and mixed encryption. This protocol can provide every bank user a safe and effective way to manage his own bank account, and also can effectively prevent the hacker attacks and bank clerk crime, so that it is absolute to guarantee the legitimate rights and interests of bank users.

  15. Does Participation in a Computer-Based Learning Program in Introductory Financial Accounting Course Lead to Choosing Accounting as a Major?

    ERIC Educational Resources Information Center

    Owhoso, Vincent; Malgwi, Charles A.; Akpomi, Margaret

    2014-01-01

    The authors examine whether students who completed a computer-based intervention program, designed to help them develop abilities and skills in introductory accounting, later declared accounting as a major. A sample of 1,341 students participated in the study, of which 74 completed the intervention program (computer-based assisted learning [CBAL])…

  16. A Review of Financial Accounting Fraud Detection based on Data Mining Techniques

    NASA Astrophysics Data System (ADS)

    Sharma, Anuj; Kumar Panigrahi, Prabin

    2012-02-01

    With an upsurge in financial accounting fraud in the current economic scenario experienced, financial accounting fraud detection (FAFD) has become an emerging topic of great importance for academic, research and industries. The failure of internal auditing system of the organization in identifying the accounting frauds has lead to use of specialized procedures to detect financial accounting fraud, collective known as forensic accounting. Data mining techniques are providing great aid in financial accounting fraud detection, since dealing with the large data volumes and complexities of financial data are big challenges for forensic accounting. This paper presents a comprehensive review of the literature on the application of data mining techniques for the detection of financial accounting fraud and proposes a framework for data mining techniques based accounting fraud detection. The systematic and comprehensive literature review of the data mining techniques applicable to financial accounting fraud detection may provide a foundation to future research in this field. The findings of this review show that data mining techniques like logistic models, neural networks, Bayesian belief network, and decision trees have been applied most extensively to provide primary solutions to the problems inherent in the detection and classification of fraudulent data.

  17. [Orbitozygomatic approaches to the skull base].

    PubMed

    Cherekaev, V A; Gol'bin, D A; Belov, A I; Radchenkov, N S; Lasunin, N V; Vinokurov, A G

    2015-01-01

    The paper is written in the lecture format and dedicated to one of the main basal approaches, the orbitozygomatic approach, that has been widely used by neurosurgeons for several decades. The authors describe the historical background of the approach development and the surgical technique features and also analyze the published data about application of the orbitozygomatic approach in surgery for skull base tumors and cerebral aneurysms. PMID:26529627

  18. A Total Quality Management Approach to Assurance of Learning in the Accounting Classroom: An Empirical Study

    ERIC Educational Resources Information Center

    Harvey, Mary Ellen; Eisner, Susan

    2011-01-01

    The research presented in this paper seeks to discern which combination of pedagogical tools most positively impact student learning of the introductory Accounting curriculum in the Principles of Accounting courses in a 4-year U.S. public college. This research topic is relevant because it helps address a quandary many instructors experience: how…

  19. Working toward More Engaged and Successful Accounting Students: A Balanced Scorecard Approach

    ERIC Educational Resources Information Center

    Fredin, Amy; Fuchsteiner, Peter; Portz, Kris

    2015-01-01

    Prior research indicates that student engagement is the key to student success, as measured by college grades, degree completion, and graduate school enrollment. We propose a set of goals and objectives for accounting students, in particular, to help them become engaged not only in the educational process, but also in the accounting profession.…

  20. A Blue/Green Water-based Accounting Framework for Assessment of Water Security

    NASA Astrophysics Data System (ADS)

    Rodrigues, D. B.; Gupta, H. V.; Mendiondo, E. M.

    2013-12-01

    A comprehensive assessment of water security can incorporate several water-related concepts, including provisioning and support for freshwater ecosystem services, water footprint, water scarcity, and water vulnerability, while accounting for Blue and Green Water (BW and GW) flows defined in accordance with the hydrological processes involved. Here, we demonstrate how a quantitative analysis of provisioning and demand (in terms of water footprint) for BW and GW ecosystem services can be conducted, so as to provide indicators of water scarcity and vulnerability at the basin level. To illustrate the approach, we use the Soil and Water Assessment Tool (SWAT) to model the hydrology of an agricultural basin (291 sq.km) within the Cantareira water supply system in Brazil. To provide a more comprehensive basis for decision-making, we compute the BW provision using three different hydrological-based methods for specifying monthly Environmental Flow Requirements (EFRs) for 23 year-period. The current BW-Footprint was defined using surface water rights for reference year 2012. Then we analyzed the BW- and GW-Footprints against long-term series of monthly values of freshwater availability. Our results reveal clear spatial and temporal patterns of water scarcity and vulnerability levels within the basin, and help to distinguish between human and natural reasons (drought) for conditions of insecurity. The Blue/Green water-based accounting framework developed here can be benchmarked at a range of spatial scales, thereby improving our understanding of how and where water-related threats to human and aquatic ecosystem security can arise. Future investigation will be necessary to better understand the intra-annual variability of blue water demand and to evaluate the impacts of uncertainties associated with a) the water rights database, b) the effects of climate change projections on blue and green freshwater provision.

  1. Analyzing the Operation of Performance-Based Accountability Systems for Public Services. Technical Report

    ERIC Educational Resources Information Center

    Camm, Frank; Stecher, Brian M.

    2010-01-01

    Empirical evidence of the effects of performance-based public management is scarce. This report describes a framework used to organize available empirical information on one form of performance-based management, a performance-based accountability system (PBAS). Such a system identifies individuals or organizations that must change their behavior…

  2. Financial Management Reforms in the Health Sector: A Comparative Study Between Cash-based and Accrual-based Accounting Systems

    PubMed Central

    Abolhallaje, Masoud; Jafari, Mehdi; Seyedin, Hesam; Salehi, Masoud

    2014-01-01

    Background: Financial management and accounting reform in the public sectors was started in 2000. Moving from cash-based to accrual-based is considered as the key component of these reforms and adjustments in the public sector. Performing this reform in the health system is a part of a bigger reform under the new public management. Objectives: The current study aimed to analyze the movement from cash-based to accrual-based accounting in the health sector in Iran. Patients and Methods: This comparative study was conducted in 2013 to compare financial management and movement from cash-based to accrual-based accounting in health sector in the countries such as the United States, Britain, Canada, Australia, New Zealand, and Iran. Library resources and reputable databases such as Medline, Elsevier, Index Copernicus, DOAJ, EBSCO-CINAHL and SID, and Iranmedex were searched. Fish cards were used to collect the data. Data were compared and analyzed using comparative tables. Results: Developed countries have implemented accrual-based accounting and utilized the valid, reliable and practical information in accrual-based reporting in different areas such as price and tariffs setting, operational budgeting, public accounting, performance evaluation and comparison and evidence based decision making. In Iran, however, only a few public organizations such as the municipalities and the universities of medical sciences use accrual-based accounting, but despite what is required by law, the other public organizations do not use accrual-based accounting. Conclusions: There are advantages in applying accrual-based accounting in the public sector which certainly depends on how this system is implemented in the sector. PMID:25763194

  3. Safe Maneuvering Envelope Estimation Based on a Physical Approach

    NASA Technical Reports Server (NTRS)

    Lombaerts, Thomas J. J.; Schuet, Stefan R.; Wheeler, Kevin R.; Acosta, Diana; Kaneshige, John T.

    2013-01-01

    This paper discusses a computationally efficient algorithm for estimating the safe maneuvering envelope of damaged aircraft. The algorithm performs a robust reachability analysis through an optimal control formulation while making use of time scale separation and taking into account uncertainties in the aerodynamic derivatives. This approach differs from others since it is physically inspired. This more transparent approach allows interpreting data in each step, and it is assumed that these physical models based upon flight dynamics theory will therefore facilitate certification for future real life applications.

  4. Contrasting motivational orientation and evaluative coding accounts: on the need to differentiate the effectors of approach/avoidance responses.

    PubMed

    Kozlik, Julia; Neumann, Roland; Lozo, Ljubica

    2015-01-01

    Several emotion theorists suggest that valenced stimuli automatically trigger motivational orientations and thereby facilitate corresponding behavior. Positive stimuli were thought to activate approach motivational circuits which in turn primed approach-related behavioral tendencies whereas negative stimuli were supposed to activate avoidance motivational circuits so that avoidance-related behavioral tendencies were primed (motivational orientation account). However, recent research suggests that typically observed affective stimulus-response compatibility phenomena might be entirely explained in terms of theories accounting for mechanisms of general action control instead of assuming motivational orientations to mediate the effects (evaluative coding account). In what follows, we explore to what extent this notion is applicable. We present literature suggesting that evaluative coding mechanisms indeed influence a wide variety of affective stimulus-response compatibility phenomena. However, the evaluative coding account does not seem to be sufficient to explain affective S-R compatibility effects. Instead, several studies provide clear evidence in favor of the motivational orientation account that seems to operate independently of evaluative coding mechanisms. Implications for theoretical developments and future research designs are discussed. PMID:25983718

  5. Beyond Traditional Literacy Instruction: Toward an Account-Based Literacy Training Curriculum in Libraries

    ERIC Educational Resources Information Center

    Cirella, David

    2012-01-01

    A diverse group, account-based services include a wide variety of sites commonly used by patrons, including online shopping sites, social networks, photo- and video-sharing sites, banking and financial sites, government services, and cloud-based storage. Whether or not a piece of information is obtainable online must be considered when creating…

  6. Accounting for water-column variability in shallow-water waveguide characterizations based on modal eigenvalues

    SciTech Connect

    Becker, Kyle M.; Ballard, Megan S.

    2010-09-06

    The influence of water-column variability on the characterization of shallow-water waveguides using modal eigenvalue information is considered. This work is based on the relationship between the acoustic pressure field in shallow water and the depth-dependent Green's function through the Hankel transform. In many practical situations, the Hankel transform can be approximated by a Fourier transform, in which case the Green's function is approximated by a horizontal wave number spectrum with discrete peaks corresponding with individual modal eigenvalues. In turn, the wave number data can be used in inverse algorithms to determine geoacoustic properties of the waveguide. Wave number spectra are estimated from measurements of a point-source acoustic field on a horizontal aperture array in the water column. For range-dependent waveguides, techniques analogous to using a short-time Fourier transform are employed to estimate range-dependent wave number spectra. In this work, water-column variability due to linear internal waves and mesoscale features are considered. It will be shown that these two types of variability impact the estimation of range-dependent modal eigenvalues in different ways. Approaches for accounting for these different types of variability will be discussed as they apply to waveguide characterization.

  7. Market-Based and Authorizer-Based Accountability Demands and the Implications for Charter School Leadership

    ERIC Educational Resources Information Center

    Blitz, Mark H.

    2011-01-01

    Charter school research has examined the relationship between charter school mission and issues of school accountability. However, there is a lack of research focusing on how charter school leaders frame and solve problems regarding multiple accountability demands. Given this gap, I investigate the question: How do charter school leaders…

  8. Peer-Mentoring Undergraduate Accounting Students: The Influence on Approaches to Learning and Academic Performance

    ERIC Educational Resources Information Center

    Fox, Alison; Stevenson, Lorna; Connelly, Patricia; Duff, Angus; Dunlop, Angela

    2010-01-01

    This article considers the impact of a student peer-mentoring programme (the Mentor Accountant Project, MAP) on first-year undergraduates' academic performance. The development of MAP was informed by reference to extant literature; it relies on the voluntary services of third-year students who then act as mentors to first-year student mentees in…

  9. The Demand for Higher Education: A Static Structural Approach Accounting for Individual Heterogeneity and Nesting Patterns

    ERIC Educational Resources Information Center

    Flannery, Darragh; O'Donoghue, Cathal

    2013-01-01

    In this paper we estimate a structural model of higher education participation and labour choices in a static setting that accounts for individual heterogeneity and possible nesting structures in the decision process. We assume that young people that complete upper secondary education are faced with three choices, go to higher education, not go to…

  10. Using NCATE 2000 Standards as State Accreditation Standards: A Beginner's and a Veteran's Approach to Accountability.

    ERIC Educational Resources Information Center

    Bennett, J. Phillip; Gage, Martha S.; Craven, Lonnie; Morrison, Gail

    This paper discusses how the 2000 standards of the National Council for the Accreditation of Teacher Education (NCATE) can be used in state accreditation standards to promote accountability. It describes standards 1 and 2: candidate knowledge, skills, and dispositions and assessment system and unit evaluation, then presents a few examples of…

  11. Bursar Accounts, Payroll Deduction, and Debt Collection: A Three-Channel Approach to Lost Item Reimbursement

    ERIC Educational Resources Information Center

    Snowman, Ann MacKay

    2005-01-01

    In 2003, Penn State Libraries implemented payroll deduction and collection agency programs to gain better control of accounts receivable. The author reports on the implementation processes and first year outcomes of the programs. She recommends careful consideration of several questions before implementing such measures.

  12. The Usage of an Online Discussion Forum for the Facilitation of Case-Based Learning in an Intermediate Accounting Course: A New Zealand Case

    ERIC Educational Resources Information Center

    Weil, Sidney; McGuigan, Nicholas; Kern, Thomas

    2011-01-01

    This paper describes the implementation of an online discussion forum as a means of facilitating case-based learning in an intermediate financial accounting course. The paper commences with a review of case-based learning literature and the use of online discussions as a delivery platform, linking these pedagogical approaches to the emerging needs…

  13. Survival analysis approach to account for non-exponential decay rate effects in lifetime experiments

    NASA Astrophysics Data System (ADS)

    Coakley, K. J.; Dewey, M. S.; Huber, M. G.; Huffer, C. R.; Huffman, P. R.; Marley, D. E.; Mumm, H. P.; O`Shaughnessy, C. M.; Schelhammer, K. W.; Thompson, A. K.; Yue, A. T.

    2016-03-01

    In experiments that measure the lifetime of trapped particles, in addition to loss mechanisms with exponential survival probability functions, particles can be lost by mechanisms with non-exponential survival probability functions. Failure to account for such loss mechanisms produces systematic measurement error and associated systematic uncertainties in these measurements. In this work, we develop a general competing risks survival analysis method to account for the joint effect of loss mechanisms with either exponential or non-exponential survival probability functions, and a method to quantify the size of systematic effects and associated uncertainties for lifetime estimates. As a case study, we apply our survival analysis formalism and method to the Ultra Cold Neutron lifetime experiment at NIST. In this experiment, neutrons can escape a magnetic trap before they decay due to a wall loss mechanism with an associated non-exponential survival probability function.

  14. A Principles-Based Approach to Teaching International Financial Reporting Standards (IFRS)

    ERIC Educational Resources Information Center

    Persons, Obeua

    2014-01-01

    This article discusses the principles-based approach that emphasizes a "why" question by using the International Accounting Standards Board (IASB) "Conceptual Framework for Financial Reporting" to question and understand the basis for specific differences between IFRS and U.S. generally accepted accounting principles (U.S.…

  15. Evaluating the effectiveness of community health partnerships: a stakeholder accountability approach.

    PubMed

    Weech-Maldonado, Robert; Benson, Keith J; Gamm, Larry D

    2003-01-01

    Community health partnerships (CHPs) are promoted as effective cooperative interorganizational relationships to improve community health status while conserving resources. However, relatively little is known about the effectiveness of these partnerships in achieving their goals. Using concepts from a network effectiveness framework (Provan and Milward, 2001) and a network accountability framework (Gamm, 1998), the authors propose that successful CHPs are those that are effective in multiple levels (community, network, organization/particpants) and/or accountability dimensions (political, commercial, clinical/patient, and community). The combined frameworks serve to identify a number of community health stakeholders and associated interests that vary according to accountability dimensions to which CHPs respond. Using survey data from over 400 participants in 25 Community Care Networks, the authors assess the usefulness of the conceptual framework in evaluating CHP effectiveness. The results suggest that CHP participants recognize three different levels of analysis in their evaluation of network effectiveness: community, network, and organization/participant. Furthermore, the results show that respondents distinguish between two different organization/participant benefits: enabling and client services. While respondents rated the intangible resources or enabling benefits (e.g., legitimacy and learning) of partnership participation most highly, client services resulting from CHP participation (e.g., client services and referrals) received the lowest ratings. Community benefit (e.g., improving community health status) and network effectiveness (e.g., ability to provide efficient, high quality health and human services) received ratings that fall between the enabling and client services. Given the relatively good scores (above 60%) received by CHPs on all four effectiveness dimensions considered here, it appears that the majority of respondents find at least some

  16. Spatial pattern of nitrogen deposition flux over Czech forests: a novel approach accounting for unmeasured nitrogen species

    NASA Astrophysics Data System (ADS)

    Hůnová, Iva; Stoklasová, Petra; Kurfürst, Pavel; Vlček, Ondřej; Schovánková, Jana; Stráník, Vojtěch

    2015-04-01

    atmospheric nitrogen deposition flux over the Czech forests collating all available data and model results. The aim of the presented study is to provide an improved, more reliable and more realistic estimate of spatial pattern of nitrogen deposition flux over one country. This has so far been based standardly on measurements of ambient N/NOx concentrations as dry deposition proxy, and N/NH4+ and N/NO3- as wet deposition proxy. For estimate of unmeasured species contributing to dry deposition, we used an Eulerian photochemical dispersion model CAMx, the Comprehensive Air Quality Model with extensions (ESSS, 2011), coupled with a high resolution regional numeric weather prediction model Aladin (Vlček, Corbet, 2011). Contribution of fog was estimated using a geostatistical data driven model. Final maps accounting for unmeasured species clearly indicate, that so far used approach results in substantial underestimation of nitrogen deposition flux. Substitution of unmeasured nitrogen species by modeled values seems to be a plausible way for approximation of total nitrogen deposition, and getting more realistic spatial pattern as input for further studies of likely nitrogen impacts on ecosystems. Acknowledgements: We would like to acknowledge the grants GA14-12262S - Effects of changing growth conditions on tree increment, stand production and vitality - danger or opportunity for the Central-European forestry?, and NAZV QI112A168 (ForSoil) of the Czech Ministry for Agriculture for support of this contribution. The input data used for the analysis were provided by the Czech Hydrometeorological Institute. References: Bobbink, R., Hicks, K., Galloway, J., Spranger, T., Alkemade, R. et al. (2010): Global Assessment of Nitrogen Deposition Effects on Terrestrial Plant Diversity: a Synthesis. Ecological Applications 20 (1), 30-59. Fowler D., O'Donoghue M., Muller J.B.A, et al. (2005): A chronology of nitrogen deposition in the UK between 1900 and 2000. Watter, Air & Soil Pollution: Focus

  17. A cloud model-based approach for water quality assessment.

    PubMed

    Wang, Dong; Liu, Dengfeng; Ding, Hao; Singh, Vijay P; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun

    2016-07-01

    Water quality assessment entails essentially a multi-criteria decision-making process accounting for qualitative and quantitative uncertainties and their transformation. Considering uncertainties of randomness and fuzziness in water quality evaluation, a cloud model-based assessment approach is proposed. The cognitive cloud model, derived from information science, can realize the transformation between qualitative concept and quantitative data, based on probability and statistics and fuzzy set theory. When applying the cloud model to practical assessment, three technical issues are considered before the development of a complete cloud model-based approach: (1) bilateral boundary formula with nonlinear boundary regression for parameter estimation, (2) hybrid entropy-analytic hierarchy process technique for calculation of weights, and (3) mean of repeated simulations for determining the degree of final certainty. The cloud model-based approach is tested by evaluating the eutrophication status of 12 typical lakes and reservoirs in China and comparing with other four methods, which are Scoring Index method, Variable Fuzzy Sets method, Hybrid Fuzzy and Optimal model, and Neural Networks method. The proposed approach yields information concerning membership for each water quality status which leads to the final status. The approach is found to be representative of other alternative methods and accurate. PMID:26995351

  18. [Methodological Approaches to the Organization of Counter Measures Taking into Account Landscape Features of Radioactively Contaminated Territories].

    PubMed

    Kuznetsov, V K; Sanzharova, N I

    2016-01-01

    Methodological approaches to the organization of counter measures are considered taking into account the landscape features of the radioactively contaminated territories. The current status and new requirements to the organization of counter measures in the contaminated agricultural areas are analyzed. The basic principles, objectives and problems of the formation of counter measures with regard to the landscape characteristics of the territory are presented; also substantiated are the organization and optimization of the counter measures in radioactively contaminated agricultural landscapes. PMID:27245009

  19. An Energy Approach to a Micromechanics Model Accounting for Nonlinear Interface Debonding.

    SciTech Connect

    Tan, H.; Huang, Y.; Geubelle, P. H.; Liu, C.; Breitenfeld, M. S.

    2005-01-01

    We developed a micromechanics model to study the effect of nonlinear interface debonding on the constitutive behavior of composite materials. While implementing this micromechanics model into a large simulation code on solid rockets, we are challenged by problems such as tension/shear coupling and the nonuniform distribution of displacement jump at the particle/matrix interfaces. We therefore propose an energy approach to solve these problems. This energy approach calculates the potential energy of the representative volume element, including the contribution from the interface debonding. By minimizing the potential energy with respect to the variation of the interface displacement jump, the traction balanced interface debonding can be found and the macroscopic constitutive relations established. This energy approach has the ability to treat different load conditions in a unified way, and the interface cohesive law can be in any arbitrary forms. In this paper, the energy approach is verified to give the same constitutive behaviors as reported before.

  20. Cosmological constraints from Sunyaev-Zeldovich cluster counts: An approach to account for missing redshifts

    SciTech Connect

    Bonaldi, A.; Battye, R. A.; Brown, M. L.

    2014-05-10

    The accumulation of redshifts provides a significant observational bottleneck when using galaxy cluster surveys to constrain cosmological parameters. We propose a simple method to allow the use of samples where there is a fraction of the redshifts that are not known. The simplest assumption is that the missing redshifts are randomly extracted from the catalog, but the method also allows one to take into account known selection effects in the accumulation of redshifts. We quantify the reduction in statistical precision of cosmological parameter constraints as a function of the fraction of missing redshifts for simulated surveys, and also investigate the impact of making an incorrect assumption for the distribution of missing redshifts.

  1. A subgrid based approach for morphodynamic modelling

    NASA Astrophysics Data System (ADS)

    Volp, N. D.; van Prooijen, B. C.; Pietrzak, J. D.; Stelling, G. S.

    2016-07-01

    To improve the accuracy and the efficiency of morphodynamic simulations, we present a subgrid based approach for a morphodynamic model. This approach is well suited for areas characterized by sub-critical flow, like in estuaries, coastal areas and in low land rivers. This new method uses a different grid resolution to compute the hydrodynamics and the morphodynamics. The hydrodynamic computations are carried out with a subgrid based, two-dimensional, depth-averaged model. This model uses a coarse computational grid in combination with a subgrid. The subgrid contains high resolution bathymetry and roughness information to compute volumes, friction and advection. The morphodynamic computations are carried out entirely on a high resolution grid, the bed grid. It is key to find a link between the information defined on the different grids in order to guaranty the feedback between the hydrodynamics and the morphodynamics. This link is made by using a new physics-based interpolation method. The method interpolates water levels and velocities from the coarse grid to the high resolution bed grid. The morphodynamic solution improves significantly when using the subgrid based method compared to a full coarse grid approach. The Exner equation is discretised with an upwind method based on the direction of the bed celerity. This ensures a stable solution for the Exner equation. By means of three examples, it is shown that the subgrid based approach offers a significant improvement at a minimal computational cost.

  2. Historicizing and Contextualizing Global Policy Discourses: Test- and Standards-Based Accountabilities in Education

    ERIC Educational Resources Information Center

    Lingard, Bob

    2013-01-01

    This paper in commenting on the contributions to this special number demonstrates the necessity of historicizing and contextualizing the rise of test- and standards-based modes of accountability in contemporary education policy globally. Both are imperative for understanding specific national manifestations of what has become a globalized…

  3. Constrained Professionalism: Dilemmas of Teaching in the Face of Test-Based Accountability

    ERIC Educational Resources Information Center

    Wills, John S.; Sandholtz, Judith Haymore

    2009-01-01

    Background/Context: In response to state-level test-based accountability and the federal No Child Left Behind Act, school administrators increasingly view centralized curriculum and prescribed instructional strategies as the most direct means of increasing student performance. This movement toward standardization reduces teachers' autonomy and…

  4. Are Performance-Based Accountability Systems Effective? Evidence from Five Sectors. Research Brief

    ERIC Educational Resources Information Center

    Leuschner, Kristin J.

    2010-01-01

    During the past two decades, performance-based accountability systems (PBASs), which link financial or other incentives to measured performance as a means of improving services, have gained popularity among policymakers. Although PBASs can vary widely across sectors, they share three main components: goals (i.e., one or more long-term outcomes to…

  5. Toward a Culture of Consequences: Performance-Based Accountability Systems for Public Services. Monograph

    ERIC Educational Resources Information Center

    Stecher, Brian M.; Camm, Frank; Damberg, Cheryl L.; Hamilton, Laura S.; Mullen, Kathleen J.; Nelson, Christopher; Sorensen, Paul; Wachs, Martin; Yoh, Allison; Zellman, Gail L.

    2010-01-01

    Performance-based accountability systems (PBASs), which link incentives to measured performance as a means of improving services to the public, have gained popularity. While PBASs can vary widely across sectors, they share three main components: goals, incentives, and measures. Research suggests that PBASs influence provider behaviors, but little…

  6. Consultation to Support Inclusive Accountability and Standards-Based Reform: Facilitating Access, Equity, and Empowerment

    ERIC Educational Resources Information Center

    Roach, Andrew T.; Elliott, Stephen N.

    2009-01-01

    Current federal legislation (i.e., No Child Left Behind (NCLB)) requires states to set rigorous academic standards, ensure classroom instruction addresses these standards, and measure and report students' progress via large-scale assessments. NCLB assumes that inclusive accountability systems and standards-based reform will result in improved…

  7. Is Comprehension Necessary for Error Detection? A Conflict-Based Account of Monitoring in Speech Production

    ERIC Educational Resources Information Center

    Nozari, Nazbanou; Dell, Gary S.; Schwartz, Myrna F.

    2011-01-01

    Despite the existence of speech errors, verbal communication is successful because speakers can detect (and correct) their errors. The standard theory of speech-error detection, the perceptual-loop account, posits that the comprehension system monitors production output for errors. Such a comprehension-based monitor, however, cannot explain the…

  8. Dynamic model of production enterprises based on accounting registers and its identification

    NASA Astrophysics Data System (ADS)

    Sirazetdinov, R. T.; Samodurov, A. V.; Yenikeev, I. A.; Markov, D. S.

    2016-06-01

    The report focuses on the mathematical modeling of economic entities based on accounting registers. Developed the dynamic model of financial and economic activity of the enterprise as a system of differential equations. Created algorithms for identification of parameters of the dynamic model. Constructed and identified the model of Russian machine-building enterprises.

  9. Educational Accountability: The Issue of Competency-Based Education. Informational Bulletin 77-IB-4.

    ERIC Educational Resources Information Center

    Watchke, Gary A.

    After a brief discussion of the issues prompting concern with competency-based education (CBE) and an explanation of what it is and why it is so popular, this document goes on to discuss action Wisconsin and other states have taken to implement CBE. Very brief accounts are given of the minimal competency-type programs implemented in Sparta,…

  10. Contest for Jurisdiction: An Occupational Analysis of Principals' Responses to Test-Based Accountability

    ERIC Educational Resources Information Center

    Rutledge, Stacey A.

    2010-01-01

    This case study uses a theory of occupational ecology to understand why test-based accountability has been successful at redirecting principals' work toward high-stakes standards and assessments. The principals and English teachers at two Chicago high schools were interviewed annually over a four-year period. The study finds that test-based…

  11. Preferences for Team Learning and Lecture-Based Learning among First-Year Undergraduate Accounting Students

    ERIC Educational Resources Information Center

    Opdecam, Evelien; Everaert, Patricia; Van Keer, Hilde; Buysschaert, Fanny

    2014-01-01

    This study investigates students' "preference" for team learning and its effectiveness, compared to lecture-based learning. A quasi-experiment was set up in a financial accounting course in the first-year undergraduate of the Economics and Business Administration Program, where students had to choose between one of the two learning…

  12. Toward an Episodic Context Account of Retrieval-Based Learning: Dissociating Retrieval Practice and Elaboration

    ERIC Educational Resources Information Center

    Lehman, Melissa; Smith, Megan A.; Karpicke, Jeffrey D.

    2014-01-01

    We tested the predictions of 2 explanations for retrieval-based learning; while the elaborative retrieval hypothesis assumes that the retrieval of studied information promotes the generation of semantically related information, which aids in later retrieval (Carpenter, 2009), the episodic context account proposed by Karpicke, Lehman, and Aue (in…

  13. The Social Organization of School Counseling in the Era of Standards-Based Accountability

    ERIC Educational Resources Information Center

    Dorsey, Alexander C.

    2011-01-01

    The reform policies of standards-based accountability, as outlined in NCLB, impede the functioning of school counseling programs and the delivery of services to students. Although recent studies have focused on the transformation of the school counseling profession, a gap exists in the literature with regard to how the experiences of school…

  14. Outcome-Based Education and Student Learning in Managerial Accounting in Hong Kong

    ERIC Educational Resources Information Center

    Lui, Gladie; Shum, Connie

    2012-01-01

    Although Outcome-based Education has not been successful in public education in several countries, it has been successful in the medical fields in higher education in the U.S. The author implemented OBE in her Managerial Accounting course in H.K. Intended learning outcomes were mapped again Bloom's Cognitive Domain. Teaching and learning…

  15. An engineering based approach for hydraulic computations in river flows

    NASA Astrophysics Data System (ADS)

    Di Francesco, S.; Biscarini, C.; Pierleoni, A.; Manciola, P.

    2016-06-01

    This paper presents an engineering based approach for hydraulic risk evaluation. The aim of the research is to identify a criteria for the choice of the simplest and appropriate model to use in different scenarios varying the characteristics of main river channel. The complete flow field, generally expressed in terms of pressure, velocities, accelerations can be described through a three dimensional approach that consider all the flow properties varying in all directions. In many practical applications for river flow studies, however, the greatest changes occur only in two dimensions or even only in one. In these cases the use of simplified approaches can lead to accurate results, with easy to build and faster simulations. The study has been conducted taking in account a dimensionless parameter of channels (ratio of curvature radius and width of the channel (R/B).

  16. Accounting for Success and Failure: A Discursive Psychological Approach to Sport Talk

    ERIC Educational Resources Information Center

    Locke, Abigail

    2004-01-01

    In recent years, constructionist methodologies such as discursive psychology (Edwards & Potter, 1992) have begun to be used in sport research. This paper provides a practical guide to applying a discursive psychological approach to sport data. It discusses the assumptions and principles of discursive psychology and outlines the stages of a…

  17. Situational Effects May Account for Gain Scores in Cognitive Ability Testing: A Longitudinal SEM Approach

    ERIC Educational Resources Information Center

    Matton, Nadine; Vautier, Stephane; Raufaste, Eric

    2009-01-01

    Mean gain scores for cognitive ability tests between two sessions in a selection setting are now a robust finding, yet not fully understood. Many authors do not attribute such gain scores to an increase in the target abilities. Our approach consists of testing a longitudinal SEM model suitable to this view. We propose to model the scores' changes…

  18. A fatal fall from a balcony? A biomechanical approach to resolving conflicting witness accounts.

    PubMed

    Jones, M D; Cory, C Z

    2006-07-01

    An adult male was found below a third floor balcony having sustained fatal head injuries. An account provided by a witness described how the deceased had been in high spirits and had engaged in swinging from the third floor balcony rail in an attempt to swing onto a lower second floor balcony and whilst doing so had lost his grip and fallen (10.67 metres) to the ground below. A conflicting account was provided, some weeks later, by a second witness, who claimed to have observed an argument between two men on a third floor balcony, during which one had vigorously pushed the other over the balcony rail. The push, it was alleged, caused the man to move very quickly over the balcony rail and fall in an 'upturned crucifix' position to the ground. This paper describes a series of biomechanical experiments, conducted on a reconstruction of the third floor balcony and the second floor balcony rail, during which a volunteer was subjected to the two fall scenarios, in an attempt to resolve the conflicting witness accounts. Analysis of human movement was performed using a 3-D motion analysis system, markers were placed at the volunteer's key joint centres and were tracked to determine physical parameters. The parameter values were used to calculate what dynamic movements may have occurred had the volunteer been allowed to fall, not just a distance equivalent to the lower balcony rail but a greater distance, equivalent to that between the balcony and the ground. Calculations indicate that during the hanging-fall scenario a range of body rotation was produced between 159 degrees and 249 degrees, that is, an upturned head-first body orientation, consistent with that required to produce the described injuries and consistent with the description provided by the first witness. The push-fall scenario, however, produced a greater estimated body rotation of between 329 degrees and 530 degrees, equal to the body rotating, from the point of free-fall to the moment of impact, between

  19. A Multi-scale Approach for CO2 Accounting and Risk Analysis in CO2 Enhanced Oil Recovery Sites

    NASA Astrophysics Data System (ADS)

    Dai, Z.; Viswanathan, H. S.; Middleton, R. S.; Pan, F.; Ampomah, W.; Yang, C.; Jia, W.; Lee, S. Y.; McPherson, B. J. O. L.; Grigg, R.; White, M. D.

    2015-12-01

    Using carbon dioxide in enhanced oil recovery (CO2-EOR) is a promising technology for emissions management because CO2-EOR can dramatically reduce carbon sequestration costs in the absence of greenhouse gas emissions policies that include incentives for carbon capture and storage. This study develops a multi-scale approach to perform CO2 accounting and risk analysis for understanding CO2 storage potential within an EOR environment at the Farnsworth Unit of the Anadarko Basin in northern Texas. A set of geostatistical-based Monte Carlo simulations of CO2-oil-water flow and transport in the Marrow formation are conducted for global sensitivity and statistical analysis of the major risk metrics: CO2 injection rate, CO2 first breakthrough time, CO2 production rate, cumulative net CO2 storage, cumulative oil and CH4 production, and water injection and production rates. A global sensitivity analysis indicates that reservoir permeability, porosity, and thickness are the major intrinsic reservoir parameters that control net CO2 injection/storage and oil/CH4 recovery rates. The well spacing (the distance between the injection and production wells) and the sequence of alternating CO2 and water injection are the major operational parameters for designing an effective five-spot CO2-EOR pattern. The response surface analysis shows that net CO2 injection rate increases with the increasing reservoir thickness, permeability, and porosity. The oil/CH4 production rates are positively correlated to reservoir permeability, porosity and thickness, but negatively correlated to the initial water saturation. The mean and confidence intervals are estimated for quantifying the uncertainty ranges of the risk metrics. The results from this study provide useful insights for understanding the CO2 storage potential and the corresponding risks of commercial-scale CO2-EOR fields.

  20. Accounting for negative automaintenance in pigeons: a dual learning systems approach and factored representations.

    PubMed

    Lesaint, Florian; Sigaud, Olivier; Khamassi, Mehdi

    2014-01-01

    Animals, including Humans, are prone to develop persistent maladaptive and suboptimal behaviours. Some of these behaviours have been suggested to arise from interactions between brain systems of Pavlovian conditioning, the acquisition of responses to initially neutral stimuli previously paired with rewards, and instrumental conditioning, the acquisition of active behaviours leading to rewards. However the mechanics of these systems and their interactions are still unclear. While extensively studied independently, few models have been developed to account for these interactions. On some experiment, pigeons have been observed to display a maladaptive behaviour that some suggest to involve conflicts between Pavlovian and instrumental conditioning. In a procedure referred as negative automaintenance, a key light is paired with the subsequent delivery of food, however any peck towards the key light results in the omission of the reward. Studies showed that in such procedure some pigeons persisted in pecking to a substantial level despite its negative consequence, while others learned to refrain from pecking and maximized their cumulative rewards. Furthermore, the pigeons that were unable to refrain from pecking could nevertheless shift their pecks towards a harmless alternative key light. We confronted a computational model that combines dual-learning systems and factored representations, recently developed to account for sign-tracking and goal-tracking behaviours in rats, to these negative automaintenance experimental data. We show that it can explain the variability of the observed behaviours and the capacity of alternative key lights to distract pigeons from their detrimental behaviours. These results confirm the proposed model as an interesting tool to reproduce experiments that could involve interactions between Pavlovian and instrumental conditioning. The model allows us to draw predictions that may be experimentally verified, which could help further investigate

  1. Accounting for Negative Automaintenance in Pigeons: A Dual Learning Systems Approach and Factored Representations

    PubMed Central

    Lesaint, Florian; Sigaud, Olivier; Khamassi, Mehdi

    2014-01-01

    Animals, including Humans, are prone to develop persistent maladaptive and suboptimal behaviours. Some of these behaviours have been suggested to arise from interactions between brain systems of Pavlovian conditioning, the acquisition of responses to initially neutral stimuli previously paired with rewards, and instrumental conditioning, the acquisition of active behaviours leading to rewards. However the mechanics of these systems and their interactions are still unclear. While extensively studied independently, few models have been developed to account for these interactions. On some experiment, pigeons have been observed to display a maladaptive behaviour that some suggest to involve conflicts between Pavlovian and instrumental conditioning. In a procedure referred as negative automaintenance, a key light is paired with the subsequent delivery of food, however any peck towards the key light results in the omission of the reward. Studies showed that in such procedure some pigeons persisted in pecking to a substantial level despite its negative consequence, while others learned to refrain from pecking and maximized their cumulative rewards. Furthermore, the pigeons that were unable to refrain from pecking could nevertheless shift their pecks towards a harmless alternative key light. We confronted a computational model that combines dual-learning systems and factored representations, recently developed to account for sign-tracking and goal-tracking behaviours in rats, to these negative automaintenance experimental data. We show that it can explain the variability of the observed behaviours and the capacity of alternative key lights to distract pigeons from their detrimental behaviours. These results confirm the proposed model as an interesting tool to reproduce experiments that could involve interactions between Pavlovian and instrumental conditioning. The model allows us to draw predictions that may be experimentally verified, which could help further investigate

  2. Accounting for Accountability.

    ERIC Educational Resources Information Center

    Colorado State Dept. of Education, Denver. Cooperative Accountability Project.

    This publication reports on two Regional Educational Accountability Conferences on Techniques sponsored by the Cooperative Accountability Project. Accountability is described as an "emotionally-charged issue" and an "operationally demanding concept." Overviewing accountability, major speakers emphasized that accountability is a means toward…

  3. Physics-based approach to haptic display

    NASA Technical Reports Server (NTRS)

    Brown, J. Michael; Colgate, J. Edward

    1994-01-01

    This paper addresses the implementation of complex multiple degree of freedom virtual environments for haptic display. We suggest that a physics based approach to rigid body simulation is appropriate for hand tool simulation, but that currently available simulation techniques are not sufficient to guarantee successful implementation. We discuss the desirable features of a virtual environment simulation, specifically highlighting the importance of stability guarantees.

  4. A Serial Risk Score Approach to Disease Classification that Accounts for Accuracy and Cost

    PubMed Central

    Huynh, Dat; Laeyendecker, Oliver; Brookmeyer, Ron

    2016-01-01

    Summary The performance of diagnostic tests for disease classification is often measured by accuracy (e.g. sensitivity or specificity); however, costs of the diagnostic test are a concern as well. Combinations of multiple diagnostic tests may improve accuracy, but incur additional costs. Here we consider serial testing approaches that maintain accuracy while controlling costs of the diagnostic tests. We present a serial risk score classification approach. The basic idea is to sequentially test with additional diagnostic tests just until persons are classified. In this way, it is not necessary to test all persons with all tests. The methods are studied in simulations and compared with logistic regression. We applied the methods to data from HIV cohort studies to identify HIV infected individuals who are recently infected (< 1 year) by testing with assays for multiple biomarkers. We find that the serial risk score classification approach can maintain accuracy while achieving a reduction in cost compared to testing all individuals with all assays. PMID:25156309

  5. Evolutionary impact assessment: accounting for evolutionary consequences of fishing in an ecosystem approach to fisheries management

    PubMed Central

    Laugen, Ane T; Engelhard, Georg H; Whitlock, Rebecca; Arlinghaus, Robert; Dankel, Dorothy J; Dunlop, Erin S; Eikeset, Anne M; Enberg, Katja; Jørgensen, Christian; Matsumura, Shuichi; Nusslé, Sébastien; Urbach, Davnah; Baulier, Loїc; Boukal, David S; Ernande, Bruno; Johnston, Fiona D; Mollet, Fabian; Pardoe, Heidi; Therkildsen, Nina O; Uusi-Heikkilä, Silva; Vainikka, Anssi; Heino, Mikko; Rijnsdorp, Adriaan D; Dieckmann, Ulf

    2014-01-01

    Managing fisheries resources to maintain healthy ecosystems is one of the main goals of the ecosystem approach to fisheries (EAF). While a number of international treaties call for the implementation of EAF, there are still gaps in the underlying methodology. One aspect that has received substantial scientific attention recently is fisheries-induced evolution (FIE). Increasing evidence indicates that intensive fishing has the potential to exert strong directional selection on life-history traits, behaviour, physiology, and morphology of exploited fish. Of particular concern is that reversing evolutionary responses to fishing can be much more difficult than reversing demographic or phenotypically plastic responses. Furthermore, like climate change, multiple agents cause FIE, with effects accumulating over time. Consequently, FIE may alter the utility derived from fish stocks, which in turn can modify the monetary value living aquatic resources provide to society. Quantifying and predicting the evolutionary effects of fishing is therefore important for both ecological and economic reasons. An important reason this is not happening is the lack of an appropriate assessment framework. We therefore describe the evolutionary impact assessment (EvoIA) as a structured approach for assessing the evolutionary consequences of fishing and evaluating the predicted evolutionary outcomes of alternative management options. EvoIA can contribute to EAF by clarifying how evolution may alter stock properties and ecological relations, support the precautionary approach to fisheries management by addressing a previously overlooked source of uncertainty and risk, and thus contribute to sustainable fisheries. PMID:26430388

  6. Standards-Based Accountability as a Tool for Making a Difference in Student Learning. A State and an Institutional Perspective on Standards-Based Accountability.

    ERIC Educational Resources Information Center

    Wilkerson, Judy R.

    This paper examines Florida's standards-driven performance assessment, emphasizing teacher preparation, and touching on K-12 accountability. Florida's educational reform and accountability efforts are driven by the Florida System of School Improvement and Accountability document. The system is derived from state goals similar to the national Goals…

  7. A dynamic water accounting framework based on marginal resource opportunity cost

    NASA Astrophysics Data System (ADS)

    Tilmant, A.; Marques, G.; Mohamed, Y.

    2014-10-01

    Many river basins throughout the world are increasingly under pressure as water demands keep rising due to population growth, industrialization, urbanization and rising living standards. In the past, the typical answer to meet those demands focused on the supply-side and involved the construction of hydraulic infrastructures to capture more water from surface water bodies and from aquifers. As river basins were being more and more developed, downstream water users and ecosystems have become increasingly dependent on the management actions taken by upstream users. The increased interconnectedness between water users, aquatic ecosystems and the built environment is further compounded by climate change and its impact on the water cycle. Those pressures mean that it has become increasingly important to measure and account for changes in water fluxes and their corresponding economic value as they progress throughout the river system. Such basin water accounting should provide policy makers with important information regarding the relative contribution of each water user, infrastructure and management decision to the overall economic value of the river basin. This paper presents a dynamic water accounting approach whereby the entire river basin is considered as a value chain with multiple services including production and storage. Water users and reservoirs operators are considered as economic agents who can exchange water with their hydraulic neighbours at a price corresponding to the marginal value of water. Effective water accounting is made possible by keeping track of all water fluxes and their corresponding hypothetical transactions using the results of a hydro-economic model. The proposed approach is illustrated with the Eastern Nile River basin in Africa.

  8. A dynamic water accounting framework based on marginal resource opportunity cost

    NASA Astrophysics Data System (ADS)

    Tilmant, A.; Marques, G.; Mohamed, Y.

    2015-03-01

    Many river basins throughout the world are increasingly under pressure as water demands keep rising due to population growth, industrialization, urbanization and rising living standards. In the past, the typical answer to meet those demands focused on the supply side and involved the construction of hydraulic infrastructures to capture more water from surface water bodies and from aquifers. As river basins have become more and more developed, downstream water users and ecosystems have become increasingly dependent on the management actions taken by upstream users. The increased interconnectedness between water users, aquatic ecosystems and the built environment is further compounded by climate change and its impact on the water cycle. Those pressures mean that it has become increasingly important to measure and account for changes in water fluxes and their corresponding economic value as they progress throughout the river system. Such basin water accounting should provide policy makers with important information regarding the relative contribution of each water user, infrastructure and management decision to the overall economic value of the river basin. This paper presents a dynamic water accounting approach whereby the entire river basin is considered as a value chain with multiple services including production and storage. Water users and reservoir operators are considered as economic agents who can exchange water with their hydraulic neighbors at a price corresponding to the marginal value of water. Effective water accounting is made possible by keeping track of all water fluxes and their corresponding hypothetical transactions using the results of a hydro-economic model. The proposed approach is illustrated with the Eastern Nile River basin in Africa.

  9. The Inequality Footprints of Nations: A Novel Approach to Quantitative Accounting of Income Inequality

    PubMed Central

    Alsamawi, Ali; Murray, Joy; Lenzen, Manfred; Moran, Daniel; Kanemoto, Keiichiro

    2014-01-01

    In this study we use economic input-output analysis to calculate the inequality footprint of nations. An inequality footprint shows the link that each country's domestic economic activity has to income distribution elsewhere in the world. To this end we use employment and household income accounts for 187 countries and an historical time series dating back to 1990. Our results show that in 2010, most developed countries had an inequality footprint that was higher than their within-country inequality, meaning that in order to support domestic lifestyles, these countries source imports from more unequal economies. Amongst exceptions are the United States and United Kingdom, which placed them on a par with many developing countries. Russia has a high within-country inequality nevertheless it has the lowest inequality footprint in the world, which is because of its trade connections with the Commonwealth of Independent States and Europe. Our findings show that the commodities that are inequality-intensive, such as electronic components, chemicals, fertilizers, minerals, and agricultural products often originate in developing countries characterized by high levels of inequality. Consumption of these commodities may implicate within-country inequality in both developing and developed countries. PMID:25353333

  10. Advanced Approach of Multiagent Based Buoy Communication

    PubMed Central

    Gricius, Gediminas; Drungilas, Darius; Andziulis, Arunas; Dzemydiene, Dale; Voznak, Miroslav; Kurmis, Mindaugas; Jakovlev, Sergej

    2015-01-01

    Usually, a hydrometeorological information system is faced with great data flows, but the data levels are often excessive, depending on the observed region of the water. The paper presents advanced buoy communication technologies based on multiagent interaction and data exchange between several monitoring system nodes. The proposed management of buoy communication is based on a clustering algorithm, which enables the performance of the hydrometeorological information system to be enhanced. The experiment is based on the design and analysis of the inexpensive but reliable Baltic Sea autonomous monitoring network (buoys), which would be able to continuously monitor and collect temperature, waviness, and other required data. The proposed approach of multiagent based buoy communication enables all the data from the costal-based station to be monitored with limited transition speed by setting different tasks for the agent-based buoy system according to the clustering information. PMID:26345197

  11. Advanced Approach of Multiagent Based Buoy Communication.

    PubMed

    Gricius, Gediminas; Drungilas, Darius; Andziulis, Arunas; Dzemydiene, Dale; Voznak, Miroslav; Kurmis, Mindaugas; Jakovlev, Sergej

    2015-01-01

    Usually, a hydrometeorological information system is faced with great data flows, but the data levels are often excessive, depending on the observed region of the water. The paper presents advanced buoy communication technologies based on multiagent interaction and data exchange between several monitoring system nodes. The proposed management of buoy communication is based on a clustering algorithm, which enables the performance of the hydrometeorological information system to be enhanced. The experiment is based on the design and analysis of the inexpensive but reliable Baltic Sea autonomous monitoring network (buoys), which would be able to continuously monitor and collect temperature, waviness, and other required data. The proposed approach of multiagent based buoy communication enables all the data from the costal-based station to be monitored with limited transition speed by setting different tasks for the agent-based buoy system according to the clustering information. PMID:26345197

  12. A Bayesian, exemplar-based approach to hierarchical shape matching.

    PubMed

    Gavrila, Dariu M

    2007-08-01

    This paper presents a novel probabilistic approach to hierarchical, exemplar-based shape matching. No feature correspondence is needed among exemplars, just a suitable pairwise similarity measure. The approach uses a template tree to efficiently represent and match the variety of shape exemplars. The tree is generated offline by a bottom-up clustering approach using stochastic optimization. Online matching involves a simultaneous coarse-to-fine approach over the template tree and over the transformation parameters. The main contribution of this paper is a Bayesian model to estimate the a posteriori probability of the object class, after a certain match at a node of the tree. This model takes into account object scale and saliency and allows for a principled setting of the matching thresholds such that unpromising paths in the tree traversal process are eliminated early on. The proposed approach was tested in a variety of application domains. Here, results are presented on one of the more challenging domains: real-time pedestrian detection from a moving vehicle. A significant speed-up is obtained when comparing the proposed probabilistic matching approach with a manually tuned nonprobabilistic variant, both utilizing the same template tree structure. PMID:17568144

  13. Accounting Technology Associate Degree. Louisiana Technical Education Program and Course Standards. Competency-Based Postsecondary Curriculum Outline from Bulletin 1822.

    ERIC Educational Resources Information Center

    Louisiana State Dept. of Education, Baton Rouge. Div. of Vocational Education.

    This document outlines the curriculum of Louisiana's accounting technology associate degree program, which is a 6-term (77-credit hour) competency-based program designed to prepare students for employment as accounting technicians providing technical administrative support to professional accountants and other financial management personnel.…

  14. Systems Engineering Interfaces: A Model Based Approach

    NASA Technical Reports Server (NTRS)

    Fosse, Elyse; Delp, Christopher

    2013-01-01

    Currently: Ops Rev developed and maintains a framework that includes interface-specific language, patterns, and Viewpoints. Ops Rev implements the framework to design MOS 2.0 and its 5 Mission Services. Implementation de-couples interfaces and instances of interaction Future: A Mission MOSE implements the approach and uses the model based artifacts for reviews. The framework extends further into the ground data layers and provides a unified methodology.

  15. Spatial pattern of nitrogen deposition flux over Czech forests: a novel approach accounting for unmeasured nitrogen species

    NASA Astrophysics Data System (ADS)

    Hůnová, Iva; Stoklasová, Petra; Kurfürst, Pavel; Vlček, Ondřej; Schovánková, Jana; Stráník, Vojtěch

    2015-04-01

    atmospheric nitrogen deposition flux over the Czech forests collating all available data and model results. The aim of the presented study is to provide an improved, more reliable and more realistic estimate of spatial pattern of nitrogen deposition flux over one country. This has so far been based standardly on measurements of ambient N/NOx concentrations as dry deposition proxy, and N/NH4+ and N/NO3- as wet deposition proxy. For estimate of unmeasured species contributing to dry deposition, we used an Eulerian photochemical dispersion model CAMx, the Comprehensive Air Quality Model with extensions (ESSS, 2011), coupled with a high resolution regional numeric weather prediction model Aladin (Vlček, Corbet, 2011). Contribution of fog was estimated using a geostatistical data driven model. Final maps accounting for unmeasured species clearly indicate, that so far used approach results in substantial underestimation of nitrogen deposition flux. Substitution of unmeasured nitrogen species by modeled values seems to be a plausible way for approximation of total nitrogen deposition, and getting more realistic spatial pattern as input for further studies of likely nitrogen impacts on ecosystems. Acknowledgements: We would like to acknowledge the grants GA14-12262S - Effects of changing growth conditions on tree increment, stand production and vitality - danger or opportunity for the Central-European forestry?, and NAZV QI112A168 (ForSoil) of the Czech Ministry for Agriculture for support of this contribution. The input data used for the analysis were provided by the Czech Hydrometeorological Institute. References: Bobbink, R., Hicks, K., Galloway, J., Spranger, T., Alkemade, R. et al. (2010): Global Assessment of Nitrogen Deposition Effects on Terrestrial Plant Diversity: a Synthesis. Ecological Applications 20 (1), 30-59. Fowler D., O'Donoghue M., Muller J.B.A, et al. (2005): A chronology of nitrogen deposition in the UK between 1900 and 2000. Watter, Air & Soil Pollution: Focus

  16. Generation of SEEAW asset accounts based on water resources management models

    NASA Astrophysics Data System (ADS)

    Pedro-Monzonís, María; Solera, Abel; Andreu, Joaquín

    2015-04-01

    One of the main challenges in the XXI century is related with the sustainable use of water. This is due to the fact that water is an essential element for the life of all who inhabit our planet. In many cases, the lack of economic valuation of water resources causes an inefficient water use. In this regard, society expects of policymakers and stakeholders maximise the profit produced per unit of natural resources. Water planning and the Integrated Water Resources Management (IWRM) represent the best way to achieve this goal. The System of Environmental-Economic Accounting for Water (SEEAW) is displayed as a tool for water allocation which enables the building of water balances in a river basin. The main concern of the SEEAW is to provide a standard approach which allows the policymakers to compare results between different territories. But building water accounts is a complex task due to the difficulty of the collection of the required data. Due to the difficulty of gauging the components of the hydrological cycle, the use of simulation models has become an essential tool extensively employed in last decades. The target of this paper is to present the building up of a database that enables the combined use of hydrological models and water resources models developed with AQUATOOL DSSS to fill in the SEEAW tables. This research is framed within the Water Accounting in a Multi-Catchment District (WAMCD) project, financed by the European Union. Its main goal is the development of water accounts in the Mediterranean Andalusian River Basin District, in Spain. This research pretends to contribute to the objectives of the "Blueprint to safeguard Europe's water resources". It is noteworthy that, in Spain, a large part of these methodological decisions are included in the Spanish Guideline of Water Planning with normative status guaranteeing consistency and comparability of the results.

  17. Basin-wide water accounting based on remote sensing data: an application for the Indus Basin

    NASA Astrophysics Data System (ADS)

    Karimi, P.; Bastiaanssen, W. G. M.; Molden, D.; Cheema, M. J. M.

    2013-07-01

    The paper demonstrates the application of a new water accounting plus (WA+) framework to produce information on depletion of water resources, storage change, and land and water productivity in the Indus basin. It shows how satellite-derived estimates of land use, rainfall, evaporation (E), transpiration (T), interception (I) and biomass production can be used in addition to measured basin outflow, for water accounting with WA+. It is demonstrated how the accounting results can be interpreted to identify existing issues and examine solutions for the future. The results for one selected year (2007) showed that total annual water depletion in the basin (501 km3) plus outflows (21 km3) exceeded total precipitation (482 km3). The water storage systems that were effected are groundwater storage (30 km3), surface water storage (9 km3), and glaciers and snow storage (2 km3). Evapotranspiration of rainfall or "landscape ET" was 344 km3 (69 % of total depletion). "Incremental ET" due to utilized flow was 157 km3 (31% of total depletion). Agriculture depleted 297 km3, or 59% of the total depletion, of which 85% (254 km3) was through irrigated agriculture and the remaining 15% (44 km3) through rainfed systems. Due to excessive soil evaporation in agricultural areas, half of all water depletion in the basin was non-beneficial. Based on the results of this accounting exercise loss of storage, low beneficial depletion, and low land and water productivity were identified as the main water resources management issues. Future scenarios to address these issues were chosen and their impacts on the Indus Basin water accounts were tested using the new WA+ framework.

  18. Reexamining the language account of cross-national differences in base-10 number representations.

    PubMed

    Vasilyeva, Marina; Laski, Elida V; Ermakova, Anna; Lai, Weng-Feng; Jeong, Yoonkyung; Hachigian, Amy

    2015-01-01

    East Asian students consistently outperform students from other nations in mathematics. One explanation for this advantage is a language account; East Asian languages, unlike most Western languages, provide cues about the base-10 structure of multi-digit numbers, facilitating the development of base-10 number representations. To test this view, the current study examined how kindergartners represented two-digit numbers using single unit-blocks and ten-blocks. The participants (N=272) were from four language groups (Korean, Mandarin, English, and Russian) that vary in the extent of "transparency" of the base-10 structure. In contrast to previous findings with older children, kindergartners showed no cross-language variability in the frequency of producing base-10 representations. Furthermore, they showed a pattern of within-language variability that was not consistent with the language account and was likely attributable to experiential factors. These findings suggest that language might not play as critical a role in the development of base-10 representations as suggested in earlier research. PMID:25240152

  19. Burned area, active fires and biomass burning - approaches to account for emissions from fires in Tanzania

    NASA Astrophysics Data System (ADS)

    Ruecker, Gernot; Hoffmann, Anja; Leimbach, David; Tiemann, Joachim; Ng'atigwa, Charles

    2013-04-01

    Eleven years of data from the globally available MODIS burned area and the MODS Active Fire Product have been analysed for Tanzania in conjunction with GIS data on land use and cover to provide a baseline for fire activity in this East African country. The total radiated energy (FRE) emitted by fires that were picked up by the burned area and active fire product is estimated based on a spatio-temporal clustering algorithm over the burned areas, and integration of the fire radiative power from the MODIS Active Fires product over the time of burning and the area of each burned area cluster. Resulting biomass combusted by unit area based on Woosteŕs scaling factor for FRE to biomass combusted is compared to values found in the literature, and to values found in the Global Fire Emissions Database (GFED). Pyrogenic emissions are then estimated using emission factors. According to our analysis, an average of 11 million ha burn annually (ranging between 8.5 and 12.9 million ha) in Tanzania corresponding to between 10 and 14 % of Tanzaniás land area. Most burned area is recorded in the months from May to October. The land cover types most affected are woodland and shrubland cover types: they comprise almost 70 % of Tanzania's average annual burned area or 6.8 million ha. Most burning occurs in gazetted land, with an annual average of 3.7 million ha in forest reserves, 3.3 million ha in game reserves and 1.46 million ha in national parks, totalling close to 8.5 million ha or 77 % of the annual average burned area of Tanzania. Annual variability of burned area is moderate for most of the analysed classes, and in most cases there is no clear trend to be detected in burned area, except for the Lindi region were annual burned area appears to be increasing. Preliminary results regarding emissions from fires show that for larger fires that burn over a longer time, biomass burned derived through the FRP method compares well to literature values, while the integration over

  20. The role of society in engineering risk analysis: a capabilities-based approach.

    PubMed

    Murphy, Colleen; Gardoni, Paolo

    2006-08-01

    This article proposes a new conceptual framework in engineering risk analysis to account for the net impact of hazards on individuals in a society. It analyzes four limitations of prevailing approaches to risk analysis and suggests a way to overcome them. These limitations are a result of how societal impacts are characteristically accounted for and valued. Prevailing approaches typically focus too narrowly on the consequences of natural or man-made hazards, not accounting for the broader societal impacts of such hazards. Such approaches lack a uniform and consistent metric for accounting for the impact of the nonquantifiable consequences (like psychological trauma or societal impacts) and rely upon implicit and potentially inaccurate value judgments when evaluating risks. To overcome these limitations, we propose an alternative, Capabilities-Based Approach to the treatment of society in risk analysis. A similar approach is currently used by the United Nations to quantitatively measure the degree of development in countries around the world. In a Capabilities-Based Approach, the potential benefits and losses due to a hazard are measured and compared in a uniform way by using individual capabilities (functionings individuals are able, still able, or unable to achieve) as a metric. This Capabilities-Based Approach provides a foundation for identifying and quantifying the broader, complex societal consequences of hazards and is based on explicit, value judgments. The Capabilities-Based Approach can accommodate different methods or techniques for risk determination and for risk evaluation and can be used in assessing risk in diverse types of hazards (natural or man-made) and different magnitudes that range from minor to catastrophic. In addition, implementing a Capabilities-Based Approach contributes to the development of a single standard for public policy decision making, since a Capabilities-Based Approach is already in use in development economics and policy. PMID

  1. Salience and Attention in Surprisal-Based Accounts of Language Processing

    PubMed Central

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus. PMID:27375525

  2. Salience and Attention in Surprisal-Based Accounts of Language Processing.

    PubMed

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus. PMID:27375525

  3. Probabilistic approaches to accounting for data variability in the practical application of bioavailability in predicting aquatic risks from metals.

    PubMed

    Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura

    2013-07-01

    The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. PMID:23505250

  4. Heutagogy: An alternative practice based learning approach.

    PubMed

    Bhoyrub, John; Hurley, John; Neilson, Gavin R; Ramsay, Mike; Smith, Margaret

    2010-11-01

    Education has explored and utilised multiple approaches in attempts to enhance the learning and teaching opportunities available to adult learners. Traditional pedagogy has been both directly and indirectly affected by andragogy and transformational learning, consequently widening our understandings and approaches toward view teaching and learning. Within the context of nurse education, a major challenge has been to effectively apply these educational approaches to the complex, unpredictable and challenging environment of practice based learning. While not offered as a panacea to such challenges, heutagogy is offered in this discussion paper as an emerging and potentially highly congruent educational framework to place around practice based learning. Being an emergent theory its known conceptual underpinnings and possible applications to nurse education need to be explored and theoretically applied. Through placing the adult learner at the foreground of grasping learning opportunities as they unpredictability emerge from a sometimes chaotic environment, heutagogy can be argued as offering the potential to minimise many of the well published difficulties of coordinating practice with faculty teaching and learning. PMID:20554249

  5. Risk-Based Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2002-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine

  6. Deciding Who Decides Questions at the Intersection of School Finance Reform Litigation and Standards-Based Accountability Policies

    ERIC Educational Resources Information Center

    Superfine, Benjamin Michael

    2009-01-01

    Courts hearing school finance reform cases have recently begun to consider several issues related to standards-based accountability policies. This convergence of school finance reform litigation and standards-based accountability policies represents a chance for the courts to reallocate decision-making authority for each type of reform across the…

  7. Administrators' Perceptions of Outcome-Based Education: Outputs, Outcomes and Professional Accountability.

    ERIC Educational Resources Information Center

    Furman, Gail Chase

    This case study explores the impact of outcome-based education (OBE) in one school district 5 years after its adoption. The study is guided by constructivist theory and a perspective that policy studies can have an important problem-finding function. The basic assumption of the OBE approach, that educational improvement depends on a shift in focus…

  8. Place-Based Pedagogy in the Era of Accountability: An Action Research Study

    ERIC Educational Resources Information Center

    Saracino, Peter C.

    2010-01-01

    Today's most common method of teaching biology--driven by calls for standardization and high-stakes testing--relies on a standards-based, de-contextualized approach to education. This results in "one size fits all" curriculums that ignore local contexts relevant to students' lives, discourage student engagement and ultimately work against a deep…

  9. Effectiveness and Accountability of the Inquiry-Based Methodology in Middle School Science

    ERIC Educational Resources Information Center

    Hardin, Cade

    2009-01-01

    When teaching science, the time allowed for students to make discoveries on their own through the inquiry method directly conflicts with the mandated targets of a broad spectrum of curricula. Research shows that using an inquiry-based approach can encourage student motivation and increase academic achievement (Wolf & Fraser, 2008, Bryant, 2006,…

  10. The impact of activity based cost accounting on health care capital investment decisions.

    PubMed

    Greene, J K; Metwalli, A

    2001-01-01

    For the future survival of the rural hospitals in the U.S., there is a need to make sound financial decisions. The Activity Based Cost Accounting (ABC) provides more accurate and detailed cost information to make an informed capital investment decision taking into consideration all the costs and revenue reimbursement from third party payors. The paper analyzes, evaluates and compares two scenarios of acquiring capital equipment and attempts to show the importance of utilizing the ABC method in making a sound financial decision as compared to the traditional cost method. PMID:11794757

  11. Computer-based accountability system (Phase I) for special nuclear materials at Argonne-West

    SciTech Connect

    Ingermanson, R.S.; Proctor, A.E.

    1982-05-01

    An automated accountability system for special nuclear materials (SNM) is under development at Argonne National Laboratory-West. Phase I of the development effort has established the following basic features of the system: a unique file organization allows rapid updating or retrieval of the status of various SNM, based on batch numbers, storage location, serial number, or other attributes. Access to the program is controlled by an interactive user interface that can be easily understood by operators who have had no prior background in electronic data processing. Extensive use of structured programming techniques make the software package easy to understand and to modify for specific applications. All routines are written in FORTRAN.

  12. An Analytical Approach to Model Heterogonous Recrystallization Kinetics Taking into Account the Natural Spatial Inhomogeneity of Deformation

    NASA Astrophysics Data System (ADS)

    Luo, Haiwen; van der Zwaag, Sybrand

    2016-01-01

    The classical Johnson-Mehl-Avrami-Kolmogorov equation was modified to take into account the normal local strain distribution in deformed samples. This new approach is not only able to describe the influence of the local heterogeneity of recrystallization but also to produce an average apparent Avrami exponent to characterize the entire recrystallization process. In particular, it predicts that the apparent Avrami exponent should be within a narrow range of 1 to 2 and converges to 1 when the local strain varies greatly. Moreover, the apparent Avrami exponent is predicted to be insensitive to temperature and deformation conditions. These predictions are in excellent agreement with the experimental observations on static recrystallization after hot deformation in different steels and other metallic alloys.

  13. Water accounting for stressed river basins based on water resources management models.

    PubMed

    Pedro-Monzonís, María; Solera, Abel; Ferrer, Javier; Andreu, Joaquín; Estrela, Teodoro

    2016-09-15

    Water planning and the Integrated Water Resources Management (IWRM) represent the best way to help decision makers to identify and choose the most adequate alternatives among other possible ones. The System of Environmental-Economic Accounting for Water (SEEA-W) is displayed as a tool for the building of water balances in a river basin, providing a standard approach to achieve comparability of the results between different territories. The target of this paper is to present the building up of a tool that enables the combined use of hydrological models and water resources models to fill in the SEEA-W tables. At every step of the modelling chain, we are capable to build the asset accounts and the physical water supply and use tables according to SEEA-W approach along with an estimation of the water services costs. The case study is the Jucar River Basin District (RBD), located in the eastern part of the Iberian Peninsula in Spain which as in other many Mediterranean basins is currently water-stressed. To guide this work we have used PATRICAL model in combination with AQUATOOL Decision Support System (DSS). The results indicate that for the average year the total use of water in the district amounts to 15,143hm(3)/year, being the Total Water Renewable Water Resources 3909hm(3)/year. On the other hand, the water service costs in Jucar RBD amounts to 1634 million € per year at constant 2012 prices. It is noteworthy that 9% of these costs correspond to non-conventional resources, such as desalinated water, reused water and water transferred from other regions. PMID:27161139

  14. Lunar base CELSS: A bioregenerative approach

    NASA Technical Reports Server (NTRS)

    Easterwood, G. W.; Street, J. J.; Sartain, J. B.; Hubbell, D. H.; Robitaille, H. A.

    1992-01-01

    During the twenty-first century, human habitation of a self-sustaining lunar base could become a reality. To achieve this goal, the occupants will have to have food, water, and an adequate atmosphere within a carefully designed environment. Advanced technology will be employed to support terrestrial life-sustaining processes on the Moon. One approach to a life support system based on food production, waste management and utilization, and product synthesis is outlined. Inputs include an atmosphere, water, plants, biodegradable substrates, and manufacutured materials such as fiberglass containment vessels from lunar resources. Outputs include purification of air and water, food, and hydrogen (H2) generated from methane (CH4). Important criteria are as follows: (1) minimize resupply from Earth; and (2) recycle as efficiently as possible.

  15. An Ontology Based Approach to Information Security

    NASA Astrophysics Data System (ADS)

    Pereira, Teresa; Santos, Henrique

    The semantically structure of knowledge, based on ontology approaches have been increasingly adopted by several expertise from diverse domains. Recently ontologies have been moved from the philosophical and metaphysics disciplines to be used in the construction of models to describe a specific theory of a domain. The development and the use of ontologies promote the creation of a unique standard to represent concepts within a specific knowledge domain. In the scope of information security systems the use of an ontology to formalize and represent the concepts of security information challenge the mechanisms and techniques currently used. This paper intends to present a conceptual implementation model of an ontology defined in the security domain. The model presented contains the semantic concepts based on the information security standard ISO/IEC_JTC1, and their relationships to other concepts, defined in a subset of the information security domain.

  16. LP based approach to optimal stable matchings

    SciTech Connect

    Teo, Chung-Piaw; Sethuraman, J.

    1997-06-01

    We study the classical stable marriage and stable roommates problems using a polyhedral approach. We propose a new LP formulation for the stable roommates problem. This formulation is non-empty if and only if the underlying roommates problem has a stable matching. Furthermore, for certain special weight functions on the edges, we construct a 2-approximation algorithm for the optimal stable roommates problem. Our technique uses a crucial geometry of the fractional solutions in this formulation. For the stable marriage problem, we show that a related geometry allows us to express any fractional solution in the stable marriage polytope as convex combination of stable marriage solutions. This leads to a genuinely simple proof of the integrality of the stable marriage polytope. Based on these ideas, we devise a heuristic to solve the optimal stable roommates problem. The heuristic combines the power of rounding and cutting-plane methods. We present some computational results based on preliminary implementations of this heuristic.

  17. Structuring a Competency-Based Accounting Communication Course at the Graduate Level

    ERIC Educational Resources Information Center

    Sharifi, Mohsen; McCombs, Gary B.; Fraser, Linda Lussy; McCabe, Robert K.

    2009-01-01

    The authors describe a graduate capstone accounting class as a basis for building communication skills desired by both accounting practitioners and accounting faculty. An academic service-learning (ASL) component is included. Adopted as a required class for a master of science degree in accounting at two universities, this course supports…

  18. Synthetic aperture elastography: a GPU based approach

    NASA Astrophysics Data System (ADS)

    Verma, Prashant; Doyley, Marvin M.

    2014-03-01

    Synthetic aperture (SA) ultrasound imaging system produces highly accurate axial and lateral displacement estimates; however, low frame rates and large data volumes can hamper its clinical use. This paper describes a real-time SA imaging based ultrasound elastography system that we have recently developed to overcome this limitation. In this system, we implemented both beamforming and 2D cross-correlation echo tracking on Nvidia GTX 480 graphics processing unit (GPU). We used one thread per pixel for beamforming; whereas, one block per pixel was used for echo tracking. We compared the quality of elastograms computed with our real-time system relative to those computed using our standard single threaded elastographic imaging methodology. In all studies, we used conventional measures of image quality such as elastographic signal to noise ratio (SNRe). Specifically, SNRe of axial and lateral strain elastograms computed with real-time system were 36 dB and 23 dB, respectively, which was numerically equal to those computed with our standard approach. We achieved a frame rate of 6 frames per second using our GPU based approach for 16 transmits and kernel size of 60 × 60 pixels, which is 400 times faster than that achieved using our standard protocol.

  19. Accountability in Dispositions for Juvenile Drug Offenders. Monograph.

    ERIC Educational Resources Information Center

    Pacific Inst. for Research and Evaluation, Walnut Creek, CA.

    Guidelines for the general development and implementation of accountability-based approaches for juvenile drug offenders are presented in this monograph. These topics are discussed: (1) the accountability approach; (2) the relevance of the accountability approach to drug offenders and its relationship to drug abuse treatment; (3) surveys of chief…

  20. Is comprehension necessary for error detection? A conflict-based account of monitoring in speech production

    PubMed Central

    Nozari, Nazbanou; Dell, Gary S.; Schwartz, Myrna F.

    2011-01-01

    Despite the existence of speech errors, verbal communication is successful because speakers can detect (and correct) their errors. The standard theory of speech-error detection, the perceptual-loop account, posits that the comprehension system monitors production output for errors. Such a comprehension-based monitor, however, cannot explain the double dissociation between comprehension and error-detection ability observed in the aphasic patients. We propose a new theory of speech-error detection which is instead based on the production process itself. The theory borrows from studies of forced-choice-response tasks the notion that error detection is accomplished by monitoring response conflict via a frontal brain structure, such as the anterior cingulate cortex. We adapt this idea to the two-step model of word production, and test the model-derived predictions on a sample of aphasic patients. Our results show a strong correlation between patients’ error-detection ability and the model’s characterization of their production skills, and no significant correlation between error detection and comprehension measures, thus supporting a production-based monitor, generally, and the implemented conflict-based monitor in particular. The successful application of the conflict-based theory to error-detection in linguistic, as well as non-linguistic domains points to a domain-general monitoring system. PMID:21652015

  1. Measuring students' school context exposures: A trajectory-based approach.

    PubMed

    Halpern-Manners, Andrew

    2016-07-01

    Studies of school effects on children's outcomes usually use single time-point measures. I argue that this approach fails to account for (1) age-based variation in children's sensitivity to their surroundings; (2) differential effects stemming from differences in the length of young people's exposures; and (3) moves between contexts and endogenous changes over time within them. To evaluate the merits of this argument, I specify and test a longitudinal model of school effects on children's academic performance. Drawing on recent advances in finite mixture modeling, I identify a series of distinct school context trajectories that extend across a substantial portion of respondents' elementary and secondary school years. I find that these trajectories vary significantly with respect to shape, with some students experiencing significant changes in their environments over time. I then show that students' trajectories of exposure are related to their 8th grade achievement, even after controlling for point-in-time measures of school context. PMID:27194656

  2. Peptide Based Radiopharmaceuticals: Specific Construct Approach

    SciTech Connect

    Som, P; Rhodes, B A; Sharma, S S

    1997-10-21

    The objective of this project was to develop receptor based peptides for diagnostic imaging and therapy. A series of peptides related to cell adhesion molecules (CAM) and immune regulation were designed for radiolabeling with 99mTc and evaluated in animal models as potential diagnostic imaging agents for various disease conditions such as thrombus (clot), acute kidney failure, and inflection/inflammation imaging. The peptides for this project were designed by the industrial partner, Palatin Technologies, (formerly Rhomed, Inc.) using various peptide design approaches including a newly developed rational computer assisted drug design (CADD) approach termed MIDAS (Metal ion Induced Distinctive Array of Structures). In this approach, the biological function domain and the 99mTc complexing domain are fused together so that structurally these domains are indistinguishable. This approach allows construction of conformationally rigid metallo-peptide molecules (similar to cyclic peptides) that are metabolically stable in-vivo. All the newly designed peptides were screened in various in vitro receptor binding and functional assays to identify a lead compound. The lead compounds were formulated in a one-step 99mTc labeling kit form which were studied by BNL for detailed in-vivo imaging using various animals models of human disease. Two main peptides usingMIDAS approach evolved and were investigated: RGD peptide for acute renal failure and an immunomodulatory peptide derived from tuftsin (RMT-1) for infection/inflammation imaging. Various RGD based metallopeptides were designed, synthesized and assayed for their efficacy in inhibiting ADP-induced human platelet aggregation. Most of these peptides displayed biological activity in the 1-100 µM range. Based on previous work by others, RGD-I and RGD-II were evaluated in animal models of acute renal failure. These earlier studies showed that after acute ischemic injury the renal cortex displays

  3. Place-based pedagogy in the era of accountability: An action research study

    NASA Astrophysics Data System (ADS)

    Saracino, Peter C.

    Today's most common method of teaching biology---driven by calls for standardization and high-stakes testing---relies on a standards-based, de-contextualized approach to education. This results in "one size fits all" curriculums that ignore local contexts relevant to students' lives, discourage student engagement and ultimately work against a deep and lasting understanding of content. In contrast, place-based education---a pedagogical paradigm grounded in situated cognition and the progressive education tradition of John Dewey---utilizes the community as an integrating context for learning. It encourages the growth of school-community partnerships with an eye towards raising student achievement while also drawing students into the economic, political, social and ecological life of their communities. Such an approach seeks to provide students with learning experiences that are both academically significant and valuable to their communities. This study explores how high school science teachers can capitalize on the rich affordances offered by a place-based approach despite the constraints imposed by a state-mandated curriculum and high-stakes testing. Using action research, I designed, implemented, evaluated and refined an intervention that grounded a portion of a Living Environment high school course I teach in a place-based experience. This experience served as a unique anchoring event to contextualize students' learning of other required core topics. The overarching question framing this study is: How can science teachers capitalize on the rich affordances offered by a place-based approach despite the constraints imposed by a state-mandated curriculum and high-stakes testing? The following more specific questions were explored within the context of the intervention: (1) Which elements of the place-based paradigm could I effectively integrate into a Living Environment course? (2) In what ways would this integration impact students' interest? (3) In what ways would

  4. ECG biometric identification: A compression based approach.

    PubMed

    Bras, Susana; Pinho, Armando J

    2015-08-01

    Using the electrocardiogram signal (ECG) to identify and/or authenticate persons are problems still lacking satisfactory solutions. Yet, ECG possesses characteristics that are unique or difficult to get from other signals used in biometrics: (1) it requires contact and liveliness for acquisition (2) it changes under stress, rendering it potentially useless if acquired under threatening. Our main objective is to present an innovative and robust solution to the above-mentioned problem. To successfully conduct this goal, we rely on information-theoretic data models for data compression and on similarity metrics related to the approximation of the Kolmogorov complexity. The proposed measure allows the comparison of two (or more) ECG segments, without having to follow traditional approaches that require heartbeat segmentation (described as highly influenced by external or internal interferences). As a first approach, the method was able to cluster the data in three groups: identical record, same participant, different participant, by the stratification of the proposed measure with values near 0 for the same participant and closer to 1 for different participants. A leave-one-out strategy was implemented in order to identify the participant in the database based on his/her ECG. A 1NN classifier was implemented, using as distance measure the method proposed in this work. The classifier was able to identify correctly almost all participants, with an accuracy of 99% in the database used. PMID:26737619

  5. Surrogate Motherhood: A Trust-Based Approach.

    PubMed

    Beier, Katharina

    2015-12-01

    Because it is often argued that surrogacy should not be treated as contractual, the question arises in which terms this practice might then be couched. In this article, I argue that a phenomenology of surrogacy centering on the notion of trust provides a description that is illuminating from the moral point of view. My thesis is that surrogacy establishes a complex and extended reproductive unit--the "surrogacy triad" consisting of the surrogate mother, the child, and the intending parents--whose constituents are bound together by mutual trustful commitments. Even though a trust-based approach does not provide an ultimate answer to whether surrogacy should be sanctioned or prohibited, it allows for at least some practical suggestions. In particular, I will argue that, under certain conditions, surrogacy is tenable within familial or other significant relationships, and I will stress the necessity of acknowledging the new relationships and moral commitments that result from this practice. PMID:26449234

  6. Nanotechnology-based approaches in anticancer research

    PubMed Central

    Jabir, Nasimudeen R; Tabrez, Shams; Ashraf, Ghulam Md; Shakil, Shazi; Damanhouri, Ghazi A; Kamal, Mohammad A

    2012-01-01

    Cancer is a highly complex disease to understand, because it entails multiple cellular physiological systems. The most common cancer treatments are restricted to chemotherapy, radiation and surgery. Moreover, the early recognition and treatment of cancer remains a technological bottleneck. There is an urgent need to develop new and innovative technologies that could help to delineate tumor margins, identify residual tumor cells and micrometastases, and determine whether a tumor has been completely removed or not. Nanotechnology has witnessed significant progress in the past few decades, and its effect is widespread nowadays in every field. Nanoparticles can be modified in numerous ways to prolong circulation, enhance drug localization, increase drug efficacy, and potentially decrease chances of multidrug resistance by the use of nanotechnology. Recently, research in the field of cancer nanotechnology has made remarkable advances. The present review summarizes the application of various nanotechnology-based approaches towards the diagnostics and therapeutics of cancer. PMID:22927757

  7. VATS-based approach for robotic lobectomy.

    PubMed

    Melfi, Franca M A; Fanucchi, Olivia; Davini, Federico; Mussi, Alfredo

    2014-05-01

    Lobectomy with systematic lymph node sampling or dissection remains the mainstay of treatment of early stage non-small cell lung cancer. The use of video-assisted thoracic surgery (VATS) to perform lobectomy was first reported in 1992. Advantages of VATS include less trauma and pain, shorter chest drainage duration, decreased hospital stay, and preservation of short-term pulmonary function. However, VATS is characterized by loss of binocular vision and a limited maneuverability of thoracoscopic instruments, an unstable camera platform, and poor ergonomics for the surgeon. To overcome these limitations, robotic systems were developed during the last decades. This article reviews the technical aspects of robotic lobectomy using a VATS-based approach. PMID:24780417

  8. Sepsis management: An evidence-based approach.

    PubMed

    Baig, Muhammad Akbar; Shahzad, Hira; Jamil, Bushra; Hussain, Erfan

    2016-03-01

    The Surviving Sepsis Campaign (SSC) guidelines have outlined an early goal directed therapy (EGDT) which demonstrates a standardized approach to ensure prompt and effective management of sepsis. Having said that, there are barriers associated with the application of evidence-based practice, which often lead to an overall poorer adherence to guidelines. Considering the global burden of disease, data from low- to middle-income countries is scarce. Asia is the largest continent but most Asian countries do not have a well-developed healthcare system and compliance rates to resuscitation and management bundles are as low as 7.6% and 3.5%, respectively. Intensive care units are not adequately equipped and financial concerns limit implementation of expensive treatment strategies. Healthcare policy-makers should be notified in order to alleviate financial restrictions and ensure delivery of standard care to septic patients. PMID:26968289

  9. Strategic approaches to planetary base development

    NASA Technical Reports Server (NTRS)

    Roberts, Barney B.

    1992-01-01

    The evolutionary development of a planetary expansionary outpost is considered in the light of both technical and economic issues. The outline of a partnering taxonomy is set forth which encompasses both institutional and temporal issues related to establishing shared interests and investments. The purely technical issues are discussed in terms of the program components which include nonaerospace technologies such as construction engineering. Five models are proposed in which partnership and autonomy for participants are approached in different ways including: (1) the standard customer/provider relationship; (2) a service-provider scenario; (3) the joint venture; (4) a technology joint-development model; and (5) a redundancy model for reduced costs. Based on the assumed characteristics of planetary surface systems the cooperative private/public models are championed with coordinated design by NASA to facilitate outside cooperation.

  10. Integrating software into PRA: a test-based approach.

    PubMed

    Li, Bin; Li, Ming; Smidts, Carol

    2005-08-01

    Probabilistic risk assessment (PRA) is a methodology to assess the probability of failure or success of a system's operation. PRA has been proved to be a systematic, logical, and comprehensive technique for risk assessment. Software plays an increasing role in modern safety critical systems. A significant number of failures can be attributed to software failures. Unfortunately, current probabilistic risk assessment concentrates on representing the behavior of hardware systems, humans, and their contributions (to a limited extent) to risk but neglects the contributions of software due to a lack of understanding of software failure phenomena. It is thus imperative to consider and model the impact of software to reflect the risk in current and future systems. The objective of our research is to develop a methodology to account for the impact of software on system failure that can be used in the classical PRA analysis process. A test-based approach for integrating software into PRA is discussed in this article. This approach includes identification of software functions to be modeled in the PRA, modeling of the software contributions in the ESD, and fault tree. The approach also introduces the concepts of input tree and output tree and proposes a quantification strategy that uses a software safety testing technique. The method is applied to an example system, PACS. PMID:16268949

  11. A Commentary on "Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century"

    ERIC Educational Resources Information Center

    Brandt, Steffen

    2010-01-01

    This article presents the author's commentary on "Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century," in which Isaac I. Bejar and E. Aurora Graf propose the application of a test design--the duplex design (which was proposed in 1988 by Bock and Mislevy) for application in current accountability assessments.…

  12. NuMas: A LAN-based materials control and accounting system in production

    SciTech Connect

    Strickland, T.W.; Bracey, J.T.; McMahon, S.A.

    1995-12-31

    A state-of-the-art Nuclear Materials Control and Accounting (NMC and A) System has been implemented and is fully operational at the Paducah Gaseous Diffusion Plant (PGDP) as of September 1994. The uranium enrichment facility is currently regulated by the Department of Energy (DOE) and is in the process of obtaining Nuclear Regulatory Commission (NRC) certification. Implementation of this system has resulted in a tremendous cost savings to the facility as well as improvements to the overall efficiency of the NMC and A department. This paper outlines the benefits of implementing a Personal Computer/Local Area Network (PC/LAN)-based system in hopes of attracting other facilities to explore and utilize its application at their sites.

  13. What is narrative therapy and what is it not?: the usefulness of Q methodology to explore accounts of White and Epston's (1990) approach to narrative therapy.

    PubMed

    Wallis, Jennifer; Burns, Jan; Capdevila, Rose

    2011-01-01

    OBJECTIVE. 'What is narrative therapy and how do you do it?' is a question that is repeatedly asked of narrative therapy, with little consistent response. This study aimed to explore and distil out the 'common themes' of practitioner definitions of White and Epston's approach to narrative therapy. DESIGN. This was an Internet-based study involving current UK practitioners of this type of narrative therapy using a unique combination of a Delphi Panel and Q methodology. METHOD. A group of experienced practitioners were recruited into the Delphi Poll and were asked two questions about what narrative therapy is and is not, and what techniques are and are not employed. These data combined with other information formed the statements of a Q-sort that was then administered to a wider range of narrative practitioners. FINDINGS. The Delphi Panel agreed on a number of key points relating to the theory, politics and practice of narrative therapy. The Q-sort produced eight distinct accounts of narrative therapy and a number of dimensions along which these different positions could be distinguished. These included narrative therapy as a political stance and integration with other approaches. CONCLUSIONS. For any therapeutic model to demonstrate its efficacy and attract proponents, an accepted definition of its components and practice should preferably be established. This study has provided some data for the UK application of White and Epston's narrative therapy, which may then assist in forming a firmer base for further research and practice. PMID:20806421

  14. A genome-wide approach accounting for body mass index identifies genetic variants influencing fasting glycemic traits and insulin resistance.

    PubMed

    Manning, Alisa K; Hivert, Marie-France; Scott, Robert A; Grimsby, Jonna L; Bouatia-Naji, Nabila; Chen, Han; Rybin, Denis; Liu, Ching-Ti; Bielak, Lawrence F; Prokopenko, Inga; Amin, Najaf; Barnes, Daniel; Cadby, Gemma; Hottenga, Jouke-Jan; Ingelsson, Erik; Jackson, Anne U; Johnson, Toby; Kanoni, Stavroula; Ladenvall, Claes; Lagou, Vasiliki; Lahti, Jari; Lecoeur, Cecile; Liu, Yongmei; Martinez-Larrad, Maria Teresa; Montasser, May E; Navarro, Pau; Perry, John R B; Rasmussen-Torvik, Laura J; Salo, Perttu; Sattar, Naveed; Shungin, Dmitry; Strawbridge, Rona J; Tanaka, Toshiko; van Duijn, Cornelia M; An, Ping; de Andrade, Mariza; Andrews, Jeanette S; Aspelund, Thor; Atalay, Mustafa; Aulchenko, Yurii; Balkau, Beverley; Bandinelli, Stefania; Beckmann, Jacques S; Beilby, John P; Bellis, Claire; Bergman, Richard N; Blangero, John; Boban, Mladen; Boehnke, Michael; Boerwinkle, Eric; Bonnycastle, Lori L; Boomsma, Dorret I; Borecki, Ingrid B; Böttcher, Yvonne; Bouchard, Claude; Brunner, Eric; Budimir, Danijela; Campbell, Harry; Carlson, Olga; Chines, Peter S; Clarke, Robert; Collins, Francis S; Corbatón-Anchuelo, Arturo; Couper, David; de Faire, Ulf; Dedoussis, George V; Deloukas, Panos; Dimitriou, Maria; Egan, Josephine M; Eiriksdottir, Gudny; Erdos, Michael R; Eriksson, Johan G; Eury, Elodie; Ferrucci, Luigi; Ford, Ian; Forouhi, Nita G; Fox, Caroline S; Franzosi, Maria Grazia; Franks, Paul W; Frayling, Timothy M; Froguel, Philippe; Galan, Pilar; de Geus, Eco; Gigante, Bruna; Glazer, Nicole L; Goel, Anuj; Groop, Leif; Gudnason, Vilmundur; Hallmans, Göran; Hamsten, Anders; Hansson, Ola; Harris, Tamara B; Hayward, Caroline; Heath, Simon; Hercberg, Serge; Hicks, Andrew A; Hingorani, Aroon; Hofman, Albert; Hui, Jennie; Hung, Joseph; Jarvelin, Marjo-Riitta; Jhun, Min A; Johnson, Paul C D; Jukema, J Wouter; Jula, Antti; Kao, W H; Kaprio, Jaakko; Kardia, Sharon L R; Keinanen-Kiukaanniemi, Sirkka; Kivimaki, Mika; Kolcic, Ivana; Kovacs, Peter; Kumari, Meena; Kuusisto, Johanna; Kyvik, Kirsten Ohm; Laakso, Markku; Lakka, Timo; Lannfelt, Lars; Lathrop, G Mark; Launer, Lenore J; Leander, Karin; Li, Guo; Lind, Lars; Lindstrom, Jaana; Lobbens, Stéphane; Loos, Ruth J F; Luan, Jian'an; Lyssenko, Valeriya; Mägi, Reedik; Magnusson, Patrik K E; Marmot, Michael; Meneton, Pierre; Mohlke, Karen L; Mooser, Vincent; Morken, Mario A; Miljkovic, Iva; Narisu, Narisu; O'Connell, Jeff; Ong, Ken K; Oostra, Ben A; Palmer, Lyle J; Palotie, Aarno; Pankow, James S; Peden, John F; Pedersen, Nancy L; Pehlic, Marina; Peltonen, Leena; Penninx, Brenda; Pericic, Marijana; Perola, Markus; Perusse, Louis; Peyser, Patricia A; Polasek, Ozren; Pramstaller, Peter P; Province, Michael A; Räikkönen, Katri; Rauramaa, Rainer; Rehnberg, Emil; Rice, Ken; Rotter, Jerome I; Rudan, Igor; Ruokonen, Aimo; Saaristo, Timo; Sabater-Lleal, Maria; Salomaa, Veikko; Savage, David B; Saxena, Richa; Schwarz, Peter; Seedorf, Udo; Sennblad, Bengt; Serrano-Rios, Manuel; Shuldiner, Alan R; Sijbrands, Eric J G; Siscovick, David S; Smit, Johannes H; Small, Kerrin S; Smith, Nicholas L; Smith, Albert Vernon; Stančáková, Alena; Stirrups, Kathleen; Stumvoll, Michael; Sun, Yan V; Swift, Amy J; Tönjes, Anke; Tuomilehto, Jaakko; Trompet, Stella; Uitterlinden, Andre G; Uusitupa, Matti; Vikström, Max; Vitart, Veronique; Vohl, Marie-Claude; Voight, Benjamin F; Vollenweider, Peter; Waeber, Gerard; Waterworth, Dawn M; Watkins, Hugh; Wheeler, Eleanor; Widen, Elisabeth; Wild, Sarah H; Willems, Sara M; Willemsen, Gonneke; Wilson, James F; Witteman, Jacqueline C M; Wright, Alan F; Yaghootkar, Hanieh; Zelenika, Diana; Zemunik, Tatijana; Zgaga, Lina; Wareham, Nicholas J; McCarthy, Mark I; Barroso, Ines; Watanabe, Richard M; Florez, Jose C; Dupuis, Josée; Meigs, James B; Langenberg, Claudia

    2012-06-01

    Recent genome-wide association studies have described many loci implicated in type 2 diabetes (T2D) pathophysiology and β-cell dysfunction but have contributed little to the understanding of the genetic basis of insulin resistance. We hypothesized that genes implicated in insulin resistance pathways might be uncovered by accounting for differences in body mass index (BMI) and potential interactions between BMI and genetic variants. We applied a joint meta-analysis approach to test associations with fasting insulin and glucose on a genome-wide scale. We present six previously unknown loci associated with fasting insulin at P < 5 × 10(-8) in combined discovery and follow-up analyses of 52 studies comprising up to 96,496 non-diabetic individuals. Risk variants were associated with higher triglyceride and lower high-density lipoprotein (HDL) cholesterol levels, suggesting a role for these loci in insulin resistance pathways. The discovery of these loci will aid further characterization of the role of insulin resistance in T2D pathophysiology. PMID:22581228

  15. A genome-wide approach accounting for body mass index identifies genetic variants influencing fasting glycemic traits and insulin resistance

    PubMed Central

    Manning, Alisa K.; Hivert, Marie-France; Scott, Robert A.; Grimsby, Jonna L.; Bouatia-Naji, Nabila; Chen, Han; Rybin, Denis; Liu, Ching-Ti; Bielak, Lawrence F.; Prokopenko, Inga; Amin, Najaf; Barnes, Daniel; Cadby, Gemma; Hottenga, Jouke-Jan; Ingelsson, Erik; Jackson, Anne U.; Johnson, Toby; Kanoni, Stavroula; Ladenvall, Claes; Lagou, Vasiliki; Lahti, Jari; Lecoeur, Cecile; Liu, Yongmei; Martinez-Larrad, Maria Teresa; Montasser, May E.; Navarro, Pau; Perry, John R. B.; Rasmussen-Torvik, Laura J.; Salo, Perttu; Sattar, Naveed; Shungin, Dmitry; Strawbridge, Rona J.; Tanaka, Toshiko; van Duijn, Cornelia M.; An, Ping; de Andrade, Mariza; Andrews, Jeanette S.; Aspelund, Thor; Atalay, Mustafa; Aulchenko, Yurii; Balkau, Beverley; Bandinelli, Stefania; Beckmann, Jacques S.; Beilby, John P.; Bellis, Claire; Bergman, Richard N.; Blangero, John; Boban, Mladen; Boehnke, Michael; Boerwinkle, Eric; Bonnycastle, Lori L.; Boomsma, Dorret I.; Borecki, Ingrid B.; Böttcher, Yvonne; Bouchard, Claude; Brunner, Eric; Budimir, Danijela; Campbell, Harry; Carlson, Olga; Chines, Peter S.; Clarke, Robert; Collins, Francis S.; Corbatón-Anchuelo, Arturo; Couper, David; de Faire, Ulf; Dedoussis, George V; Deloukas, Panos; Dimitriou, Maria; Egan, Josephine M; Eiriksdottir, Gudny; Erdos, Michael R.; Eriksson, Johan G.; Eury, Elodie; Ferrucci, Luigi; Ford, Ian; Forouhi, Nita G.; Fox, Caroline S; Franzosi, Maria Grazia; Franks, Paul W; Frayling, Timothy M; Froguel, Philippe; Galan, Pilar; de Geus, Eco; Gigante, Bruna; Glazer, Nicole L.; Goel, Anuj; Groop, Leif; Gudnason, Vilmundur; Hallmans, Göran; Hamsten, Anders; Hansson, Ola; Harris, Tamara B.; Hayward, Caroline; Heath, Simon; Hercberg, Serge; Hicks, Andrew A.; Hingorani, Aroon; Hofman, Albert; Hui, Jennie; Hung, Joseph; Jarvelin, Marjo Riitta; Jhun, Min A.; Johnson, Paul C.D.; Jukema, J Wouter; Jula, Antti; Kao, W.H.; Kaprio, Jaakko; Kardia, Sharon L. R.; Keinanen-Kiukaanniemi, Sirkka; Kivimaki, Mika; Kolcic, Ivana; Kovacs, Peter; Kumari, Meena; Kuusisto, Johanna; Kyvik, Kirsten Ohm; Laakso, Markku; Lakka, Timo; Lannfelt, Lars; Lathrop, G Mark; Launer, Lenore J.; Leander, Karin; Li, Guo; Lind, Lars; Lindstrom, Jaana; Lobbens, Stéphane; Loos, Ruth J. F.; Luan, Jian’an; Lyssenko, Valeriya; Mägi, Reedik; Magnusson, Patrik K. E.; Marmot, Michael; Meneton, Pierre; Mohlke, Karen L.; Mooser, Vincent; Morken, Mario A.; Miljkovic, Iva; Narisu, Narisu; O’Connell, Jeff; Ong, Ken K.; Oostra, Ben A.; Palmer, Lyle J.; Palotie, Aarno; Pankow, James S.; Peden, John F.; Pedersen, Nancy L.; Pehlic, Marina; Peltonen, Leena; Penninx, Brenda; Pericic, Marijana; Perola, Markus; Perusse, Louis; Peyser, Patricia A; Polasek, Ozren; Pramstaller, Peter P.; Province, Michael A.; Räikkönen, Katri; Rauramaa, Rainer; Rehnberg, Emil; Rice, Ken; Rotter, Jerome I.; Rudan, Igor; Ruokonen, Aimo; Saaristo, Timo; Sabater-Lleal, Maria; Salomaa, Veikko; Savage, David B.; Saxena, Richa; Schwarz, Peter; Seedorf, Udo; Sennblad, Bengt; Serrano-Rios, Manuel; Shuldiner, Alan R.; Sijbrands, Eric J.G.; Siscovick, David S.; Smit, Johannes H.; Small, Kerrin S.; Smith, Nicholas L.; Smith, Albert Vernon; Stančáková, Alena; Stirrups, Kathleen; Stumvoll, Michael; Sun, Yan V.; Swift, Amy J.; Tönjes, Anke; Tuomilehto, Jaakko; Trompet, Stella; Uitterlinden, Andre G.; Uusitupa, Matti; Vikström, Max; Vitart, Veronique; Vohl, Marie-Claude; Voight, Benjamin F.; Vollenweider, Peter; Waeber, Gerard; Waterworth, Dawn M; Watkins, Hugh; Wheeler, Eleanor; Widen, Elisabeth; Wild, Sarah H.; Willems, Sara M.; Willemsen, Gonneke; Wilson, James F.; Witteman, Jacqueline C.M.; Wright, Alan F.; Yaghootkar, Hanieh; Zelenika, Diana; Zemunik, Tatijana; Zgaga, Lina; Wareham, Nicholas J.; McCarthy, Mark I.; Barroso, Ines; Watanabe, Richard M.; Florez, Jose C.; Dupuis, Josée; Meigs, James B.; Langenberg, Claudia

    2013-01-01

    Recent genome-wide association studies have described many loci implicated in type 2 diabetes (T2D) pathophysiology and beta-cell dysfunction, but contributed little to our understanding of the genetic basis of insulin resistance. We hypothesized that genes implicated in insulin resistance pathways may be uncovered by accounting for differences in body mass index (BMI) and potential interaction between BMI and genetic variants. We applied a novel joint meta-analytical approach to test associations with fasting insulin (FI) and glucose (FG) on a genome-wide scale. We present six previously unknown FI loci at P<5×10−8 in combined discovery and follow-up analyses of 52 studies comprising up to 96,496non-diabetic individuals. Risk variants were associated with higher triglyceride and lower HDL cholesterol levels, suggestive of a role for these FI loci in insulin resistance pathways. The localization of these additional loci will aid further characterization of the role of insulin resistance in T2D pathophysiology. PMID:22581228

  16. Knowledge-based approach to system integration

    NASA Technical Reports Server (NTRS)

    Blokland, W.; Krishnamurthy, C.; Biegl, C.; Sztipanovits, J.

    1988-01-01

    To solve complex problems one can often use the decomposition principle. However, a problem is seldom decomposable into completely independent subproblems. System integration deals with problem of resolving the interdependencies and the integration of the subsolutions. A natural method of decomposition is the hierarchical one. High-level specifications are broken down into lower level specifications until they can be transformed into solutions relatively easily. By automating the hierarchical decomposition and solution generation an integrated system is obtained in which the declaration of high level specifications is enough to solve the problem. We offer a knowledge-based approach to integrate the development and building of control systems. The process modeling is supported by using graphic editors. The user selects and connects icons that represent subprocesses and might refer to prewritten programs. The graphical editor assists the user in selecting parameters for each subprocess and allows the testing of a specific configuration. Next, from the definitions created by the graphical editor, the actual control program is built. Fault-diagnosis routines are generated automatically as well. Since the user is not required to write program code and knowledge about the process is present in the development system, the user is not required to have expertise in many fields.

  17. Evaluating face trustworthiness: a model based approach

    PubMed Central

    Baron, Sean G.; Oosterhof, Nikolaas N.

    2008-01-01

    Judgments of trustworthiness from faces determine basic approach/avoidance responses and approximate the valence evaluation of faces that runs across multiple person judgments. Here, based on trustworthiness judgments and using a computer model for face representation, we built a model for representing face trustworthiness (study 1). Using this model, we generated novel faces with an increased range of trustworthiness and used these faces as stimuli in a functional Magnetic Resonance Imaging study (study 2). Although participants did not engage in explicit evaluation of the faces, the amygdala response changed as a function of face trustworthiness. An area in the right amygdala showed a negative linear response—as the untrustworthiness of faces increased so did the amygdala response. Areas in the left and right putamen, the latter area extended into the anterior insula, showed a similar negative linear response. The response in the left amygdala was quadratic—strongest for faces on both extremes of the trustworthiness dimension. The medial prefrontal cortex and precuneus also showed a quadratic response, but their response was strongest to faces in the middle range of the trustworthiness dimension. PMID:19015102

  18. Pattern recognition tool based on complex network-based approach

    NASA Astrophysics Data System (ADS)

    Casanova, Dalcimar; Backes, André Ricardo; Martinez Bruno, Odemir

    2013-02-01

    This work proposed a generalization of the method proposed by the authors: 'A complex network-based approach for boundary shape analysis'. Instead of modelling a contour into a graph and use complex networks rules to characterize it, here, we generalize the technique. This way, the work proposes a mathematical tool for characterization signals, curves and set of points. To evaluate the pattern description power of the proposal, an experiment of plat identification based on leaf veins image are conducted. Leaf vein is a taxon characteristic used to plant identification proposes, and one of its characteristics is that these structures are complex, and difficult to be represented as a signal or curves and this way to be analyzed in a classical pattern recognition approach. Here, we model the veins as a set of points and model as graphs. As features, we use the degree and joint degree measurements in a dynamic evolution. The results demonstrates that the technique has a good power of discrimination and can be used for plant identification, as well as other complex pattern recognition tasks.

  19. How Educators in Three States Are Responding to Standards-Based Accountability under No Child Left Behind. Research Brief

    ERIC Educational Resources Information Center

    Hamilton, Laura S.; Stecher, Brian M.; Marsh, Julie A.; McCombs, Jennifer Sloan; Robyn, Abby; Russell, Jennifer; Naftel, Scott; Barney, Heather

    2007-01-01

    In 2002, the RAND Corporation launched a project to understand how educators are responding to the new accountability requirements in California, Georgia, and Pennsylvania--three states that represent a range of approaches, regions, and student populations. The researchers aimed to identify the factors that enhance the implementation of SBA…

  20. Toward an Human Resource Accounting (HRA)-Based Model for Designing an Organizational Effectiveness Audit in Education.

    ERIC Educational Resources Information Center

    Myroon, John L.

    The major purpose of this paper was to develop a Human Resource Accounting (HRA) macro-model that could be used for designing a school organizational effectiveness audit. Initially, the paper reviewed the advent and definition of HRA. In order to develop the proposed model, the different approaches to measuring effectiveness were reviewed,…

  1. Use of Alternate Assessment Results in Reporting and Accountability Systems: Conditions for Use Based on Research and Practice. Synthesis Report.

    ERIC Educational Resources Information Center

    Quenemoen, Rachel; Rigney, Susan; Thurlow, Martha

    State assessment systems must address both technical and policy issues as assessments and accountability practices are developed and implemented. These technical and policy issues have been expanded from traditional large-scale assessment to new alternative assessment approaches required by law and developed in every state. The primary purpose of…

  2. Concurrency-based approaches to parallel programming

    NASA Technical Reports Server (NTRS)

    Kale, L.V.; Chrisochoides, N.; Kohl, J.; Yelick, K.

    1995-01-01

    The inevitable transition to parallel programming can be facilitated by appropriate tools, including languages and libraries. After describing the needs of applications developers, this paper presents three specific approaches aimed at development of efficient and reusable parallel software for irregular and dynamic-structured problems. A salient feature of all three approaches in their exploitation of concurrency within a processor. Benefits of individual approaches such as these can be leveraged by an interoperability environment which permits modules written using different approaches to co-exist in single applications.

  3. Concurrency-based approaches to parallel programming

    SciTech Connect

    Kale, L.V.; Chrisochoides, N.; Kohl, J.

    1995-07-17

    The inevitable transition to parallel programming can be facilitated by appropriate tools, including languages and libraries. After describing the needs of applications developers, this paper presents three specific approaches aimed at development of efficient and reusable parallel software for irregular and dynamic-structured problems. A salient feature of all three approaches in their exploitation of concurrency within a processor. Benefits of individual approaches such as these can be leveraged by an interoperability environment which permits modules written using different approaches to co-exist in single applications.

  4. Nanotechnology-Based Approaches for Guiding Neural Regeneration.

    PubMed

    Shah, Shreyas; Solanki, Aniruddh; Lee, Ki-Bum

    2016-01-19

    The mammalian brain is a phenomenal piece of "organic machinery" that has fascinated scientists and clinicians for centuries. The intricate network of tens of billions of neurons dispersed in a mixture of chemical and biochemical constituents gives rise to thoughts, feelings, memories, and life as we know it. In turn, subtle imbalances or damage to this system can cause severe complications in physical, motor, psychological, and cognitive function. Moreover, the inevitable loss of nerve tissue caused by degenerative diseases and traumatic injuries is particularly devastating because of the limited regenerative capabilities of the central nervous system (i.e., the brain and spinal cord). Among current approaches, stem-cell-based regenerative medicine has shown the greatest promise toward repairing and regenerating destroyed neural tissue. However, establishing controlled and reliable methodologies to guide stem cell differentiation into specialized neural cells of interest (e.g., neurons and oligodendrocytes) has been a prevailing challenge in the field. In this Account, we summarize the nanotechnology-based approaches our group has recently developed to guide stem-cell-based neural regeneration. We focus on three overarching strategies that were adopted to selectively control this process. First, soluble microenvironmental factors play a critical role in directing the fate of stem cells. Multiple factors have been developed in the form of small-molecule drugs, biochemical analogues, and DNA/RNA-based vectors to direct neural differentiation. However, the delivery of these factors with high transfection efficiency and minimal cytotoxicity has been challenging, especially to sensitive cell lines such as stem cells. In our first approach, we designed nanoparticle-based systems for the efficient delivery of such soluble factors to control neural differentiation. Our nanoparticles, comprising either organic or inorganic elements, were biocompatible and offered

  5. An agent-based simulation model to study accountable care organizations.

    PubMed

    Liu, Pai; Wu, Shinyi

    2016-03-01

    Creating accountable care organizations (ACOs) has been widely discussed as a strategy to control rapidly rising healthcare costs and improve quality of care; however, building an effective ACO is a complex process involving multiple stakeholders (payers, providers, patients) with their own interests. Also, implementation of an ACO is costly in terms of time and money. Immature design could cause safety hazards. Therefore, there is a need for analytical model-based decision-support tools that can predict the outcomes of different strategies to facilitate ACO design and implementation. In this study, an agent-based simulation model was developed to study ACOs that considers payers, healthcare providers, and patients as agents under the shared saving payment model of care for congestive heart failure (CHF), one of the most expensive causes of sometimes preventable hospitalizations. The agent-based simulation model has identified the critical determinants for the payment model design that can motivate provider behavior changes to achieve maximum financial and quality outcomes of an ACO. The results show nonlinear provider behavior change patterns corresponding to changes in payment model designs. The outcomes vary by providers with different quality or financial priorities, and are most sensitive to the cost-effectiveness of CHF interventions that an ACO implements. This study demonstrates an increasingly important method to construct a healthcare system analytics model that can help inform health policy and healthcare management decisions. The study also points out that the likely success of an ACO is interdependent with payment model design, provider characteristics, and cost and effectiveness of healthcare interventions. PMID:24715674

  6. Approach to proliferation risk assessment based on multiple objective analysis framework

    SciTech Connect

    Andrianov, A.; Kuptsov, I.

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  7. Prognostics and Health Management for Complex system Based on Fusion of Model-based approach and Data-driven approach

    NASA Astrophysics Data System (ADS)

    Hong-feng, Wang

    Prognostics and Health Management has been becoming an effective technology to increasing efficiency and reducing cost for complex system. As for the two major categories methods, both model-based approaches and datadriven approaches have merits and drawbacks. A kind of fusion approaches that integrate model-based approaches and data-driven approaches is presented in this paper and fusion structure is detailed to make full use of their advantages and overcome their limitations.

  8. Reuse: A knowledge-based approach

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui

    1992-01-01

    This paper describes our research in automating the reuse process through the use of application domain models. Application domain models are explicit formal representations of the application knowledge necessary to understand, specify, and generate application programs. Furthermore, they provide a unified repository for the operational structure, rules, policies, and constraints of a specific application area. In our approach, domain models are expressed in terms of a transaction-based meta-modeling language. This paper has described in detail the creation and maintenance of hierarchical structures. These structures are created through a process that includes reverse engineering of data models with supplementary enhancement from application experts. Source code is also reverse engineered but is not a major source of domain model instantiation at this time. In the second phase of the software synthesis process, program specifications are interactively synthesized from an instantiated domain model. These specifications are currently integrated into a manual programming process but will eventually be used to derive executable code with mechanically assisted transformations. This research is performed within the context of programming-in-the-large types of systems. Although our goals are ambitious, we are implementing the synthesis system in an incremental manner through which we can realize tangible results. The client/server architecture is capable of supporting 16 simultaneous X/Motif users and tens of thousands of attributes and classes. Domain models have been partially synthesized from five different application areas. As additional domain models are synthesized and additional knowledge is gathered, we will inevitably add to and modify our representation. However, our current experience indicates that it will scale and expand to meet our modeling needs.

  9. Stereoscopic ground-based determination of the cloud base height: theory of camera position calibration with account for lens distortion

    NASA Astrophysics Data System (ADS)

    Chulichkov, Alexey I.; Postylyakov, Oleg V.

    2016-05-01

    For the reconstruction of some geometrical characteristics of clouds a method was developed based on taking pictures of the sky by a pair of digital photo cameras and subsequent processing of the obtained sequence of stereo frames to obtain the height of the cloud base. Since the directions of the optical axes of the stereo cameras are not exactly known, a procedure of adjusting of obtained frames was developed which use photographs of the night starry sky. In the second step, the method of the morphological analysis of images is used to determine the relative shift of the coordinates of some fragment of cloud. The shift is used to estimate the searched cloud base height. The proposed method can be used for automatic processing of stereo data and getting the cloud base height. The earlier paper described a mathematical model of stereophotography measurement, poses and solves the problem of adjusting of optical axes of the cameras in paraxial (first-order geometric optics) approximation and was applied for the central part of the sky frames. This paper describes the model of experiment which takes into account lens distortion in Seidel approximation (depending on the third order of the distance from optical axis). We developed procedure of simultaneous camera position calibration and estimation of parameters of lens distortion in Seidel approximation.

  10. Design of a Competency-Based Assessment Model in the Field of Accounting

    ERIC Educational Resources Information Center

    Ciudad-Gómez, Adelaida; Valverde-Berrocoso, Jesús

    2012-01-01

    This paper presents the phases involved in the design of a methodology to contribute both to the acquisition of competencies and to their assessment in the field of Financial Accounting, within the European Higher Education Area (EHEA) framework, which we call MANagement of COMpetence in the areas of Accounting (MANCOMA). Having selected and…

  11. Involving Diverse Communities of Practice to Minimize Unintended Consequences of Test-Based Accountability Systems

    ERIC Educational Resources Information Center

    Behizadeh, Nadia; Engelhard, George, Jr.

    2015-01-01

    In his focus article, Koretz (this issue) argues that accountability has become the primary function of large-scale testing in the United States. He then points out that tests being used for accountability purposes are flawed and that the high-stakes nature of these tests creates a context that encourages score inflation. Koretz is concerned about…

  12. Structure-Based Statistical Mechanical Model Accounts for the Causality and Energetics of Allosteric Communication

    PubMed Central

    Guarnera, Enrico; Berezovsky, Igor N.

    2016-01-01

    Allostery is one of the pervasive mechanisms through which proteins in living systems carry out enzymatic activity, cell signaling, and metabolism control. Effective modeling of the protein function regulation requires a synthesis of the thermodynamic and structural views of allostery. We present here a structure-based statistical mechanical model of allostery, allowing one to observe causality of communication between regulatory and functional sites, and to estimate per residue free energy changes. Based on the consideration of ligand free and ligand bound systems in the context of a harmonic model, corresponding sets of characteristic normal modes are obtained and used as inputs for an allosteric potential. This potential quantifies the mean work exerted on a residue due to the local motion of its neighbors. Subsequently, in a statistical mechanical framework the entropic contribution to allosteric free energy of a residue is directly calculated from the comparison of conformational ensembles in the ligand free and ligand bound systems. As a result, this method provides a systematic approach for analyzing the energetics of allosteric communication based on a single structure. The feasibility of the approach was tested on a variety of allosteric proteins, heterogeneous in terms of size, topology and degree of oligomerization. The allosteric free energy calculations show the diversity of ways and complexity of scenarios existing in the phenomenology of allosteric causality and communication. The presented model is a step forward in developing the computational techniques aimed at detecting allosteric sites and obtaining the discriminative power between agonistic and antagonistic effectors, which are among the major goals in allosteric drug design. PMID:26939022

  13. Structure-Based Statistical Mechanical Model Accounts for the Causality and Energetics of Allosteric Communication.

    PubMed

    Guarnera, Enrico; Berezovsky, Igor N

    2016-03-01

    Allostery is one of the pervasive mechanisms through which proteins in living systems carry out enzymatic activity, cell signaling, and metabolism control. Effective modeling of the protein function regulation requires a synthesis of the thermodynamic and structural views of allostery. We present here a structure-based statistical mechanical model of allostery, allowing one to observe causality of communication between regulatory and functional sites, and to estimate per residue free energy changes. Based on the consideration of ligand free and ligand bound systems in the context of a harmonic model, corresponding sets of characteristic normal modes are obtained and used as inputs for an allosteric potential. This potential quantifies the mean work exerted on a residue due to the local motion of its neighbors. Subsequently, in a statistical mechanical framework the entropic contribution to allosteric free energy of a residue is directly calculated from the comparison of conformational ensembles in the ligand free and ligand bound systems. As a result, this method provides a systematic approach for analyzing the energetics of allosteric communication based on a single structure. The feasibility of the approach was tested on a variety of allosteric proteins, heterogeneous in terms of size, topology and degree of oligomerization. The allosteric free energy calculations show the diversity of ways and complexity of scenarios existing in the phenomenology of allosteric causality and communication. The presented model is a step forward in developing the computational techniques aimed at detecting allosteric sites and obtaining the discriminative power between agonistic and antagonistic effectors, which are among the major goals in allosteric drug design. PMID:26939022

  14. Grid-cell-based crop water accounting for the famine early warning system

    USGS Publications Warehouse

    Verdin, J.; Klaver, R.

    2002-01-01

    Rainfall monitoring is a regular activity of food security analysts for sub-Saharan Africa due to the potentially disastrous impact of drought. Crop water accounting schemes are used to track rainfall timing and amounts relative to phenological requirements, to infer water limitation impacts on yield. Unfortunately, many rain gauge reports are available only after significant delays, and the gauge locations leave large gaps in coverage. As an alternative, a grid-cell-based formulation for the water requirement satisfaction index (WRSI) was tested for maize in Southern Africa. Grids of input variables were obtained from remote sensing estimates of rainfall, meteorological models, and digital soil maps. The spatial WRSI was computed for the 1996-97 and 1997-98 growing seasons. Maize yields were estimated by regression and compared with a limited number of reports from the field for the 1996-97 season in Zimbabwe. Agreement at a useful level (r = 0.80) was observed. This is comparable to results from traditional analysis with station data. The findings demonstrate the complementary role that remote sensing, modelling, and geospatial analysis can play in an era when field data collection in sub-Saharan Africa is suffering an unfortunate decline. Published in 2002 by John Wiley & Sons, Ltd.

  15. Acid-base accounting assessment of mine wastes using the chromium reducible sulfur method.

    PubMed

    Schumann, Russell; Stewart, Warwick; Miller, Stuart; Kawashima, Nobuyuki; Li, Jun; Smart, Roger

    2012-05-01

    The acid base account (ABA), commonly used in assessment of mine waste materials, relies in part on calculation of potential acidity from total sulfur measurements. However, potential acidity is overestimated where organic sulfur, sulfate sulfur and some sulfide compounds make up a substantial portion of the sulfur content. The chromium reducible sulfur (CRS) method has been widely applied to assess reduced inorganic sulfur forms in sediments and acid sulfate soils, but not in ABA assessment of mine wastes. This paper reports the application of the CRS method to measuring forms of sulfur commonly found in mine waste materials. A number of individual sulfur containing minerals and real waste materials were analyzed using both CRS and total S and the potential acidity estimates were compared with actual acidity measured from net acid generation tests and column leach tests. The results of the CRS analysis made on individual minerals demonstrate good assessment of sulfur from a range of sulfides. No sulfur was measured using the CRS method in a number of sulfate salts, including jarosite and melanterite typically found in weathered waste rocks, or from dibenzothiophene characteristic of organic sulfur compounds common to coal wastes. Comparison of ABA values for a number of coal waste samples demonstrated much better agreement of acidity predicted from CRS analysis than total S analysis with actual acidity. It also resulted in reclassification of most samples tested from PAF to NAF. Similar comparisons on base metal sulfide wastes generally resulted in overestimation of the acid potential by total S and underestimation of the acid potential by CRS in comparison to acidity measured during NAG tests, but did not generally result in reclassification. In all the cases examined, the best estimate of potential acidity included acidity calculated from both CRS and jarositic S. PMID:22444067

  16. Concept Based Approach for Adaptive Personalized Course Learning System

    ERIC Educational Resources Information Center

    Salahli, Mehmet Ali; Özdemir, Muzaffer; Yasar, Cumali

    2013-01-01

    One of the most important factors for improving the personalization aspects of learning systems is to enable adaptive properties to them. The aim of the adaptive personalized learning system is to offer the most appropriate learning path and learning materials to learners by taking into account their profiles. In this paper, a new approach to…

  17. Nanotechnology based approaches in cancer therapeutics

    NASA Astrophysics Data System (ADS)

    Kumer Biswas, Amit; Reazul Islam, Md; Sadek Choudhury, Zahid; Mostafa, Asif; Fahim Kadir, Mohammad

    2014-12-01

    The current decades are marked not by the development of new molecules for the cure of various diseases but rather the development of new delivery methods for optimum treatment outcome. Nanomedicine is perhaps playing the biggest role in this concern. Nanomedicine offers numerous advantages over conventional drug delivery approaches and is particularly the hot topic in anticancer research. Nanoparticles (NPs) have many unique criteria that enable them to be incorporated in anticancer therapy. This topical review aims to look at the properties and various forms of NPs and their use in anticancer treatment, recent development of the process of identifying new delivery approaches as well as progress in clinical trials with these newer approaches. Although the outcome of cancer therapy can be increased using nanomedicine there are still many disadvantages of using this approach. We aim to discuss all these issues in this review.

  18. Minimally invasive surgery of the anterior skull base: transorbital approaches

    PubMed Central

    Gassner, Holger G.; Schwan, Franziska; Schebesch, Karl-Michael

    2016-01-01

    Minimally invasive approaches are becoming increasingly popular to access the anterior skull base. With interdisciplinary cooperation, in particular endonasal endoscopic approaches have seen an impressive expansion of indications over the past decades. The more recently described transorbital approaches represent minimally invasive alternatives with a differing spectrum of access corridors. The purpose of the present paper is to discuss transorbital approaches to the anterior skull base in the light of the current literature. The transorbital approaches allow excellent exposure of areas that are difficult to reach like the anterior and posterior wall of the frontal sinus; working angles may be more favorable and the paranasal sinus system can be preserved while exposing the skull base. Because of their minimal morbidity and the cosmetically excellent results, the transorbital approaches represent an important addition to established endonasal endoscopic and open approaches to the anterior skull base. Their execution requires an interdisciplinary team approach. PMID:27453759

  19. Women of Courage: A Personal Account of a Wilderness-Based Experiential Group for Survivors of Abuse

    ERIC Educational Resources Information Center

    Kelly, Virginia A.

    2006-01-01

    Adventure-based therapy has grown in both scope and popularity. These groups are frequently utilized in the treatment of adolescents with behavioral or substance abuse issues. Less evident is the use of this modality with other populations. Described here is a personal account of the author's participation in a wilderness-based group for women.…

  20. PPDF-based method to account for atmospheric light scattering in observations of carbon dioxide from space

    NASA Astrophysics Data System (ADS)

    Oshchepkov, Sergey; Bril, Andrey; Yokota, Tatsuya

    2008-12-01

    We present an original method that accounts for thin clouds in carbon dioxide retrievals from space-based reflected sunlight observations in near-infrared regions. This approach involves a reasonable, simple parameterization of effective transmittance using a set of parameters that describe the path-length modification caused by clouds. The complete retrieval scheme included the following: estimation of cloud parameters from the 0.76-μm O2 A-band and from the H2O-saturated absorption area of the 2.0-μm band; a necessary correction to utilize these parameters at the target CO2 1.58-μm band using estimated ground surface albedo outside of gas absorption lines in this band; and retrieval of CO2 amount at the 1.58-μm band using a maximum a posteriori method of inversion. The primary retrieved parameters refer to the CO2 volume mixing ratio vertical profile that is then transformed to an averaged-column amount under a pre-defined increment of pressure. A set of numerical simulations with synthetic radiance spectra particular to Greenhouse Gases Observing Satellite (GOSAT) observations showed that the proposed method provides acceptably accurate CO2 retrievals from an atmosphere that includes thin cirrus clouds. Efficiency of the aerosol and cloud corrections was demonstrated by comparing it with a modified iterative maximum a posteriori-DOAS (IMAP-DOAS) that neglects path length modifications due to the scattering effects.

  1. 26 CFR 1.801-8 - Contracts with reserves based on segregated asset accounts.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... segregated asset account includes a contract under which the reflection of investment return and market value...) provides that for purposes of section 801(b)(1)(A), the reflection of the investment return and the...

  2. 26 CFR 1.801-8 - Contracts with reserves based on segregated asset accounts.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... segregated asset account includes a contract under which the reflection of investment return and market value...) provides that for purposes of section 801(b)(1)(A), the reflection of the investment return and the...

  3. 26 CFR 1.801-8 - Contracts with reserves based on segregated asset accounts.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... segregated asset account includes a contract under which the reflection of investment return and market value...) provides that for purposes of section 801(b)(1)(A), the reflection of the investment return and the...

  4. 26 CFR 1.801-8 - Contracts with reserves based on segregated asset accounts.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... segregated asset account includes a contract under which the reflection of investment return and market value...) provides that for purposes of section 801(b)(1)(A), the reflection of the investment return and the...

  5. Future Performance Trend Indicators: A Current Value Approach to Human Resources Accounting. Report I. Internal Consistencies and Relationships to Performance By Site. Final Report.

    ERIC Educational Resources Information Center

    Pecorella, Patricia A.; Bowers, David G.

    Analyses preparatory to construction of a suitable file for generating a system of future performance trend indicators are described. Such a system falls into the category of a current value approach to human resources accounting. It requires that there be a substantial body of data which: (1) uses the work group or unit, not the individual, as…

  6. Development of prototype induced-fission-based Pu accountancy instrument for safeguards applications.

    PubMed

    Seo, Hee; Lee, Seung Kyu; An, Su Jung; Park, Se-Hwan; Ku, Jeong-Hoe; Menlove, Howard O; Rael, Carlos D; LaFleur, Adrienne M; Browne, Michael C

    2016-09-01

    Prototype safeguards instrument for nuclear material accountancy (NMA) of uranium/transuranic (U/TRU) products that could be produced in a future advanced PWR fuel processing facility has been developed and characterized. This is a new, hybrid neutron measurement system based on fast neutron energy multiplication (FNEM) and passive neutron albedo reactivity (PNAR) methods. The FNEM method is sensitive to the induced fission rate by fast neutrons, while the PNAR method is sensitive to the induced fission rate by thermal neutrons in the sample to be measured. The induced fission rate is proportional to the total amount of fissile material, especially plutonium (Pu), in the U/TRU product; hence, the Pu amount can be calibrated as a function of the induced fission rate, which can be measured using either the FNEM or PNAR method. In the present study, the prototype system was built using six (3)He tubes, and its performance was evaluated for various detector parameters including high-voltage (HV) plateau, efficiency profiles, dead time, and stability. The system's capability to measure the difference in the average neutron energy for the FNEM signature also was evaluated, using AmLi, PuBe, (252)Cf, as well as four Pu-oxide sources each with a different impurity (Al, F, Mg, and B) and producing (α,n) neutrons with different average energies. Future work will measure the hybrid signature (i.e., FNEM×PNAR) for a Pu source with an external interrogating neutron source after enlarging the cavity size of the prototype system to accommodate a large-size Pu source (~600g Pu). PMID:27337652

  7. Appearance questions can be misleading: a discourse-based account of the appearance-reality problem.

    PubMed

    Hansen, Mikkel B; Markman, Ellen M

    2005-05-01

    Preschoolers' success on the appearance-reality task is a milestone in theory-of-mind development. On the standard task children see a deceptive object, such as a sponge that looks like a rock, and are asked, "What is this really?" and "What does this look like?" Children below 412 years of age fail saying that the object not only is a sponge but also looks like a sponge. We propose that young children's difficulty stems from ambiguity in the meaning of "looks like." This locution can refer to outward appearance ("Peter looks like Paul") but in fact often refers to likely reality ("That looks like Jim"). We propose that "looks like" is taken to refer to likely reality unless the reality is already part of the common ground of the conversation. Because this joint knowledge is unclear to young children on the appearance-reality task, they mistakenly think the appearance question is about likely reality. Study 1 analyzed everyday conversations from the CHILDES database and documented that 2 and 3-year-olds are familiar with these two different uses of the locution. To disambiguate the meaning of "looks like," Study 2 clarified that reality was shared knowledge as part of the appearance question, e.g., "What does the sponge look like?" Study 3 used a non-linguistic measure to emphasize the shared knowledge of the reality in the appearance question. Study 4 asked children on their own to articulate the contrast between appearance and reality. At 91%, 85%, and 81% correct responses, children were at near ceiling levels in each of our manipulations while they failed the standard versions of the tasks. Moreover, we show how this discourse-based explanation accounts for findings in the literature. Thus children master the appearance-reality distinction by the age of 3 but the standard task masks this understanding because of the discourse structure involved in talking about appearances. PMID:15826611

  8. Cropland carbon fluxes in the United States: increasing geospatial resolution of inventory-based carbon accounting.

    PubMed

    West, Tristram O; Brandt, Craig C; Baskaran, Latha M; Hellwinckel, Chad M; Mueller, Richard; Bernacchi, Carl J; Bandaru, Varaprasad; Yang, Bai; Wilson, Bradly S; Marland, Gregg; Nelson, Richard G; De la Torre Ugarte, Daniel G; Post, Wilfred M

    2010-06-01

    Net annual soil carbon change, fossil fuel emissions from cropland production, and cropland net primary production were estimated and spatially distributed using land cover defined by NASA's moderate resolution imaging spectroradiometer (MODIS) and by the USDA National Agricultural Statistics Service (NASS) cropland data layer (CDL). Spatially resolved estimates of net ecosystem exchange (NEE) and net ecosystem carbon balance (NECB) were developed. The purpose of generating spatial estimates of carbon fluxes, and the primary objective of this research, was to develop a method of carbon accounting that is consistent from field to national scales. NEE represents net on-site vertical fluxes of carbon. NECB represents all on-site and off-site carbon fluxes associated with crop production. Estimates of cropland NEE using moderate resolution (approximately 1 km2) land cover data were generated for the conterminous United States and compared with higher resolution (30-m) estimates of NEE and with direct measurements of CO2 flux from croplands in Illinois and Nebraska, USA. Estimates of NEE using the CDL (30-m resolution) had a higher correlation with eddy covariance flux tower estimates compared with estimates of NEE using MODIS. Estimates of NECB are primarily driven by net soil carbon change, fossil fuel emissions associated with crop production, and CO2 emissions from the application of agricultural lime. NEE and NECB for U.S. croplands were -274 and 7 Tg C/yr for 2004, respectively. Use of moderate- to high-resolution satellite-based land cover data enables improved estimates of cropland carbon dynamics. PMID:20597291

  9. The accountability for reasonableness approach to guide priority setting in health systems within limited resources – findings from action research at district level in Kenya, Tanzania, and Zambia

    PubMed Central

    2014-01-01

    Background Priority-setting decisions are based on an important, but not sufficient set of values and thus lead to disagreement on priorities. Accountability for Reasonableness (AFR) is an ethics-based approach to a legitimate and fair priority-setting process that builds upon four conditions: relevance, publicity, appeals, and enforcement, which facilitate agreement on priority-setting decisions and gain support for their implementation. This paper focuses on the assessment of AFR within the project REsponse to ACcountable priority setting for Trust in health systems (REACT). Methods This intervention study applied an action research methodology to assess implementation of AFR in one district in Kenya, Tanzania, and Zambia, respectively. The assessments focused on selected disease, program, and managerial areas. An implementing action research team of core health team members and supporting researchers was formed to implement, and continually assess and improve the application of the four conditions. Researchers evaluated the intervention using qualitative and quantitative data collection and analysis methods. Results The values underlying the AFR approach were in all three districts well-aligned with general values expressed by both service providers and community representatives. There was some variation in the interpretations and actual use of the AFR in the decision-making processes in the three districts, and its effect ranged from an increase in awareness of the importance of fairness to a broadened engagement of health team members and other stakeholders in priority setting and other decision-making processes. Conclusions District stakeholders were able to take greater charge of closing the gap between nationally set planning and the local realities and demands of the served communities within the limited resources at hand. This study thus indicates that the operationalization of the four broadly defined and linked conditions is both possible and seems to

  10. A performance weighting procedure for GCMs based on explicit probabilistic models and accounting for observation uncertainty

    NASA Astrophysics Data System (ADS)

    Renard, Benjamin; Vidal, Jean-Philippe

    2016-04-01

    In recent years, the climate modeling community has put a lot of effort into releasing the outputs of multimodel experiments for use by the wider scientific community. In such experiments, several structurally distinct GCMs are run using the same observed forcings (for the historical period) or the same projected forcings (for the future period). In addition, several members are produced for a single given model structure, by running each GCM with slightly different initial conditions. This multiplicity of GCM outputs offers many opportunities in terms of uncertainty quantification or GCM comparisons. In this presentation, we propose a new procedure to weight GCMs according to their ability to reproduce the observed climate. Such weights can be used to combine the outputs of several models in a way that rewards good-performing models and discards poorly-performing ones. The proposed procedure has the following main properties: 1. It is based on explicit probabilistic models describing the time series produced by the GCMs and the corresponding historical observations, 2. It can use several members whenever available, 3. It accounts for the uncertainty in observations, 4. It assigns a weight to each GCM (all weights summing up to one), 5. It can also assign a weight to the "H0 hypothesis" that all GCMs in the multimodel ensemble are not compatible with observations. The application of the weighting procedure is illustrated with several case studies including synthetic experiments, simple cases where the target GCM output is a simple univariate variable and more realistic cases where the target GCM output is a multivariate and/or a spatial variable. These case studies illustrate the generality of the procedure which can be applied in a wide range of situations, as long as the analyst is prepared to make an explicit probabilistic assumption on the target variable. Moreover, these case studies highlight several interesting properties of the weighting procedure. In

  11. Model-based imputation approach for data analysis in the presence of non-detects.

    PubMed

    Krishnamoorthy, K; Mallick, Avishek; Mathew, Thomas

    2009-04-01

    A model-based multiple imputation approach for analyzing sample data with non-detects is proposed. The imputation approach involves randomly generating observations below the detection limit using the detected sample values and then analyzing the data using complete sample techniques, along with suitable adjustments to account for the imputation. The method is described for the normal case and is illustrated for making inferences for constructing prediction limits, tolerance limits, for setting an upper bound for an exceedance probability and for interval estimation of a log-normal mean. Two imputation approaches are investigated in the paper: one uses approximate maximum likelihood estimates (MLEs) of the parameters and a second approach uses simple ad hoc estimates that were developed for the specific purpose of imputations. The accuracy of the approaches is verified using Monte Carlo simulation. Simulation studies show that both approaches are very satisfactory for small to moderately large sample sizes, but only the MLE-based approach is satisfactory for large sample sizes. The MLE-based approach can be calibrated to perform very well for large samples. Applicability of the method to the log-normal distribution and the gamma distribution (via a cube root transformation) is outlined. Simulation studies also show that the imputation approach works well for constructing tolerance limits and prediction limits for a gamma distribution. The approach is illustrated using a few practical examples. PMID:19181626

  12. The financing of the health system in the Islamic Republic of Iran: A National Health Account (NHA) approach

    PubMed Central

    Zakeri, Mohammadreza; Olyaeemanesh, Alireza; Zanganeh, Marziee; Kazemian, Mahmoud; Rashidian, Arash; Abouhalaj, Masoud; Tofighi, Shahram

    2015-01-01

    Background: The National Health Accounts keep track of all healthcare related activities from the beginning (i.e. resource provision), to the end (i.e. service provision). This study was conducted to address following questions: How is the Iranian health system funded? Who distribute the funds? For what services are the funds spent on?, What service providers receive the funds? Methods: The required study data were collected through a number of methods. The family health expenditure data was obtained through a cross sectional multistage (seasonal) survey; while library and field study was used to collect the registered data. The collected data fell into the following three categories: the household health expenditure (the sample size: 10200 urban households and 6800 rural households-four rounds of questioning), financial agents data, the medical universities financial performance data. Results: The total health expenditure of the Iranian households was 201,496,172 million Rials in 2008, which showed a 34.4% increase when compared to 2007. The share of the total health expenditure was 6.2% of the GDP. The share of the public sector showed a decreasing trend between 2003-2008 while the share of the private sector, of which 95.77% was paid by households, had an increasing trend within the same period. The percent of out of pocket expenditure was 53.79% of the total health expenditure. The total health expenditure per capita was US$ 284.00 based on the official US$ exchange rate and US$ 683.1 based on the international US$ exchange rate.( exchange rate: 1$=9988 Rial). Conclusion: The share of the public and private sectors in financing the health system was imbalanced and did not meet the international standards. The public share of the total health expenditures has increased in the recent years despite the 4th and 5th Development Plans. The inclusion of household health insurance fees and other service related expenses increases the public contribution to 73% of the

  13. Constitutive Description of 7075 Aluminum Alloy During Hot Deformation by Apparent and Physically-Based Approaches

    NASA Astrophysics Data System (ADS)

    Mirzadeh, Hamed

    2015-03-01

    Hot flow stress of 7075 aluminum alloy during compressive hot deformation was correlated to the Zener-Hollomon parameter through constitutive analyses based on the apparent approach and the proposed physically-based approach which accounts for the dependence of the Young's modulus and the self-diffusion coefficient of aluminum on temperature. It was shown that the latter approach not only results in a more reliable constitutive equation, but also significantly simplifies the constitutive analysis, which in turn makes it possible to conduct comparative hot working studies. It was also demonstrated that the theoretical exponent of 5 and the lattice self-diffusion activation energy of aluminum (142 kJ/mol) can be set in the hyperbolic sine law to describe the peak flow stresses and the resulting constitutive equation was found to be consistent with that resulted from the proposed physically-based approach.

  14. A novel Bayesian approach to accounting for uncertainty in fMRI-derived estimates of cerebral oxygen metabolism fluctuations.

    PubMed

    Simon, Aaron B; Dubowitz, David J; Blockley, Nicholas P; Buxton, Richard B

    2016-04-01

    Calibrated blood oxygenation level dependent (BOLD) imaging is a multimodal functional MRI technique designed to estimate changes in cerebral oxygen metabolism from measured changes in cerebral blood flow and the BOLD signal. This technique addresses fundamental ambiguities associated with quantitative BOLD signal analysis; however, its dependence on biophysical modeling creates uncertainty in the resulting oxygen metabolism estimates. In this work, we developed a Bayesian approach to estimating the oxygen metabolism response to a neural stimulus and used it to examine the uncertainty that arises in calibrated BOLD estimation due to the presence of unmeasured model parameters. We applied our approach to estimate the CMRO2 response to a visual task using the traditional hypercapnia calibration experiment as well as to estimate the metabolic response to both a visual task and hypercapnia using the measurement of baseline apparent R2' as a calibration technique. Further, in order to examine the effects of cerebral spinal fluid (CSF) signal contamination on the measurement of apparent R2', we examined the effects of measuring this parameter with and without CSF-nulling. We found that the two calibration techniques provided consistent estimates of the metabolic response on average, with a median R2'-based estimate of the metabolic response to CO2 of 1.4%, and R2'- and hypercapnia-calibrated estimates of the visual response of 27% and 24%, respectively. However, these estimates were sensitive to different sources of estimation uncertainty. The R2'-calibrated estimate was highly sensitive to CSF contamination and to uncertainty in unmeasured model parameters describing flow-volume coupling, capillary bed characteristics, and the iso-susceptibility saturation of blood. The hypercapnia-calibrated estimate was relatively insensitive to these parameters but highly sensitive to the assumed metabolic response to CO2. PMID:26790354

  15. Medical Researchers' Ancillary Care Obligations: The Relationship-Based Approach.

    PubMed

    Olson, Nate W

    2016-06-01

    In this article, I provide a new account of the basis of medical researchers' ancillary care obligations. Ancillary care in medical research, or medical care that research participants need but that is not required for the validity or safety of a study or to redress research injuries, is a topic that has drawn increasing attention in research ethics over the last ten years. My view, the relationship-based approach, improves on the main existing theory, Richardson and Belsky's 'partial-entrustment model', by avoiding its problematic restriction on the scope of health needs for which researchers could be obligated to provide ancillary care. Instead, it grounds ancillary care obligations in a wide range of morally relevant features of the researcher-participant relationship, including the level of engagement between researchers and participants, and weighs these factors against each other. I argue that the level of engagement, that is, the duration and intensity of interactions, between researchers and participants matters for ancillary care because of its connection to the meaningfulness of a relationship, and I suggest that other morally relevant features can be grounded in researchers' role obligations. PMID:26424512

  16. Measuring neuronal branching patterns using model-based approach.

    PubMed

    Luczak, Artur

    2010-01-01

    Neurons have complex branching systems which allow them to communicate with thousands of other neurons. Thus understanding neuronal geometry is clearly important for determining connectivity within the network and how this shapes neuronal function. One of the difficulties in uncovering relationships between neuronal shape and its function is the problem of quantifying complex neuronal geometry. Even by using multiple measures such as: dendritic length, distribution of segments, direction of branches, etc, a description of three dimensional neuronal embedding remains incomplete. To help alleviate this problem, here we propose a new measure, a shape diffusiveness index (SDI), to quantify spatial relations between branches at the local and global scale. It was shown that growth of neuronal trees can be modeled by using diffusion limited aggregation (DLA) process. By measuring "how easy" it is to reproduce the analyzed shape by using the DLA algorithm it can be measured how "diffusive" is that shape. Intuitively, "diffusiveness" measures how tree-like is a given shape. For example shapes like an oak tree will have high values of SDI. This measure is capturing an important feature of dendritic tree geometry, which is difficult to assess with other measures. This approach also presents a paradigm shift from well-defined deterministic measures to model-based measures, which estimate how well a model with specific properties can account for features of analyzed shape. PMID:21079752

  17. A Time-Based Account of the Perception of Odor Objects and Valences

    PubMed Central

    Olofsson, Jonas K.; Bowman, Nicholas E.; Khatibi, Katherine; Gottfried, Jay A.

    2013-01-01

    Is human odor perception guided by memory or emotion? Object-centered accounts predict that recognition of unique odor qualities precedes valence decoding. Valence-centered accounts predict the opposite: that stimulus-driven valence responses precede and guide identification. In a speeded response time study, participants smelled paired odors, presented sequentially, and indicated whether the second odor in each pair belonged to the same category as the first (object evaluation task) or whether the second odor was more pleasant than the first (valence evaluation task). Object evaluation was faster and more accurate than valence evaluation. In a complementary experiment, participants performed an identification task, in which they indicated whether an odor matched the previously presented word label. Responses were quicker for odors preceded by semantically matching, rather than nonmatching, word labels, but results showed no evidence of interference from valence on nonmatching trials. These results are in accordance with object-centered accounts of odor perception. PMID:22961773

  18. Assessment of Person Fit Using Resampling-Based Approaches

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2016-01-01

    De la Torre and Deng suggested a resampling-based approach for person-fit assessment (PFA). The approach involves the use of the [math equation unavailable] statistic, a corrected expected a posteriori estimate of the examinee ability, and the Monte Carlo (MC) resampling method. The Type I error rate of the approach was closer to the nominal level…

  19. Evaluating a pivot-based approach for bilingual lexicon extraction.

    PubMed

    Kim, Jae-Hoon; Kwon, Hong-Seok; Seo, Hyeong-Won

    2015-01-01

    A pivot-based approach for bilingual lexicon extraction is based on the similarity of context vectors represented by words in a pivot language like English. In this paper, in order to show validity and usability of the pivot-based approach, we evaluate the approach in company with two different methods for estimating context vectors: one estimates them from two parallel corpora based on word association between source words (resp., target words) and pivot words and the other estimates them from two parallel corpora based on word alignment tools for statistical machine translation. Empirical results on two language pairs (e.g., Korean-Spanish and Korean-French) have shown that the pivot-based approach is very promising for resource-poor languages and this approach observes its validity and usability. Furthermore, for words with low frequency, our method is also well performed. PMID:25983745

  20. Evaluating a Pivot-Based Approach for Bilingual Lexicon Extraction

    PubMed Central

    Kim, Jae-Hoon; Kwon, Hong-Seok; Seo, Hyeong-Won

    2015-01-01

    A pivot-based approach for bilingual lexicon extraction is based on the similarity of context vectors represented by words in a pivot language like English. In this paper, in order to show validity and usability of the pivot-based approach, we evaluate the approach in company with two different methods for estimating context vectors: one estimates them from two parallel corpora based on word association between source words (resp., target words) and pivot words and the other estimates them from two parallel corpora based on word alignment tools for statistical machine translation. Empirical results on two language pairs (e.g., Korean-Spanish and Korean-French) have shown that the pivot-based approach is very promising for resource-poor languages and this approach observes its validity and usability. Furthermore, for words with low frequency, our method is also well performed. PMID:25983745

  1. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  2. Mission vs. Mandate: How Charter School Leaders Conceptualize and Address Market-Based and Performance-Based Accountability Demands

    ERIC Educational Resources Information Center

    Blitz, Mark H.

    2011-01-01

    Charter school research has examined the relationship between charter school mission and issues of school accountability. However, there is a lack of research focusing on how charter school leaders frame and solve problems regarding multiple accountability demands. Given this gap, I investigate the question: How do charter school leaders…

  3. School-Based Accountability and the Distribution of Teacher Quality across Grades in Elementary School

    ERIC Educational Resources Information Center

    Fuller, Sarah C.; Ladd, Helen F.

    2013-01-01

    We use North Carolina data to explore whether the quality of teachers in the lower elementary grades (K-2) falls short of teacher quality in the upper grades (3-5) and to examine the hypothesis that school accountability pressures contribute to such quality shortfalls. Our concern with the early grades arises from recent studies highlighting how…

  4. Teachers' Perceptions of the Impact of Performance-Based Accountability on Teacher Efficacy

    ERIC Educational Resources Information Center

    Gantt, Phyllis Elizabeth Crowley

    2012-01-01

    Implementation of state and federal high-stakes accountability measures such as end-of-course tests (EoCTs) has contributed to increased teacher stress in the classroom, decreased teacher creativity and autonomy, and reduced effectiveness. Prior research focused primarily on the elementary and middle school levels, so this study sought to examine…

  5. Accounting for Teamwork: A Critical Study of Group-Based Systems of Organizational Control.

    ERIC Educational Resources Information Center

    Ezzamel, Mahmoud; Willmott, Hugh

    1998-01-01

    Examines the role of accounting calculations in reorganizing manufacturing capabilities of a vertically integrated global retailing company. Introducing teamwork to replace line work extended traditional, hierarchical management control systems. Teamwork's self-managing demands contravened workers' established sense of self-identity as…

  6. Adapting Educational Measurement to the Demands of Test-Based Accountability

    ERIC Educational Resources Information Center

    Koretz, Daniel

    2015-01-01

    Accountability has become a primary function of large-scale testing in the United States. The pressure on educators to raise scores is vastly greater than it was several decades ago. Research has shown that high-stakes testing can generate behavioral responses that inflate scores, often severely. I argue that because of these responses, using…

  7. Performance-Based Incentives and the Behavior of Accounting Academics: Responding to Changes

    ERIC Educational Resources Information Center

    Moya, Soledad; Prior, Diego; Rodríguez-Pérez, Gonzalo

    2015-01-01

    When laws change the rules of the game, it is important to observe the effects on the players' behavior. Some effects can be anticipated while others are difficult to enunciate before the law comes into force. In this paper we have analyzed articles authored by Spanish accounting academics between 1996 and 2005 to assess the impact of a change in…

  8. A Strength-Based Approach to Teacher Professional Development

    ERIC Educational Resources Information Center

    Zwart, Rosanne C.; Korthagen, Fred A. J.; Attema-Noordewier, Saskia

    2015-01-01

    Based on positive psychology, self-determination theory and a perspective on teacher quality, this study proposes and examines a strength-based approach to teacher professional development. A mixed method pre-test/post-test design was adopted to study perceived outcomes of the approach for 93 teachers of six primary schools in the Netherlands and…

  9. EFL Reading Instruction: Communicative Task-Based Approach

    ERIC Educational Resources Information Center

    Sidek, Harison Mohd

    2012-01-01

    The purpose of this study was to examine the overarching framework of EFL (English as a Foreign Language) reading instructional approach reflected in an EFL secondary school curriculum in Malaysia. Based on such analysis, a comparison was made if Communicative Task-Based Language is the overarching instructional approach for the Malaysian EFL…

  10. Human Rights Education in Japan: An Historical Account, Characteristics and Suggestions for a Better-Balanced Approach

    ERIC Educational Resources Information Center

    Takeda, Sachiko

    2012-01-01

    Although human rights are often expressed as universal tenets, the concept was conceived in a particular socio-political and historical context. Conceptualisations and practice of human rights vary across societies, and face numerous challenges. After providing an historical account of the conceptualisation of human rights in Japanese society,…

  11. 20 CFR 404.408 - Reduction of benefits based on disability on account of receipt of certain other disability...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... insurance benefit is also entitled to periodic benefits under a workers' compensation law or plan of the... periodic benefit (including workers' compensation or any other payments based on a work relationship) on account of a total or partial disability (whether or not permanent) under a law or plan of the...

  12. 20 CFR 404.408 - Reduction of benefits based on disability on account of receipt of certain other disability...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... insurance benefit is also entitled to periodic benefits under a workers' compensation law or plan of the... periodic benefit (including workers' compensation or any other payments based on a work relationship) on account of a total or partial disability (whether or not permanent) under a law or plan of the...

  13. 20 CFR 404.408 - Reduction of benefits based on disability on account of receipt of certain other disability...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... insurance benefit is also entitled to periodic benefits under a workers' compensation law or plan of the... periodic benefit (including workers' compensation or any other payments based on a work relationship) on account of a total or partial disability (whether or not permanent) under a law or plan of the...

  14. 20 CFR 404.408 - Reduction of benefits based on disability on account of receipt of certain other disability...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... insurance benefit is also entitled to periodic benefits under a workers' compensation law or plan of the... periodic benefit (including workers' compensation or any other payments based on a work relationship) on account of a total or partial disability (whether or not permanent) under a law or plan of the...

  15. 20 CFR 404.408 - Reduction of benefits based on disability on account of receipt of certain other disability...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... insurance benefit is also entitled to periodic benefits under a workers' compensation law or plan of the... periodic benefit (including workers' compensation or any other payments based on a work relationship) on account of a total or partial disability (whether or not permanent) under a law or plan of the...

  16. Crafting Coherence from Complex Policy Messages: Educators' Perceptions of Special Education and Standards-Based Accountability Policies

    ERIC Educational Resources Information Center

    Russell, Jennifer Lin; Bray, Laura E.

    2013-01-01

    Federal special education and accountability policies requires that educators individualize instruction for students with disabilities, while simultaneously ensuring that the vast majority of these students meet age-based grade-level standards and assessment targets. In this paper, we examine this dynamic interplay between policies through…

  17. The Effects of Project Based Learning on 21st Century Skills and No Child Left Behind Accountability Standards

    ERIC Educational Resources Information Center

    Holmes, Lisa Marie

    2012-01-01

    The purpose of this study was to determine ways "Digital Biographies," a Project Based Learning Unit, developed 21st century skills while simultaneously supporting NCLB accountability standards. The main goal of this study was to inform professional practice by exploring ways to address two separate, seemingly opposing, demands of…

  18. Component design bases - A template approach

    SciTech Connect

    Pabst, L.F. ); Strickland, K.M. )

    1991-01-01

    A well-documented nuclear plant design basis can enhance plant safety and availability. Older plants, however, often lack historical evidence of the original design intent, particularly for individual components. Most plant documentation describes the actual design (what is) rather than the bounding limits of the design. Without knowledge of these design limits, information from system descriptions and equipment specifications is often interpreted as inviolate design requirements. Such interpretations may lead to unnecessary design conservatism in plant modifications and unnecessary restrictions on plant operation. In 1986, Florida Power and Light Company's (FP and L's) Turkey Point plant embarked on one of the first design basis reconstitution programs in the United States to catalog the true design requirements. As the program developed, design basis users expressed a need for additional information at the component level. This paper outlines a structured (template) approach to develop useful component design basis information (including the WHYs behind the design).

  19. Context-Based Chemistry: The Salters Approach

    ERIC Educational Resources Information Center

    Bennett, Judith; Lubben, Fred

    2006-01-01

    This paper describes briefly the development and key features of one of the major context-based courses for upper high school students, Salters Advanced Chemistry. It goes on to consider the research evidence on the impact of the course, focusing on teachers' views, and, in particular, on students' affective and cognitive responses. The research…

  20. PBL Approach in Web-Based Instruction

    ERIC Educational Resources Information Center

    ChanLin, Lih-Juan; Chan, Kung-Chi

    2004-01-01

    Web-Based Instruction is increasingly being recognized as a means of teaching and learning. In dietetics, the interactions between drugs and nutrients are complex due to the wide variety of drugs and their mechanism and interactions with nutrients. How to help student professionals acquired necessary skills and knowledge is important in a dietetic…

  1. A model-based multisensor data fusion knowledge management approach

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-06-01

    A variety of approaches exist for combining data from multiple sensors. The model-based approach combines data based on its support for or refutation of elements of the model which in turn can be used to evaluate an experimental thesis. This paper presents a collection of algorithms for mapping various types of sensor data onto a thesis-based model and evaluating the truth or falsity of the thesis, based on the model. The use of this approach for autonomously arriving at findings and for prioritizing data are considered. Techniques for updating the model (instead of arriving at a true/false assertion) are also discussed.

  2. A spatiotemporal dengue fever early warning model accounting for nonlinear associations with meteorological factors: a Bayesian maximum entropy approach

    NASA Astrophysics Data System (ADS)

    Lee, Chieh-Han; Yu, Hwa-Lung; Chien, Lung-Chang

    2014-05-01

    Dengue fever has been identified as one of the most widespread vector-borne diseases in tropical and sub-tropical. In the last decade, dengue is an emerging infectious disease epidemic in Taiwan especially in the southern area where have annually high incidences. For the purpose of disease prevention and control, an early warning system is urgently needed. Previous studies have showed significant relationships between climate variables, in particular, rainfall and temperature, and the temporal epidemic patterns of dengue cases. However, the transmission of the dengue fever is a complex interactive process that mostly understated the composite space-time effects of dengue fever. This study proposes developing a one-week ahead warning system of dengue fever epidemics in the southern Taiwan that considered nonlinear associations between weekly dengue cases and meteorological factors across space and time. The early warning system based on an integration of distributed lag nonlinear model (DLNM) and stochastic Bayesian Maximum Entropy (BME) analysis. The study identified the most significant meteorological measures including weekly minimum temperature and maximum 24-hour rainfall with continuous 15-week lagged time to dengue cases variation under condition of uncertainty. Subsequently, the combination of nonlinear lagged effects of climate variables and space-time dependence function is implemented via a Bayesian framework to predict dengue fever occurrences in the southern Taiwan during 2012. The result shows the early warning system is useful for providing potential outbreak spatio-temporal prediction of dengue fever distribution. In conclusion, the proposed approach can provide a practical disease control tool for environmental regulators seeking more effective strategies for dengue fever prevention.

  3. Accountable Professional Practice in ELT

    ERIC Educational Resources Information Center

    Farmer, Frank

    2006-01-01

    Professionalism is widely thought to be desirable in ELT, and at the same time institutions are taking seriously the need to evaluate their teachers. This article presents a general approach to professionalism focused on the accountability of the professional to the client based on TESOL's (2000) classification of adult ELT within eight general…

  4. Increasing accountability to drive improvement.

    PubMed

    Abbott, John G

    2012-01-01

    Governments need to be more strategic in their approach to healthcare and ensure greater accountability for the performance of their health systems. They can start by agreeing on a pan-Canadian vision for the health of Canadians and the services to be provided, accompanied by explicit policy goals, evidence-based performance targets, and more transparent public reporting. PMID:23387134

  5. Cost unit accounting based on a clinical pathway: a practical tool for DRG implementation.

    PubMed

    Feyrer, R; Rösch, J; Weyand, M; Kunzmann, U

    2005-10-01

    Setting up a reliable cost unit accounting system in a hospital is a fundamental necessity for economic survival, given the current general conditions in the healthcare system. Definition of a suitable cost unit is a crucial factor for success. We present here the development and use of a clinical pathway as a cost unit as an alternative to the DRG. Elective coronary artery bypass grafting was selected as an example. Development of the clinical pathway was conducted according to a modular concept that mirrored all the treatment processes across various levels and modules. Using service records and analyses the process algorithms of the clinical pathway were developed and visualized with CorelTM iGrafix Process 2003. A detailed process cost record constituted the basis of the pathway costing, in which financial evaluation of the treatment processes was performed. The result of this study was a structured clinical pathway for coronary artery bypass grafting together with a cost calculation in the form of cost unit accounting. The use of a clinical pathway as a cost unit offers considerable advantages compared to the DRG or clinical case. The variance in the diagnoses and procedures within a pathway is minimal, so the consumption of resources is homogeneous. This leads to a considerable improvement in the value of cost unit accounting as a strategic control instrument in hospitals. PMID:16208610

  6. [Evidence-based medicine: an epistemological approach].

    PubMed

    Henao, Daniel Eduardo; Jaimes, Fabián Alberto

    2009-03-01

    Evidence-based medicine gathers physician's experience and the best scientific evidence to make medical decisions. This proposal has been widely promulgated by medical opinion leaders. Despite a large literature supporting this practice, a formal discussion has not been established regarding its epistemological consequences in daily medical work. The main proposal of evidence-based medicine consists of choosing the best medical decision according to the best available results from scientific studies. Herein, the goal was to highlight inappropriate application of the scientific method used by physics to clinical science. The inaccuracy resides in describing health and disease in strictly numeric equivalents that can be homogenized on a continuous scale. Finally, the authors consider each diseased human being as a complex system, unique and particular, and that this being is defined by an historical background as well as current actual context. Therefore, evidence-based medicine possesses certain limitations that must be recognized in order to to provide better health care to patients. PMID:19753837

  7. Wavelet-based approach to character skeleton.

    PubMed

    You, Xinge; Tang, Yuan Yan

    2007-05-01

    Character skeleton plays a significant role in character recognition. The strokes of a character may consist of two regions, i.e., singular and regular regions. The intersections and junctions of the strokes belong to singular region, while the straight and smooth parts of the strokes are categorized to regular region. Therefore, a skeletonization method requires two different processes to treat the skeletons in theses two different regions. All traditional skeletonization algorithms are based on the symmetry analysis technique. The major problems of these methods are as follows. 1) The computation of the primary skeleton in the regular region is indirect, so that its implementation is sophisticated and costly. 2) The extracted skeleton cannot be exactly located on the central line of the stroke. 3) The captured skeleton in the singular region may be distorted by artifacts and branches. To overcome these problems, a novel scheme of extracting the skeleton of character based on wavelet transform is presented in this paper. This scheme consists of two main steps, namely: a) extraction of primary skeleton in the regular region and b) amendment processing of the primary skeletons and connection of them in the singular region. A direct technique is used in the first step, where a new wavelet-based symmetry analysis is developed for finding the central line of the stroke directly. A novel method called smooth interpolation is designed in the second step, where a smooth operation is applied to the primary skeleton, and, thereafter, the interpolation compensation technique is proposed to link the primary skeleton, so that the skeleton in the singular region can be produced. Experiments are conducted and positive results are achieved, which show that the proposed skeletonization scheme is applicable to not only binary image but also gray-level image, and the skeleton is robust against noise and affine transform. PMID:17491454

  8. Theory-Based Approaches to the Concept of Life

    ERIC Educational Resources Information Center

    El-Hani, Charbel Nino

    2008-01-01

    In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

  9. Arts-based and creative approaches to dementia care.

    PubMed

    McGreevy, Jessica

    2016-02-01

    This article presents a review of arts-based and creative approaches to dementia care as an alternative to antipsychotic medications. While use of antipsychotics may be appropriate for some people, the literature highlights the success of creative approaches and the benefits of their lack of negative side effects associated with antipsychotics. The focus is the use of biographical approaches, music, dance and movement to improve wellbeing, enhance social networks, support inclusive practice and enable participation. Staff must be trained to use these approaches. A case study is presented to demonstrate how creative approaches can be implemented in practice and the outcomes that can be expected when used appropriately. PMID:26938607

  10. Differentiating between rights-based and relational ethical approaches.

    PubMed

    Trobec, Irena; Herbst, Majda; Zvanut, Bostjan

    2009-05-01

    When forced treatment in mental health care is under consideration, two approaches guide clinicians in their actions: the dominant rights-based approach and the relational ethical approach. We hypothesized that nurses with bachelor's degrees differentiate better between the two approaches than nurses without a degree. To test this hypothesis a survey was performed in major Slovenian health institutions. We found that nurses emphasize the importance of ethics and personal values, but 55.4% of all the nurse participants confused the two approaches. The results confirmed our hypothesis and indicate the importance of nurses' formal education, especially when caring for patients with mental illness. PMID:19372123

  11. Union Exon Based Approach for RNA-Seq Gene Quantification: To Be or Not to Be?

    PubMed Central

    Zhao, Shanrong; Xi, Li; Zhang, Baohong

    2015-01-01

    In recent years, RNA-seq is emerging as a powerful technology in estimation of gene and/or transcript expression, and RPKM (Reads Per Kilobase per Million reads) is widely used to represent the relative abundance of mRNAs for a gene. In general, the methods for gene quantification can be largely divided into two categories: transcript-based approach and ‘union exon’-based approach. Transcript-based approach is intrinsically more difficult because different isoforms of the gene typically have a high proportion of genomic overlap. On the other hand, ‘union exon’-based approach method is much simpler and thus widely used in RNA-seq gene quantification. Biologically, a gene is expressed in one or more transcript isoforms. Therefore, transcript-based approach is logistically more meaningful than ‘union exon’-based approach. Despite the fact that gene quantification is a fundamental task in most RNA-seq studies, however, it remains unclear whether ‘union exon’-based approach for RNA-seq gene quantification is a good practice or not. In this paper, we carried out a side-by-side comparison of ‘union exon’-based approach and transcript-based method in RNA-seq gene quantification. It was found that the gene expression levels are significantly underestimated by ‘union exon’-based approach, and the average of RPKM from ‘union exons’-based method is less than 50% of the mean expression obtained from transcript-based approach. The difference between the two approaches is primarily affected by the number of transcripts in a gene. We performed differential analysis at both gene and transcript levels, respectively, and found more insights, such as isoform switches, are gained from isoform differential analysis. The accuracy of isoform quantification would improve if the read coverage pattern and exon-exon spanning reads are taken into account and incorporated into EM (Expectation Maximization) algorithm. Our investigation discourages the use of

  12. Ameliorated GA approach for base station planning

    NASA Astrophysics Data System (ADS)

    Wang, Andong; Sun, Hongyue; Wu, Xiaomin

    2011-10-01

    In this paper, we aim at locating base station (BS) rationally to satisfy the most customs by using the least BSs. An ameliorated GA is proposed to search for the optimum solution. In the algorithm, we mesh the area to be planned according to least overlap length derived from coverage radius, bring into isometric grid encoding method to represent BS distribution as well as its number and develop select, crossover and mutation operators to serve our unique necessity. We also construct our comprehensive object function after synthesizing coverage ratio, overlap ratio, population and geographical conditions. Finally, after importing an electronic map of the area to be planned, a recommended strategy draft would be exported correspondingly. We eventually import HongKong, China to simulate and yield a satisfactory solution.

  13. Engineering application based on fuzzy approach

    NASA Astrophysics Data System (ADS)

    Pislaru, Marius; Avasilcai, Silvia; Trandabat, Alexandru

    2011-12-01

    The article focus on an application of chemical engineering. A fuzzy modeling methodology designed to determinate two relevant characteristics of a chemical compound (ferrocenylsiloxane polyamide) for self-assembling - surface tension and maximum UV absorbance measured as temperature and concentration functions. One of the most important parts of a fuzzy rule-based inference system for the polyamide solution characteristics determinations is that it allows to interpret the knowledge contained in the model and also to improve it with a-priori knowledge. The results obtained through proposed method are highly accurate and its can be optimized by utilizing the available information during the modeling process. The results showed that it is feasible in theory and reliable on calculation applying Mamdani fuzzy inference system to the estimation of optical and surface properties of a polyamide solution.

  14. Engineering application based on fuzzy approach

    NASA Astrophysics Data System (ADS)

    Pislaru, Marius; Avasilcai, Silvia; Trandabat, Alexandru

    2012-01-01

    The article focus on an application of chemical engineering. A fuzzy modeling methodology designed to determinate two relevant characteristics of a chemical compound (ferrocenylsiloxane polyamide) for self-assembling - surface tension and maximum UV absorbance measured as temperature and concentration functions. One of the most important parts of a fuzzy rule-based inference system for the polyamide solution characteristics determinations is that it allows to interpret the knowledge contained in the model and also to improve it with a-priori knowledge. The results obtained through proposed method are highly accurate and its can be optimized by utilizing the available information during the modeling process. The results showed that it is feasible in theory and reliable on calculation applying Mamdani fuzzy inference system to the estimation of optical and surface properties of a polyamide solution.

  15. Multiresolution approach based on projection matrices

    SciTech Connect

    Vargas, Javier; Quiroga, Juan Antonio

    2009-03-01

    Active triangulation measurement systems with a rigid geometric configuration are inappropriate for scanning large objects with low measuring tolerances. The reason is that the ratio between the depth recovery error and the lateral extension is a constant that depends on the geometric setup. As a consequence, measuring large areas with low depth recovery error requires the use of multiresolution techniques. We propose a multiresolution technique based on a camera-projector system previously calibrated. The method consists of changing the camera or projector's parameters in order to increase the system depth sensitivity. A subpixel retroprojection error in the self-calibration process and a decrease of approximately one order of magnitude in the depth recovery error can be achieved using the proposed method.

  16. Systems-based approaches toward wound healing

    PubMed Central

    Buganza-Tepole, Adrian; Kuhl, Ellen

    2013-01-01

    Wound healing in the pediatric patient is of utmost clinical and social importance, since hypertrophic scarring can have aesthetic and psychological sequelae, from early childhood to late adolescence. Wound healing is a well-orchestrated reparative response affecting the damaged tissue at the cellular, tissue, organ, and system scales. While tremendous progress has been made towards understanding wound healing at the individual temporal and spatial scales, its effects across the scales remain severely understudied and poorly understood. Here we discuss the critical need for systems-based computational modeling of wound healing across the scales, from short-term to long-term and from small to large. We illustrate the state of the art in systems modeling by means of three key signaling mechanisms: oxygen tension regulating angiogenesis and revascularization; TGF-β kinetics controlling collagen deposition; and mechanical stretch stimulating cellular mitosis and extracellular matrix remodeling. The complex network of biochemical and biomechanical signaling mechanisms and the multi-scale character of the healing process make systems modeling an integral tool in exploring personalized strategies for wound repair. A better mechanistic understanding of wound healing in the pediatric patient could open new avenues in treating children with skin disorders such as birth defects, skin cancer, wounds, and burn injuries. PMID:23314298

  17. Enuresis in children: a case based approach.

    PubMed

    Baird, Drew C; Seehusen, Dean A; Bode, David V

    2014-10-15

    Enuresis is defined as intermittent urinary incontinence during sleep in a child at least five years of age. Approximately 5% to 10% of all seven-year-olds have enuresis, and an estimated 5 to 7 million children in the United States have enuresis. The pathophysiology of primary nocturnal enuresis involves the inability to awaken from sleep in response to a full bladder, coupled with excessive nighttime urine production or a decreased functional capacity of the bladder. Initial evaluation should include a history, physical examination, and urinalysis. Several conditions, such as constipation, obstructive sleep apnea, diabetes mellitus, diabetes insipidus, chronic kidney disease, and psychiatric disorders, are associated with enuresis. If identified, these conditions should be evaluated and treated. Treatment of primary monosymptomatic enuresis (i.e., the only symptom is nocturnal bed-wetting in a child who has never been dry) begins with counseling the child and parents on effective behavioral modifications. First-line treatments for enuresis include bed alarm therapy and desmopressin. The choice of therapy is based on the child's age and nighttime voiding patterns, and the desires of the child and family. Referral to a pediatric urologist is indicated for children with primary enuresis refractory to standard and combination therapies, and for children with some secondary causes of enuresis, including urinary tract malformations, recurrent urinary tract infections, or neurologic disorders. PMID:25369644

  18. Modeling Site-Based Decision Making: School Practices in the Age of Accountability

    ERIC Educational Resources Information Center

    Bauer, Scott C.; Bogotch, Ira E.

    2006-01-01

    Purpose: The primary purpose is to present empirical measures of variables relating to practices engaged in by site-based teams, and then to use these variables to test a model predicting significant outcomes of site-based decision making. The practice variables of site-based management (SBM) teams are essential in promoting research within a…

  19. Airframe integrity based on Bayesian approach

    NASA Astrophysics Data System (ADS)

    Hurtado Cahuao, Jose Luis

    Aircraft aging has become an immense challenge in terms of ensuring the safety of the fleet while controlling life cycle costs. One of the major concerns in aircraft structures is the development of fatigue cracks in the fastener holes. A probabilistic-based method has been proposed to manage this problem. In this research, the Bayes' theorem is used to assess airframe integrity by updating generic data with airframe inspection data while such data are compiled. This research discusses the methodology developed for assessment of loss of airframe integrity due to fatigue cracking in the fastener holes of an aging platform. The methodology requires a probability density function (pdf) at the end of SAFE life. Subsequently, a crack growth regime begins. As the Bayesian analysis requires information of a prior initial crack size pdf, such a pdf is assumed and verified to be lognormally distributed. The prior distribution of crack size as cracks grow is modeled through a combined Inverse Power Law (IPL) model and lognormal relationships. The first set of inspections is used as the evidence for updating the crack size distribution at the various stages of aircraft life. Moreover, the materials used in the structural part of the aircrafts have variations in their properties due to their calibration errors and machine alignment. A Matlab routine (PCGROW) is developed to calculate the crack distribution growth through three different crack growth models. As the first step, the material properties and the initial crack size are sampled. A standard Monte Carlo simulation is employed for this sampling process. At the corresponding aircraft age, the crack observed during the inspections, is used to update the crack size distribution and proceed in time. After the updating, it is possible to estimate the probability of structural failure as a function of flight hours for a given aircraft in the future. The results show very accurate and useful values related to the reliability

  20. Simulation-Based Constructivist Approach for Education Leaders

    ERIC Educational Resources Information Center

    Shapira-Lishchinsky, Orly

    2015-01-01

    The purpose of this study was to reflect the leadership strategies that may arise using a constructivist approach based on organizational learning. This approach involved the use of simulations that focused on ethical tensions in school principals' daily experiences, and the development of codes of ethical conduct to reduce these tensions. The…

  1. Interteaching: An Evidence-Based Approach to Instruction

    ERIC Educational Resources Information Center

    Brown, Thomas Wade; Killingsworth, Kenneth; Alavosius, Mark P.

    2014-01-01

    This paper describes "interteaching" as an evidence-based method of instruction. Instructors often rely on more traditional approaches, such as lectures, as means to deliver instruction. Despite high usage, these methods are ineffective at achieving desirable academic outcomes. We discuss an innovative approach to delivering instruction…

  2. A Competency-Based Approach to Business Exploration.

    ERIC Educational Resources Information Center

    Demaio, Genene; Ackley, R. Jon

    1982-01-01

    The importance of active student involvement is stressed in this discussion of a competency-based approach to career exploration. Five steps used in developing a one-semester, eighth-grade business exploration course are presented and described. Methods of teaching such a unit and advantages to this approach are discussed. (CT)

  3. Theory Based Approaches to Learning. Implications for Adult Educators.

    ERIC Educational Resources Information Center

    Bolton, Elizabeth B.; Jones, Edward V.

    This paper presents a codification of theory-based approaches that are applicable to adult learning situations. It also lists some general guidelines that can be used when selecting a particular approach or theory as a basis for planning instruction. Adult education's emphasis on practicality and the relationship between theory and practice is…

  4. Bare Forms and Lexical Insertions in Code-Switching: A Processing-Based Account

    ERIC Educational Resources Information Center

    Owens, Jonathan

    2005-01-01

    Bare forms (or [slashed O] forms), uninflected lexical L2 insertions in contexts where the matrix language expects morphological marking, have been recognized as an anomaly in different approaches to code-switching. Myers-Scotton (1997, 2002) has explained their existence in terms of structural incongruity between the matrix and embedded…

  5. Reimagining Kindergarten: Restoring a Developmental Approach when Accountability Demands Are Pushing Formal Instruction on the Youngest Learners

    ERIC Educational Resources Information Center

    Graue, Elizabeth

    2009-01-01

    The traditional kindergarten program often reflected a rich but generic approach with creative contexts for typical kindergartners organized around materials (manipulatives or dramatic play) or a developmental area (fine motor or language). The purpose of kindergarten reflected beliefs about how children learn, specialized training for…

  6. Site-Based Management versus Systems-Based Thinking: The Impact of Data-Driven Accountability and Reform

    ERIC Educational Resources Information Center

    Mette, Ian M.; Bengtson, Ed

    2015-01-01

    This case was written to help prepare building-level and central office administrators who are expected to effectively lead schools and systems in an often tumultuous world of educational accountability and reform. The intent of this case study is to allow educators to examine the impact data management has on the types of thinking required when…

  7. A unified account of tilt illusions, association fields, and contour detection based on elastica.

    PubMed

    Keemink, Sander W; van Rossum, Mark C W

    2016-09-01

    As expressed in the Gestalt law of good continuation, human perception tends to associate stimuli that form smooth continuations. Contextual modulation in primary visual cortex, in the form of association fields, is believed to play an important role in this process. Yet a unified and principled account of the good continuation law on the neural level is lacking. In this study we introduce a population model of primary visual cortex. Its contextual interactions depend on the elastica curvature energy of the smoothest contour connecting oriented bars. As expected, this model leads to association fields consistent with data. However, in addition the model displays tilt-illusions for stimulus configurations with grating and single bars that closely match psychophysics. Furthermore, the model explains not only pop-out of contours amid a variety of backgrounds, but also pop-out of single targets amid a uniform background. We thus propose that elastica is a unifying principle of the visual cortical network. PMID:26232611

  8. Spectral Neugebauer-based color halftone prediction model accounting for paper fluorescence.

    PubMed

    Hersch, Roger David

    2014-08-20

    We present a spectral model for predicting the fluorescent emission and the total reflectance of color halftones printed on optically brightened paper. By relying on extended Neugebauer models, the proposed model accounts for the attenuation by the ink halftones of both the incident exciting light in the UV wavelength range and the emerging fluorescent emission in the visible wavelength range. The total reflectance is predicted by adding the predicted fluorescent emission relative to the incident light and the pure reflectance predicted with an ink-spreading enhanced Yule-Nielsen modified Neugebauer reflectance prediction model. The predicted fluorescent emission spectrum as a function of the amounts of cyan, magenta, and yellow inks is very accurate. It can be useful to paper and ink manufacturers who would like to study in detail the contribution of the fluorescent brighteners and the attenuation of the fluorescent emission by ink halftones. PMID:25321109

  9. How to Tell the Truth with Statistics: The Case for Accountable Data Analyses in Team-based Science

    PubMed Central

    Gelfond, Jonathan A. L.; Klugman, Craig M.; Welty, Leah J.; Heitman, Elizabeth; Louden, Christopher; Pollock, Brad H.

    2015-01-01

    Data analysis is essential to translational medicine, epidemiology, and the scientific process. Although recent advances in promoting reproducibility and reporting standards have made some improvements, the data analysis process remains insufficiently documented and susceptible to avoidable errors, bias, and even fraud. Comprehensively accounting for the full analytical process requires not only records of the statistical methodology used, but also records of communications among the research team. In this regard, the data analysis process can benefit from the principle of accountability that is inherent in other disciplines such as clinical practice. We propose a novel framework for capturing the analytical narrative called the Accountable Data Analysis Process (ADAP), which allows the entire research team to participate in the analysis in a supervised and transparent way. The framework is analogous to an electronic health record in which the dataset is the “patient” and actions related to the dataset are recorded in a project management system. We discuss the design, advantages, and challenges in implementing this type of system in the context of academic health centers, where team based science increasingly demands accountability. PMID:26290897

  10. Expanded Endoscopic Endonasal Approaches to Skull Base Meningiomas

    PubMed Central

    Prosser, J. Drew; Vender, John R.; Alleyne, Cargill H.; Solares, C. Arturo

    2012-01-01

    Anterior cranial base meningiomas have traditionally been addressed via frontal or frontolateral approaches. However, with the advances in endoscopic endonasal treatment of pituitary lesions, the transphenoidal approach is being expanded to address lesions of the petrous ridge, anterior clinoid, clivus, sella, parasellar region, tuberculum, planum, olfactory groove, and crista galli regions. The expanded endoscopic endonasal approach (EEEA) has the advantage of limiting brain retraction and resultant brain edema, as well as minimizing manipulation of neural structures. Herein, we describe the techniques of transclival, transphenoidal, transplanum, and transcribiform resections of anterior skull base meningiomas. Selected cases are presented. PMID:23730542

  11. When Creative Problem Solving Strategy Meets Web-Based Cooperative Learning Environment in Accounting Education

    ERIC Educational Resources Information Center

    Cheng, Kai Wen

    2011-01-01

    Background: Facing highly competitive and changing environment, cultivating citizens with problem-solving attitudes is one critical vision of education. In brief, the importance of education is to cultivate students with practical abilities. Realizing the advantages of web-based cooperative learning (web-based CL) and creative problem solving…

  12. Negotiating Accountability during Student Teaching: The Influence of an Inquiry-Based Student Teaching Seminar

    ERIC Educational Resources Information Center

    Cuenca, Alexander

    2014-01-01

    Drawing on the work of Russian literary critic, Mikhail Bakhtin, this article explores how an inquiry-based social studies student teaching seminar helped three preservice teachers negotiate the pressures of standards-based reforms during student teaching. The author first examines how initial perceptions of standardization and high-stakes testing…

  13. An energy-based model accounting for snow accumulation and snowmelt in a coniferous forest and in an open area

    NASA Astrophysics Data System (ADS)

    Matějka, Ondřej; Jeníček, Michal

    2016-04-01

    An energy balance approach was used to simulate snow water equivalent (SWE) evolution in an open area, forest clearing and coniferous forest during winter seasons 2011/12 and 2012/13 in the Bystřice River basin (Krušné Mountains, Czech Republic). The aim was to describe the impact of vegetation on snow accumulation and snowmelt under different forest canopy structure and trees density. Hemispherical photographs were used to describe the forest canopy structure. Energy balance model of snow accumulation and melt was set up. The snow model was adjusted to account the effects of forest canopy on driving meteorological variables. Leaf area index derived from 32 hemispherical photographs of vegetation and sky was used to implement the forest influence in the snow model. The model was evaluated using snow depth and SWE data measured at 16 localities in winter seasons from 2011 to 2013. The model was able to reproduce the SWE evolution in both winter seasons beneath the forest canopy, forest clearing and open area. The SWE maximum in forest sites was by 18% lower than in open areas and forest clearings. The portion of shortwave radiation on snowmelt rate was by 50% lower in forest areas than in open areas due to shading effect. The importance of turbulent fluxes was by 30% lower in forest sites compared to openings because of wind speed reduction up to 10% of values at corresponding open areas. Indirect estimation of interception rates was derived. Between 14 and 60% of snowfall was intercept and sublimated in the forest canopy in both winter seasons. Based on model results, the underestimation of solid precipitation (heated precipitation gauge used for measurement) at the weather station Hřebečná was revealed. The snowfall was underestimated by 40% in winter season 2011/12 and by 13% in winter season 2012/13. Although, the model formulation appeared sufficient for both analysed winter seasons, canopy effects on the longwave radiation and ground heat flux were not

  14. Retrieval-Based Model Accounts for Striking Profile of Episodic Memory and Generalization.

    PubMed

    Banino, Andrea; Koster, Raphael; Hassabis, Demis; Kumaran, Dharshan

    2016-01-01

    A fundamental theoretical tension exists between the role of the hippocampus in generalizing across a set of related episodes, and in supporting memory for individual episodes. Whilst the former requires an appreciation of the commonalities across episodes, the latter emphasizes the representation of the specifics of individual experiences. We developed a novel version of the hippocampal-dependent paired associate inference (PAI) paradigm, which afforded us the unique opportunity to investigate the relationship between episodic memory and generalization in parallel. Across four experiments, we provide surprising evidence that the overlap between object pairs in the PAI paradigm results in a marked loss of episodic memory. Critically, however, we demonstrate that superior generalization ability was associated with stronger episodic memory. Through computational simulations we show that this striking profile of behavioral findings is best accounted for by a mechanism by which generalization occurs at the point of retrieval, through the recombination of related episodes on the fly. Taken together, our study offers new insights into the intricate relationship between episodic memory and generalization, and constrains theories of the mechanisms by which the hippocampus supports generalization. PMID:27510579

  15. Retrieval-Based Model Accounts for Striking Profile of Episodic Memory and Generalization

    PubMed Central

    Banino, Andrea; Koster, Raphael; Hassabis, Demis; Kumaran, Dharshan

    2016-01-01

    A fundamental theoretical tension exists between the role of the hippocampus in generalizing across a set of related episodes, and in supporting memory for individual episodes. Whilst the former requires an appreciation of the commonalities across episodes, the latter emphasizes the representation of the specifics of individual experiences. We developed a novel version of the hippocampal-dependent paired associate inference (PAI) paradigm, which afforded us the unique opportunity to investigate the relationship between episodic memory and generalization in parallel. Across four experiments, we provide surprising evidence that the overlap between object pairs in the PAI paradigm results in a marked loss of episodic memory. Critically, however, we demonstrate that superior generalization ability was associated with stronger episodic memory. Through computational simulations we show that this striking profile of behavioral findings is best accounted for by a mechanism by which generalization occurs at the point of retrieval, through the recombination of related episodes on the fly. Taken together, our study offers new insights into the intricate relationship between episodic memory and generalization, and constrains theories of the mechanisms by which the hippocampus supports generalization. PMID:27510579

  16. RESULTS FROM A DEMONSTRATION OF RF-BASED UF6 CYLINDER ACCOUNTING AND TRACKING SYSTEM INSTALLED AT A USEC FACILITY

    SciTech Connect

    Pickett, Chris A; Kovacic, Donald N; Morgan, Jim; Younkin, James R; Carrick, Bernie; Ken, Whittle; Johns, R E

    2008-09-01

    add tamper-indicating and data authentication features to some of the pertinent system components. Future efforts will focus on these needs along with implementing protocols relevant to IAEA safeguards. The work detailed in this report demonstrates the feasibility of constructing RF devices that can survive the operational rigors associated with the transportation, storage, and processing of UF6 cylinders. The system software specially designed for this project is called Cylinder Accounting and Tracking System (CATS). This report details the elements of the CATS rules-based architecture and its use in safeguards-monitoring and asset-tracking applications. Information is also provided on improvements needed to make the technology ready, as well as options for improving the safeguards aspects of the technology. The report also includes feedback from personnel involved in the testing, as well as individuals who could utilize an RF-based system in supporting the performance of their work. The system software was set up to support a Mailbox declaration, where a declaration can be made either before or after cylinder movements take place. When the declaration is made before cylinders move, the operators must enter this information into CATS. If the IAEA then shows up unexpectedly at the facility, they can see how closely the operational condition matches the declaration. If the declaration is made after the cylinders move, this provides greater operational flexibility when schedules are interrupted or are changed, by allowing operators to declare what moves have been completed. The IAEA can then compare where cylinders are with where CATS or the system says they are located. The ability of CATS to automatically generate Mailbox declarations is seen by the authors as a desirable feature. The Mailbox approach is accepted by the IAEA but has not been widely implemented (and never in enrichment facilities). During the course of this project, we have incorporated alternative

  17. Better ILP-Based Approaches to Haplotype Assembly.

    PubMed

    Chen, Zhi-Zhong; Deng, Fei; Shen, Chao; Wang, Yiji; Wang, Lusheng

    2016-07-01

    Haplotype assembly is to directly construct the haplotypes of an individual from sequence fragments (reads) of the individual. Although a number of programs have been designed for computing optimal or heuristic solutions to the haplotype assembly problem, computing an optimal solution may take days or even months while computing a heuristic solution usually requires a trade-off between speed and accuracy. This article refines a previously known integer linear programming-based (ILP-based) approach to the haplotype assembly problem in twofolds. First, the read-matrices of some datasets (such as NA12878) come with a quality for each base in the reads. We here propose to utilize the qualities in the ILP-based approach. Secondly, we propose to use the ILP-based approach to improve the output of any heuristic program for the problem. Experiments with both real and simulated datasets show that the qualities of read-matrices help us find more accurate solutions without significant loss of speed. Moreover, our experimental results show that the proposed hybrid approach improves the output of ReFHap (the current leading heuristic) significantly (say, by almost 25% of the QAN50 score) without significant loss of speed, and can even find optimal solutions in much shorter time than the original ILP-based approach. Our program is available upon request to the authors. PMID:27347882

  18. Knowledge-based biomedical word sense disambiguation: comparison of approaches

    PubMed Central

    2010-01-01

    Background Word sense disambiguation (WSD) algorithms attempt to select the proper sense of ambiguous terms in text. Resources like the UMLS provide a reference thesaurus to be used to annotate the biomedical literature. Statistical learning approaches have produced good results, but the size of the UMLS makes the production of training data infeasible to cover all the domain. Methods We present research on existing WSD approaches based on knowledge bases, which complement the studies performed on statistical learning. We compare four approaches which rely on the UMLS Metathesaurus as the source of knowledge. The first approach compares the overlap of the context of the ambiguous word to the candidate senses based on a representation built out of the definitions, synonyms and related terms. The second approach collects training data for each of the candidate senses to perform WSD based on queries built using monosemous synonyms and related terms. These queries are used to retrieve MEDLINE citations. Then, a machine learning approach is trained on this corpus. The third approach is a graph-based method which exploits the structure of the Metathesaurus network of relations to perform unsupervised WSD. This approach ranks nodes in the graph according to their relative structural importance. The last approach uses the semantic types assigned to the concepts in the Metathesaurus to perform WSD. The context of the ambiguous word and semantic types of the candidate concepts are mapped to Journal Descriptors. These mappings are compared to decide among the candidate concepts. Results are provided estimating accuracy of the different methods on the WSD test collection available from the NLM. Conclusions We have found that the last approach achieves better results compared to the other methods. The graph-based approach, using the structure of the Metathesaurus network to estimate the relevance of the Metathesaurus concepts, does not perform well compared to the first two

  19. Assessing a New Approach to Class-Based Affirmative Action

    ERIC Educational Resources Information Center

    Gaertner, Matthew N.

    2011-01-01

    In November, 2008, Colorado and Nebraska voted on amendments that sought to end race-based affirmative action at public universities. In anticipation of the vote, Colorado's flagship public institution--The University of Colorado at Boulder (CU)--explored statistical approaches to support class-based affirmative action. This paper details CU's…

  20. Assessing a New Approach to Class-Based Affirmative Action

    ERIC Educational Resources Information Center

    Gaertner, Matthew Newman

    2011-01-01

    In November, 2008, Colorado and Nebraska voted on amendments that sought to end race-based affirmative action at public universities in those states. In anticipation of the vote, the University of Colorado at Boulder (CU) explored statistical approaches to support class-based (i.e., socioeconomic) affirmative action. This dissertation introduces…

  1. Zero base approach to fiscal management of the laboratory.

    PubMed

    Boudreau, D A; Majonos, J S

    1985-08-01

    Lab administrators who face the challenge of providing quality care while cutting costs need a way to periodically re-evaluate all lab functions and services. The guidelines presented here, based on the Zero Base Budget approach, formulate a management strategy for the lab that could lead to better fiscal planning. PMID:10272519

  2. Component-Based Approach in Learning Management System Development

    ERIC Educational Resources Information Center

    Zaitseva, Larisa; Bule, Jekaterina; Makarov, Sergey

    2013-01-01

    The paper describes component-based approach (CBA) for learning management system development. Learning object as components of e-learning courses and their metadata is considered. The architecture of learning management system based on CBA being developed in Riga Technical University, namely its architecture, elements and possibilities are…

  3. Accounting for Sampling Error When Inferring Population Synchrony from Time-Series Data: A Bayesian State-Space Modelling Approach with Applications

    PubMed Central

    Santin-Janin, Hugues; Hugueny, Bernard; Aubry, Philippe; Fouchet, David; Gimenez, Olivier; Pontier, Dominique

    2014-01-01

    Background Data collected to inform time variations in natural population size are tainted by sampling error. Ignoring sampling error in population dynamics models induces bias in parameter estimators, e.g., density-dependence. In particular, when sampling errors are independent among populations, the classical estimator of the synchrony strength (zero-lag correlation) is biased downward. However, this bias is rarely taken into account in synchrony studies although it may lead to overemphasizing the role of intrinsic factors (e.g., dispersal) with respect to extrinsic factors (the Moran effect) in generating population synchrony as well as to underestimating the extinction risk of a metapopulation. Methodology/Principal findings The aim of this paper was first to illustrate the extent of the bias that can be encountered in empirical studies when sampling error is neglected. Second, we presented a space-state modelling approach that explicitly accounts for sampling error when quantifying population synchrony. Third, we exemplify our approach with datasets for which sampling variance (i) has been previously estimated, and (ii) has to be jointly estimated with population synchrony. Finally, we compared our results to those of a standard approach neglecting sampling variance. We showed that ignoring sampling variance can mask a synchrony pattern whatever its true value and that the common practice of averaging few replicates of population size estimates poorly performed at decreasing the bias of the classical estimator of the synchrony strength. Conclusion/Significance The state-space model used in this study provides a flexible way of accurately quantifying the strength of synchrony patterns from most population size data encountered in field studies, including over-dispersed count data. We provided a user-friendly R-program and a tutorial example to encourage further studies aiming at quantifying the strength of population synchrony to account for uncertainty in

  4. A biomimetic vision-based hovercraft accounts for bees' complex behaviour in various corridors.

    PubMed

    Roubieu, Frédéric L; Serres, Julien R; Colonnier, Fabien; Franceschini, Nicolas; Viollet, Stéphane; Ruffier, Franck

    2014-09-01

    Here we present the first systematic comparison between the visual guidance behaviour of a biomimetic robot and those of honeybees flying in similar environments. We built a miniature hovercraft which can travel safely along corridors with various configurations. For the first time, we implemented on a real physical robot the 'lateral optic flow regulation autopilot', which we previously studied computer simulations. This autopilot inspired by the results of experiments on various species of hymenoptera consists of two intertwined feedback loops, the speed and lateral control loops, each of which has its own optic flow (OF) set-point. A heading-lock system makes the robot move straight ahead as fast as 69 cm s(-1) with a clearance from one wall as small as 31 cm, giving an unusually high translational OF value (125° s(-1)). Our biomimetic robot was found to navigate safely along straight, tapered and bent corridors, and to react appropriately to perturbations such as the lack of texture on one wall, the presence of a tapering or non-stationary section of the corridor and even a sloping terrain equivalent to a wind disturbance. The front end of the visual system consists of only two local motion sensors (LMS), one on each side. This minimalistic visual system measuring the lateral OF suffices to control both the robot's forward speed and its clearance from the walls without ever measuring any speeds or distances. We added two additional LMSs oriented at +/-45° to improve the robot's performances in stiffly tapered corridors. The simple control system accounts for worker bees' ability to navigate safely in six challenging environments: straight corridors, single walls, tapered corridors, straight corridors with part of one wall moving or missing, as well as in the presence of wind. PMID:24615558

  5. A Carbon Monitoring System Approach to US Coastal Wetland Carbon Fluxes: Progress Towards a Tier II Accounting Method with Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Windham-Myers, L.; Holmquist, J. R.; Bergamaschi, B. A.; Byrd, K. B.; Callaway, J.; Crooks, S.; Drexler, J. Z.; Feagin, R. A.; Ferner, M. C.; Gonneea, M. E.; Kroeger, K. D.; Megonigal, P.; Morris, J. T.; Schile, L. M.; Simard, M.; Sutton-Grier, A.; Takekawa, J.; Troxler, T.; Weller, D.; Woo, I.

    2015-12-01

    Despite their high rates of long-term carbon (C) sequestration when compared to upland ecosystems, coastal C accounting is only recently receiving the attention of policy makers and carbon markets. Assessing accuracy and uncertainty in net C flux estimates requires both direct and derived measurements based on both short and long term dynamics in key drivers, particularly soil accretion rates and soil organic content. We are testing the ability of remote sensing products and national scale datasets to estimate biomass and soil stocks and fluxes over a wide range of spatial and temporal scales. For example, the 2013 Wetlands Supplement to the 2006 IPCC GHG national inventory reporting guidelines requests information on development of Tier I-III reporting, which express increasing levels of detail. We report progress toward development of a Carbon Monitoring System for "blue carbon" that may be useful for IPCC reporting guidelines at Tier II levels. Our project uses a current dataset of publically available and contributed field-based measurements to validate models of changing soil C stocks, across a broad range of U.S. tidal wetland types and landuse conversions. Additionally, development of biomass algorithms for both radar and spectral datasets will be tested and used to determine the "price of precision" of different satellite products. We discuss progress in calculating Tier II estimates focusing on variation introduced by the different input datasets. These include the USFWS National Wetlands Inventory, NOAA Coastal Change Analysis Program, and combinations to calculate tidal wetland area. We also assess the use of different attributes and depths from the USDA-SSURGO database to map soil C density. Finally, we examine the relative benefit of radar, spectral and hybrid approaches to biomass mapping in tidal marshes and mangroves. While the US currently plans to report GHG emissions at a Tier I level, we argue that a Tier II analysis is possible due to national

  6. Discriminating bot accounts based solely on temporal features of microblog behavior

    NASA Astrophysics Data System (ADS)

    Pan, Junshan; Liu, Ying; Liu, Xiang; Hu, Hanping

    2016-05-01

    As the largest microblog service in China, Sina Weibo has attracted numerous automated applications (known as bots) due to its popularity and open architecture. We classify the active users from Sina Weibo into human, bot-based and hybrid groups based solely on the study of temporal features of their posting behavior. The anomalous burstiness parameter and time-interval entropy value are exploited to characterize automation. We also reveal different behavior patterns among the three types of users regarding their reposting ratio, daily rhythm and active days. Our findings may help Sina Weibo manage a better community and should be considered for dynamic models of microblog behaviors.

  7. Randomly Accountable

    ERIC Educational Resources Information Center

    Kane, Thomas J.; Staiger, Douglas O.; Geppert, Jeffrey

    2002-01-01

    The accountability debate tends to devolve into a battle between the pro-testing and anti-testing crowds. When it comes to the design of a school accountability system, the devil is truly in the details. A well-designed accountability plan may go a long way toward giving school personnel the kinds of signals they need to improve performance.…

  8. School Accountability.

    ERIC Educational Resources Information Center

    Evers, Williamson M., Ed.; Walberg, Herbert J., Ed.

    This book presents the perspectives of experts from the fields of history, economics, political science, and psychology on what is known about accountability, what still needs to be learned, what should be done right now, and what should be avoided in devising accountability systems. The common myths about accountability are dispelled and how it…

  9. Colorful Accounting

    ERIC Educational Resources Information Center

    Warrick, C. Shane

    2006-01-01

    As instructors of accounting, we should take an abstract topic (at least to most students) and connect it to content known by students to help increase the effectiveness of our instruction. In a recent semester, ordinary items such as colors, a basketball, and baseball were used to relate the subject of accounting. The accounting topics of account…

  10. Frequency Affects Object Relative Clause Processing: Some Evidence in Favor of Usage-Based Accounts

    ERIC Educational Resources Information Center

    Reali, Florencia

    2014-01-01

    The processing difficulty of nested grammatical structure has been explained by different psycholinguistic theories. Here I provide corpus and behavioral evidence in favor of usage-based models, focusing on the case of object relative clauses in Spanish as a first language. A corpus analysis of spoken Spanish reveals that, as in English, the…

  11. Adventure-Based Service Learning: University Students' Self-Reflection Accounts of Service with Children

    ERIC Educational Resources Information Center

    Quezada, Reyes L.; Christopherson, Richard W.

    2005-01-01

    The need to provide alternative and exciting community service-learning experiences with university students has been a challenge to institutions of higher education. One institution was able to capitalize on an idea of integrating challenge and adventure-based activities as a form of community service. This article focuses on undergraduate…

  12. Working Memory Span Development: A Time-Based Resource-Sharing Model Account

    ERIC Educational Resources Information Center

    Barrouillet, Pierre; Gavens, Nathalie; Vergauwe, Evie; Gaillard, Vinciane; Camos, Valerie

    2009-01-01

    The time-based resource-sharing model (P. Barrouillet, S. Bernardin, & V. Camos, 2004) assumes that during complex working memory span tasks, attention is frequently and surreptitiously switched from processing to reactivate decaying memory traces before their complete loss. Three experiments involving children from 5 to 14 years of age…

  13. Accounting for Equity: Performance-Based Budgeting and Fiscal Equity in Florida

    ERIC Educational Resources Information Center

    Mullin, Christopher M.; Honeyman, David S.

    2008-01-01

    Institutional performance was a topic given considerable attention by the Commission on the Future of Higher Education. Florida's Community College System responded to the challenge by committing to increase performance-based funding allocations from less than 2% to 5% of total state appropriations. Results of the analysis indicated that…

  14. The Social Foundation of Team-Based Learning: Students Accountable to Students

    ERIC Educational Resources Information Center

    Sweet, Michael; Pelton-Sweet, Laura M.

    2008-01-01

    As one form of small group learning, team-based learning's (TBL's) unique sequence of individual and group work with immediate feedback enables and encourages students to engage course content and each other in remarkable ways. Specifically, TBL creates an environment where students can fulfill their human need to belong in the process of…

  15. Bringing Technology to Students' Proximity: A Sociocultural Account of Technology-Based Learning Projects

    ERIC Educational Resources Information Center

    Mukama, Evode

    2014-01-01

    This paper depicts a study carried out in Rwanda concerning university students who participated in a contest to produce short documentary films. The purpose of this research is to conceptualize these kinds of technology-based learning projects (TBLPs) through a sociocultural perspective. The methodology included focus group discussions and field…

  16. ACCOUNTING FOR BIOLOGICAL AND ANTHROPOGENIC FACTORS IN NATIONAL LAND-BASED CARBON BUDGETS

    EPA Science Inventory

    Efforts to quantify net greenhouse gas emissions at the national scale, as required by the United Nations Framework Convention on Climate Change, must include both industrial emissions and the net flux associated with the land base. In this study, data on current land use, rates ...

  17. THE FUTURE OF COMPUTER-BASED TOXICITY PREDICTION: MECHANISM-BASED MODELS VS. INFORMATION MINING APPROACHES

    EPA Science Inventory


    The Future of Computer-Based Toxicity Prediction:
    Mechanism-Based Models vs. Information Mining Approaches

    When we speak of computer-based toxicity prediction, we are generally referring to a broad array of approaches which rely primarily upon chemical structure ...

  18. An Efficient Deterministic Approach to Model-based Prediction Uncertainty Estimation

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Saxena, Abhinav; Goebel, Kai

    2012-01-01

    Prognostics deals with the prediction of the end of life (EOL) of a system. EOL is a random variable, due to the presence of process noise and uncertainty in the future inputs to the system. Prognostics algorithm must account for this inherent uncertainty. In addition, these algorithms never know exactly the state of the system at the desired time of prediction, or the exact model describing the future evolution of the system, accumulating additional uncertainty into the predicted EOL. Prediction algorithms that do not account for these sources of uncertainty are misrepresenting the EOL and can lead to poor decisions based on their results. In this paper, we explore the impact of uncertainty in the prediction problem. We develop a general model-based prediction algorithm that incorporates these sources of uncertainty, and propose a novel approach to efficiently handle uncertainty in the future input trajectories of a system by using the unscented transformation. Using this approach, we are not only able to reduce the computational load but also estimate the bounds of uncertainty in a deterministic manner, which can be useful to consider during decision-making. Using a lithium-ion battery as a case study, we perform several simulation-based experiments to explore these issues, and validate the overall approach using experimental data from a battery testbed.

  19. Accounting for failure: risk-based regulation and the problems of ensuring healthcare quality in the NHS

    PubMed Central

    Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry

    2016-01-01

    In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013–2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes. PMID:27499677

  20. Community-Based Participatory Evaluation: The Healthy Start Approach

    PubMed Central

    Braithwaite, Ronald L.; McKenzie, Robetta D.; Pruitt, Vikki; Holden, Kisha B.; Aaron, Katrina; Hollimon, Chavone

    2013-01-01

    The use of community-based participatory research has gained momentum as a viable approach to academic and community engagement for research over the past 20 years. This article discusses an approach for extending the process with an emphasis on evaluation of a community partnership–driven initiative and thus advances the concept of conducting community-based participatory evaluation (CBPE) through a model used by the Healthy Start project of the Augusta Partnership for Children, Inc., in Augusta, Georgia. Application of the CBPE approach advances the importance of bilateral engagements with consumers and academic evaluators. The CBPE model shows promise as a reliable and credible evaluation approach for community-level assessment of health promotion programs. PMID:22461687

  1. Can we reconcile atmospheric estimates of the Northern terrestrial carbon sink with land-based accounting?

    SciTech Connect

    Ciais, Philippe; Luyssaert, Sebastiaan; Chevallier, Fredric; Poussi, Zegbeu; Peylin, Philippe; Breon, Francois-Marie; Canadell, J.G.; Shvidenko, Anatoly; Jonas, Matthias; King, Anthony Wayne; Schulze, E.-D.; Roedenbeck, Christian; Piao, Shilong; Peters, Wouter

    2010-10-01

    We estimatethenorthernhemisphere(NH)terrestrialcarbon sink bycomparingfourrecentatmosphericinversionswith land-based Caccountingdataforsixlargenorthernregions. The meanNHterrestrialCO2 sink fromtheinversionmodelsis 1.7 PgCyear1 over theperiod2000 2004. Theuncertaintyof this estimateisbasedonthetypicalindividual(1-sigma) precision ofoneinversion(0.9PgCyear1) andisconsistent with themin max rangeofthefourinversionmeanestimates (0.8 PgCyear1). Inversionsagreewithintheiruncertaintyfor the distributionoftheNHsinkofCO2 in longitude,withRussia being thelargestsink.Theland-basedaccountingestimateof NH carbonsinkis1.7PgCyear1 for thesumofthesixregions studied. The1-sigmauncertaintyoftheland-basedestimate (0.3 PgCyear1) issmallerthanthatofatmosphericinversions, but noindependentland-basedfluxestimateisavailableto derive a betweenaccountingmodel uncertainty. Encouragingly, thetop-downatmosphericandthebottom-up land-based methodsconvergetoconsistentmeanestimates within theirrespectiveerrors,increasingtheconfidenceinthe overall budget.Theseresultsalsoconfirmthecontinuedcritical role ofNHterrestrialecosystemsinslowingdownthe atmospheric accumulationofanthropogenicCO2

  2. What Part of "No" Do Children Not Understand? A Usage-Based Account of Multiword Negation

    ERIC Educational Resources Information Center

    Cameron-Faulkner, Thea; Lieven, Elena; Theakston, Anna

    2007-01-01

    The study investigates the development of English multiword negation, in particular the negation of zero marked verbs (e.g. "no sleep", "not see", "can't reach") from a usage-based perspective. The data was taken from a dense database consisting of the speech of an English-speaking child (Brian) aged 2;3-3;4 (MLU 2.05-3.1) and his mother. The…

  3. Healthcare information system approaches based on middleware concepts.

    PubMed

    Holena, M; Blobel, B

    1997-01-01

    To meet the challenges for efficient and high-level quality, health care systems must implement the "Shared Care" paradigm of distributed co-operating systems. To this end, both the newly developed and legacy applications must be fully integrated into the care process. These requirements can be fulfilled by information systems based on middleware concepts. In the paper, the middleware approaches HL7, DHE, and CORBA are described. The relevance of those approaches to the healthcare domain is documented. The description presented here is complemented through two other papers in this volume, concentrating on the evaluation of the approaches, and on their security threats and solutions. PMID:10175361

  4. An information-based neural approach to constraint satisfaction.

    PubMed

    Jönsson, H; Söderberg, B

    2001-08-01

    A novel artificial neural network approach to constraint satisfaction problems is presented. Based on information-theoretical considerations, it differs from a conventional mean-field approach in the form of the resulting free energy. The method, implemented as an annealing algorithm, is numerically explored on a testbed of K-SAT problems. The performance shows a dramatic improvement over that of a conventional mean-field approach and is comparable to that of a state-of-the-art dedicated heuristic (GSAT+walk). The real strength of the method, however, lies in its generality. With minor modifications, it is applicable to arbitrary types of discrete constraint satisfaction problems. PMID:11506672

  5. LIFE CLIMATREE project: A novel approach for accounting and monitoring carbon sequestration of tree crops and their potential as carbon sink areas

    NASA Astrophysics Data System (ADS)

    Stergiou, John; Tagaris, Efthimios; -Eleni Sotiropoulou, Rafaella

    2016-04-01

    Climate Change Mitigation is one of the most important objectives of the Kyoto Convention, and is mostly oriented towards reducing GHG emissions. However, carbon sink is retained only in the calculation of the forests capacity since agricultural land and farmers practices for securing carbon stored in soils have not been recognized in GHG accounting, possibly resulting in incorrect estimations of the carbon dioxide balance in the atmosphere. The agricultural sector, which is a key sector in the EU, presents a consistent strategic framework since 1954, in the form of Common Agricultural Policy (CAP). In its latest reform of 2013 (reg. (EU) 1305/13) CAP recognized the significance of Agriculture as a key player in Climate Change policy. In order to fill this gap the "LIFE ClimaTree" project has recently founded by the European Commission aiming to provide a novel method for including tree crop cultivations in the LULUCF's accounting rules for GHG emissions and removal. In the framework of "LIFE ClimaTree" project estimation of carbon sink within EU through the inclusion of the calculated tree crop capacity will be assessed for both current and future climatic conditions by 2050s using the GISS-WRF modeling system in a very fine scale (i.e., 9km x 9km) using RCP8.5 and RCP4.5 climate scenarios. Acknowledgement: LIFE CLIMATREE project "A novel approach for accounting and monitoring carbon sequestration of tree crops and their potential as carbon sink areas" (LIFE14 CCM/GR/000635).

  6. Nondestructive vision-based approaches for condition assessment of structures

    NASA Astrophysics Data System (ADS)

    Jahanshahi, Mohammad R.; Masri, Sami F.

    2011-04-01

    Nondestructive evaluation techniques, including the use of optical instrumentation (e.g., digital cameras), image processing and computer vision are promising approaches for structural health monitoring to complement sensorbased approaches. This study applies and evaluates the underlying technical elements for the development of an integrated inspection tool that is based on the use of commercially available digital cameras. The proposed system can help an inspector to visually assess a target structure remotely, without the need of having to travel to the bridge site, and by bypassing needed traffic detouring. Also, a contact-less vision-based crack detection methodology is introduced and evaluated. Illustrative examples are provided to demonstrate the capabilities as well as the limitations of the proposed vision-based approaches.

  7. Image coding approach based on multiscale matching pursuits operation

    NASA Astrophysics Data System (ADS)

    Li, Hui; Wolff, Ingo

    1998-12-01

    A new image coding technique based on the Multiscale Matching Pursuits (MMP) approach is presented. Using a pre-defined dictionary set, which consists of a limited amount of elements, the MMP approach can decompose/encode images on different image scales and reconstruct/decode the image by the same dictionary. The MMP approach can be used to represent different scale image texture as well as the whole image. Instead of the pixel-based image representation, the MMP method represents the image texture as an index of a dictionary and thereby can encode the image with low data volume. Based on the MMP operation, the image content can be coded in an order from global to local and detail.

  8. A Variance Based Active Learning Approach for Named Entity Recognition

    NASA Astrophysics Data System (ADS)

    Hassanzadeh, Hamed; Keyvanpour, Mohammadreza

    The cost of manually annotating corpora is one of the significant issues in many text based tasks such as text mining, semantic annotation and generally information extraction. Active Learning is an approach that deals with reduction of labeling costs. In this paper we proposed an effective active learning approach based on minimal variance that reduces manual annotation cost by using a small number of manually labeled examples. In our approach we use a confidence measure based on the model's variance that reaches a considerable accuracy for annotating entities. Conditional Random Field (CRF) is chosen as the underlying learning model due to its promising performance in many sequence labeling tasks. The experiments show that the proposed method needs considerably fewer manual labeled samples to produce a desirable result.

  9. Accounting & Computing Curriculum Guide.

    ERIC Educational Resources Information Center

    Avani, Nathan T.; And Others

    This curriculum guide consists of materials for use in teaching a competency-based accounting and computing course that is designed to prepare students for employability in the following occupational areas: inventory control clerk, invoice clerk, payroll clerk, traffic clerk, general ledger bookkeeper, accounting clerk, account information clerk,…

  10. A new computational account of cognitive control over reinforcement-based decision-making: Modeling of a probabilistic learning task.

    PubMed

    Zendehrouh, Sareh

    2015-11-01

    Recent work on decision-making field offers an account of dual-system theory for decision-making process. This theory holds that this process is conducted by two main controllers: a goal-directed system and a habitual system. In the reinforcement learning (RL) domain, the habitual behaviors are connected with model-free methods, in which appropriate actions are learned through trial-and-error experiences. However, goal-directed behaviors are associated with model-based methods of RL, in which actions are selected using a model of the environment. Studies on cognitive control also suggest that during processes like decision-making, some cortical and subcortical structures work in concert to monitor the consequences of decisions and to adjust control according to current task demands. Here a computational model is presented based on dual system theory and cognitive control perspective of decision-making. The proposed model is used to simulate human performance on a variant of probabilistic learning task. The basic proposal is that the brain implements a dual controller, while an accompanying monitoring system detects some kinds of conflict including a hypothetical cost-conflict one. The simulation results address existing theories about two event-related potentials, namely error related negativity (ERN) and feedback related negativity (FRN), and explore the best account of them. Based on the results, some testable predictions are also presented. PMID:26339919

  11. Q-Axis Flux-Based Sensorless Vector Control of Induction Motor Taking into Account Iron Loss

    NASA Astrophysics Data System (ADS)

    Tsuji, Mineo; Chen, Shuo; Kai, Toshihiro; Hamasaki, Shin-Ichi

    This paper presents a sensorless vector control system for induction motors by taking into account iron loss, in which a flux-observer-based method is applied. Since the flux observer is constructed in a synchronously rotating reference frame with respect to the rotor flux of a current model and the iron loss resistance of parallel exiting circuit is used, the proposed system is very simple and the compensation of iron loss related to the operating frequency is directly realized while calculating rotor fluxes and slip frequency. The accuracies of estimated torque and speed are improved. The effectiveness of the proposed system has been verified by digital simulation and experimentation.

  12. An Open Science Approach to Gis-Based Paleoenvironment Data

    NASA Astrophysics Data System (ADS)

    Willmes, C.; Becker, D.; Verheul, J.; Yener, Y.; Zickel, M.; Bolten, A.; Bubenzer, O.; Bareth, G.

    2016-06-01

    Paleoenvironmental studies and according information (data) are abundantly published and available in the scientific record. However, GIS-based paleoenvironmental information and datasets are comparably rare. Here, we present an Open Science approach for creating GIS-based data and maps of paleoenvironments, and Open Access publishing them in a web based Spatial Data Infrastructure (SDI), for access by the archaeology and paleoenvironment communities. We introduce an approach to gather and create GIS datasets from published non-GIS based facts and information (data), such as analogous maps, textual information or figures in scientific publications. These collected and created geo-datasets and maps are then published, including a Digital Object Identifier (DOI) to facilitate scholarly reuse and citation of the data, in a web based Open Access Research Data Management Infrastructure. The geo-datasets are additionally published in an Open Geospatial Consortium (OGC) standards compliant SDI, and available for GIS integration via OGC Open Web Services (OWS).

  13. Novel linear analysis for a gyrotron oscillator based on a spectral approach

    NASA Astrophysics Data System (ADS)

    Genoud, J.; Tran, T. M.; Alberti, S.; Braunmueller, F.; Hogge, J.-Ph.; Tran, M. Q.; Guss, W. C.; Temkin, R. J.

    2016-04-01

    With the aim of gaining a better physical insight into linear regimes in gyrotrons, a new linear model was developed. This model is based on a spectral approach for solving the self-consistent system of equations describing the wave-particle interaction in the cavity of a gyrotron oscillator. Taking into account the wall-losses self-consistently and including the main system inhomogeneities in the cavity geometry and in the magnetic field, the model is appropriate to consider real system parameters. The main advantage of the spectral approach, compared with a time-dependent approach, is the possibility to describe all of the stable and unstable modes, respectively, with negative and positive growth rates. This permits to reveal the existence of a new set of eigenmodes, in addition to the usual eigenmodes issued from cold-cavity modes. The proposed model can be used for studying other instabilities such as, for instance, backward waves potentially excited in gyrotron beam tunnels.

  14. A Market-Based Approach to Multi-factory Scheduling

    NASA Astrophysics Data System (ADS)

    Vytelingum, Perukrishnen; Rogers, Alex; MacBeth, Douglas K.; Dutta, Partha; Stranjak, Armin; Jennings, Nicholas R.

    In this paper, we report on the design of a novel market-based approach for decentralised scheduling across multiple factories. Specifically, because of the limitations of scheduling in a centralised manner - which requires a center to have complete and perfect information for optimality and the truthful revelation of potentially commercially private preferences to that center - we advocate an informationally decentralised approach that is both agile and dynamic. In particular, this work adopts a market-based approach for decentralised scheduling by considering the different stakeholders representing different factories as self-interested, profit-motivated economic agents that trade resources for the scheduling of jobs. The overall schedule of these jobs is then an emergent behaviour of the strategic interaction of these trading agents bidding for resources in a market based on limited information and their own preferences. Using a simple (zero-intelligence) bidding strategy, we empirically demonstrate that our market-based approach achieves a lower bound efficiency of 84%. This represents a trade-off between a reasonable level of efficiency (compared to a centralised approach) and the desirable benefits of a decentralised solution.

  15. The Transconjunctival Transorbital Approach: A Keyhole Approach to the Midline Anterior Skull Base

    PubMed Central

    Raza, Shaan M.; Quinones-Hinojosa, Alfredo; Lim, Michael; Owusu Boahene, Kofi D.

    2015-01-01

    OBJECTIVE To report an initial experience with a medial transorbital approach to the midline skull base performed via a transconjunctival incision. METHODS The authors retrospectively reviewed their clinical experience with this approach in the management of benign cranial base pathology. Preoperative imaging, intraoperative records, hospitalization charts, and postoperative records were reviewed for relevant data. RESULTS During the period 2009–2011, six patients underwent a transconjunctival craniotomy performed by a neurosurgeon and otolaryngologist–head and neck surgeon working together. The indications for surgery were esthesioneuroblastoma in one patient, juvenile angiofibroma in one patient, Paget disease in one patient, and recalcitrant cerebrospinal fluid leaks in three patients. Three patients had prior cranial base surgery (either open craniotomy or an endonasal approach) done at another institution. The mean length of stay was 3.8 days; mean follow-up was 6 months. Surgery was considered successful in all cases (negative margins or no leak recurrence); diplopia was noted in one patient postoperatively. CONCLUSIONS The transconjunctival medial orbital craniectomy provides a minimally invasive keyhole approach to lesions located anteriorly along the anterior cranial fossa that are in the midline with lateral extension over the orbital roof. Based on our initial experience with this technique, the working space afforded limits complex surgical dissection; this approach is primarily well suited for less extensive pathology. PMID:22722037

  16. A storm event-based approach to TMDL development.

    PubMed

    Hsu, Tsung-Hung; Lin, Jen-Yang; Lee, Tsu-Chuan; Zhang, Harry X; Yu, Shaw L

    2010-04-01

    It is vitally important to define the critical condition for a receiving water body in the total maximum daily load (TMDL) development process. One of the major disadvantages of using a continuous simulation approach is that there is no guarantee that the most critical condition will be covered within the subjectively selected representative hydrologic period, which is usually several years depending on the availability of data. Another limitation of the continuous simulation approach, compared to a design storm approach, is the lack of an estimate of the risk involved. Because of the above limitations, a storm event-based critical flow-storm (CFS) approach was previously developed to explicitly address the critical condition as a combination of a prescribed stream flow and a storm event of certain magnitude, both having a certain frequency of occurrence and when combined, would create a critical condition. The CFS approach was tested successfully in a TMDL study for Muddy Creek in Virginia. The present paper reports results of a comparative study on the applicability of the CFS approach in Taiwan. The Dy-yu creek watershed in northern Taiwan differs significantly from Muddy Creek in terms of climate, hydrology, terrain, and other characteristics. Results show that the critical condition for different watersheds might be also different, and that the CFS approach could clearly define that critical condition and should be considered as an alternative method for TMDL development to a continuous simulation approach. PMID:19266300

  17. Grid-based electronic structure calculations: The tensor decomposition approach

    NASA Astrophysics Data System (ADS)

    Rakhuba, M. V.; Oseledets, I. V.

    2016-05-01

    We present a fully grid-based approach for solving Hartree-Fock and all-electron Kohn-Sham equations based on low-rank approximation of three-dimensional electron orbitals. Due to the low-rank structure the total complexity of the algorithm depends linearly with respect to the one-dimensional grid size. Linear complexity allows for the usage of fine grids, e.g. 81923 and, thus, cheap extrapolation procedure. We test the proposed approach on closed-shell atoms up to the argon, several molecules and clusters of hydrogen atoms. All tests show systematical convergence with the required accuracy.

  18. An Individual-Based Model of Zebrafish Population Dynamics Accounting for Energy Dynamics

    PubMed Central

    Beaudouin, Rémy; Goussen, Benoit; Piccini, Benjamin; Augustine, Starrlight; Devillers, James; Brion, François; Péry, Alexandre R. R.

    2015-01-01

    Developing population dynamics models for zebrafish is crucial in order to extrapolate from toxicity data measured at the organism level to biological levels relevant to support and enhance ecological risk assessment. To achieve this, a dynamic energy budget for individual zebrafish (DEB model) was coupled to an individual based model of zebrafish population dynamics (IBM model). Next, we fitted the DEB model to new experimental data on zebrafish growth and reproduction thus improving existing models. We further analysed the DEB-model and DEB-IBM using a sensitivity analysis. Finally, the predictions of the DEB-IBM were compared to existing observations on natural zebrafish populations and the predicted population dynamics are realistic. While our zebrafish DEB-IBM model can still be improved by acquiring new experimental data on the most uncertain processes (e.g. survival or feeding), it can already serve to predict the impact of compounds at the population level. PMID:25938409

  19. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    NASA Technical Reports Server (NTRS)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  20. The gamesmanship of sex: a model based on African American adolescent accounts.

    PubMed

    Eyre, S L; Hoffman, V; Millstein, S G

    1998-12-01

    This article examines adolescent understanding of the social context of sexual behavior. Using grounded theory to interpret interviews with 39 African American male and female adolescents, the article builds a model of sex-related behavior as a set of interrelated games. A courtship game involves communication of sexual or romantic interest and, over time, formation of a romantic relationship. A duplicity game draws on conventions of a courtship game to trick a partner into having sex. A disclosure game spreads stories about one's own and other's sex-related activities to peers in a gossip network. Finally, a prestige game builds social reputation in the eyes of peers, typically based on gender-specific standards. The article concludes by examining the meanings that sex-related behavior may have for adolescents and the potential use of social knowledge for facilitating adolescent health. PMID:9884994

  1. Monte Carlo-based adaptive EPID dose kernel accounting for different field size responses of imagers

    PubMed Central

    Wang, Song; Gardner, Joseph K.; Gordon, John J.; Li, Weidong; Clews, Luke; Greer, Peter B.; Siebers, Jeffrey V.

    2009-01-01

    The aim of this study is to present an efficient method to generate imager-specific Monte Carlo (MC)-based dose kernels for amorphous silicon-based electronic portal image device dose prediction and determine the effective backscattering thicknesses for such imagers. EPID field size-dependent responses were measured for five matched Varian accelerators from three institutions with 6 MV beams at the source to detector distance (SDD) of 105 cm. For two imagers, measurements were made with and without the imager mounted on the robotic supporting arm. Monoenergetic energy deposition kernels with 0–2.5 cm of water backscattering thicknesses were simultaneously computed by MC to a high precision. For each imager, the backscattering thickness required to match measured field size responses was determined. The monoenergetic kernel method was validated by comparing measured and predicted field size responses at 150 cm SDD, 10×10 cm2 multileaf collimator (MLC) sliding window fields created with 5, 10, 20, and 50 mm gaps, and a head-and-neck (H&N) intensity modulated radiation therapy (IMRT) patient field. Field size responses for the five different imagers deviated by up to 1.3%. When imagers were removed from the robotic arms, response deviations were reduced to 0.2%. All imager field size responses were captured by using between 1.0 and 1.6 cm backscatter. The predicted field size responses by the imager-specific kernels matched measurements for all involved imagers with the maximal deviation of 0.34%. The maximal deviation between the predicted and measured field size responses at 150 cm SDD is 0.39%. The maximal deviation between the predicted and measured MLC sliding window fields is 0.39%. For the patient field, gamma analysis yielded that 99.0% of the pixels have γ<1 by the 2%, 2 mm criteria with a 3% dose threshold. Tunable imager-specific kernels can be generated rapidly and accurately in a single MC simulation. The resultant kernels are imager position

  2. GO-Bayes: Gene Ontology-based overrepresentation analysis using a Bayesian approach

    PubMed Central

    Zhang, Song; Cao, Jing; Kong, Y. Megan; Scheuermann, Richard H.

    2010-01-01

    Motivation: A typical approach for the interpretation of high-throughput experiments, such as gene expression microarrays, is to produce groups of genes based on certain criteria (e.g. genes that are differentially expressed). To gain more mechanistic insights into the underlying biology, overrepresentation analysis (ORA) is often conducted to investigate whether gene sets associated with particular biological functions, for example, as represented by Gene Ontology (GO) annotations, are statistically overrepresented in the identified gene groups. However, the standard ORA, which is based on the hypergeometric test, analyzes each GO term in isolation and does not take into account the dependence structure of the GO-term hierarchy. Results: We have developed a Bayesian approach (GO-Bayes) to measure overrepresentation of GO terms that incorporates the GO dependence structure by taking into account evidence not only from individual GO terms, but also from their related terms (i.e. parents, children, siblings, etc.). The Bayesian framework borrows information across related GO terms to strengthen the detection of overrepresentation signals. As a result, this method tends to identify sets of closely related GO terms rather than individual isolated GO terms. The advantage of the GO-Bayes approach is demonstrated with a simulation study and an application example. Contact: song.zhang@utsouthwestern.edu; richard.scheuermann@utsouthwestern.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20176581

  3. Cropland carbon fluxes in the United States: increasing Geospatial Resolution of Inventory-Based Carbon Accounting

    SciTech Connect

    West, Tristram O.; Brandt, Craig C; Baskaran, Latha Malar; Hellwinckel, Chad M; Marland, Gregg; Nelson, Richard G; De La Torre Ugarte, Daniel G; Post, Wilfred M

    2010-01-01

    Net annual soil carbon change, fossil fuel emissions from cropland production, and cropland net primary productivity were estimated and spatially distributed using land cover defined by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by the Cropland Data Layer (CDL). Spatially resolved estimates of net ecosystem exchange (NEE) and net ecosystem carbon balance (NECB) were developed. NEE represents net on-site vertical fluxes of carbon. NECB represents all on-site and off-site carbon fluxes associated with crop production. Estimates of cropland NEE using moderate resolution (~1km2) land cover data were generated for the conterminous US and compared with higher resolution (30m) estimates of NEE and with direct measurements of CO2 flux from croplands in Illinois and Nebraska. Estimates of NEE using the CDL (30m resolution) had a higher correlation with eddy covariance flux tower estimates compared with estimates of NEE using MODIS. Estimates of NECB are primarily driven by net soil carbon change, fossil-fuel emissions associated with crop production, and CO2 emissions from the application of agricultural lime. NEE and NECB for US croplands were -274 and 7 Tg C yr-1 for 2004, respectively. Use of moderate to high resolution satellite-based land cover data enables improved estimates of cropland carbon dynamics.

  4. A simple microviscometric approach based on Brownian motion tracking.

    PubMed

    Hnyluchová, Zuzana; Bjalončíková, Petra; Karas, Pavel; Mravec, Filip; Halasová, Tereza; Pekař, Miloslav; Kubala, Lukáš; Víteček, Jan

    2015-02-01

    Viscosity-an integral property of a liquid-is traditionally determined by mechanical instruments. The most pronounced disadvantage of such an approach is the requirement of a large sample volume, which poses a serious obstacle, particularly in biology and biophysics when working with limited samples. Scaling down the required volume by means of microviscometry based on tracking the Brownian motion of particles can provide a reasonable alternative. In this paper, we report a simple microviscometric approach which can be conducted with common laboratory equipment. The core of this approach consists in a freely available standalone script to process particle trajectory data based on a Newtonian model. In our study, this setup allowed the sample to be scaled down to 10 μl. The utility of the approach was demonstrated using model solutions of glycerine, hyaluronate, and mouse blood plasma. Therefore, this microviscometric approach based on a newly developed freely available script can be suggested for determination of the viscosity of small biological samples (e.g., body fluids). PMID:25725855

  5. A simple microviscometric approach based on Brownian motion tracking

    NASA Astrophysics Data System (ADS)

    Hnyluchová, Zuzana; Bjalončíková, Petra; Karas, Pavel; Mravec, Filip; Halasová, Tereza; Pekař, Miloslav; Kubala, Lukáš; Víteček, Jan

    2015-02-01

    Viscosity—an integral property of a liquid—is traditionally determined by mechanical instruments. The most pronounced disadvantage of such an approach is the requirement of a large sample volume, which poses a serious obstacle, particularly in biology and biophysics when working with limited samples. Scaling down the required volume by means of microviscometry based on tracking the Brownian motion of particles can provide a reasonable alternative. In this paper, we report a simple microviscometric approach which can be conducted with common laboratory equipment. The core of this approach consists in a freely available standalone script to process particle trajectory data based on a Newtonian model. In our study, this setup allowed the sample to be scaled down to 10 μl. The utility of the approach was demonstrated using model solutions of glycerine, hyaluronate, and mouse blood plasma. Therefore, this microviscometric approach based on a newly developed freely available script can be suggested for determination of the viscosity of small biological samples (e.g., body fluids).

  6. A proven knowledge-based approach to prioritizing process information

    NASA Technical Reports Server (NTRS)

    Corsberg, Daniel R.

    1991-01-01

    Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.

  7. Effects of storm runoff on acid-base accounting of mine drainage

    SciTech Connect

    Sjoegren, D.R.; Olyphant, G.A.; Harper, D.

    1997-12-31

    Pre-reclamation conditions were documented at an abandoned mine site in an upland area at the headwaters of a small perennial stream in southwestern Indiana. Stream discharge and chemistry were monitored from April to October 1995, in an effort to assess the total acid-base budget of outflows from the site. The chemistry of three lakes, a shallow aquifer, and flooded mine voids was also monitored. During the period of monitoring, thirty-five rainfall-runoff events occurred, producing a total storm discharge of approximately 6.12 x 10{sup 7} L. Baseflow during the monitoring period was approximately 1.10 x 10{sup 8} L and was characterized by water chemistry that was similar to that of a spring that issued from the flooded mine voids. Analysis of the discharge and chemistry associated with an isolated thunderstorm revealed fluctuations in acidity that were not congruent with fluctuations in the total discharge hydrograph. For example, acidity increased rapidly during the initial phase of hydrograph rise, but dropped significantly as the storm hydrograph peaked. A second, more subdued, rise in acidity occurred during a second rain pulse, and the acidity gradually decreased to pre-storm levels during hydrograph recession. The trends are interpreted to reflect different sources of storm runoff associated with various components of the total discharge hydrograph. Preliminary calculations indicate that the total quantity of acidity that is discharged during stormflow is about eight times higher than that which is discharged during a comparable period under baseflow conditions. While the lower acid concentrations generated during storm events are ecologically favorable, the increase in total quantities of acidity can have implications for the buffering capacities of receiving water bodies.

  8. Feasibility of a perfluorocarbon tracer based network to support monitoring, verification, and accounting of sequestered CO₂.

    PubMed

    Watson, Thomas B; Sullivan, Terrence

    2012-02-01

    Carbon capture and sequestration (CCS) will act as a bridging technology necessary to facilitate a transition from fossil fuels to a sustainable energy based economy. The Department of Energy (DOE) target leak rate for sequestration reservoirs is 1% of total sequestered CO(2) over the lifetime of the reservoir. This is 0.001% per year for a 1000 year lifetime of a storage reservoir. Effective detection of CO(2) leaks at the surface may require incorporation of a tracer tag into the sequestered CO(2). We applied a simple Gaussian Plume model to predict dispersion of a direct leak into the atmosphere and used the results to examine the requirements for designing a perfluorocarbon (PFT) monitoring network and tracer tagging strategy. Careful consideration must be given to the climate implications of using these compounds. The quantity of PFTs needed for tagging sequestered CO(2) is too large to be practical for routine monitoring. Tagging at a level that will result in 1.5 times background at a sampler 1 km from a leak of 0.01% per year will require 625 kg per year of PFT. This is a leak rate 10 times greater than the 1000 year DOE requirement and will require 19 tons of injected PFT over the 30 year lifetime of a 1000 mega watt coal fired plant. The utility of PFTs or any other tracer will be lost if the background levels are allowed to rise indiscriminately. A better use of PFTs is as a tool in sequestration research. Instead, geological surveys of sequestration sites will be necessary to locate potential direct pathways and develop targeted monitoring strategies. A global agreement on the use of tracers for monitoring CCS projects should be developed. PMID:22243211

  9. [Internet-based approaches in the therapy of eating disorders].

    PubMed

    Fichter, M M; Quadflieg, N; Nisslmüller, K; Lindner, S; Voderholzer, U; Wünsch-Leiteritz, W; Osen, B; Huber, T; Zahn, S; Meermann, R; Irrgang, V; Bleichner, F

    2011-09-01

    Recent technological developments of communication media offer new approaches to diagnostic and therapeutic interactions with patients. One major development is Internet-based primary prevention in vulnerable individuals not yet suffering as well as the development of new therapeutic approaches for affected individuals based on the experiences of guided self-help through CD, DVD or bibliotherapy. The eating disorder literature shows several interesting, partly controlled and randomized, studies on bulimia nervosa, a few studies on binge eating disorder and no studies on anorexia nervosa. As part of the German Eating Disorder Network on Psychotherapy (EDNET) a 9-month Internet-based relapse prevention program for patients with anorexia nervosa after inpatient treatment was evaluated. Conception, first experiences and first results of the Internet-based relapse prevention program for anorexia nervosa are reported. PMID:21755336

  10. A Comparison of Seven Cox Regression-Based Models to Account for Heterogeneity Across Multiple HIV Treatment Cohorts in Latin America and the Caribbean

    PubMed Central

    Giganti, Mark J.; Luz, Paula M.; Caro-Vega, Yanink; Cesar, Carina; Padgett, Denis; Koenig, Serena; Echevarria, Juan; McGowan, Catherine C.; Shepherd, Bryan E.

    2015-01-01

    Abstract Many studies of HIV/AIDS aggregate data from multiple cohorts to improve power and generalizability. There are several analysis approaches to account for cross-cohort heterogeneity; we assessed how different approaches can impact results from an HIV/AIDS study investigating predictors of mortality. Using data from 13,658 HIV-infected patients starting antiretroviral therapy from seven Latin American and Caribbean cohorts, we illustrate the assumptions of seven readily implementable approaches to account for across cohort heterogeneity with Cox proportional hazards models, and we compare hazard ratio estimates across approaches. As a sensitivity analysis, we modify cohort membership to generate specific heterogeneity conditions. Hazard ratio estimates varied slightly between the seven analysis approaches, but differences were not clinically meaningful. Adjusted hazard ratio estimates for the association between AIDS at treatment initiation and death varied from 2.00 to 2.20 across approaches that accounted for heterogeneity; the adjusted hazard ratio was estimated as 1.73 in analyses that ignored across cohort heterogeneity. In sensitivity analyses with more extreme heterogeneity, we noted a slightly greater distinction between approaches. Despite substantial heterogeneity between cohorts, the impact of the specific approach to account for heterogeneity was minimal in our case study. Our results suggest that it is important to account for across cohort heterogeneity in analyses, but that the specific technique for addressing heterogeneity may be less important. Because of their flexibility in accounting for cohort heterogeneity, we prefer stratification or meta-analysis methods, but we encourage investigators to consider their specific study conditions and objectives. PMID:25647087

  11. A Comparison of Seven Cox Regression-Based Models to Account for Heterogeneity Across Multiple HIV Treatment Cohorts in Latin America and the Caribbean.

    PubMed

    Giganti, Mark J; Luz, Paula M; Caro-Vega, Yanink; Cesar, Carina; Padgett, Denis; Koenig, Serena; Echevarria, Juan; McGowan, Catherine C; Shepherd, Bryan E

    2015-05-01

    Many studies of HIV/AIDS aggregate data from multiple cohorts to improve power and generalizability. There are several analysis approaches to account for cross-cohort heterogeneity; we assessed how different approaches can impact results from an HIV/AIDS study investigating predictors of mortality. Using data from 13,658 HIV-infected patients starting antiretroviral therapy from seven Latin American and Caribbean cohorts, we illustrate the assumptions of seven readily implementable approaches to account for across cohort heterogeneity with Cox proportional hazards models, and we compare hazard ratio estimates across approaches. As a sensitivity analysis, we modify cohort membership to generate specific heterogeneity conditions. Hazard ratio estimates varied slightly between the seven analysis approaches, but differences were not clinically meaningful. Adjusted hazard ratio estimates for the association between AIDS at treatment initiation and death varied from 2.00 to 2.20 across approaches that accounted for heterogeneity; the adjusted hazard ratio was estimated as 1.73 in analyses that ignored across cohort heterogeneity. In sensitivity analyses with more extreme heterogeneity, we noted a slightly greater distinction between approaches. Despite substantial heterogeneity between cohorts, the impact of the specific approach to account for heterogeneity was minimal in our case study. Our results suggest that it is important to account for across cohort heterogeneity in analyses, but that the specific technique for addressing heterogeneity may be less important. Because of their flexibility in accounting for cohort heterogeneity, we prefer stratification or meta-analysis methods, but we encourage investigators to consider their specific study conditions and objectives. PMID:25647087

  12. [Marine environmental assessment approaches based on biomarker index: a review].

    PubMed

    Meng, Fan-Ping; Yang, Fei-Fei; Cheng, Feng-Lian

    2012-04-01

    Biomarkers are applied worldwide in marine environmental assessment due to their "early warning" function to chemical pollutants. Several integrative index approaches such as multi-marker pollution index (MPI), integrated biomarker response (IBR), bioeffect assessment index (BAI), biomarker response index (BRI), and health assessment index (HIS), have been developed based on biomarkers. By transforming the complex alterations of biomarkers into a single class or value, these approaches have been so far the useful tools for assessing the environmental quality. This review summarized the establishment of evaluation indicator system, the calculation of integrative index, the grading of pollution status, and the practical applications of each approach, and discussed the existing problems in marine environmental assessment based on biomarker index. Some improving suggestions were also proposed. PMID:22803485

  13. Evaluation of a Blog Based Parent Involvement Approach by Parents

    ERIC Educational Resources Information Center

    Ozcinar, Zehra; Ekizoglu, Nihat

    2013-01-01

    Despite the well-known benefits of parent involvement in children's education, research clearly shows that it is difficult to effectively involve parents. This study aims to capture parents' views of a Blog Based Parent Involvement Approach (BPIA) designed to secure parent involvement in education by strengthening school-parent communication. Data…

  14. Augmented approach to desirability function based on MM estimator

    NASA Astrophysics Data System (ADS)

    Midi, Habshah; Mustafa, Mohd Shafie; Fitrianto, Anuar

    2013-04-01

    The desirability function approach is commonly used in industry to tackle multiple response optimization problems. The shortcoming of this approach is that the variability in each predicted response is ignored. It is now evident that the actual response may fall outside the acceptable region even though the predicted response at the optimal solution has a high overall desirability score. An augmented approach to the desirability function (AADF) is put forward to rectify this problem. Nevertheless the AADF is easily affected by outliers since the AADF is constructed based on the Ordinary Least Squares (OLS) estimate which is not resistant to outliers. As an alternative, we propose a robust MM-estimator to estimate the parameters of the Response Surface Model (RSM) and incorporated the estimated parameters in the augmented approach framework. A numerical example is presented to assess the performance of the AADF-MM based method. The numerical results signify that the AADF-MM based is more efficient than the AADF-OLS based method.

  15. Content Based Image Retrieval and Information Theory: A General Approach.

    ERIC Educational Resources Information Center

    Zachary, John; Iyengar, S. S.; Barhen, Jacob

    2001-01-01

    Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image…

  16. School-Based HIV Prevention: A Multidisciplinary Approach.

    ERIC Educational Resources Information Center

    Kerr, Dianne L.; And Others

    This manual was written to help school-based professionals implement school health education programs to prevent the spread of the human immunodeficiency virus (HIV). The manual provides a framework and plan to promote an interdisciplinary approach to HIV education in schools. The manual begins with a review of basic facts about acquired immune…

  17. Tennis: Applied Examples of a Game-Based Teaching Approach

    ERIC Educational Resources Information Center

    Crespo, Miguel; Reid, Machar M.; Miley, Dave

    2004-01-01

    In this article, the authors reveal that tennis has been increasingly taught with a tactical model or game-based approach, which emphasizes learning through practice in match-like drills and actual play, rather than in practicing strokes for exact technical execution. Its goal is to facilitate the player's understanding of the tactical, physical…

  18. An Evidence-Based Approach to Introductory Chemistry

    ERIC Educational Resources Information Center

    Johnson, Philip

    2014-01-01

    Drawing on research into students' understanding, this article argues that the customary approach to introductory chemistry has created difficulties for students. Instead of being based on the notion of "solids, liquids and gases", introductory chemistry should be structured to develop the concept of a substance. The concept of a…

  19. Evaluation Theory in Problem-Based Learning Approach.

    ERIC Educational Resources Information Center

    Hsu, Yu-chen

    The purpose of this paper is to review evaluation theories and techniques in both the medical and educational fields and to propose an evaluation theory to explain the condition variables, the method variables, and the outcome variables of student assessment in a problem-based learning (PBL) approach. The PBL definition and process are presented,…

  20. IIM Digital Library System: Consortia-Based Approach.

    ERIC Educational Resources Information Center

    Pandian, M. Paul; Jambhekar, Ashok; Karisiddappa, C. R.

    2002-01-01

    Provides a framework for the design and development of an intranet model based on a consortia approach by the Indian Institutes of Management (IIM) digital library system that will facilitate information access and use by providing a single Web-enabled window to users to their own resources and to sources in other participating institutions.…

  1. From Equation to Inequality Using a Function-Based Approach

    ERIC Educational Resources Information Center

    Verikios, Petros; Farmaki, Vassiliki

    2010-01-01

    This article presents features of a qualitative research study concerning the teaching and learning of school algebra using a function-based approach in a grade 8 class, of 23 students, in 26 lessons, in a state school of Athens, in the school year 2003-2004. In this article, we are interested in the inequality concept and our aim is to…

  2. A Problem-Based Learning Approach to Entrepreneurship Education

    ERIC Educational Resources Information Center

    Tan, Siok San; Ng, C. K. Frank

    2006-01-01

    Purpose: While it is generally acknowledged that entrepreneurship can be taught, many differ in their opinions about the appropriate methodologies to teach and equip students with the requisite entrepreneurial skills. This paper presents a case to suggest that a problem-based learning (PBL) approach practised at the Republic Polytechnic in…

  3. A Genre-Based Approach to Teaching EFL Summary Writing

    ERIC Educational Resources Information Center

    Chen, Yuan-Shan; Su, Shao-Wen

    2012-01-01

    This study utilizes a pre-test/post-test assessment to investigate the instructional efficacy of a genre-based approach to teaching summary writing. Forty-one EFL university students in Taiwan were asked before and after the instruction to summarize a simplified version of The Adventures of Tom Sawyer in a maximum of 500 words. All the students'…

  4. Economic Dispatch Using Genetic Algorithm Based Hybrid Approach

    SciTech Connect

    Tahir Nadeem Malik; Aftab Ahmad; Shahab Khushnood

    2006-07-01

    Power Economic Dispatch (ED) is vital and essential daily optimization procedure in the system operation. Present day large power generating units with multi-valves steam turbines exhibit a large variation in the input-output characteristic functions, thus non-convexity appears in the characteristic curves. Various mathematical and optimization techniques have been developed, applied to solve economic dispatch (ED) problem. Most of these are calculus-based optimization algorithms that are based on successive linearization and use the first and second order differentiations of objective function and its constraint equations as the search direction. They usually require heat input, power output characteristics of generators to be of monotonically increasing nature or of piecewise linearity. These simplifying assumptions result in an inaccurate dispatch. Genetic algorithms have used to solve the economic dispatch problem independently and in conjunction with other AI tools and mathematical programming approaches. Genetic algorithms have inherent ability to reach the global minimum region of search space in a short time, but then take longer time to converge the solution. GA based hybrid approaches get around this problem and produce encouraging results. This paper presents brief survey on hybrid approaches for economic dispatch, an architecture of extensible computational framework as common environment for conventional, genetic algorithm and hybrid approaches based solution for power economic dispatch, the implementation of three algorithms in the developed framework. The framework tested on standard test systems for its performance evaluation. (authors)

  5. [Global brain metastases management strategy: a multidisciplinary-based approach].

    PubMed

    Métellus, P; Tallet, A; Dhermain, F; Reyns, N; Carpentier, A; Spano, J-P; Azria, D; Noël, G; Barlési, F; Taillibert, S; Le Rhun, É

    2015-02-01

    Brain metastases management has evolved over the last fifteen years and may use varying strategies, including more or less aggressive treatments, sometimes combined, leading to an improvement in patient's survival and quality of life. The therapeutic decision is subject to a multidisciplinary analysis, taking into account established prognostic factors including patient's general condition, extracerebral disease status and clinical and radiological presentation of lesions. In this article, we propose a management strategy based on the state of current knowledge and available therapeutic resources. PMID:25649388

  6. A Dynamic Path Planning Approach for Multirobot Sensor-Based Coverage Considering Energy Constraints.

    PubMed

    Yazici, Ahmet; Kirlik, Gokhan; Parlaktuna, Osman; Sipahioglu, Aydin

    2014-03-01

    Multirobot sensor-based coverage path planning determines a tour for each robot in a team such that every point in a given workspace is covered by at least one robot using its sensors. In sensor-based coverage of narrow spaces, i.e., obstacles lie within the sensor range, a generalized Voronoi diagram (GVD)-based graph can be used to model the environment. A complete sensor-based coverage path plan for the robot team can be obtained by using the capacitated arc routing problem solution methods on the GVD-based graph. Unlike capacitated arc routing problem, sensor-based coverage problem requires to consider two types of edge demands. Therefore, modified Ulusoy algorithm is used to obtain mobile robot tours by taking into account two different energy consumption cases during sensor-based coverage. However, due to the partially unknown nature of the environment, the robots may encounter obstacles on their tours. This requires a replanning process that considers the remaining energy capacities and the current positions of the robots. In this paper, the modified Ulusoy algorithm is extended to incorporate this dynamic planning problem. A dynamic path-planning approach is proposed for multirobot sensor-based coverage of narrow environments by considering the energy capacities of the mobile robots. The approach is tested in a laboratory environment using Pioneer 3-DX mobile robots. Simulations are also conducted for a larger test environment. PMID:23757551

  7. Branch-Based Model for the Diameters of the Pulmonary Airways: Accounting for Departures From Self-Consistency and Registration Errors

    SciTech Connect

    Neradilek, Moni B.; Polissar, Nayak L.; Einstein, Daniel R.; Glenny, Robb W.; Minard, Kevin R.; Carson, James P.; Jiao, Xiangmin; Jacob, Richard E.; Cox, Timothy C.; Postlethwait, Edward M.; Corley, Richard A.

    2012-04-24

    We examine a previously published branch-based approach to modeling airway diameters that is predicated on the assumption of self-consistency across all levels of the tree. We mathematically formulate this assumption, propose a method to test it and develop a more general model to be used when the assumption is violated. We discuss the effect of measurement error on the estimated models and propose methods that account for it. The methods are illustrated on data from MRI and CT images of silicone casts of two rats, two normal monkeys and one ozone-exposed monkey. Our results showed substantial departures from self-consistency in all five subjects. When departures from selfconsistency exist we do not recommend using the self-consistency model, even as an approximation, as we have shown that it may likely lead to an incorrect representation of the diameter geometry. Measurement error has an important impact on the estimated morphometry models and needs to be accounted for in the analysis.

  8. Simulation-Based Approach for Site-Specific Optimization of Hydrokinetic Turbine Arrays

    NASA Astrophysics Data System (ADS)

    Sotiropoulos, F.; Chawdhary, S.; Yang, X.; Khosronejad, A.; Angelidis, D.

    2014-12-01

    A simulation-based approach has been developed to enable site-specific optimization of tidal and current turbine arrays in real-life waterways. The computational code is based on the St. Anthony Falls Laboratory Virtual StreamLab (VSL3D), which is able to carry out high-fidelity simulations of turbulent flow and sediment transport processes in rivers and streams taking into account the arbitrary geometrical complexity characterizing natural waterways. The computational framework can be used either in turbine-resolving mode, to take into account all geometrical details of the turbine, or with the turbines parameterized as actuator disks or actuator lines. Locally refined grids are employed to dramatically increase the resolution of the simulation and enable efficient simulations of multi-turbine arrays. Turbine/sediment interactions are simulated using the coupled hydro-morphodynamic module of VSL3D. The predictive capabilities of the resulting computational framework will be demonstrated by applying it to simulate turbulent flow past a tri-frame configuration of hydrokinetic turbines in a rigid-bed turbulent open channel flow as well as turbines mounted on mobile bed open channels to investigate turbine/sediment interactions. The utility of the simulation-based approach for guiding the optimal development of turbine arrays in real-life waterways will also be discussed and demonstrated. This work was supported by NSF grant IIP-1318201. Simulations were carried out at the Minnesota Supercomputing Institute.

  9. A Unified Model of Time Perception Accounts for Duration-Based and Beat-Based Timing Mechanisms

    PubMed Central

    Teki, Sundeep; Grube, Manon; Griffiths, Timothy D.

    2011-01-01

    Accurate timing is an integral aspect of sensory and motor processes such as the perception of speech and music and the execution of skilled movement. Neuropsychological studies of time perception in patient groups and functional neuroimaging studies of timing in normal participants suggest common neural substrates for perceptual and motor timing. A timing system is implicated in core regions of the motor network such as the cerebellum, inferior olive, basal ganglia, pre-supplementary, and supplementary motor area, pre-motor cortex as well as higher-level areas such as the prefrontal cortex. In this article, we assess how distinct parts of the timing system subserve different aspects of perceptual timing. We previously established brain bases for absolute, duration-based timing and relative, beat-based timing in the olivocerebellar and striato-thalamo-cortical circuits respectively (Teki et al., 2011). However, neurophysiological and neuroanatomical studies provide a basis to suggest that timing functions of these circuits may not be independent. Here, we propose a unified model of time perception based on coordinated activity in the core striatal and olivocerebellar networks that are interconnected with each other and the cerebral cortex through multiple synaptic pathways. Timing in this unified model is proposed to involve serial beat-based striatal activation followed by absolute olivocerebellar timing mechanisms. PMID:22319477

  10. A Collaborative Approach to Preparing Field-Based Teachers/Supervisors for Standards-Based Accountability Systems in Teacher Education

    ERIC Educational Resources Information Center

    Powell, Beverlee-Ann; Szlosek, Peggy; Flaherty, Thomas; Ryan, Lynne

    2007-01-01

    Preparing college supervisors/cooperating teachers to support teacher candidates' performance using standards is a challenge for teacher preparation programs. This paper will describe a professional development program collaboratively developed by representatives of the state department of education, institutions of higher education, and K-12…

  11. A Hybrid LSSVR/HMM-Based Prognostic Approach

    PubMed Central

    Liu, Zhijuan; Li, Qing; Liu, Xianhui; Mu, Chundi

    2013-01-01

    In a health management system, prognostics, which is an engineering discipline that predicts a system's future health, is an important aspect yet there is currently limited research in this field. In this paper, a hybrid approach for prognostics is proposed. The approach combines the least squares support vector regression (LSSVR) with the hidden Markov model (HMM). Features extracted from sensor signals are used to train HMMs, which represent different health levels. A LSSVR algorithm is used to predict the feature trends. The LSSVR training and prediction algorithms are modified by adding new data and deleting old data and the probabilities of the predicted features for each HMM are calculated based on forward or backward algorithms. Based on these probabilities, one can determine a system's future health state and estimate the remaining useful life (RUL). To evaluate the proposed approach, a test was carried out using bearing vibration signals. Simulation results show that the LSSVR/HMM approach can forecast faults long before they occur and can predict the RUL. Therefore, the LSSVR/HMM approach is very promising in the field of prognostics. PMID:23624688

  12. An Efficient Soft Set-Based Approach for Conflict Analysis.

    PubMed

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  13. An Efficient Soft Set-Based Approach for Conflict Analysis

    PubMed Central

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  14. A combined forecasting approach based on fuzzy soft sets

    NASA Astrophysics Data System (ADS)

    Xiao, Zhi; Gong, Ke; Zou, Yan

    2009-06-01

    Forecasting the export and import volume in international trade is the prerequisite of a government's policy-making and guidance for a healthier international trade development. However, an individual forecast may not always perform satisfactorily, while combination of forecasts may result in a better forecast than component forecasts. We believe the component forecasts employed in combined forecasts are a description of the actual time series, which is fuzzy. This paper attempts to use forecasting accuracy as the criterion of fuzzy membership function, and proposes a combined forecasting approach based on fuzzy soft sets. This paper also examines the method with data of international trade from 1993 to 2006 in the Chongqing Municipality of China and compares it with a combined forecasting approach based on rough sets and each individual forecast. The experimental results show that the combined approach provided in this paper improves the forecasting performance of each individual forecast and is free from a rough sets approach's restrictions as well. It is a promising forecasting approach and a new application of soft sets theory.

  15. Program Accounting for Indiana Schools.

    ERIC Educational Resources Information Center

    Copeland, Jack; Costerison, Dennis

    This booklet outlines the conversion of the Western Wayne (Indiana) Schools from a traditional school accounting and budgeting system to a program accounting and budgeting system. The Western Wayne Schools became the first district to adopt Indiana's new program accounting and budgeting system in 1975. The Indiana approach to program accounting is…

  16. Passive localization in ocean acoustics: A model-based approach

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1995-09-01

    A model-based approach is developed to solve the passive localization problem in ocean acoustics using the state-space formulation for the first time. It is shown that the inherent structure of the resulting processor consists of a parameter estimator coupled to a nonlinear optimization scheme. The parameter estimator is designed using the model-based approach in which an ocean acoustic propagation model is used in developing the model-based processor required for localization. Recall that model-based signal processing is a well-defined methodology enabling the inclusion of environmental (propagation) models, measurement (sensor arrays) models, and noise (shipping, measurement) models into a sophisticated processing algorithm. Here the parameter estimator is designed, or more appropriately the model-based identifier (MBID) for a propagation model developed from a shallow water ocean experiment. After simulation, it is then applied to a set of experimental data demonstrating the applicability of this approach. {copyright} {ital 1995} {ital Acoustical} {ital Society} {ital of} {ital America}.

  17. Attitudes toward a game-based approach to mental health.

    PubMed

    Kreutzer, Christine P; Bowers, Clint A

    2015-01-01

    Based on preliminary research, game-based treatments appear to be a promising approach to post-traumatic stress disorder (PTSD). However, attitudes toward this novel approach must be better understood. Thus, the objective of this study was to determine if video game self-efficacy mediates the relationship between expectations and reactions to a game-based treatment for PTSD. Participants played the serious game "Walk in My Shoes" (Novonics Corp., Orlando, FL) and completed a series of scales to measure attitudes toward the intervention. Video game self-efficacy was found to be a partial mediator of expectancies and reactions. These results suggest that enhancing attitudes via self-efficacy in a clinical setting may maximize treatment effectiveness. PMID:25584727

  18. An Evidential Approach To Model-Based Satellite Diagnosis

    NASA Astrophysics Data System (ADS)

    Bickmore, Timothy W.; Yoshimoto, Glenn M.

    1987-10-01

    Satellite diagnosis presents many unusual problems in the application of current knowledge-based diagnosis technology. The operation of satellite systems involves expertise that spans a large variety of systems, hardware, and software design areas. This expertise includes knowledge of design rationale and sensitivities, development history, test methods, test history, fault history and other indications of pedigree, and operational scenarios and environments. We have developed an approach to satellite diagnosis which can integrate evidence from a variety of diagnostic strategies encompassing this expertise. The system utilizes a structural and behavioral model of the satellite, and uses a form of spreading activation to perform the diagnostic procedures on the model. The various sources of diagnostic evidence are combined using a specially-tailored Dempster-Shafer based utility for modelling uncertainty. A prototype of a diagnostic system based on this approach has been implemented.

  19. Searching for adaptive traits in genetic resources - phenology based approach

    NASA Astrophysics Data System (ADS)

    Bari, Abdallah

    2015-04-01

    Searching for adaptive traits in genetic resources - phenology based approach Abdallah Bari, Kenneth Street, Eddy De Pauw, Jalal Eddin Omari, and Chandra M. Biradar International Center for Agricultural Research in the Dry Areas, Rabat Institutes, Rabat, Morocco Phenology is an important plant trait not only for assessing and forecasting food production but also for searching in genebanks for adaptive traits. Among the phenological parameters we have been considering to search for such adaptive and rare traits are the onset (sowing period) and the seasonality (growing period). Currently an application is being developed as part of the focused identification of germplasm strategy (FIGS) approach to use climatic data in order to identify crop growing seasons and characterize them in terms of onset and duration. These approximations of growing period characteristics can then be used to estimate flowering and maturity dates for dryland crops, such as wheat, barley, faba bean, lentils and chickpea, and assess, among others, phenology-related traits such as days to heading [dhe] and grain filling period [gfp]. The approach followed here is based on first calculating long term average daily temperatures by fitting a curve to the monthly data over days from beginning of the year. Prior to the identification of these phenological stages the onset is extracted first from onset integer raster GIS layers developed based on a model of the growing period that considers both moisture and temperature limitations. The paper presents some examples of real applications of the approach to search for rare and adaptive traits.

  20. Dependency Resolution Difficulty Increases with Distance in Persian Separable Complex Predicates: Evidence for Expectation and Memory-Based Accounts.

    PubMed

    Safavi, Molood S; Husain, Samar; Vasishth, Shravan

    2016-01-01

    Delaying the appearance of a verb in a noun-verb dependency tends to increase processing difficulty at the verb; one explanation for this locality effect is decay and/or interference of the noun in working memory. Surprisal, an expectation-based account, predicts that delaying the appearance of a verb either renders it no more predictable or more predictable, leading respectively to a prediction of no effect of distance or a facilitation. Recently, Husain et al. (2014) suggested that when the exact identity of the upcoming verb is predictable (strong predictability), increasing argument-verb distance leads to facilitation effects, which is consistent with surprisal; but when the exact identity of the upcoming verb is not predictable (weak predictability), locality effects are seen. We investigated Husain et al.'s proposal using Persian complex predicates (CPs), which consist of a non-verbal element-a noun in the current study-and a verb. In CPs, once the noun has been read, the exact identity of the verb is highly predictable (strong predictability); this was confirmed using a sentence completion study. In two self-paced reading (SPR) and two eye-tracking (ET) experiments, we delayed the appearance of the verb by interposing a relative clause (Experiments 1 and 3) or a long PP (Experiments 2 and 4). We also included a simple Noun-Verb predicate configuration with the same distance manipulation; here, the exact identity of the verb was not predictable (weak predictability). Thus, the design crossed Predictability Strength and Distance. We found that, consistent with surprisal, the verb in the strong predictability conditions was read faster than in the weak predictability conditions. Furthermore, greater verb-argument distance led to slower reading times; strong predictability did not neutralize or attenuate the locality effects. As regards the effect of distance on dependency resolution difficulty, these four experiments present evidence in favor of working memory

  1. Dependency Resolution Difficulty Increases with Distance in Persian Separable Complex Predicates: Evidence for Expectation and Memory-Based Accounts

    PubMed Central

    Safavi, Molood S.; Husain, Samar; Vasishth, Shravan

    2016-01-01

    Delaying the appearance of a verb in a noun-verb dependency tends to increase processing difficulty at the verb; one explanation for this locality effect is decay and/or interference of the noun in working memory. Surprisal, an expectation-based account, predicts that delaying the appearance of a verb either renders it no more predictable or more predictable, leading respectively to a prediction of no effect of distance or a facilitation. Recently, Husain et al. (2014) suggested that when the exact identity of the upcoming verb is predictable (strong predictability), increasing argument-verb distance leads to facilitation effects, which is consistent with surprisal; but when the exact identity of the upcoming verb is not predictable (weak predictability), locality effects are seen. We investigated Husain et al.'s proposal using Persian complex predicates (CPs), which consist of a non-verbal element—a noun in the current study—and a verb. In CPs, once the noun has been read, the exact identity of the verb is highly predictable (strong predictability); this was confirmed using a sentence completion study. In two self-paced reading (SPR) and two eye-tracking (ET) experiments, we delayed the appearance of the verb by interposing a relative clause (Experiments 1 and 3) or a long PP (Experiments 2 and 4). We also included a simple Noun-Verb predicate configuration with the same distance manipulation; here, the exact identity of the verb was not predictable (weak predictability). Thus, the design crossed Predictability Strength and Distance. We found that, consistent with surprisal, the verb in the strong predictability conditions was read faster than in the weak predictability conditions. Furthermore, greater verb-argument distance led to slower reading times; strong predictability did not neutralize or attenuate the locality effects. As regards the effect of distance on dependency resolution difficulty, these four experiments present evidence in favor of working

  2. Style: A Computational and Conceptual Blending-Based Approach

    NASA Astrophysics Data System (ADS)

    Goguen, Joseph A.; Harrell, D. Fox

    This chapter proposes a new approach to style, arising from our work on computational media using structural blending, which enriches the conceptual blending of cognitive linguistics with structure building operations in order to encompass syntax and narrative as well as metaphor. We have implemented both conceptual and structural blending, and conducted initial experiments with poetry, including interactive multimedia poetry, although the approach generalizes to other media. The central idea is to generate multimedia content and analyze style in terms of blending principles, based on our finding that different principles from those of common sense blending are often needed for some contemporary poetic metaphors.

  3. Revising a design course from a lecture approach to a project-based learning approach

    NASA Astrophysics Data System (ADS)

    Kunberger, Tanya

    2013-06-01

    In order to develop the evaluative skills necessary for successful performance of design, a senior, Geotechnical Engineering course was revised to immerse students in the complexity of the design process utilising a project-based learning (PBL) approach to instruction. The student-centred approach stresses self-directed group learning, which focuses on the process rather than the result and underscores not only the theoretical but also the practical constraints of a problem. The shift in course emphasis, to skills over concepts, results in reduced content coverage but increased student ability to independently acquire a breadth of knowledge.

  4. Estimation of the shadow prices of pollutants with production/environment inefficiency taken into account: a nonparametric directional distance function approach.

    PubMed

    Lee, Jeong-Dong; Park, Jong-Bok; Kim, Tai-Yoo

    2002-04-01

    This paper deals with the estimation of the shadow prices of pollutants with a nonparametric directional distance function approach, where the inefficiency involved in the production process is taken into account unlike the previous studies. The directional vector, which is critical to the estimation and subject to the criterion for an appropriate efficiency rule proposed here, is calculated by using the annual plans of power plants in terms of production and environment. In the empirical study for Korea's electric power industry during the period of 1990-1995, we find that the average shadow prices of sulfur oxides (SOx), nitrogen oxides (NOx), and total suspended particulates (TSP) are approximately 10% lower than those calculated under the assumption of full efficiency. The methodology we propose and the findings obtained in the empirical study allow us to undertake better decision-making over a broad range of environmental policy issues. PMID:12141157

  5. Network Medicine: A Network-based Approach to Human Diseases

    NASA Astrophysics Data System (ADS)

    Ghiassian, Susan Dina

    With the availability of large-scale data, it is now possible to systematically study the underlying interaction maps of many complex systems in multiple disciplines. Statistical physics has a long and successful history in modeling and characterizing systems with a large number of interacting individuals. Indeed, numerous approaches that were first developed in the context of statistical physics, such as the notion of random walks and diffusion processes, have been applied successfully to study and characterize complex systems in the context of network science. Based on these tools, network science has made important contributions to our understanding of many real-world, self-organizing systems, for example in computer science, sociology and economics. Biological systems are no exception. Indeed, recent studies reflect the necessity of applying statistical and network-based approaches in order to understand complex biological systems, such as cells. In these approaches, a cell is viewed as a complex network consisting of interactions among cellular components, such as genes and proteins. Given the cellular network as a platform, machinery, functionality and failure of a cell can be studied with network-based approaches, a field known as systems biology. Here, we apply network-based approaches to explore human diseases and their associated genes within the cellular network. This dissertation is divided in three parts: (i) A systematic analysis of the connectivity patterns among disease proteins within the cellular network. The quantification of these patterns inspires the design of an algorithm which predicts a disease-specific subnetwork containing yet unknown disease associated proteins. (ii) We apply the introduced algorithm to explore the common underlying mechanism of many complex diseases. We detect a subnetwork from which inflammatory processes initiate and result in many autoimmune diseases. (iii) The last chapter of this dissertation describes the

  6. Involving decision-makers in the research process: Challenges of implementing the accountability for reasonableness approach to priority setting at the district level in Tanzania.

    PubMed

    Maluka, Stephen; Kamuzora, Peter; Ndawi, Benedict; Hurtig, Anna-Karin

    2014-01-01

    The past two decades have seen a growing call for researchers, policy-makers and health care providers to collaborate in efforts to bridge the gaps between research, policy and practice. However, there has been a little attention focused on documenting the challenges of dealing with decision-makers in the course of implementing a research project. This paper highlights a collaborative research project aiming to implement the accountability for reasonableness (AFR) approach to priority setting in accordance with the Response to Accountable Priority Setting for Trust in Health Systems (REACT) project in Tanzania. Specifically, the paper examines the challenges of dealing with decision-makers during the project-implementation process and shows how the researchers dealt with the decision-makers to facilitate the implementation of the REACT project. Key informant interviews were conducted with the Council Health Management Team (CHMT), local government officials and other stakeholders, using a semi-structured interview guide. Minutes of the Action Research Team and CHMT were analysed. Additionally, project-implementation reports were analysed and group priority-setting processes in the district were observed. The findings show that the characteristics of the REACT research project, the novelty of some aspects of the AFR approach, such as publicity and appeals, the Action Research methodology used to implement the project and the traditional cultural contexts within which the project was implemented, created challenges for both researchers and decision-makers, which consequently slowed down the implementation of the REACT project. While collaboration between researchers and decision-makers is important in bridging gaps between research and practice, it is imperative to understand the challenges of dealing with decision-makers in the course of implementing a collaborative research project. Such analyses are crucial in designing proper strategies for improved communication

  7. PoMo: An Allele Frequency-Based Approach for Species Tree Estimation

    PubMed Central

    De Maio, Nicola; Schrempf, Dominik; Kosiol, Carolin

    2015-01-01

    Incomplete lineage sorting can cause incongruencies of the overall species-level phylogenetic tree with the phylogenetic trees for individual genes or genomic segments. If these incongruencies are not accounted for, it is possible to incur several biases in species tree estimation. Here, we present a simple maximum likelihood approach that accounts for ancestral variation and incomplete lineage sorting. We use a POlymorphisms-aware phylogenetic MOdel (PoMo) that we have recently shown to efficiently estimate mutation rates and fixation biases from within and between-species variation data. We extend this model to perform efficient estimation of species trees. We test the performance of PoMo in several different scenarios of incomplete lineage sorting using simulations and compare it with existing methods both in accuracy and computational speed. In contrast to other approaches, our model does not use coalescent theory but is allele frequency based. We show that PoMo is well suited for genome-wide species tree estimation and that on such data it is more accurate than previous approaches. PMID:26209413

  8. A novel image fusion approach based on compressive sensing

    NASA Astrophysics Data System (ADS)

    Yin, Hongpeng; Liu, Zhaodong; Fang, Bin; Li, Yanxia

    2015-11-01

    Image fusion can integrate complementary and relevant information of source images captured by multiple sensors into a unitary synthetic image. The compressive sensing-based (CS) fusion approach can greatly reduce the processing speed and guarantee the quality of the fused image by integrating fewer non-zero coefficients. However, there are two main limitations in the conventional CS-based fusion approach. Firstly, directly fusing sensing measurements may bring greater uncertain results with high reconstruction error. Secondly, using single fusion rule may result in the problems of blocking artifacts and poor fidelity. In this paper, a novel image fusion approach based on CS is proposed to solve those problems. The non-subsampled contourlet transform (NSCT) method is utilized to decompose the source images. The dual-layer Pulse Coupled Neural Network (PCNN) model is used to integrate low-pass subbands; while an edge-retention based fusion rule is proposed to fuse high-pass subbands. The sparse coefficients are fused before being measured by Gaussian matrix. The fused image is accurately reconstructed by Compressive Sampling Matched Pursuit algorithm (CoSaMP). Experimental results demonstrate that the fused image contains abundant detailed contents and preserves the saliency structure. These also indicate that our proposed method achieves better visual quality than the current state-of-the-art methods.

  9. Facilitating Experience Reuse: Towards a Task-Based Approach

    NASA Astrophysics Data System (ADS)

    Du, Ying; Chen, Liming; Hu, Bo; Patterson, David; Wang, Hui

    This paper proposes a task-based approach to facilitate experience reuse in knowledge-intensive work environments, such as the domain of Technical Support. We first present a real-world motivating scenario, product technical support in a global IT enterprise, by studying of which key characteristics of the application domain and user requirements are drawn and analysed. We then develop the associated architecture for enabling the work experience reuse process to address the issues identified from the motivating scenario. Central to the approach is the task ontology that seamlessly integrates different components of the architecture. Work experience reuse amounts to the discovery and retrieval of task instances. In order to compare task instances, we introduce the dynamic weighted task similarity measure that is able to tuning similarity value against the dynamically changing task contextual information. A case study has been carried out to evaluate the proposed approach.

  10. Cellular and Antibody Based Approaches for Pediatric Cancer Immunotherapy

    PubMed Central

    Huang, Michael A.; Krishnadas, Deepa K.; Lucas, Kenneth G.

    2015-01-01

    Progress in the use of traditional chemotherapy and radiation-based strategies for the treatment of pediatric malignancies has plateaued in the past decade, particularly for patients with relapsing or therapy refractory disease. As a result, cellular and humoral immunotherapy approaches have been investigated for several childhood cancers. Several monoclonal antibodies are now FDA approved and commercially available, some of which are currently considered standard of practice. There are also several new cellular immunotherapy approaches under investigation, including chimeric antigen receptor (CAR) modified T cells, cancer vaccines and adjuvants, and natural killer (NK) cell therapies. In this review, we will discuss previous studies on pediatric cancer immunotherapy and new approaches that are currently being investigated in clinical trials. PMID:26587548

  11. A fuzzy behaviorist approach to sensor-based robot control

    SciTech Connect

    Pin, F.G.

    1996-05-01

    Sensor-based operation of autonomous robots in unstructured and/or outdoor environments has revealed to be an extremely challenging problem, mainly because of the difficulties encountered when attempting to represent the many uncertainties which are always present in the real world. These uncertainties are primarily due to sensor imprecisions and unpredictability of the environment, i.e., lack of full knowledge of the environment characteristics and dynamics. An approach. which we have named the {open_quotes}Fuzzy Behaviorist Approach{close_quotes} (FBA) is proposed in an attempt to remedy some of these difficulties. This approach is based on the representation of the system`s uncertainties using Fuzzy Set Theory-based approximations and on the representation of the reasoning and control schemes as sets of elemental behaviors. Using the FBA, a formalism for rule base development and an automated generator of fuzzy rules have been developed. This automated system can automatically construct the set of membership functions corresponding to fuzzy behaviors. Once these have been expressed in qualitative terms by the user. The system also checks for completeness of the rule base and for non-redundancy of the rules (which has traditionally been a major hurdle in rule base development). Two major conceptual features, the suppression and inhibition mechanisms which allow to express a dominance between behaviors are discussed in detail. Some experimental results obtained with the automated fuzzy, rule generator applied to the domain of sensor-based navigation in aprion unknown environments. using one of our autonomous test-bed robots as well as a real car in outdoor environments, are then reviewed and discussed to illustrate the feasibility of large-scale automatic fuzzy rule generation using the {open_quotes}Fuzzy Behaviorist{close_quotes} concepts.

  12. A data base approach for prediction of deforestation-induced mass wasting events

    NASA Technical Reports Server (NTRS)

    Logan, T. L.

    1981-01-01

    A major topic of concern in timber management is determining the impact of clear-cutting on slope stability. Deforestation treatments on steep mountain slopes have often resulted in a high frequency of major mass wasting events. The Geographic Information System (GIS) is a potentially useful tool for predicting the location of mass wasting sites. With a raster-based GIS, digitally encoded maps of slide hazard parameters can be overlayed and modeled to produce new maps depicting high probability slide areas. The present investigation has the objective to examine the raster-based information system as a tool for predicting the location of the clear-cut mountain slopes which are most likely to experience shallow soil debris avalanches. A literature overview is conducted, taking into account vegetation, roads, precipitation, soil type, slope-angle and aspect, and models predicting mass soil movements. Attention is given to a data base approach and aspects of slide prediction.

  13. Performance evaluation of cost-based vs. fuzzy-logic-based prediction approaches in PRIDE

    NASA Astrophysics Data System (ADS)

    Kootbally, Z.; Schlenoff, C.; Madhavan, R.; Foufou, S.

    2008-04-01

    PRIDE (PRediction In Dynamic Environments) is a hierarchical multi-resolutional framework for moving object prediction. PRIDE incorporates multiple prediction algorithms into a single, unifying framework. To date, we have applied this framework to predict the future location of autonomous vehicles during on-road driving. In this paper, we describe two different approaches to compute long-term predictions (on the order of seconds into the future) within PRIDE. The first is a cost-based approach that uses a discretized set of vehicle motions and costs associated with states and actions to compute probabilities of vehicle motion. The cost-based approach is the first prediction approach we have been using within PRIDE. The second is a fuzzy-logic-based approach that deals with the pervasive presence of uncertainty in the environment to negotiate complex traffic situations. Using the high-fidelity physics-based framework for the Unified System for Automation and Robot Simulation (USARSim), we will compare the performance of the two approaches in different driving situations at traffic intersections. Consequently, we will show how the two approaches complement each other and how their combination performs better than the cost-based approach only.

  14. A novel logic-based approach for quantitative toxicology prediction.

    PubMed

    Amini, Ata; Muggleton, Stephen H; Lodhi, Huma; Sternberg, Michael J E

    2007-01-01

    There is a pressing need for accurate in silico methods to predict the toxicity of molecules that are being introduced into the environment or are being developed into new pharmaceuticals. Predictive toxicology is in the realm of structure activity relationships (SAR), and many approaches have been used to derive such SAR. Previous work has shown that inductive logic programming (ILP) is a powerful approach that circumvents several major difficulties, such as molecular superposition, faced by some other SAR methods. The ILP approach reasons with chemical substructures within a relational framework and yields chemically understandable rules. Here, we report a general new approach, support vector inductive logic programming (SVILP), which extends the essentially qualitative ILP-based SAR to quantitative modeling. First, ILP is used to learn rules, the predictions of which are then used within a novel kernel to derive a support-vector generalization model. For a highly heterogeneous dataset of 576 molecules with known fathead minnow fish toxicity, the cross-validated correlation coefficients (R2CV) from a chemical descriptor method (CHEM) and SVILP are 0.52 and 0.66, respectively. The ILP, CHEM, and SVILP approaches correctly predict 55, 58, and 73%, respectively, of toxic molecules. In a set of 165 unseen molecules, the R2 values from the commercial software TOPKAT and SVILP are 0.26 and 0.57, respectively. In all calculations, SVILP showed significant improvements in comparison with the other methods. The SVILP approach has a major advantage in that it uses ILP automatically and consistently to derive rules, mostly novel, describing fragments that are toxicity alerts. The SVILP is a general machine-learning approach and has the potential of tackling many problems relevant to chemoinformatics including in silico drug design. PMID:17451225

  15. A Kalman-Filter-Based Approach to Combining Independent Earth-Orientation Series

    NASA Technical Reports Server (NTRS)

    Gross, Richard S.; Eubanks, T. M.; Steppe, J. A.; Freedman, A. P.; Dickey, J. O.; Runge, T. F.

    1998-01-01

    An approach. based upon the use of a Kalman filter. that is currently employed at the Jet Propulsion Laboratory (JPL) for combining independent measurements of the Earth's orientation, is presented. Since changes in the Earth's orientation can be described is a randomly excited stochastic process, the uncertainty in our knowledge of the Earth's orientation grows rapidly in the absence of measurements. The Kalman-filter methodology allows for an objective accounting of this uncertainty growth, thereby facilitating the intercomparison of measurements taken at different epochs (not necessarily uniformly spaced in time) and with different precision. As an example of this approach to combining Earth-orientation series, a description is given of a combination, SPACE95, that has been generated recently at JPL.

  16. A microfabrication-based approach to quantitative isothermal titration calorimetry.

    PubMed

    Wang, Bin; Jia, Yuan; Lin, Qiao

    2016-04-15

    Isothermal titration calorimetry (ITC) directly measures heat evolved in a chemical reaction to determine equilibrium binding properties of biomolecular systems. Conventional ITC instruments are expensive, use complicated design and construction, and require long analysis times. Microfabricated calorimetric devices are promising, although they have yet to allow accurate, quantitative ITC measurements of biochemical reactions. This paper presents a microfabrication-based approach to integrated, quantitative ITC characterization of biomolecular interactions. The approach integrates microfabricated differential calorimetric sensors with microfluidic titration. Biomolecules and reagents are introduced at each of a series of molar ratios, mixed, and allowed to react. The reaction thermal power is differentially measured, and used to determine the thermodynamic profile of the biomolecular interactions. Implemented in a microdevice featuring thermally isolated, well-defined reaction volumes with minimized fluid evaporation as well as highly sensitive thermoelectric sensing, the approach enables accurate and quantitative ITC measurements of protein-ligand interactions under different isothermal conditions. Using the approach, we demonstrate ITC characterization of the binding of 18-Crown-6 with barium chloride, and the binding of ribonuclease A with cytidine 2'-monophosphate within reaction volumes of approximately 0.7 µL and at concentrations down to 2mM. For each binding system, the ITC measurements were completed with considerably reduced analysis times and material consumption, and yielded a complete thermodynamic profile of the molecular interaction in agreement with published data. This demonstrates the potential usefulness of our approach for biomolecular characterization in biomedical applications. PMID:26655185

  17. Nucleic acid-based approaches to STAT inhibition.

    PubMed

    Sen, Malabika; Grandis, Jennifer R

    2012-10-01

    Silencing of abnormally activated genes can be accomplished in a highly specific manner using nucleic acid based approaches. The focus of this review includes the different nucleic acid based inhibition strategies such as antisense oligodeoxynucleotides, small interfering RNA (siRNA), dominant-negative constructs, G-quartet oligonucleotides and decoy oligonucleotides, their mechanism of action and the effectiveness of these approaches to targeting the STAT (signal transducer and activator of transcription) proteins in cancer. Among the STAT proteins, especially STAT3, followed by STAT5, are the most frequently activated oncogenic STATs, which have emerged as plausible therapeutic cancer targets. Both STAT3 and STAT5 have been shown to regulate numerous oncogenic signaling pathways including proliferation, survival, angiogenesis and migration/invasion. PMID:24058785

  18. English to Sanskrit Machine Translation Using Transfer Based approach

    NASA Astrophysics Data System (ADS)

    Pathak, Ganesh R.; Godse, Sachin P.

    2010-11-01

    Translation is one of the needs of global society for communicating thoughts and ideas of one country with other country. Translation is the process of interpretation of text meaning and subsequent production of equivalent text, also called as communicating same meaning (message) in another language. In this paper we gave detail information on how to convert source language text in to target language text using Transfer Based Approach for machine translation. Here we implemented English to Sanskrit machine translator using transfer based approach. English is global language used for business and communication but large amount of population in India is not using and understand the English. Sanskrit is ancient language of India most of the languages in India are derived from Sanskrit. Sanskrit can be act as an intermediate language for multilingual translation.

  19. Meniere's disease: an evidence based approach to assessment and management.

    PubMed

    Syed, I; Aldren, C

    2012-02-01

    Menière's disease (MD) is frequently over-diagnosed in both primary and secondary care. This is unfortunate given the significant medical and social implications of such a diagnosis. Difficulties may arise in differentiating the patient with true MD from those individuals with less clearly defined disorders of cochleo-vestibular function. In this review, we suggest a practical evidence based approach to assessment and management of the patient with MD. PMID:22257041

  20. Protecting the Smart Grid: A Risk Based Approach

    SciTech Connect

    Clements, Samuel L.; Kirkham, Harold; Elizondo, Marcelo A.; Lu, Shuai

    2011-10-10

    This paper describes a risk-based approach to security that has been used for years in protecting physical assets, and shows how it could be modified to help secure the digital aspects of the smart grid and control systems in general. One way the smart grid has been said to be vulnerable is that mass load fluctuations could be created by quickly turning off and on large quantities of smart meters. We investigate the plausibility.

  1. Intramuscular injection technique: an evidence-based approach.

    PubMed

    Ogston-Tuck, Sherri

    2014-09-30

    Intramuscular injections require a thorough and meticulous approach to patient assessment and injection technique. This article, the second in a series of two, reviews the evidence base to inform safer practice and to consider the evidence for nursing practice in this area. A framework for safe practice is included, identifying important points for safe technique, patient care and clinical decision making. It also highlights the ongoing debate in selection of intramuscular injection sites, predominately the ventrogluteal and dorsogluteal muscles. PMID:25249123

  2. A New Approach to Image Fusion Based on Cokriging

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; LeMoigne, Jacqueline; Mount, David M.; Morisette, Jeffrey T.

    2005-01-01

    We consider the image fusion problem involving remotely sensed data. We introduce cokriging as a method to perform fusion. We investigate the advantages of fusing Hyperion with ALI. The evaluation is performed by comparing the classification of the fused data with that of input images and by calculating well-chosen quantitative fusion quality metrics. We consider the Invasive Species Forecasting System (ISFS) project as our fusion application. The fusion of ALI with Hyperion data is studies using PCA and wavelet-based fusion. We then propose utilizing a geostatistical based interpolation method called cokriging as a new approach for image fusion.

  3. An alternative approach to achieving water quality-based limits

    SciTech Connect

    Hart, C.M.; Graeser, W.C.

    1995-12-01

    Since May 1982, members of the Iron and Steel Industry have been required to meet effluent limits based on Best Available Technology (BAT) for a process water discharge to receiving stream. US Steel Clairton Works has been successful in meeting these limits in the last three years; however, the current regulatory thrust is toward more stringent limits based on water quality. In cases of smaller streams such as the receiving stream for Clairton Works` process outfall, these limits can be very rigid. This paper will discuss the alternative approaches investigated to meet the new more stringent limits including the solution chosen.

  4. Machine learning algorithms for damage detection: Kernel-based approaches

    NASA Astrophysics Data System (ADS)

    Santos, Adam; Figueiredo, Eloi; Silva, M. F. M.; Sales, C. S.; Costa, J. C. W. A.

    2016-02-01

    This paper presents four kernel-based algorithms for damage detection under varying operational and environmental conditions, namely based on one-class support vector machine, support vector data description, kernel principal component analysis and greedy kernel principal component analysis. Acceleration time-series from an array of accelerometers were obtained from a laboratory structure and used for performance comparison. The main contribution of this study is the applicability of the proposed algorithms for damage detection as well as the comparison of the classification performance between these algorithms and other four ones already considered as reliable approaches in the literature. All proposed algorithms revealed to have better classification performance than the previous ones.

  5. Biomaterial Approaches for Stem Cell-Based Myocardial Tissue Engineering

    PubMed Central

    Cutts, Josh; Nikkhah, Mehdi; Brafman, David A

    2015-01-01

    Adult and pluripotent stem cells represent a ready supply of cellular raw materials that can be used to generate the functionally mature cells needed to replace damaged or diseased heart tissue. However, the use of stem cells for cardiac regenerative therapies is limited by the low efficiency by which stem cells are differentiated in vitro to cardiac lineages as well as the inability to effectively deliver stem cells and their derivatives to regions of damaged myocardium. In this review, we discuss the various biomaterial-based approaches that are being implemented to direct stem cell fate both in vitro and in vivo. First, we discuss the stem cell types available for cardiac repair and the engineering of naturally and synthetically derived biomaterials to direct their in vitro differentiation to the cell types that comprise heart tissue. Next, we describe biomaterial-based approaches that are being implemented to enhance the in vivo integration and differentiation of stem cells delivered to areas of cardiac damage. Finally, we present emerging trends of using stem cell-based biomaterial approaches to deliver pro-survival factors and fully vascularized tissue to the damaged and diseased cardiac tissue. PMID:26052226

  6. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2001-01-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  7. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2000-12-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  8. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Goebel, Kai

    2011-01-01

    Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.

  9. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  10. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  11. Model-based approach for elevator performance estimation

    NASA Astrophysics Data System (ADS)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  12. Optimizing Dendritic Cell-Based Approaches for Cancer Immunotherapy

    PubMed Central

    Datta, Jashodeep; Terhune, Julia H.; Lowenfeld, Lea; Cintolo, Jessica A.; Xu, Shuwen; Roses, Robert E.; Czerniecki, Brian J.

    2014-01-01

    Dendritic cells (DC) are professional antigen-presenting cells uniquely suited for cancer immunotherapy. They induce primary immune responses, potentiate the effector functions of previously primed T-lymphocytes, and orchestrate communication between innate and adaptive immunity. The remarkable diversity of cytokine activation regimens, DC maturation states, and antigen-loading strategies employed in current DC-based vaccine design reflect an evolving, but incomplete, understanding of optimal DC immunobiology. In the clinical realm, existing DC-based cancer immunotherapy efforts have yielded encouraging but inconsistent results. Despite recent U.S. Federal and Drug Administration (FDA) approval of DC-based sipuleucel-T for metastatic castration-resistant prostate cancer, clinically effective DC immunotherapy as monotherapy for a majority of tumors remains a distant goal. Recent work has identified strategies that may allow for more potent “next-generation” DC vaccines. Additionally, multimodality approaches incorporating DC-based immunotherapy may improve clinical outcomes. PMID:25506283

  13. Science based integrated approach to advanced nuclear fuel development - vision, approach, and overview

    SciTech Connect

    Unal, Cetin; Pasamehmetoglu, Kemal; Carmack, Jon

    2010-01-01

    Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Rcactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems is critical. In order to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating the phase and microstructural behavior of the nuclear fuel system materials and matrices. The purpose of this paper is to identify the modeling and simulation approach in order to deliver predictive tools for advanced fuels development. The coordination between experimental nuclear fuel design, development technical experts, and computational fuel modeling and simulation technical experts is a critical aspect of the approach and naturally leads to an integrated, goal-oriented science-based R & D approach and strengthens both the experimental and computational efforts. The Advanced Fuels Campaign (AFC) and Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Integrated Performance and Safety Code (IPSC) are working together to determine experimental data and modeling needs. The primary objective of the NEAMS fuels IPSC project is to deliver a coupled, three-dimensional, predictive computational platform for modeling the fabrication and both normal and abnormal operation of nuclear fuel pins and assemblies, applicable to both existing and future reactor fuel designs. The science based program is pursuing the development of an integrated multi-scale and multi-physics modeling and simulation platform for nuclear fuels. This overview paper discusses the vision, goals and approaches how to develop and implement the new approach.

  14. Wave-Particle Duality: An Information-Based Approach

    NASA Astrophysics Data System (ADS)

    Angelo, R. M.; Ribeiro, A. D.

    2015-11-01

    Recently, Bohr's complementarity principle was assessed in setups involving delayed choices. These works argued in favor of a reformulation of the aforementioned principle so as to account for situations in which a quantum system would simultaneously behave as wave and particle. Here we defend a framework that, supported by well-known experimental results and consistent with the decoherence paradigm, allows us to interpret complementarity in terms of correlations between the system and an informer. Our proposal offers formal definition and operational interpretation for the dual behavior in terms of both nonlocal resources and the couple work-information. Most importantly, our results provide a generalized information-based trade-off for the wave-particle duality and a causal interpretation for delayed-choice experiments.

  15. A modified Shockley equation taking into account the multi-element nature of light emitting diodes based on nanowire ensembles

    NASA Astrophysics Data System (ADS)

    Musolino, M.; Tahraoui, A.; van Treeck, D.; Geelhaar, L.; Riechert, H.

    2016-07-01

    In this work we study how the multi-element nature of light emitting diodes (LEDs) based on nanowire (NW) ensembles influences their current voltage (I–V) characteristics. We systematically address critical issues of the fabrication process that can result in significant fluctuations of the electrical properties among the individual NWs in such LEDs, paying particular attention to the planarization step. Electroluminescence (EL) maps acquired for two nominally identical NW-LEDs reveal that small processing variations can result in a large difference in the number of individual nano-devices emitting EL. The lower number of EL spots in one of the LEDs is caused by its inhomogeneous electrical properties. The I–V characteristics of this LED cannot be described well by the classical Shockley model. We are able to take into account the multi-element nature of such LEDs and fit the I–V characteristics in the forward bias regime by employing an ad hoc adjusted version of the Shockley equation. More specifically, we introduce a bias dependence of the ideality factor. The basic considerations of our model should remain valid also for other types of devices based on ensembles of interconnected p–n junctions with inhomogeneous electrical properties, regardless of the employed material system.

  16. A modified Shockley equation taking into account the multi-element nature of light emitting diodes based on nanowire ensembles.

    PubMed

    Musolino, M; Tahraoui, A; Treeck, D van; Geelhaar, L; Riechert, H

    2016-07-01

    In this work we study how the multi-element nature of light emitting diodes (LEDs) based on nanowire (NW) ensembles influences their current voltage (I-V) characteristics. We systematically address critical issues of the fabrication process that can result in significant fluctuations of the electrical properties among the individual NWs in such LEDs, paying particular attention to the planarization step. Electroluminescence (EL) maps acquired for two nominally identical NW-LEDs reveal that small processing variations can result in a large difference in the number of individual nano-devices emitting EL. The lower number of EL spots in one of the LEDs is caused by its inhomogeneous electrical properties. The I-V characteristics of this LED cannot be described well by the classical Shockley model. We are able to take into account the multi-element nature of such LEDs and fit the I-V characteristics in the forward bias regime by employing an ad hoc adjusted version of the Shockley equation. More specifically, we introduce a bias dependence of the ideality factor. The basic considerations of our model should remain valid also for other types of devices based on ensembles of interconnected p-n junctions with inhomogeneous electrical properties, regardless of the employed material system. PMID:27232449

  17. Use of an ecologically relevant modelling approach to improve remote sensing-based schistosomiasis risk profiling.

    PubMed

    Walz, Yvonne; Wegmann, Martin; Leutner, Benjamin; Dech, Stefan; Vounatsou, Penelope; N'Goran, Eliézer K; Raso, Giovanna; Utzinger, Jürg

    2015-01-01

    Schistosomiasis is a widespread water-based disease that puts close to 800 million people at risk of infection with more than 250 million infected, mainly in sub-Saharan Africa. Transmission is governed by the spatial distribution of specific freshwater snails that act as intermediate hosts and the frequency, duration and extent of human bodies exposed to infested water sources during human water contact. Remote sensing data have been utilized for spatially explicit risk profiling of schistosomiasis. Since schistosomiasis risk profiling based on remote sensing data inherits a conceptual drawback if school-based disease prevalence data are directly related to the remote sensing measurements extracted at the location of the school, because the disease transmission usually does not exactly occur at the school, we took the local environment around the schools into account by explicitly linking ecologically relevant environmental information of potential disease transmission sites to survey measurements of disease prevalence. Our models were validated at two sites with different landscapes in Côte d'Ivoire using high- and moderate-resolution remote sensing data based on random forest and partial least squares regression. We found that the ecologically relevant modelling approach explained up to 70% of the variation in Schistosoma infection prevalence and performed better compared to a purely pixel-based modelling approach. Furthermore, our study showed that model performance increased as a function of enlarging the school catchment area, confirming the hypothesis that suitable environments for schistosomiasis transmission rarely occur at the location of survey measurements. PMID:26618326

  18. Parkinson's disease prediction using diffusion-based atlas approach

    NASA Astrophysics Data System (ADS)

    Teodorescu, Roxana O.; Racoceanu, Daniel; Smit, Nicolas; Cretu, Vladimir I.; Tan, Eng K.; Chan, Ling L.

    2010-03-01

    We study Parkinson's disease (PD) using an automatic specialized diffusion-based atlas. A total of 47 subjects, among who 22 patients diagnosed clinically with PD and 25 control cases, underwent DTI imaging. The EPIs have lower resolution but provide essential anisotropy information for the fiber tracking process. The two volumes of interest (VOI) represented by the Substantia Nigra and the Putamen are detected on the EPI and FA respectively. We use the VOIs for the geometry-based registration. We fuse the anatomical detail detected on FA image for the putamen volume with the EPI. After 3D fibers growing on the two volumes, we compute the fiber density (FD) and the fiber volume (FV). Furthermore, we compare patients based on the extracted fibers and evaluate them according to Hohen&Yahr (H&Y) scale. This paper introduces the method used for automatic volume detection and evaluates the fiber growing method on these volumes. Our approach is important from the clinical standpoint, providing a new tool for the neurologists to evaluate and predict PD evolution. From the technical point of view, the fusion approach deals with the tensor based information (EPI) and the extraction of the anatomical detail (FA and EPI).

  19. Synchronization-based approach for detecting functional activation of brain

    NASA Astrophysics Data System (ADS)

    Hong, Lei; Cai, Shi-Min; Zhang, Jie; Zhuo, Zhao; Fu, Zhong-Qian; Zhou, Pei-Ling

    2012-09-01

    In this paper, we investigate a synchronization-based, data-driven clustering approach for the analysis of functional magnetic resonance imaging (fMRI) data, and specifically for detecting functional activation from fMRI data. We first define a new measure of similarity between all pairs of data points (i.e., time series of voxels) integrating both complete phase synchronization and amplitude correlation. These pairwise similarities are taken as the coupling between a set of Kuramoto oscillators, which in turn evolve according to a nearest-neighbor rule. As the network evolves, similar data points naturally synchronize with each other, and distinct clusters will emerge. The clustering behavior of the interaction network of the coupled oscillators, therefore, mirrors the clustering property of the original multiple time series. The clustered regions whose cross-correlation coefficients are much greater than other regions are considered as the functionally activated brain regions. The analysis of fMRI data in auditory and visual areas shows that the recognized brain functional activations are in complete correspondence with those from the general linear model of statistical parametric mapping, but with a significantly lower time complexity. We further compare our results with those from traditional K-means approach, and find that our new clustering approach can distinguish between different response patterns more accurately and efficiently than the K-means approach, and therefore more suitable in detecting functional activation from event-related experimental fMRI data.

  20. Energy function-based approaches to graph coloring.

    PubMed

    Di Blas, A; Jagota, A; Hughey, R

    2002-01-01

    We describe an approach to optimization based on a multiple-restart quasi-Hopfield network where the only problem-specific knowledge is embedded in the energy function that the algorithm tries to minimize. We apply this method to three different variants of the graph coloring problem: the minimum coloring problem, the spanning subgraph k-coloring problem, and the induced subgraph k-coloring problem. Though Hopfield networks have been applied in the past to the minimum coloring problem, our encoding is more natural and compact than almost all previous ones. In particular, we use k-state neurons while almost all previous approaches use binary neurons. This reduces the number of connections in the network from (Nk)(2) to N(2) asymptotically and also circumvents a problem in earlier approaches, that of multiple colors being assigned to a single vertex. Experimental results show that our approach compares favorably with other algorithms, even nonneural ones specifically developed for the graph coloring problem. PMID:18244411

  1. Pedestrian detection from thermal images: A sparse representation based approach

    NASA Astrophysics Data System (ADS)

    Qi, Bin; John, Vijay; Liu, Zheng; Mita, Seiichi

    2016-05-01

    Pedestrian detection, a key technology in computer vision, plays a paramount role in the applications of advanced driver assistant systems (ADASs) and autonomous vehicles. The objective of pedestrian detection is to identify and locate people in a dynamic environment so that accidents can be avoided. With significant variations introduced by illumination, occlusion, articulated pose, and complex background, pedestrian detection is a challenging task for visual perception. Different from visible images, thermal images are captured and presented with intensity maps based objects' emissivity, and thus have an enhanced spectral range to make human beings perceptible from the cool background. In this study, a sparse representation based approach is proposed for pedestrian detection from thermal images. We first adopted the histogram of sparse code to represent image features and then detect pedestrian with the extracted features in an unimodal and a multimodal framework respectively. In the unimodal framework, two types of dictionaries, i.e. joint dictionary and individual dictionary, are built by learning from prepared training samples. In the multimodal framework, a weighted fusion scheme is proposed to further highlight the contributions from features with higher separability. To validate the proposed approach, experiments were conducted to compare with three widely used features: Haar wavelets (HWs), histogram of oriented gradients (HOG), and histogram of phase congruency (HPC) as well as two classification methods, i.e. AdaBoost and support vector machine (SVM). Experimental results on a publicly available data set demonstrate the superiority of the proposed approach.

  2. A triangulation-based approach to automatically repair GIS polygons

    NASA Astrophysics Data System (ADS)

    Ledoux, Hugo; Arroyo Ohori, Ken; Meijers, Martijn

    2014-05-01

    Although the validation of a single GIS polygon can be considered as a solved issue, the repair of an invalid polygon has not received much attention and is still in practice a semi-manual and time-consuming task. We investigate in this paper algorithms to automatically repair a single polygon. Automated repair algorithms can be considered as interpreting ambiguous or ill-defined polygons and returning a coherent and clearly defined output (the definition of the international standards in our case). We present a novel approach, based on the use of a constrained triangulation, to automatically repair invalid polygons. Our approach is conceptually simple and easy to implement as it is mostly based on labelling triangles. It is also flexible: it permits us to implement different repair paradigms (we describe two in the paper). We have implemented our algorithms, and we report on experiments made with large real-world polygons that are often used by practitioners in different disciplines. We show that our approach is faster and more scalable than alternative tools.

  3. Covariance-based approaches to aeroacoustic noise source analysis.

    PubMed

    Du, Lin; Xu, Luzhou; Li, Jian; Guo, Bin; Stoica, Petre; Bahr, Chris; Cattafesta, Louis N

    2010-11-01

    In this paper, several covariance-based approaches are proposed for aeroacoustic noise source analysis under the assumptions of a single dominant source and all observers contaminated solely by uncorrelated noise. The Cramér-Rao Bounds (CRB) of the unbiased source power estimates are also derived. The proposed methods are evaluated using both simulated data as well as data acquired from an airfoil trailing edge noise experiment in an open-jet aeroacoustic facility. The numerical examples show that the covariance-based algorithms significantly outperform an existing least-squares approach and provide accurate power estimates even under low signal-to-noise ratio (SNR) conditions. Furthermore, the mean-squared-errors (MSEs) of the so-obtained estimates are close to the corresponding CRB especially for a large number of data samples. The experimental results show that the power estimates of the proposed approaches are consistent with one another as long as the core analysis assumptions are obeyed. PMID:21110583

  4. A computer vision-based approach for structural displacement measurement

    NASA Astrophysics Data System (ADS)

    Ji, Yunfeng

    2010-04-01

    Along with the incessant advancement in optics, electronics and computer technologies during the last three decades, commercial digital video cameras have experienced a remarkable evolution, and can now be employed to measure complex motions of objects with sufficient accuracy, which render great assistance to structural displacement measurement in civil engineering. This paper proposes a computer vision-based approach for dynamic measurement of structures. One digital camera is used to capture image sequences of planar targets mounted on vibrating structures. The mathematical relationship between image plane and real space is established based on computer vision theory. Then, the structural dynamic displacement at the target locations can be quantified using point reconstruction rules. Compared with other tradition displacement measurement methods using sensors, such as accelerometers, linear-variable-differential-transducers (LVDTs) and global position system (GPS), the proposed approach gives the main advantages of great flexibility, a non-contact working mode and ease of increasing measurement points. To validate, four tests of sinusoidal motion of a point, free vibration of a cantilever beam, wind tunnel test of a cross-section bridge model, and field test of bridge displacement measurement, are performed. Results show that the proposed approach can attain excellent accuracy compared with the analytical ones or the measurements using conventional transducers, and proves to deliver an innovative and low cost solution to structural displacement measurement.

  5. A Novel Rules Based Approach for Estimating Software Birthmark

    PubMed Central

    Binti Alias, Norma; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363

  6. A novel rules based approach for estimating software birthmark.

    PubMed

    Nazir, Shah; Shahzad, Sara; Khan, Sher Afzal; Alias, Norma Binti; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363

  7. Rights-Based Approaches to Ensure Sustainable Nutrition Security.

    PubMed

    Banerjee, Sweta

    2016-01-01

    In India, a rights-based approach has been used to address large-scale malnutrition, including both micro- and macro-level nutrition deficiencies. Stunting, which is an intergenerational chronic consequence of malnutrition, is especially widespread in India (38% among children under 5 years old). To tackle this problem, the government of India has designed interventions for the first 1,000 days, a critical period of the life cycle, through a number of community-based programs to fulfill the rights to food and life. However, the entitlements providing these rights have not yet produced the necessary changes in the malnutrition status of people, especially women and children. The government of India has already implemented laws and drafted a constitution that covers the needs of its citizens, but corruption, bureaucracy, lack of awareness of rights and entitlements and social discrimination limit people's access to basic rights and services. To address this crisis, Welthungerhilfe India, working in remote villages of the most backward states in India, has shifted from a welfare-based approach to a rights-based approach. The Fight Hunger First Initiative, started by Welthungerhilfe in 2011, is designed on the premise that in the long term, poor people can only leave poverty behind if adequate welfare systems are in place and if basic rights are fulfilled; these rights include access to proper education, sufficient access to adequate food and income, suitable health services and equal rights. Only then can the next generation of disadvantaged populations look forward to a new and better future and can growth benefit the entire society. The project, co-funded by the Federal Ministry for Economic Cooperation and Development, is a long-term multi-sectoral program that involves institution-building and empowerment. PMID:27198153

  8. Pleomorphic Adenoma of Base of Tongue: Is Midline Mandibulotomy Necessary for Approaching Benign Base Tongue Lesions?

    PubMed Central

    Bansal, Sandeep; Kalsotra, Gopika; Mohammed, Abdul Wadood; Bahl, Amanjit; Gupta, Ashok K.

    2012-01-01

    Objective. To report a rare presentation of pleomorphic adenoma, at base tongue, excised surgically by a transoral midline glossotomy technique without mandibulotomy. Case Report. Pleomorphic adenoma is a benign tumor of the salivary gland found rarely in the base of tongue. Surgery is the definitive treatment for this tumor, and different approaches have been mentioned in the literature. In our case we surgically excised the tumor by a transoral midline glossotomy technique without mandibulotomy where we combined the cosmetic advantage of transoral technique and the exposure advantage of a glossotomy technique. Discussion. We discuss the different approaches to oropharynx, their advantages and disadvantages. Primary transoral approach provides better cosmesis but less exposure whereas median labiomandibuloglossotomy approach provides more exposure but is cosmetically unacceptable. Conclusion. A transoral midline glossotomy approach without mandibulotomy provides wide exposure with acceptable cosmesis. PMID:22953125

  9. Probabilistic Risk-Based Approach to Aeropropulsion System Assessment Developed

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2001-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decisionmakers to determine the feasibility and return-on-investment of a new aircraft engine.

  10. Dependent component analysis based approach to robust demarcation of skin tumors

    NASA Astrophysics Data System (ADS)

    Kopriva, Ivica; Peršin, Antun; Puizina-Ivić, Neira; Mirić, Lina

    2009-02-01

    Method for robust demarcation of the basal cell carcinoma (BCC) is presented employing novel dependent component analysis (DCA)-based approach to unsupervised segmentation of the red-green-blue (RGB) fluorescent image of the BCC. It exploits spectral diversity between the BCC and the surrounding tissue. DCA represents an extension of the independent component analysis (ICA) and is necessary to account for statistical dependence induced by spectral similarity between the BCC and surrounding tissue. Robustness to intensity fluctuation is due to the scale invariance property of DCA algorithms. By comparative performance analysis with state-of-the-art image segmentation methods such as active contours (level set), K-means clustering, non-negative matrix factorization and ICA we experimentally demonstrate good performance of DCA-based BCC demarcation in demanding scenario where intensity of the fluorescent image has been varied almost two-orders of magnitude.

  11. An Information-Based Learning Approach to Dual Control.

    PubMed

    Alpcan, Tansu; Shames, Iman

    2015-11-01

    Dual control aims to concurrently learn and control an unknown system. However, actively learning the system conflicts directly with any given control objective for it will disturb the system during exploration. This paper presents a receding horizon approach to dual control, where a multiobjective optimization problem is solved repeatedly and subject to constraints representing system dynamics. Balancing a standard finite-horizon control objective, a knowledge gain objective is defined to explicitly quantify the information acquired when learning the system dynamics. Measures from information theory, such as entropy-based uncertainty, Fisher information, and relative entropy, are studied and used to quantify the knowledge gained as a result of the control actions. The resulting iterative framework is applied to Markov decision processes and discrete-time nonlinear systems. Thus, the broad applicability and usefulness of the presented approach is demonstrated in diverse problem settings. The framework is illustrated with multiple numerical examples. PMID:25730828

  12. A symptom-based approach to pharmacologic management of fibromyalgia.

    PubMed

    Boomershine, Chad S; Crofford, Leslie J

    2009-04-01

    Fibromyalgia is a prevalent disorder that is characterized by widespread pain along with numerous other symptoms, including fatigue, poor sleep, mood disorders, and stiffness. Previous guidelines for the management of fibromyalgia recommended an approach that integrates pharmacologic and nonpharmacologic therapies selected according to the symptoms experienced by individual patients. However, they offered no recommendations for a system of patient assessment that would provide a basis for individualized treatment selection. We present a simple, rapid and easily remembered system for symptom quantitation and pharmacologic management of fibromyalgia that combines visual analogue scale symptom scores from a modified form of the disease-neutral Fibromyalgia Impact Questionnaire, with a review of medications that can be used to treat the individual symptoms. This symptom-based approach is amenable to caring for patients with fibromyalgia in a busy clinical practice. PMID:19337283

  13. Unit testing-based approach for reconfigurable logic controllers verification

    NASA Astrophysics Data System (ADS)

    Doligalski, Michał; Tkacz, Jacek; Bukowiec, Arkadiusz; Gratkowski, Tomasz

    2015-09-01

    The paper presents unit testing-based approach to FPGA design in-circuit verification. Presented methodology is dedicated to modular reconfigurable logic controllers, but other ip-cores and systems can be verified as well. The speed and reproducibility of tests is key for rapid system prototyping, where the quality and reliability of the system is significance. Typically FPGA are programmed by means single (full) bitstream. Specific devices are able to be reconfigured partially. Usually the partial reconfiguration is a part of the design functionality. It enables the minimization of used resources or provides specific functionality like system adaptation. The paper presents the use of the partial reconfiguration as a toll for the designer. The unit testing approach well know form software engineering was adopted to modular logic controllers development. The simulation process results waveform files, the waveform can be used for synthesizable test bench generation.

  14. Hypercompetitive Environments: An Agent-based model approach

    NASA Astrophysics Data System (ADS)

    Dias, Manuel; Araújo, Tanya

    Information technology (IT) environments are characterized by complex changes and rapid evolution. Globalization and the spread of technological innovation have increased the need for new strategic information resources, both from individual firms and management environments. Improvements in multidisciplinary methods and, particularly, the availability of powerful computational tools, are giving researchers an increasing opportunity to investigate management environments in their true complex nature. The adoption of a complex systems approach allows for modeling business strategies from a bottom-up perspective — understood as resulting from repeated and local interaction of economic agents — without disregarding the consequences of the business strategies themselves to individual behavior of enterprises, emergence of interaction patterns between firms and management environments. Agent-based models are at the leading approach of this attempt.

  15. A component based approach to scientific workflow management

    NASA Astrophysics Data System (ADS)

    Baker, N.; Brooks, P.; Kovacs, Z.; LeGoff, J.-M.; McClatchey, R.

    2001-08-01

    CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta-modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirements but maximize software reuse in the process. The next generation CRISTAL vision is to build a generic component architecture from which a complete software product line can be generated according to the particular needs of the target enterprise. This paper discusses the issues of adopting a component product line based approach and our experiences of software reuse.

  16. Integrated Systems-Based Approach to Monitoring Environmental Remediation - 13211

    SciTech Connect

    Truex, Mike; Oostrom, Mart; Carroll, K.C.; Bunn, Amoret; Wellman, Dawn

    2013-07-01

    The US Department of Energy (DOE) is responsible for risk reduction and cleanup of its nuclear weapons complex. Remediation strategies for some of the existing contamination use techniques that mitigate risk, but leave contaminants in place. Monitoring to verify remedy performance and long-term mitigation of risk is a key element for implementing these strategies and can be a large portion of the total cost of remedy implementation. Especially in these situations, there is a need for innovative monitoring approaches that move away from the cost and labor intensive point-source monitoring. A systems-based approach to monitoring design focuses monitoring on controlling features and processes to enable effective interpretation of remedy performance. (authors)

  17. Integrated Systems-Based Approach to Monitoring Environmental Remediation

    SciTech Connect

    Bunn, Amoret L.; Truex, Michael J.; Oostrom, Martinus; Carroll, Kenneth C.; Wellman, Dawn M.

    2013-02-24

    The US Department of Energy (DOE) is responsible for risk reduction and cleanup of its nuclear weapons complex. Remediation strategies for some of the existing contamination use techniques that mitigate risk, but leave contaminants in place. Monitoring to verify remedy performance and long-term mitigation of risk is a key element for implementing these strategies and can be a large portion of the total cost of remedy implementation. Especially in these situations, there is a need for innovative monitoring approaches that move away from the cost and labor intensive point-source monitoring. A systems-based approach to monitoring design focuses monitoring on controlling features and processes to enable effective interpretation of remedy performance.

  18. A Cloud-based Approach to Medical NLP

    PubMed Central

    Chard, Kyle; Russell, Michael; Lussier, Yves A.; Mendonça, Eneida A; Silverstein, Jonathan C.

    2011-01-01

    Natural Language Processing (NLP) enables access to deep content embedded in medical texts. To date, NLP has not fulfilled its promise of enabling robust clinical encoding, clinical use, quality improvement, and research. We submit that this is in part due to poor accessibility, scalability, and flexibility of NLP systems. We describe here an approach and system which leverages cloud-based approaches such as virtual machines and Representational State Transfer (REST) to extract, process, synthesize, mine, compare/contrast, explore, and manage medical text data in a flexibly secure and scalable architecture. Available architectures in which our Smntx (pronounced as semantics) system can be deployed include: virtual machines in a HIPAA-protected hospital environment, brought up to run analysis over bulk data and destroyed in a local cloud; a commercial cloud for a large complex multi-institutional trial; and within other architectures such as caGrid, i2b2, or NHIN. PMID:22195072

  19. Accountability Overboard

    ERIC Educational Resources Information Center

    Chieppo, Charles D.; Gass, James T.

    2009-01-01

    This article reports that special interest groups opposed to charter schools and high-stakes testing have hijacked Massachusetts's once-independent board of education and stand poised to water down the Massachusetts Comprehensive Assessment System (MCAS) tests and the accountability system they support. President Barack Obama and Massachusetts…

  20. Accounting Specialist.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication identifies 20 subjects appropriate for use in a competency list for the occupation of accounting specialist, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 20 units are as follows:…

  1. Accountability and primary healthcare.

    PubMed

    Mukhi, Shaheena; Barnsley, Jan; Deber, Raisa B

    2014-09-01

    This paper examines the accountability structures within primary healthcare (PHC) in Ontario; in particular, who is accountable for what and to whom, and the policy tools being used. Ontario has implemented a series of incremental reforms, using expenditure policy instruments, enforced through contractual agreements to provide a defined set of publicly financed services that are privately delivered, most often by family physicians. The findings indicate that reporting, funding, evaluation and governance accountability requirements vary across service provider models. Accountability to the funder and patients is most common. Agreements, incentives and compensation tools have been used but may be insufficient to ensure parties are being held responsible for their activities related to stated goals. Clear definitions of various governance structures, a cohesive approach to monitoring critical performance indicators and associated improvement strategies are important elements in operationalizing accountability and determining whether goals are being met. PMID:25305392

  2. Accountability and Primary Healthcare

    PubMed Central

    Mukhi, Shaheena; Barnsley, Jan; Deber, Raisa B.

    2014-01-01

    This paper examines the accountability structures within primary healthcare (PHC) in Ontario; in particular, who is accountable for what and to whom, and the policy tools being used. Ontario has implemented a series of incremental reforms, using expenditure policy instruments, enforced through contractual agreements to provide a defined set of publicly financed services that are privately delivered, most often by family physicians. The findings indicate that reporting, funding, evaluation and governance accountability requirements vary across service provider models. Accountability to the funder and patients is most common. Agreements, incentives and compensation tools have been used but may be insufficient to ensure parties are being held responsible for their activities related to stated goals. Clear definitions of various governance structures, a cohesive approach to monitoring critical performance indicators and associated improvement strategies are important elements in operationalizing accountability and determining whether goals are being met. PMID:25305392

  3. Practical and scientifically based approaches for cleanup and site restoration.

    PubMed

    Till, John E; McBaugh, Debra

    2005-11-01

    This paper presents practical and scientific approaches for cleanup and site restoration following terrorist events. Both approaches are required in actual emergency situations and are complementary. The practical examples are taken from the May 2003 second biannual national emergency exercise, Top Officials 2 (TOPOFF 2), which occurred in Chicago, Illinois, and Seattle, Washington. The scientific examples are taken from the Department of Energy sites at Rocky Flats, Fernald, and Los Alamos where cleanup initiatives based on scientific approaches and community input are underway. Three examples are provided to explain, from a practical standpoint, how decisions during the exercise had to be made quickly, even though the alternatives were not always clear. These examples illustrate how scientific approaches can be integrated into the resolution of these dilemmas. The examples are (1) use of water to wash city roads and freeways contaminated with plutonium, Am, and Cs; (2) decontamination of large public ferries that passed through a radioactive plume; and (3) handling of wastewater following decontamination within a city. Each of these situations posed the need for an immediate decision by authorities in charge, without the benefit of community input or time for an analysis of the important pathways of exposure. It is evident there is a need to merge the practical knowledge gained in emergency response with scientific knowledge learned from cleanup and site restoration. The development of some basic scientific approaches ahead of time in the form of easy-to-use tools will allow practical decisions to be made more quickly and effectively should an actual terrorist event occur. PMID:16217202

  4. A Thermodynamically-consistent FBA-based Approach to Biogeochemical Reaction Modeling

    NASA Astrophysics Data System (ADS)

    Shapiro, B.; Jin, Q.

    2015-12-01

    Microbial rates are critical to understanding biogeochemical processes in natural environments. Recently, flux balance analysis (FBA) has been applied to predict microbial rates in aquifers and other settings. FBA is a genome-scale constraint-based modeling approach that computes metabolic rates and other phenotypes of microorganisms. This approach requires a prior knowledge of substrate uptake rates, which is not available for most natural microbes. Here we propose to constrain substrate uptake rates on the basis of microbial kinetics. Specifically, we calculate rates of respiration (and fermentation) using a revised Monod equation; this equation accounts for both the kinetics and thermodynamics of microbial catabolism. Substrate uptake rates are then computed from the rates of respiration, and applied to FBA to predict rates of microbial growth. We implemented this method by linking two software tools, PHREEQC and COBRA Toolbox. We applied this method to acetotrophic methanogenesis by Methanosarcina barkeri, and compared the simulation results to previous laboratory observations. The new method constrains acetate uptake by accounting for the kinetics and thermodynamics of methanogenesis, and predicted well the observations of previous experiments. In comparison, traditional methods of dynamic-FBA constrain acetate uptake on the basis of enzyme kinetics, and failed to reproduce the experimental results. These results show that microbial rate laws may provide a better constraint than enzyme kinetics for applying FBA to biogeochemical reaction modeling.

  5. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow. PMID:25045677

  6. Big Data Analytics in Immunology: A Knowledge-Based Approach

    PubMed Central

    Zhang, Guang Lan

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow. PMID:25045677

  7. Gene-based vaccine approaches for respiratory syncytial virus.

    PubMed

    Loomis, Rebecca J; Johnson, Philip R

    2013-01-01

    A respiratory syncytial virus (RSV) vaccine has remained elusive for decades, largely due to the failure of a formalin-inactivated RSV vaccine in the 1960s that resulted in enhanced disease upon RSV exposure in the immunized individuals. Vaccine development has also been hindered by the incomplete immunity conferred by natural infection allowing for re-infection at any time, and the immature immune system and circulating maternal antibodies present in the neonate, the primary target for a vaccine. This chapter will review the use of gene delivery, both nonviral and viral, as a potential vaccine approach for human RSV. Many of these gene-based vaccines vectors elicit protective immune responses in animal models. None of the RSV gene-based platforms have progressed into clinical trials, mostly due to uncertainty regarding the direct translation of animal model results to humans and the hesitancy to invest in costly clinical trials with the potential for unclear and complicated immune responses. The continued development of RSV vaccine gene-based approaches is warranted because of their inherent flexibility with regard to composition and administration. It is likely that multiple candidate vaccines will reach human testing in the next few years. PMID:24362696

  8. MRI upsampling using feature-based nonlocal means approach.

    PubMed

    Jafari-Khouzani, Kourosh

    2014-10-01

    In magnetic resonance imaging (MRI), spatial resolution is limited by several factors such as acquisition time, short physiological phenomena, and organ motion. The acquired image usually has higher resolution in two dimensions (the acquisition plane) in comparison with the third dimension, resulting in highly anisotropic voxel size. Interpolation of these low resolution (LR) images using standard techniques, such as linear or spline interpolation, results in distorted edges in the planes perpendicular to the acquisition plane. This poses limitation on conducting quantitative analyses of LR images, particularly on their voxel-wise analysis and registration. We have proposed a new non-local means feature-based technique that uses structural information of a high resolution (HR) image with a different contrast and interpolates the LR image. In this approach, the similarity between voxels is estimated using a feature vector that characterizes the laminar pattern of the brain structures, resulting in a more accurate similarity measure in comparison with conventional patch-based approach. This technique can be applied to LR images with both anisotropic and isotropic voxel sizes. Experimental results conducted on brain MRI scans of patients with brain tumors, multiple sclerosis, epilepsy, as well as schizophrenic patients and normal controls show that the proposed method is more accurate, requires fewer computations, and thus is significantly faster than a previous state-of-the-art patch-based technique. We also show how the proposed method may be used to upsample regions of interest drawn on LR images. PMID:24951680

  9. Risk based tiered approach (RBTASM) for pollution prevention.

    PubMed

    Elves, R G; Sweeney, L M; Tomljanovic, C

    1997-11-01

    Effective management of human health and ecological hazards in the manufacturing and maintenance environment can be achieved by focusing on the risks associated with these operations. The NDCEE Industrial Health Risk Assessment (IHRA) Program is developing a comprehensive approach to risk analysis applied to existing processes and used to evaluate alternatives. The IHRA Risk-Based Tiered Approach (RBTASM) builds on the American Society for Testing and Materials (ASTM) Risk-Based Corrective Action (RBCA) effort to remediate underground storage tanks. Using readily available information, a semi-quantitative ranking of alternatives based on environmental, safety, and occupational health criteria was produced. A Rapid Screening Assessment of alternative corrosion protection products was performed on behalf of the Joint Group on Acquisition Pollution Prevention (JG-APP). Using the RBTASM in pollution prevention alternative selection required higher tiered analysis and more detailed assessment of human health risks under site-specific conditions. This example illustrates the RBTASM for a organic finishing line using three different products (one conventional spray and two alternative powder coats). The human health risk information developed using the RBTASM is considered along with product performance, regulatory, and cost information by risk managers downselecting alternatives for implementation or further analysis. PMID:9433667

  10. Exploring polypharmacology using a ROCS-based target fishing approach.

    PubMed

    AbdulHameed, Mohamed Diwan M; Chaudhury, Sidhartha; Singh, Narender; Sun, Hongmao; Wallqvist, Anders; Tawa, Gregory J

    2012-02-27

    Polypharmacology has emerged as a new theme in drug discovery. In this paper, we studied polypharmacology using a ligand-based target fishing (LBTF) protocol. To implement the protocol, we first generated a chemogenomic database that links individual protein targets with a specified set of drugs or target representatives. Target profiles were then generated for a given query molecule by computing maximal shape/chemistry overlap between the query molecule and the drug sets assigned to each protein target. The overlap was computed using the program ROCS (Rapid Overlay of Chemical Structures). We validated this approach using the Directory of Useful Decoys (DUD). DUD contains 2950 active compounds, each with 36 property-matched decoys, against 40 protein targets. We chose a set of known drugs to represent each DUD target, and we carried out ligand-based virtual screens using data sets of DUD actives seeded into DUD decoys for each target. We computed Receiver Operator Characteristic (ROC) curves and associated area under the curve (AUC) values. For the majority of targets studied, the AUC values were significantly better than for the case of a random selection of compounds. In a second test, the method successfully identified off-targets for drugs such as rimantadine, propranolol, and domperidone that were consistent with those identified by recent experiments. The results from our ROCS-based target fishing approach are promising and have potential application in drug repurposing for single and multiple targets, identifying targets for orphan compounds, and adverse effect prediction. PMID:22196353

  11. A practical approach to object based requirements analysis

    NASA Technical Reports Server (NTRS)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  12. A Personalized Collaborative Recommendation Approach Based on Clustering of Customers

    NASA Astrophysics Data System (ADS)

    Wang, Pu

    Collaborative filtering has been known to be the most successful recommender techniques in recommendation systems. Collaborative methods recommend items based on aggregated user ratings of those items and these techniques do not depend on the availability of textual descriptions. They share the common goal of assisting in the users search for items of interest, and thus attempt to address one of the key research problems of the information overload. Collaborative filtering systems can deal with large numbers of customers and with many different products. However there is a problem that the set of ratings is sparse, such that any two customers will most likely have only a few co-rated products. The high dimensional sparsity of the rating matrix and the problem of scalability result in low quality recommendations. In this paper, a personalized collaborative recommendation approach based on clustering of customers is presented. This method uses the clustering technology to form the customers centers. The personalized collaborative filtering approach based on clustering of customers can alleviate the scalability problem in the collaborative recommendations.

  13. Assessing the Success of a Discipline-Based Communication Skills Development and Enhancement Program in a Graduate Accounting Course

    ERIC Educational Resources Information Center

    Barratt, Catherine; Hanlon, Dean; Rankin, Michaela

    2011-01-01

    In this paper we present results of the impact diagnostic testing and associated context-specific workshops have on students' written communication skills in a graduate-level accounting course. We find that students who undertook diagnostic testing performed better in their first semester accounting subject. This improvement is positively…

  14. A developmental, mentalization-based approach to the understanding and treatment of borderline personality disorder.

    PubMed

    Fonagy, Peter; Luyten, Patrick

    2009-01-01

    The precise nature and etiopathogenesis of borderline personality disorder (BPD) continues to elude researchers and clinicians. Yet, increasing evidence from various strands of research converges to suggest that affect dysregulation, impulsivity, and unstable relationships constitute the core features of BPD. Over the last two decades, the mentalization-based approach to BPD has attempted to provide a theoretically consistent way of conceptualizing the interrelationship between these core features of BPD, with the aim of providing clinicians with a conceptually sound and empirically supported approach to BPD and its treatment. This paper presents an extended version of this approach to BPD based on recently accumulated data. In particular, we suggest that the core features of BPD reflect impairments in different facets of mentalization, each related to impairments in relatively distinct neural circuits underlying these facets. Hence, we provide a comprehensive account of BPD by showing how its core features are related to each other in theoretically meaningful ways. More specifically, we argue that BPD is primarily associated with a low threshold for the activation of the attachment system and deactivation of controlled mentalization, linked to impairments in the ability to differentiate mental states of self and other, which lead to hypersensitivity and increased susceptibility to contagion by other people's mental states, and poor integration of cognitive and affective aspects of mentalization. The combination of these impairments may explain BPD patients' propensity for vicious interpersonal cycles, and their high levels of affect dysregulation and impulsivity. Finally, the implications of this expanded mentalization-based approach to BPD for mentalization-based treatment and treatment of BPD more generally are discussed. PMID:19825272

  15. Scapular dyskinesia: evolution towards a systems-based approach

    PubMed Central

    Smith, Michael J

    2015-01-01

    Historically, scapular dyskinesia has been used to describe an isolated clinical entity whereby an abnormality in positioning, movement or function of the scapula is present. Based upon this, treatment approaches have focused on addressing local isolated muscle activity. Recently, however, there has been a progressive move towards viewing the scapula as being part of a wider system of movement that is regulated and controlled by multiple factors, including the wider kinetic chain and individual patient-centred requirements. We therefore propose a paradigm shift whereby scapular dyskinesia is seen not in isolation but is considered within the broader context of patient-centred care and an entire neuromuscular system. PMID:27583003

  16. Kinect-based rehabilitation exercises system: therapist involved approach.

    PubMed

    Yao, Li; Xu, Hui; Li, Andong

    2014-01-01

    The Kinect-based physical rehabilitation receives increasing recognition as an approach to provide convenience for the patients who need the therapy usually from the health professions. Most of the previous studies were driven from the patients' point of view. This paper proposes a system aiming to simplify the recovery instruction from therapists, increasing patients' motivation to participate in the rehabilitation exercise. Furthermore, the architecture for developing such rehabilitation system is designed by motion capture, human action recognition and standard exercises prototype with Kinect device. PMID:25226964

  17. A design approach for systems based on magnetic pulse compression.

    PubMed

    Kumar, D Durga Praveen; Mitra, S; Senthil, K; Sharma, D K; Rajan, Rehim N; Sharma, Archana; Nagesh, K V; Chakravarthy, D P

    2008-04-01

    A design approach giving the optimum number of stages in a magnetic pulse compression circuit and gain per stage is given. The limitation on the maximum gain per stage is discussed. The total system volume minimization is done by considering the energy storage capacitor volume and magnetic core volume at each stage. At the end of this paper, the design of a magnetic pulse compression based linear induction accelerator of 200 kV, 5 kA, and 100 ns with a repetition rate of 100 Hz is discussed with its experimental results. PMID:18447549

  18. Tip-based nanofabrication: an approach to true nanotechnology

    NASA Astrophysics Data System (ADS)

    Schofield, Adam R.; Bloschock, Kristen P.; Kenny, Thomas W.

    2010-03-01

    In order to unlock the true potential of nanotechnology, the development of controlled nanomanufacturing techniques for individual structures is critical. While the capability to grow, deposit, and manipulate nanostructures currently exists, the ability to reliably fabricate these devices with controlled differences in size, shape, and orientation at various substrate positions does not exist. To bridge this gap, the Defense Advanced Research Projects Agency (DARPA) launched the Tip-Based Nanofabrication (TBN) research program with the intent of achieving controlled nanomanufacturing of nanowires, nanotubes and quantum dots using functionalized AFM cantilevers and tips. This work describes the background, goals, and current approaches being explored during the multi-year TBN program.

  19. Infections on Temporal Networks—A Matrix-Based Approach

    PubMed Central

    Koher, Andreas; Lentz, Hartmut H. K.; Hövel, Philipp; Sokolov, Igor M.

    2016-01-01

    We extend the concept of accessibility in temporal networks to model infections with a finite infectious period such as the susceptible-infected-recovered (SIR) model. This approach is entirely based on elementary matrix operations and unifies the disease and network dynamics within one algebraic framework. We demonstrate the potential of this formalism for three examples of networks with high temporal resolution: networks of social contacts, sexual contacts, and livestock-trade. Our investigations provide a new methodological framework that can be used, for instance, to estimate the epidemic threshold, a quantity that determines disease parameters, for which a large-scale outbreak can be expected. PMID:27035128

  20. Infections on Temporal Networks--A Matrix-Based Approach.

    PubMed

    Koher, Andreas; Lentz, Hartmut H K; Hövel, Philipp; Sokolov, Igor M

    2016-01-01

    We extend the concept of accessibility in temporal networks to model infections with a finite infectious period such as the susceptible-infected-recovered (SIR) model. This approach is entirely based on elementary matrix operations and unifies the disease and network dynamics within one algebraic framework. We demonstrate the potential of this formalism for three examples of networks with high temporal resolution: networks of social contacts, sexual contacts, and livestock-trade. Our investigations provide a new methodological framework that can be used, for instance, to estimate the epidemic threshold, a quantity that determines disease parameters, for which a large-scale outbreak can be expected. PMID:27035128