Science.gov

Sample records for accounting approach based

  1. Accounting Control Technology Using SAP: A Case-Based Approach

    ERIC Educational Resources Information Center

    Ragan, Joseph; Puccio, Christopher; Talisesky, Brandon

    2014-01-01

    The Sarbanes-Oxley Act (SOX) revolutionized the accounting and audit industry. The use of preventative and process controls to evaluate the continuous audit process done via an SAP ERP ECC 6.0 system is key to compliance with SOX and managing costs. This paper can be used in a variety of ways to discuss issues associated with auditing and testing…

  2. Greenhouse gas emissions accounting of urban residential consumption: a household survey based approach.

    PubMed

    Lin, Tao; Yu, Yunjun; Bai, Xuemei; Feng, Ling; Wang, Jin

    2013-01-01

    Devising policies for a low carbon city requires a careful understanding of the characteristics of urban residential lifestyle and consumption. The production-based accounting approach based on top-down statistical data has a limited ability to reflect the total greenhouse gas (GHG) emissions from residential consumption. In this paper, we present a survey-based GHG emissions accounting methodology for urban residential consumption, and apply it in Xiamen City, a rapidly urbanizing coastal city in southeast China. Based on this, the main influencing factors determining residential GHG emissions at the household and community scale are identified, and the typical profiles of low, medium and high GHG emission households and communities are identified. Up to 70% of household GHG emissions are from regional and national activities that support household consumption including the supply of energy and building materials, while 17% are from urban level basic services and supplies such as sewage treatment and solid waste management, and only 13% are direct emissions from household consumption. Housing area and household size are the two main factors determining GHG emissions from residential consumption at the household scale, while average housing area and building height were the main factors at the community scale. Our results show a large disparity in GHG emissions profiles among different households, with high GHG emissions households emitting about five times more than low GHG emissions households. Emissions from high GHG emissions communities are about twice as high as from low GHG emissions communities. Our findings can contribute to better tailored and targeted policies aimed at reducing household GHG emissions, and developing low GHG emissions residential communities in China.

  3. Greenhouse Gas Emissions Accounting of Urban Residential Consumption: A Household Survey Based Approach

    PubMed Central

    Lin, Tao; Yu, Yunjun; Bai, Xuemei; Feng, Ling; Wang, Jin

    2013-01-01

    Devising policies for a low carbon city requires a careful understanding of the characteristics of urban residential lifestyle and consumption. The production-based accounting approach based on top-down statistical data has a limited ability to reflect the total greenhouse gas (GHG) emissions from residential consumption. In this paper, we present a survey-based GHG emissions accounting methodology for urban residential consumption, and apply it in Xiamen City, a rapidly urbanizing coastal city in southeast China. Based on this, the main influencing factors determining residential GHG emissions at the household and community scale are identified, and the typical profiles of low, medium and high GHG emission households and communities are identified. Up to 70% of household GHG emissions are from regional and national activities that support household consumption including the supply of energy and building materials, while 17% are from urban level basic services and supplies such as sewage treatment and solid waste management, and only 13% are direct emissions from household consumption. Housing area and household size are the two main factors determining GHG emissions from residential consumption at the household scale, while average housing area and building height were the main factors at the community scale. Our results show a large disparity in GHG emissions profiles among different households, with high GHG emissions households emitting about five times more than low GHG emissions households. Emissions from high GHG emissions communities are about twice as high as from low GHG emissions communities. Our findings can contribute to better tailored and targeted policies aimed at reducing household GHG emissions, and developing low GHG emissions residential communities in China. PMID:23405187

  4. Examining air pollution in China using production- and consumption-based emissions accounting approaches.

    PubMed

    Huo, Hong; Zhang, Qiang; Guan, Dabo; Su, Xin; Zhao, Hongyan; He, Kebin

    2014-12-16

    Two important reasons for China's air pollution are the high emission factors (emission per unit of product) of pollution sources and the high emission intensity (emissions per unit of GDP) of the industrial structure. Therefore, a wide variety of policy measures, including both emission abatement technologies and economic adjustment, must be implemented. To support such measures, this study used the production- and consumption-based emissions accounting approaches to simulate the SO2, NOx, PM2.5, and VOC emissions flows among producers and consumers. This study analyzed the emissions and GDP performance of 36 production sectors. The results showed that the equipment, machinery, and devices manufacturing and construction sectors contributed more than 50% of air pollutant emissions, and most of their products were used for capital formation and export. The service sector had the lowest emission intensities, and its output was mainly consumed by households and the government. In China, the emission intensities of production activities triggered by capital formation and export were approximately twice that of the service sector triggered by final consumption expenditure. This study suggests that China should control air pollution using the following strategies: applying end-of-pipe abatement technologies and using cleaner fuels to further decrease the emission factors associated with rural cooking, electricity generation, and the transportation sector; continuing to limit highly emission-intensive but low value-added exports; developing a plan to reduce construction activities; and increasing the proportion of service GDP in the national economy.

  5. Reality-Based Learning and Interdisciplinary Teams: An Interactive Approach Integrating Accounting and Engineering Technology.

    ERIC Educational Resources Information Center

    Rogers, Robert L.; Stemkoski, Michael J.

    This paper describes a reality-based learning project in which sophomore accounting and engineering students collaborated in interdisciplinary teams to design and build a million-dollar waterslide park. Two weeks into the project, the teams received a briefing from an industrial panel of engineers, bankers, entrepreneurs, and other professionals.…

  6. Consequences of Assessment and Accountability Systems Are Integral to the Argument-Based Approach to Validity

    ERIC Educational Resources Information Center

    Lane, Suzanne

    2012-01-01

    Considering consequences in the evaluation of validity is not new although it is still debated by Paul E. Newton and others. The argument-based approach to validity entails an interpretative argument that explicitly identifies the proposed interpretations and uses of test scores and a validity argument that provides a structure for evaluating the…

  7. Building Student Success Using Problem-Based Learning Approach in the Accounting Classroom

    ERIC Educational Resources Information Center

    Shawver, Todd A.

    2015-01-01

    A major area of concern in academia is that of student retention at the university, college, and departmental levels. As academics, there is a considerable amount that we can do to improve student retention, and reduce the attrition rates in our departments. One way to solve this is to take an innovative approach in the classroom to enhance the…

  8. Accounting for selection bias in species distribution models: An econometric approach on forested trees based on structural modeling

    NASA Astrophysics Data System (ADS)

    Ay, Jean-Sauveur; Guillemot, Joannès; Martin-StPaul, Nicolas K.; Doyen, Luc; Leadley, Paul

    2015-04-01

    Species distribution models (SDMs) are widely used to study and predict the outcome of global change on species. In human dominated ecosystems the presence of a given species is the result of both its ecological suitability and human footprint on nature such as land use choices. Land use choices may thus be responsible for a selection bias in the presence/absence data used in SDM calibration. We present a structural modelling approach (i.e. based on structural equation modelling) that accounts for this selection bias. The new structural species distribution model (SSDM) estimates simultaneously land use choices and species responses to bioclimatic variables. A land use equation based on an econometric model of landowner choices was joined to an equation of species response to bioclimatic variables. SSDM allows the residuals of both equations to be dependent, taking into account the possibility of shared omitted variables and measurement errors. We provide a general description of the statistical theory and a set of application on forested trees over France using databases of climate and forest inventory at different spatial resolution (from 2km to 8 km). We also compared the output of the SSDM with outputs of a classical SDM in term of bioclimatic response curves and potential distribution under current climate. According to the species and the spatial resolution of the calibration dataset, shapes of bioclimatic response curves the modelled species distribution maps differed markedly between the SSDM and classical SDMs. The magnitude and directions of these differences were dependent on the correlations between the errors from both equations and were highest for higher spatial resolutions. A first conclusion is that the use of classical SDMs can potentially lead to strong miss-estimation of the actual and future probability of presence modelled. Beyond this selection bias, the SSDM we propose represents a crucial step to account for economic constraints on tree

  9. Starworld: Preparing Accountants for the Future: A Case-Based Approach to Teach International Financial Reporting Standards Using ERP Software

    ERIC Educational Resources Information Center

    Ragan, Joseph M.; Savino, Christopher J.; Parashac, Paul; Hosler, Jonathan C.

    2010-01-01

    International Financial Reporting Standards now constitute an important part of educating young professional accountants. This paper looks at a case based process to teach International Financial Reporting Standards using integrated Enterprise Resource Planning software. The case contained within the paper can be used within a variety of courses…

  10. Integrated Approach to User Account Management

    NASA Technical Reports Server (NTRS)

    Kesselman, Glenn; Smith, William

    2007-01-01

    IT environments consist of both Windows and other platforms. Providing user account management for this model has become increasingly diffi cult. If Microsoft#s Active Directory could be enhanced to extend a W indows identity for authentication services for Unix, Linux, Java and Macintosh systems, then an integrated approach to user account manag ement could be realized.

  11. Assessing Students' Accounting Knowledge: A Structural Approach.

    ERIC Educational Resources Information Center

    Boldt, Margaret N.

    2001-01-01

    Comparisons of students' representations of financial accounting concepts with the knowledge structures of experts were depicted using Pathfinder networks. This structural approach identified the level of students' understanding of concepts and knowledge gaps that need to be addressed. (SK)

  12. An Innovative Approach to Resident Scheduling: Use of a Point-Based System to Account for Resident Preferences

    PubMed Central

    Chow, Robert Tao-Ping; Tamhane, Shrikant; Zhang, Manling; Fisher, Lori-Ann; Yoon, Jenni; Sehgal, Sameep; Lumbres, Madel; Han, Ma Ai Thanda; Win, Tiffany

    2015-01-01

    Background The scheduling of residents for rotation assignments and on-call responsibilities is a time-consuming process that challenges the resources of residency programs. Assignment of schedules is traditionally done by chief residents or program administration with variable input from the residents involved. Intervention We introduced an innovative point-based scheduling system to increase transparency in the scheduling process, foster a sense of fairness and equality in scheduling, and increase resident ownership for making judicious scheduling choices. Methods We devised a point-based system in which each resident in our 40-member program was allocated an equal number of points. The residents assigned these points to their preferred choices of rotations. Residents were then surveyed anonymously on their perceptions of this new scheduling system and were asked to compare it with their traditional scheduling system. Results The schedule was successfully implemented, and it allowed residents to express their scheduling preferences using an innovative point-based approach. Residents were generally satisfied with the new system, would recommend it to other programs, and perceived a greater sense of involvement. However, resident satisfaction with the new system was not significantly greater compared with the previous approach to scheduling (P = .20). Chief residents expressed satisfaction with the new scheduling model. Conclusions Residents were equally satisfied with the traditional preference-based scheduling approach and the new point-based system. Chief residents' feedback on the new system reflected reduced stress and time commitment in the new point-based system. PMID:26457154

  13. School Centered Evidence Based Accountability

    ERIC Educational Resources Information Center

    Milligan, Charles

    2015-01-01

    Achievement scores drive much of the effort in today's accountability system, however, there is much more that occurs in every school, every day. School Centered Evidence Based Accountability can be used from micro to macro giving School Boards and Administration a process for monitoring the results of the entire school operation effectively and…

  14. Approaches to Accountability in Long-Term Care

    PubMed Central

    Berta, Whitney; Laporte, Audrey; Wodchis, Walter P.

    2014-01-01

    This paper discusses the array of approaches to accountability in Ontario long-term care (LTC) homes. A focus group involving key informants from the LTC industry, including both for-profit and not-for-profit nursing home owners/operators, was used to identify stakeholders involved in formulating and implementing LTC accountability approaches and the relevant regulations, policies and initiatives relating to accountability in the LTC sector. These documents were then systematically reviewed. We found that the dominant mechanisms have been financial incentives and oversight, regulations and information; professionalism has played a minor role. More recently, measurement for accountability in LTC has grown to encompass an array of fiscal, clinical and public accountability measurement mechanisms. The goals of improved quality and accountability are likely more achievable using these historical regulatory approaches, but the recent rapid increase in data and measurability could also enable judicious application of market-based approaches. PMID:25305396

  15. School District Program Cost Accounting: An Alternative Approach

    ERIC Educational Resources Information Center

    Hentschke, Guilbert C.

    1975-01-01

    Discusses the value for school districts of a program cost accounting system and examines different approaches to generating program cost data, with particular emphasis on the "cost allocation to program system" (CAPS) and the traditional "transaction-based system." (JG)

  16. Teaching the Indirect Method of the Statement of Cash Flows in Introductory Financial Accounting: A Comprehensive, Problem-Based Approach

    ERIC Educational Resources Information Center

    Brickner, Daniel R.; McCombs, Gary B.

    2004-01-01

    In this article, the authors provide an instructional resource for presenting the indirect method of the statement of cash flows (SCF) in an introductory financial accounting course. The authors focus primarily on presenting a comprehensive example that illustrates the "why" of SCF preparation and show how journal entries and T-accounts can be…

  17. Teaching Financial Accounting via a Worksheet Approach.

    ERIC Educational Resources Information Center

    Vincent, Vern C.; Dietz, Elizabeth M.

    A classroom research study investigated the effectiveness of an approach to financial accounting instruction that uses worksheets to bring together the conceptual and practical aspects of the field. Students were divided into two groups, one taught by traditional lecture method and the other taught with worksheet exercises and lectures stressing…

  18. A partnership approach to learning about accountability.

    PubMed

    Plant, Nigel; Pitt, Richard; Troke, Ben

    Clinicians and healthcare providers are frequently reminded that they are 'accountable' practitioners - but what is the definition of accountability, and how does it apply in a practical and legal context? To clarify these issues, the University of Nottingham School of Nursing has formed a partnership with Browne Jacobson Solicitors. Together they have developed a 7-stage training programme for nursing students which covers the key aspects of accountability, including ethical concepts, the law of negligence, and scenario-based training on being called as a witness in an investigation. This article introduces the implications of accountability and describes the structure and syllabus of the programme, including participants' feedback on the benefits of the experience. PMID:20622781

  19. Bookkeeping and Accounting: The "Time" Approach to Teaching Accounting

    ERIC Educational Resources Information Center

    Mallue, Henry E., Jr.

    1977-01-01

    Describes the "time" approach, a non-traditional method for teaching Bookkeeping I, which redirects the general climate of the first week of class by not introducing crucial balance sheet and journal concepts, but makes use of sections 441 and 446 of the Internal Revenue Code, thereby permitting students to learn the important role "time"…

  20. The Effects of Different Teaching Approaches in Introductory Financial Accounting

    ERIC Educational Resources Information Center

    Chiang, Bea; Nouri, Hossein; Samanta, Subarna

    2014-01-01

    The purpose of the research is to examine the effect of the two different teaching approaches in the first accounting course on student performance in a subsequent finance course. The study compares 128 accounting and finance students who took introductory financial accounting by either a user approach or a traditional preparer approach to examine…

  1. How good is the turbid medium-based approach for accounting for light partitioning in contrasted grass–legume intercropping systems?

    PubMed Central

    Barillot, Romain; Louarn, Gaëtan; Escobar-Gutiérrez, Abraham J.; Huynh, Pierre; Combes, Didier

    2011-01-01

    Background and Aims Most studies dealing with light partitioning in intercropping systems have used statistical models based on the turbid medium approach, thus assuming homogeneous canopies. However, these models could not be directly validated although spatial heterogeneities could arise in such canopies. The aim of the present study was to assess the ability of the turbid medium approach to accurately estimate light partitioning within grass–legume mixed canopies. Methods Three contrasted mixtures of wheat–pea, tall fescue–alfalfa and tall fescue–clover were sown according to various patterns and densities. Three-dimensional plant mock-ups were derived from magnetic digitizations carried out at different stages of development. The benchmarks for light interception efficiency (LIE) estimates were provided by the combination of a light projective model and plant mock-ups, which also provided the inputs of a turbid medium model (SIRASCA), i.e. leaf area index and inclination. SIRASCA was set to gradually account for vertical heterogeneity of the foliage, i.e. the canopy was described as one, two or ten horizontal layers of leaves. Key Results Mixtures exhibited various and heterogeneous profiles of foliar distribution, leaf inclination and component species height. Nevertheless, most of the LIE was satisfactorily predicted by SIRASCA. Biased estimations were, however, observed for (1) grass species and (2) tall fescue–alfalfa mixtures grown at high density. Most of the discrepancies were due to vertical heterogeneities and were corrected by increasing the vertical description of canopies although, in practice, this would require time-consuming measurements. Conclusions The turbid medium analogy could be successfully used in a wide range of canopies. However, a more detailed description of the canopy is required for mixtures exhibiting vertical stratifications and inter-/intra-species foliage overlapping. Architectural models remain a relevant tool for

  2. The Cyclical Relationship Approach in Teaching Basic Accounting Principles.

    ERIC Educational Resources Information Center

    Golen, Steven

    1981-01-01

    Shows how teachers can provide a more meaningful presentation of various accounting principles by illustrating them through a cyclical relationship approach. Thus, the students see the entire accounting relationship as a result of doing business. (CT)

  3. Enhanced Student Learning in Accounting Utilising Web-Based Technology, Peer-Review Feedback and Reflective Practices: A Learning Community Approach to Assessment

    ERIC Educational Resources Information Center

    Taylor, Sue; Ryan, Mary; Pearce, Jon

    2015-01-01

    Higher education is becoming a major driver of economic competitiveness in an increasingly knowledge-driven global economy. Maintaining the competitive edge has seen an increase in public accountability of higher education institutions through the mechanism of ranking universities based on the quality of their teaching and learning outcomes. As a…

  4. The Coach-Team Approach: An Introductory Accounting Instructional Alternative

    ERIC Educational Resources Information Center

    Wood, Lynette I.

    2012-01-01

    Many students approach the introductory accounting course with a great deal of apprehension. For the most part, the course is populated by non-accounting majors who often perceive accounting to be extremely difficult and may view the instructor-student relationship as adversarial. As a result, such students may be inclined to express their…

  5. Accountable clinical management: an integrated approach.

    PubMed

    Potash, David L

    2011-10-01

    Hospitals should move from the traditional siloed approach to managing the clinical side of the enterprise, where finance leaders and clinicians play distinctly different roles without coordination, to an integrated approach that assembles a multidisciplinary team to focus coordinated attention on identifying and pursuing opportunities for clinical process improvement. Senior executives should lead this top-down effort to establish goals and set priorities for action, using an integrated, high-level reporting dashboard that shows overall performance in terms of quality, efficiency, and patient experience. Implementing integrated clinical management requires a clear, consistent communications plan and messaging for physicians and managers to show why it is increasingly necessary for both hospitals and physicians. PMID:22053648

  6. Risk-Based Educational Accountability in Dutch Primary Education

    ERIC Educational Resources Information Center

    Timmermans, A. C.; de Wolf, I. F.; Bosker, R. J.; Doolaard, S.

    2015-01-01

    A recent development in educational accountability is a risk-based approach, in which intensity and frequency of school inspections vary across schools to make educational accountability more efficient and effective by enabling inspectorates to focus on organizations at risk. Characteristics relevant in predicting which schools are "at risk…

  7. Moving to micro-based cost accounting.

    PubMed

    Baird, P J; Kazamek, T J

    1988-03-01

    Cost accounting information is needed for flexible budgeting, productivity management, contracting with third party payors, physician evaluation, and investment and divestment decisions. This article discusses why an increasing number of hospital managers are turning to microcomputer-based accounting systems as solutions to these growing needs.

  8. Standards-Based Accountability Systems. Policy Brief.

    ERIC Educational Resources Information Center

    Stapleman, Jan

    This policy brief summarizes research results and provides guidance regarding decisions associated with school accountability. Unlike previous notions of accountability, a standards-based system examines outputs, such as student performance and graduation rates, as well as inputs like the amount of instructional time or the number of books in the…

  9. Students' Approaches to Learning in Problem-Based Learning: Taking into Account Professional Behavior in the Tutorial Groups, Self-Study Time, and Different Assessment Aspects

    ERIC Educational Resources Information Center

    Loyens, Sofie M. M.; Gijbels, David; Coertjens, Liesje; Cote, Daniel J.

    2013-01-01

    Problem-based learning (PBL) represents a major development in higher educational practice and is believed to promote deep learning in students. However, empirical findings on the promotion of deep learning in PBL remain unclear. The aim of the present study is to investigate the relationships between students' approaches to learning (SAL) and…

  10. Students' Approaches to Study in Introductory Accounting Courses

    ERIC Educational Resources Information Center

    Elias, Rafik Z.

    2005-01-01

    Significant education research has focused on the study approaches of students. Two study approaches have been clearly identified: deep and surface. In this study, the author examined the way in which students approach studying introductory accounting courses. In general, he found that GPA and expected course grade were correlated positively with…

  11. Toward Accountability. A Report on the Mesa Approach to Career Guidance, Counseling, and Placement.

    ERIC Educational Resources Information Center

    Mesa Public Schools, AZ.

    This report describes a systems approach toward an accountability program for a high school guidance department, the primary objective of which was to reduce the size of the "guidance universe" to manageable size, and to be responsible i.e., accountable. A commitment was developed to move toward a model of accountability, based not only upon what…

  12. Implementing Outcomes Based Accountability in Children's Services. Case Studies

    ERIC Educational Resources Information Center

    Bergeron, Caroline; Chamberlain, Tamsin; George, Nalia; Golden, Sarah; Mundy, Ellie; Southcott, Clare; Walker, Fiona

    2010-01-01

    Outcomes Based Accountability (OBA) is an approach that Children's Trusts and Children's Services can use to assist with planning services and assessing their performance. The OBA approach focuses on outcomes that are desired and monitoring and evidencing progress towards those desired outcomes. OBA makes a distinction between two types of…

  13. The Implementation of Accountability Systems: A Psycho-Social Approach.

    ERIC Educational Resources Information Center

    Levine, Donald M.

    This report reviews several approaches to organizational analysis and discusses problems in implementing an accountability system. Chapter 1 surveys socio-technical and technical-behavioral systems. Chapter 2 discusses the importance of psychoanalysis and organization analysis. Chapter 3 presents problems in implementing an accountability system…

  14. Accountability.

    ERIC Educational Resources Information Center

    Mullen, David J., Ed.

    This monograph, prepared to assist Georgia elementary principals to better understand accountability and its implications for educational improvement, sets forth many of the theoretical and philosophical bases from which accountability is being considered. Leon M. Lessinger begins this 5-paper presentation by describing the need for accountability…

  15. Using Problem-Based Learning in Accounting

    ERIC Educational Resources Information Center

    Hansen, James D.

    2006-01-01

    In this article, the author describes the process of writing a problem-based learning (PBL) problem and shows how a typical end-of-chapter accounting problem can be converted to a PBL problem. PBL uses complex, real-world problems to motivate students to identify and research the concepts and principles they need to know to solve these problems.…

  16. Problem-Based Learning in Accounting

    ERIC Educational Resources Information Center

    Dockter, DuWayne L.

    2012-01-01

    Seasoned educators use an assortment of student-centered methods and tools to enhance their student's learning environment. In respects to methodologies used in accounting, educators have utilized and created new forms of problem-based learning exercises, including case studies, simulations, and other projects, to help students become more active…

  17. Intergovernmental Approaches for Strengthening K-12 Accountability Systems

    ERIC Educational Resources Information Center

    Armour-Garb, Allison, Ed.

    2007-01-01

    This volume contains an edited transcript of the Rockefeller Institute's October 29, 2007 symposium (Chicago, IL) entitled "Intergovernmental Approaches to Strengthen K-12 Accountability Systems" as well as a framework paper circulated in preparation for the symposium. The transcript begins with a list of the forty state and federal education…

  18. Arkansas' Curriculum Guide. Competency Based Computerized Accounting.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock. Div. of Vocational, Technical and Adult Education.

    This guide contains the essential parts of a total curriculum for a one-year secondary-level course in computerized accounting. Addressed in the individual sections of the guide are the following topics: the complete accounting cycle, computer operations for accounting, computerized accounting and general ledgers, computerized accounts payable,…

  19. Trade-based carbon sequestration accounting.

    PubMed

    King, Dennis M

    2004-04-01

    This article describes and illustrates an accounting method to assess and compare "early" carbon sequestration investments and trades on the basis of the number of standardized CO2 emission offset credits they will provide. The "gold standard" for such credits is assumed to be a relatively riskless credit based on a CO2 emission reduction that provides offsets against CO2 emissions on a one-for-one basis. The number of credits associated with carbon sequestration needs to account for time, risk, durability, permanence, additionality, and other factors that future trade regulators will most certainly use to assign "official" credits to sequestration projects. The method that is presented here uses established principles of natural resource accounting and conventional rules of asset valuation to "score" projects. A review of 20 "early" voluntary United States based CO2 offset trades that involve carbon sequestration reveals that the assumptions that buyers, sellers, brokers, and traders are using to characterize the economic potential of their investments and trades vary enormously. The article develops a "universal carbon sequestration credit scoring equation" and uses two of these trades to illustrate the sensitivity of trade outcomes to various assumptions about how future trade auditors are likely to "score" carbon sequestration projects in terms of their "equivalency" with CO2 emission reductions. The article emphasizes the importance of using a standard credit scoring method that accounts for time and risk to assess and compare even unofficial prototype carbon sequestration trades. The scoring method illustrated in this article is a tool that can protect the integrity of carbon sequestration credit trading and can assist buyers and sellers in evaluating the real economic potential of prospective trades.

  20. Accountability for Project-Based Collaborative Learning

    ERIC Educational Resources Information Center

    Jamal, Abu-Hussain; Essawi, Mohammad; Tilchin, Oleg

    2014-01-01

    One perspective model for the creation of the learning environment and engendering students' thinking development is the Project-Based Collaborative Learning (PBCL) model. This model organizes learning by collaborative performance of various projects. In this paper we describe an approach to enhancing the PBCL model through the creation of…

  1. Validating a mass balance accounting approach to using 7Be measurements to estimate event-based erosion rates over an extended period at the catchment scale

    NASA Astrophysics Data System (ADS)

    Porto, Paolo; Walling, Des E.; Cogliandro, Vanessa; Callegari, Giovanni

    2016-07-01

    Use of the fallout radionuclides cesium-137 and excess lead-210 offers important advantages over traditional methods of quantifying erosion and soil redistribution rates. However, both radionuclides provide information on longer-term (i.e., 50-100 years) average rates of soil redistribution. Beryllium-7, with its half-life of 53 days, can provide a basis for documenting short-term soil redistribution and it has been successfully employed in several studies. However, the approach commonly used introduces several important constraints related to the timing and duration of the study period. A new approach proposed by the authors that overcomes these constraints has been successfully validated using an erosion plot experiment undertaken in southern Italy. Here, a further validation exercise undertaken in a small (1.38 ha) catchment is reported. The catchment was instrumented to measure event sediment yields and beryllium-7 measurements were employed to document the net soil loss for a series of 13 events that occurred between November 2013 and June 2015. In the absence of significant sediment storage within the catchment's ephemeral channel system and of a significant contribution from channel erosion to the measured sediment yield, the estimates of net soil loss for the individual events could be directly compared with the measured sediment yields to validate the former. The close agreement of the two sets of values is seen as successfully validating the use of beryllium-7 measurements and the new approach to obtain estimates of net soil loss for a sequence of individual events occurring over an extended period at the scale of a small catchment.

  2. Responding Effectively to Test-Based Accountability

    ERIC Educational Resources Information Center

    Hamilton, Laura; Stecher, Brian

    2004-01-01

    The No Child Left Behind (NCLB) Act of 2001 has focused the attention of educators, policy makers, and the public on accountability for performance in public education. Yet many of those who will be responsible for improving school performance lack guidance on how to proceed in the brave new world of NCLB accountability. For the most part, state…

  3. Trading Land: A Review of Approaches to Accounting for Upstream Land Requirements of Traded Products

    PubMed Central

    Haberl, Helmut; Kastner, Thomas; Wiedenhofer, Dominik; Eisenmenger, Nina; Erb, Karl‐Heinz

    2015-01-01

    Summary Land use is recognized as a pervasive driver of environmental impacts, including climate change and biodiversity loss. Global trade leads to “telecoupling” between the land use of production and the consumption of biomass‐based goods and services. Telecoupling is captured by accounts of the upstream land requirements associated with traded products, also commonly referred to as land footprints. These accounts face challenges in two main areas: (1) the allocation of land to products traded and consumed and (2) the metrics to account for differences in land quality and land‐use intensity. For two main families of accounting approaches (biophysical, factor‐based and environmentally extended input‐output analysis), this review discusses conceptual differences and compares results for land footprints. Biophysical approaches are able to capture a large number of products and different land uses, but suffer from a truncation problem. Economic approaches solve the truncation problem, but are hampered by the limited disaggregation of sectors and products. In light of the conceptual differences, the overall similarity of results generated by both types of approaches is remarkable. Diametrically opposed results for some of the world's largest producers and consumers of biomass‐based products, however, make interpretation difficult. This review aims to provide clarity on some of the underlying conceptual issues of accounting for land footprints. PMID:27547028

  4. Challenges of "Thinking Differently" with Rhizoanalytic Approaches: A Reflexive Account

    ERIC Educational Resources Information Center

    Cumming, Tamara

    2015-01-01

    Growing numbers of educational researchers are using rhizoanalytic approaches based on the work of Deleuze and Guattari to think differently in their research practices. However, as those engaging in debates about post-qualitative research suggest, thinking differently is not without its challenges. This paper uses three complex challenges…

  5. Teaching Consolidations Accounting: An Approach to Easing the Challenge

    ERIC Educational Resources Information Center

    Murphy, Elizabeth A.; McCarthy, Mark A.

    2010-01-01

    Teaching and learning accounting for consolidations is a challenging endeavor. Students not only need to understand the conceptual underpinnings of the accounting requirements for consolidations, but also must master the complex accounting needed to prepare consolidated financial statements. To add to the challenge, the consolidation process is…

  6. Accountability.

    ERIC Educational Resources Information Center

    Lashway, Larry

    1999-01-01

    This issue reviews publications that provide a starting point for principals looking for a way through the accountability maze. Each publication views accountability differently, but collectively these readings argue that even in an era of state-mandated assessment, principals can pursue proactive strategies that serve students' needs. James A.…

  7. Accountability.

    ERIC Educational Resources Information Center

    The Newsletter of the Comprehensive Center-Region VI, 1999

    1999-01-01

    Controversy surrounding the accountability movement is related to how the movement began in response to dissatisfaction with public schools. Opponents see it as one-sided, somewhat mean-spirited, and a threat to the professional status of teachers. Supporters argue that all other spheres of the workplace have accountability systems and that the…

  8. 12 CFR 563b.465 - Do account holders retain any voting rights based on their liquidation sub-accounts?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 5 2011-01-01 2011-01-01 false Do account holders retain any voting rights... Account § 563b.465 Do account holders retain any voting rights based on their liquidation sub-accounts? Eligible account holders or supplemental eligible account holders do not retain any voting rights based...

  9. 12 CFR 563b.465 - Do account holders retain any voting rights based on their liquidation sub-accounts?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Do account holders retain any voting rights... Account § 563b.465 Do account holders retain any voting rights based on their liquidation sub-accounts? Eligible account holders or supplemental eligible account holders do not retain any voting rights based...

  10. Applications and rewards of cost accounting: a practical approach.

    PubMed

    Gottlieb, J A; Suskin, S W

    1986-07-01

    This article is organized to present the full process of cost accounting. Cost behavior characteristics will be explained to provide a foundation for classifying specific types of cost. An overview of cost accounting applications is presented with discussions of productivity monitoring, contract pricing, program evaluation and strategic planning.

  11. A Humanistic Approach to South African Accounting Education

    ERIC Educational Resources Information Center

    West, A.; Saunders, S.

    2006-01-01

    Humanistic psychologist Carl Rogers made a distinction between traditional approaches and humanistic "learner-centred" approaches to education. The traditional approach holds that educators impart their knowledge to willing and able recipients; whereas the humanistic approach holds that educators act as facilitators who assist learners in their…

  12. Improving hospital budgeting and accountability a best practice approach.

    PubMed

    Clark, Jonathan J

    2005-07-01

    Best practices in setting and managing healthcare organization budgets include: Using comparative benchmarks. Setting accurate, high-performance department budgets. Establishing a culture of accountability. Managing expenses. Monitoring variances and requiring corrective action plans. Employing a balanced scorecard. PMID:16060103

  13. Incentives and Test-Based Accountability in Education

    ERIC Educational Resources Information Center

    Hout, Michael, Ed.; Elliott, Stuart W., Ed.

    2011-01-01

    In recent years there have been increasing efforts to use accountability systems based on large-scale tests of students as a mechanism for improving student achievement. The federal No Child Left Behind Act (NCLB) is a prominent example of such an effort, but it is only the continuation of a steady trend toward greater test-based accountability in…

  14. Community-Based School Finance and Accountability: A New Era for Local Control in Education Policy?

    ERIC Educational Resources Information Center

    Vasquez Heilig, Julian; Ward, Derrick R.; Weisman, Eric; Cole, Heather

    2014-01-01

    Top-down accountability policies have arguably had very limited impact over the past 20 years. Education stakeholders are now contemplating new forms of bottom-up accountability. In 2013, policymakers in California enacted a community-based approach that creates the Local Control Funding Formula (LCFF) process for school finance to increase…

  15. Educational Accountability: A Qualitatively Driven Mixed-Methods Approach

    ERIC Educational Resources Information Center

    Hall, Jori N.; Ryan, Katherine E.

    2011-01-01

    This article discusses the importance of mixed-methods research, in particular the value of qualitatively driven mixed-methods research for quantitatively driven domains like educational accountability. The article demonstrates the merits of qualitative thinking by describing a mixed-methods study that focuses on a middle school's system of…

  16. Evaluating an Accountability Mentoring Approach for School Counselors

    ERIC Educational Resources Information Center

    Milsom, Amy; McCormick, Katlyn

    2015-01-01

    School counselors are encouraged to use accountability in order to advocate for their programs and students, but many school counselors lack confidence to work with data. This project examined the effectiveness of an individualized mentoring intervention targeting data attitudes, self-efficacy, and behaviors. After participating in the…

  17. Two-stage decision approach to material accounting

    SciTech Connect

    Opelka, J.H.; Sutton, W.B.

    1982-01-01

    The validity of the alarm threshold 4sigma has been checked for hypothetical large and small facilities using a two-stage decision model in which the diverter's strategic variable is the quantity diverted, and the defender's strategic variables are the alarm threshold and the effectiveness of the physical security and material control systems in the possible presence of a diverter. For large facilities, the material accounting system inherently appears not to be a particularly useful system for the deterrence of diversions, and essentially no improvement can be made by lowering the alarm threshold below 4sigma. For small facilities, reduction of the threshold to 2sigma or 3sigma is a cost effective change for the accounting system, but is probably less cost effective than making improvements in the material control and physical security systems.

  18. Student Accountability in Team-Based Learning Classes

    ERIC Educational Resources Information Center

    Stein, Rachel E.; Colyer, Corey J.; Manning, Jason

    2016-01-01

    Team-based learning (TBL) is a form of small-group learning that assumes stable teams promote accountability. Teamwork promotes communication among members; application exercises promote active learning. Students must prepare for each class; failure to do so harms their team's performance. Therefore, TBL promotes accountability. As part of the…

  19. Financial Accounting System Based Upon NCES Revised Handbook II.

    ERIC Educational Resources Information Center

    National Center for Education Statistics (DHEW), Washington, DC. Educational Data Standards Branch.

    This publication describes the development and implementation of a school district financial accounting system based on the concepts and guidelines of the National Center for Education Statistics Handbook II, Revised. The system described was designed by school district personnel to utilize computer equipment and to meet the accounting and…

  20. Test-Based Teacher Evaluations: Accountability vs. Responsibility

    ERIC Educational Resources Information Center

    Bolyard, Chloé

    2015-01-01

    Gert Biesta contends that managerial accountability, which focuses on efficiency and competition, dominates the current political arena in education. Such accountability has influenced states' developments of test-based teacher evaluations in an attempt to quantify teachers' efficacy on student learning. With numerous state policies requiring the…

  1. School-Based Management and Accountability Procedures Manual

    ERIC Educational Resources Information Center

    North Carolina Department of Public Instruction, 2004

    2004-01-01

    From the mission, several principles were developed to guide the School-Based Management and Accountability Program (the ABCs). (1) The ABCs sets standards for student performance and growth in the basics that are the foundation for further learning and achievement; (2) The accountability system in the ABCs plan is designed to result in improved…

  2. Future Performance Trend Indicators: A Current Value Approach to Human Resources Accounting. Report V: The Value Attribution Process. Technical Report.

    ERIC Educational Resources Information Center

    Lapointe, Jean B.; And Others

    The development of future performance trend indicators is based on the current value approach to human resource accounting. The value attribution portion of the current value approach is used to estimate the dollar value of observed changes in the state of the human organization. The procedure for value attribution includes: prediction of changes…

  3. Developing Accounting Students' Listening Skills: Barriers, Opportunities and an Integrated Stakeholder Approach

    ERIC Educational Resources Information Center

    Stone, Gerard; Lightbody, Margaret; Whait, Rob

    2013-01-01

    Accountants and employers of accounting graduates consider listening to be among the most important communication skills that graduates possess. However, accounting education practices that develop students' listening skills are uncommon. Further, in the case of listening development, the current approach of prescribing that educators do more to…

  4. The Impact of a Participant-Based Accounting Cycle Course on Student Performance in Intermediate Financial Accounting I

    ERIC Educational Resources Information Center

    Siagian, Ferdinand T.; Khan, Mohammad

    2016-01-01

    The authors investigated whether students in an Intermediate Financial Accounting I course who took a 1-credit, participant-based accounting cycle course performed better than students who did not take the accounting cycle course. Results indicate a higher likelihood of earning a better grade for students who took the accounting cycle course even…

  5. Accounting for Parameter Uncertainty in Reservoir Uncertainty Assessment: The Conditional Finite-Domain Approach

    SciTech Connect

    Babak, Olena Deutsch, Clayton V.

    2009-03-15

    An important aim of modern geostatistical modeling is to quantify uncertainty in geological systems. Geostatistical modeling requires many input parameters. The input univariate distribution or histogram is perhaps the most important. A new method for assessing uncertainty in the histogram, particularly uncertainty in the mean, is presented. This method, referred to as the conditional finite-domain (CFD) approach, accounts for the size of the domain and the local conditioning data. It is a stochastic approach based on a multivariate Gaussian distribution. The CFD approach is shown to be convergent, design independent, and parameterization invariant. The performance of the CFD approach is illustrated in a case study focusing on the impact of the number of data and the range of correlation on the limiting uncertainty in the parameters. The spatial bootstrap method and CFD approach are compared. As the number of data increases, uncertainty in the sample mean decreases in both the spatial bootstrap and the CFD. Contrary to spatial bootstrap, uncertainty in the sample mean in the CFD approach decreases as the range of correlation increases. This is a direct result of the conditioning data being more correlated to unsampled locations in the finite domain. The sensitivity of the limiting uncertainty relative to the variogram and the variable limits are also discussed.

  6. The utilization of activity-based cost accounting in hospitals.

    PubMed

    Emmett, Dennis; Forget, Robert

    2005-01-01

    Healthcare costs are being examined on all fronts. Healthcare accounts for 11% of the gross national product and will continue to rise as the "babyboomers" reach retirement age. While ascertaining costs is important, most research shows that costing methods have not been implemented in hospitals. This study is concerned with the use of costing methods; particularly activity-based cost accounting. A mail survey of CFOs was undertaken to determine the type of cost accounting method they use. In addition, they were asked whether they were aware of activity-based cost accounting and whether they had implemented it or were planning to implement it. Only 71.8% were aware of it and only 4.7% had implemented it. In addition, only 52% of all hospitals report using any cost accounting systems. Education needs to ensure that all healthcare executives are cognizant of activity-based accounting and its importance in determining costs. Only by determining costs can hospitals strive to contain them.

  7. Accounting for Recoil Effects in Geochronometers: A New Model Approach

    NASA Astrophysics Data System (ADS)

    Lee, V. E.; Huber, C.

    2012-12-01

    dated grain is a major control on the magnitude of recoil loss, the first feature is the ability to calculate recoil effects on isotopic compositions for realistic, complex grain shapes and surface roughnesses. This is useful because natural grains may have irregular shapes that do not conform to simple geometric descriptions. Perhaps more importantly, the surface area over which recoiled nuclides are lost can be significantly underestimated when grain surface roughness is not accounted for, since the recoil distances can be of similar characteristic lengthscales to surface roughness features. The second key feature is the ability to incorporate dynamical geologic processes affecting grain surfaces in natural settings, such as dissolution and crystallization. We describe the model and its main components, and point out implications for the geologically-relevant chronometers mentioned above.

  8. A distributed approach to accounting for carbon in wood products

    SciTech Connect

    Marland, Eric; Stellar, Kirk; Marland, Gregg

    2010-01-01

    With an evolving political environment of commitments to limit emissions of greenhouse gases, and of markets to trade in emissions permits, there is growing scientific, political, and economic need to accurately evaluate carbon (C) stocks and flows especially those related to human activities. One component of the global carbon cycle that has been contentious is the stock of carbon that is physically held in harvested wood products. The carbon stored in wood products has been sometimes overlooked, but the amount of carbon contained in wood products is not trivial, it is increasing with time, and it is significant to some Parties. This paper is concerned with accurate treatment of harvested wood products in inventories of CO2 emissions to the atmosphere. The methodologies outlined demonstrate a flexible way to expand current methods beyond the assumption of a simple, first-order decay to include the use of more accurate and detailed data while retaining the simplicity of simple formulas. The paper demonstrates that a more accurate representation of decay time can have significant economic implications in a system where emissions are taxed or emissions permits are traded. The method can be easily applied using only data on annual production of wood products and two parameters to characterize their expected lifetime. These methods are not specific to wood products but can be applied to long-lived, carbon-containing products from sources other than wood, e.g. long-lived petrochemical products. A single unifying approach that is both simple and flexible has the potential to be both more accurate in its results, more efficient in its implementation, and economically important to some Parties.

  9. Serving Public Interests in Educational Accountability: Alternative Approaches to Democratic Evaluation

    ERIC Educational Resources Information Center

    Ryan, Katherine E.

    2004-01-01

    Today, educational evaluation theory and practice face a critical juncture with the kind of educational accountability evaluation legislated by No Child Left Behind. While the goal of this kind of educational accountability is to improve education, it is characterized by a hierarchical, top-down approach to improving educational achievement…

  10. Knuckling Under? School Superintendents and Accountability-Based Educational Reform

    ERIC Educational Resources Information Center

    Feuerstein, Abe

    2013-01-01

    The goal of this article is to explore the various ways that superintendents have responded to accountability-based educational reform efforts such as No Child Left Behind, the factors that have influenced their responses, and the implications of these responses for current and future educational leaders. With respect to the first issue, empirical…

  11. Ontology-Based e-Assessment for Accounting Education

    ERIC Educational Resources Information Center

    Litherland, Kate; Carmichael, Patrick; Martínez-García, Agustina

    2013-01-01

    This summary reports on a pilot of a novel, ontology-based e-assessment system in accounting. The system, OeLe, uses emerging semantic technologies to offer an online assessment environment capable of marking students' free text answers to questions of a conceptual nature. It does this by matching their response with a "concept map" or…

  12. Challenges in Building Disease-Based National Health Accounts

    PubMed Central

    Rosen, Allison B.; Cutler, David M.

    2012-01-01

    Background Measuring spending on diseases is critical to assessing the value of medical care. Objective To review the current state of cost of illness (COI) estimation methods, identifying their strengths, limitations and uses. We briefly describe the current National Health Expenditure Accounts (NHEA), and then go on to discuss the addition of COI estimation to the NHEA. Conclusion Recommendations are made for future research aimed at identifying the best methods for developing and using disease-based national health accounts to optimize the information available to policymakers as they struggle with difficult resource allocation decisions. PMID:19536017

  13. Water Accounting Plus (WA+) - a water accounting procedure for complex river basins based on satellite measurements

    NASA Astrophysics Data System (ADS)

    Karimi, P.; Bastiaanssen, W. G. M.; Molden, D.

    2012-11-01

    Coping with the issue of water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links hydrological flows to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper we introduce Water Accounting Plus (WA+), which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use on the water cycle is described explicitly by defining land use groups with common characteristics. Analogous to financial accounting, WA+ presents four sheets including (i) a resource base sheet, (ii) a consumption sheet, (iii) a productivity sheet, and (iv) a withdrawal sheet. Every sheet encompasses a set of indicators that summarize the overall water resources situation. The impact of external (e.g. climate change) and internal influences (e.g. infrastructure building) can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used for 3 out of the 4 sheets, but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  14. Failing Tests: Commentary on "Adapting Educational Measurement to the Demands of Test-Based Accountability"

    ERIC Educational Resources Information Center

    Thissen, David

    2015-01-01

    In "Adapting Educational Measurement to the Demands of Test-Based Accountability" Koretz takes the time-honored engineering approach to educational measurement, identifying specific problems with current practice and proposing minimal modifications of the system to alleviate those problems. In response to that article, David Thissen…

  15. Implementing Outcomes-Based Accountability in Children's Services: An Overview of the Process and Impact. LG Group Research Report

    ERIC Educational Resources Information Center

    Chamberlain, Tamsin; Golden, Sarah; Walker, Fiona

    2010-01-01

    Is your local authority using outcomes based accountability for planning and managing the performance of services? If yes, does it lead to an improvement? The Outcomes Based Accountability (OBA) approach uses performance management categories that distinguish between "How much did we do?", "How well did we do it?" and "Is anyone better off?" Based…

  16. An Inter-Institutional Exploration of the Learning Approaches of Students Studying Accounting

    ERIC Educational Resources Information Center

    Byrne, Marann; Flood, Barbara; Willis, Pauline

    2009-01-01

    This paper provides a comparative analysis of the learning approaches of students taking their first course in accounting at a United States or an Irish university. The data for this study was gathered from 204 students in the U.S. and 309 in Ireland, using the Approaches and Study Skills Inventory for Students (ASSIST, 1997) which measures…

  17. Accounting based risk measures for not-for-profit hospitals.

    PubMed

    Smith, D G; Wheeler, J R

    1989-11-01

    This paper discusses the issues involved with determining an appropriate discount rate for not-for-profit hospitals and develops a method for computing measures of systematic risk based on a hospital's own accounting data. Data on four hospital management companies are used to demonstrate the method. Results indicate the need for sensitivity analysis in the selection of estimation methods and in the final determination of a discount rate.

  18. A Comparison of Four Approaches to Account for Method Effects in Latent State-Trait Analyses

    PubMed Central

    Geiser, Christian; Lockhart, Ginger

    2012-01-01

    Latent state-trait (LST) analysis is frequently applied in psychological research to determine the degree to which observed scores reflect stable person-specific effects, effects of situations and/or person-situation interactions, and random measurement error. Most LST applications use multiple repeatedly measured observed variables as indicators of latent trait and latent state residual factors. In practice, such indicators often show shared indicator-specific (or methods) variance over time. In this article, the authors compare four approaches to account for such method effects in LST models and discuss the strengths and weaknesses of each approach based on theoretical considerations, simulations, and applications to actual data sets. The simulation study revealed that the LST model with indicator-specific traits (Eid, 1996) and the LST model with M − 1 correlated method factors (Eid, Schneider, & Schwenkmezger, 1999) performed well, whereas the model with M orthogonal method factors used in the early work of Steyer, Ferring, and Schmitt (1992) and the correlated uniqueness approach (Kenny, 1976) showed limitations under conditions of either low or high method-specificity. Recommendations for the choice of an appropriate model are provided. PMID:22309958

  19. Improving hospital cost accounting with activity-based costing.

    PubMed

    Chan, Y C

    1993-01-01

    In this article, activity-based costing, an approach that has proved to be an improvement over the conventional costing system in product costing, is introduced. By combining activity-based costing with standard costing, health care administrators can better plan and control the costs of health services provided while ensuring that the organization's bottom line is healthy.

  20. Ecological accounting based on extended exergy: a sustainability perspective.

    PubMed

    Dai, Jing; Chen, Bin; Sciubba, Enrico

    2014-08-19

    The excessive energy consumption, environmental pollution, and ecological destruction problems have gradually become huge obstacles for the development of societal-economic-natural complex ecosystems. Regarding the national ecological-economic system, how to make explicit the resource accounting, diagnose the resource conversion, and measure the disturbance of environmental emissions to the systems are the fundamental basis of sustainable development and coordinated management. This paper presents an extended exergy (EE) accounting including the material exergy and exergy equivalent of externalities consideration in a systematic process from production to consumption, and China in 2010 is chosen as a case study to foster an in-depth understanding of the conflict between high-speed development and the available resources. The whole society is decomposed into seven sectors (i.e., Agriculture, Extraction, Conversion, Industry, Transportation, Tertiary, and Domestic sectors) according to their distinct characteristics. An adaptive EE accounting database, which incorporates traditional energy, renewable energy, mineral element, and other natural resources as well as resource-based secondary products, is constructed on the basis of the internal flows in the system. In addition, the environmental emission accounting has been adjusted to calculate the externalities-equivalent exergy. The results show that the EE value for the year 2010 in China was 1.80 × 10(14) MJ, which is greatly increased. Furthermore, an EE-based sustainability indices system has been established to provide an epitomized exploration for evaluating the performance of flows and storages with the system from a sustainability perspective. The value of the EE-based sustainability indicator was calculated to be 0.23, much lower than the critical value of 1, implying that China is still developing in the stages of high energy consumption and a low sustainability level.

  1. Water Accounting Plus (WA+) - a water accounting procedure for complex river basins based on satellite measurements

    NASA Astrophysics Data System (ADS)

    Karimi, P.; Bastiaanssen, W. G. M.; Molden, D.

    2013-07-01

    Coping with water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links depletion to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper, we introduce Water Accounting Plus (WA+), which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use and landscape evapotranspiration on the water cycle is described explicitly by defining land use groups with common characteristics. WA+ presents four sheets including (i) a resource base sheet, (ii) an evapotranspiration sheet, (iii) a productivity sheet, and (iv) a withdrawal sheet. Every sheet encompasses a set of indicators that summarise the overall water resources situation. The impact of external (e.g., climate change) and internal influences (e.g., infrastructure building) can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used to acquire a vast amount of required data but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  2. The Bottom-Up Simple Approach to School Accountability and Improvement.

    ERIC Educational Resources Information Center

    Carr, John; Artman, Elaine M.

    The School BUS (bottom-up simple) model for school accountability uses local standards-based assessment data to improve schoolwide performance during the year, and from year to year. This system derives its ideas from abundant professional literature about successful school renewal. A local accountability process should link changes in school…

  3. Pension Accounting and Reporting with Other Comprehensive Income and Deferred Taxes: A Worksheet Approach

    ERIC Educational Resources Information Center

    Jackson, Robert E.; Sneathen, L. Dwight, Jr.; Veal, Timothy R.

    2012-01-01

    This instructional tool presents pension accounting using a worksheet approach where debits equal credits for both the employer and for the plan. Transactions associated with the initiation of the plan through the end of the second year of the plan are presented, including their impact on accumulated other comprehensive income and deferred taxes.…

  4. 76 FR 20974 - Implications of Climate Change for Bioassessment Programs and Approaches To Account for Effects

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-14

    ... AGENCY Implications of Climate Change for Bioassessment Programs and Approaches To Account for Effects... external peer review workshop to review the external review draft report titled, ``Implications of Climate... by climate change. The study (1) Investigates the potential to identify biological response...

  5. Use of Resources, People and Approaches by Accounting Students in a Blending Learning Environment

    ERIC Educational Resources Information Center

    O'Keefe, Patricia; Rienks, Jane H.; Smith, Bernadette

    2014-01-01

    This research investigates how students used or "blended" the various learning resources, including people,while studying a compulsory, first year accounting unit. The unit design incorporated a blended learning approach. The study was motivated by perceived low rates of attendance and low levels of communication with lecturers which…

  6. A Comparison of the Learning Approaches of Accounting and Science Students at an Irish University

    ERIC Educational Resources Information Center

    Byrne, Marann; Finlayson, Odilla; Flood, Barbara; Lyons, Orla; Willis, Pauline

    2010-01-01

    One of the major challenges facing accounting education is the creation of a learning environment that promotes high-quality learning. Comparative research across disciplines offers educators the opportunity to gain a better understanding of the influence of contextual and personal variables on students' learning approaches. Using the Approaches…

  7. Desired emotions across cultures: A value-based account.

    PubMed

    Tamir, Maya; Schwartz, Shalom H; Cieciuch, Jan; Riediger, Michaela; Torres, Claudio; Scollon, Christie; Dzokoto, Vivian; Zhou, Xiaolu; Vishkin, Allon

    2016-07-01

    Values reflect how people want to experience the world; emotions reflect how people actually experience the world. Therefore, we propose that across cultures people desire emotions that are consistent with their values. Whereas prior research focused on the desirability of specific affective states or 1 or 2 target emotions, we offer a broader account of desired emotions. After reporting initial evidence for the potential causal effects of values on desired emotions in a preliminary study (N = 200), we tested the predictions of our proposed model in 8 samples (N = 2,328) from distinct world cultural regions. Across cultural samples, we found that people who endorsed values of self-transcendence (e.g., benevolence) wanted to feel more empathy and compassion, people who endorsed values of self-enhancement (e.g., power) wanted to feel more anger and pride, people who endorsed values of openness to change (e.g., self-direction) wanted to feel more interest and excitement, and people who endorsed values of conservation (e.g., tradition) wanted to feel more calmness and less fear. These patterns were independent of differences in emotional experience. We discuss the implications of our value-based account of desired emotions for understanding emotion regulation, culture, and other individual differences. (PsycINFO Database Record

  8. Communication: A combined periodic density functional and incremental wave-function-based approach for the dispersion-accounting time-resolved dynamics of ⁴He nanodroplets on surfaces: ⁴He/graphene.

    PubMed

    de Lara-Castells, María Pilar; Stoll, Hermann; Civalleri, Bartolomeo; Causà, Mauro; Voloshina, Elena; Mitrushchenkov, Alexander O; Pi, Martí

    2014-10-21

    In this work we propose a general strategy to calculate accurate He-surface interaction potentials. It extends the dispersionless density functional approach recently developed by Pernal et al. [Phys. Rev. Lett. 103, 263201 (2009)] to adsorbate-surface interactions by including periodic boundary conditions. We also introduce a scheme to parametrize the dispersion interaction by calculating two- and three-body dispersion terms at coupled cluster singles and doubles and perturbative triples (CCSD(T)) level via the method of increments [H. Stoll, J. Chem. Phys. 97, 8449 (1992)]. The performance of the composite approach is tested on (4)He/graphene by determining the energies of the low-lying selective adsorption states, finding an excellent agreement with the best available theoretical data. Second, the capability of the approach to describe dispersionless correlation effects realistically is used to extract dispersion effects in time-dependent density functional simulations on the collision of (4)He droplets with a single graphene sheet. It is found that dispersion effects play a key role in the fast spreading of the (4)He nanodroplet, the evaporation-like process of helium atoms, and the formation of solid-like helium structures. These characteristics are expected to be quite general and highly relevant to explain experimental measurements with the newly developed helium droplet mediated deposition technique.

  9. Communication: A combined periodic density functional and incremental wave-function-based approach for the dispersion-accounting time-resolved dynamics of {sup 4}He nanodroplets on surfaces: {sup 4}He/graphene

    SciTech Connect

    Lara-Castells, María Pilar de; Stoll, Hermann; Civalleri, Bartolomeo; Causà, Mauro; Voloshina, Elena; Mitrushchenkov, Alexander O.; Pi, Martí

    2014-10-21

    In this work we propose a general strategy to calculate accurate He–surface interaction potentials. It extends the dispersionless density functional approach recently developed by Pernal et al. [Phys. Rev. Lett. 103, 263201 (2009)] to adsorbate-surface interactions by including periodic boundary conditions. We also introduce a scheme to parametrize the dispersion interaction by calculating two- and three-body dispersion terms at coupled cluster singles and doubles and perturbative triples (CCSD(T)) level via the method of increments [H. Stoll, J. Chem. Phys. 97, 8449 (1992)]. The performance of the composite approach is tested on {sup 4}He/graphene by determining the energies of the low-lying selective adsorption states, finding an excellent agreement with the best available theoretical data. Second, the capability of the approach to describe dispersionless correlation effects realistically is used to extract dispersion effects in time-dependent density functional simulations on the collision of {sup 4}He droplets with a single graphene sheet. It is found that dispersion effects play a key role in the fast spreading of the {sup 4}He nanodroplet, the evaporation-like process of helium atoms, and the formation of solid-like helium structures. These characteristics are expected to be quite general and highly relevant to explain experimental measurements with the newly developed helium droplet mediated deposition technique.

  10. Faith-based evaluation: accountable to whom, for what?

    PubMed

    O'Connor, Mary Katherine; Netting, F Ellen

    2008-11-01

    Findings, issues, and lessons learned about program evaluation are examined from a national qualitative study of 15 faith-based human service programs targeting those in need in urban areas. Using a grounded theory design, five properties emerge as part of the evaluation network: (1) philosophy of accountability, (2) legitimacy, (3) evaluation design, (4) feedback loop, and (5) barriers to evaluation. While funders expect measurable outcomes to evaluate service effectiveness, respondents acknowledge other competing expectations of multiple constituents in religious and secular communities. What emerges is an excellent example of managing multiple program evaluation demands in programs that are particularly facile at process evaluation in the interest of quality service and relationship building. The article concludes with important lessons learned about the process of program evaluation.

  11. An Internet-Based Accounting Information Systems Project

    ERIC Educational Resources Information Center

    Miller, Louise

    2012-01-01

    This paper describes a student project assignment used in an accounting information systems course. We are now truly immersed in the internet age, and while many required accounting information systems courses and textbooks introduce database design, accounting software development, cloud computing, and internet security, projects involving the…

  12. How do the approaches to accountability compare for charities working in international development?

    PubMed

    Kirsch, David

    2014-09-01

    Approaches to accountability vary between charities working to reduce under-five mortality in underdeveloped countries, and healthcare workers and facilities in Canada. Comparison reveals key differences, similarities and trade-offs. For example, while health professionals are governed by legislation and healthcare facilities have a de facto obligation to be accredited, charities and other international organizations are not subject to mandatory international laws or guidelines or to de facto international standards. Charities have policy goals similar to those found in the Canadian substudies, including access, quality, cost control, cost-effectiveness and customer satisfaction. However, the relative absence of external policy tools means that these goals may not be realized. Accountability can be beneficial, but too much or the wrong kind of accountability can divert resources and diminish returns. PMID:25305397

  13. Territory management an appropriate approach for taking into account dynamic risks

    NASA Astrophysics Data System (ADS)

    Fernandez, M.; Ruegg, J.

    2012-04-01

    The territorial approach in risk analysis is well established in scientific communications in recent years, especially in the francophone literature. It is an especially appropriate approach for exploring a large number of criteria and factors influencing, on the territory, the composition of the vulnerabilities and risks. In these sense, this approach is appropriate to identify not only risks due to natural hazards but also social and environmental risks. Our case study explores the catastrophic landslide, a collapse of 6 millions cubic meters of rock in Los Chorros, in the municipality of San Cristobal Verapaz-Guatemala, in January 2009. We demonstrate that the same natural hazard has different consequences within this territory and may also increase or even create new vulnerabilities and risks for the population. The analysis shows that the same event can endanger various aspects of the territory: resources, functions (agriculture, or houses uses for example) and allocations and highlights the different types of vulnerabilities that land users (i.e., farmers, merchants transport drivers) face. To resolve a post-disaster situation, the actors choose one vulnerability among a set of vulnerabilities (in a multi-vulnerability context) and with this choice they define their own acceptable risk limits. To give an example, the transport driver choose to reduce the economic vulnerability when going to the local market and crossing the landslide (physical vulnerability). In the context of a developing country with weak development and limited resources, land users that become the Risk managers after the disaster are compelled to prioritize between different actions for reducing risks This study provides a novel approach to risk management by adding a political science and geography dimension through the territory approach for improving our understanding of multi-hazard and multi-risk management. Based on findings from this case study, this work asserts that risk is not

  14. A behavior-analytic account of depression and a case report using acceptance-based procedures

    PubMed Central

    Dougher, Michael J.; Hackbert, Lucianne

    1994-01-01

    Although roughly 6% of the general population is affected by depression at some time during their lifetime, the disorder has been relatively neglected by behavior analysts. The preponderance of research on the etiology and treatment of depression has been conducted by cognitive behavior theorists and biological psychiatrists and psychopharmacologists interested in the biological substrates of depression. These approaches have certainly been useful, but their reliance on cognitive and biological processes and their lack of attention to environment—behavior relations render them unsatisfactory from a behavior-analytic perspective. The purpose of this paper is to provide a behavior-analytic account of depression and to derive from this account several possible treatment interventions. In addition, case material is presented to illustrate an acceptance-based approach with a depressed client. PMID:22478195

  15. Creating Meaningful Accountability through Web-Based Electronic NCATE Exhibits.

    ERIC Educational Resources Information Center

    Salzman, Stephanie; Zimmerly, Chuck

    This paper presents the proactive steps taken by Idaho State University to address accountability in teacher education. The university addressed accountability mandates and new accreditation standards through a Web site (http://www.ed.isu.edu) that includes electronic documents providing evidence of meeting National Council for the Accreditation…

  16. Performance-Based Accountability: Newarks Charter School Experience.

    ERIC Educational Resources Information Center

    Callahan, Kathe; Sadovnik, Alan; Visconti, Louisa

    This study assessed how New Jersey's state accountability system encouraged or thwarted charter school success, how effectively performance standards were defined and enacted by authorizing agents, and how individual charter schools were developing accountability processes that made them more or less successful than their charter school…

  17. Acid/base account and minesoils: A review

    SciTech Connect

    Hossner, L.R.; Brandt, J.E.

    1997-12-31

    Generation of acidity from the oxidation of iron sulfides (FeS{sub 2}) is a common feature of geological materials exposed to the atmosphere by mining activities. Acid/base accounting (ABA) has been the primary method to evaluate the acid- or alkaline-potential of geological materials and to predict if weathering of these materials will have an adverse effect on terrestrial and aquatic environments. The ABA procedure has also been used to evaluate minesoils at different stages of weathering and, in some cases, to estimate lime requirements. Conflicting assessments of the methodology have been reported in the literature. The ABA is the fastest and easiest way to evaluate the acid-forming characteristics of overburden materials; however, accurate evaluations sometimes require that ABA data be examined in conjunction with additional sample information and results from other analytical procedures. The end use of ABA data, whether it be for minesoil evaluation or water quality prediction, will dictate the method`s interpretive criteria. Reaction kinetics and stoichiometry may vary and are not clearly defined for all situations. There is an increasing awareness of the potential for interfering compounds, particularly siderite (FeCO{sub 3}), to be present in geological materials associated with coal mines. Hardrock mines, with possible mixed sulfide mineralogy, offer a challenge to the ABA, since acid generation may be caused by minerals other than pyrite. A combination of methods, static and kinetic, is appropriate to properly evaluate the presence of acid-forming materials.

  18. A shape-based account for holistic face processing.

    PubMed

    Zhao, Mintao; Bülthoff, Heinrich H; Bülthoff, Isabelle

    2016-04-01

    Faces are processed holistically, so selective attention to 1 face part without any influence of the others often fails. In this study, 3 experiments investigated what type of facial information (shape or surface) underlies holistic face processing and whether generalization of holistic processing to nonexperienced faces requires extensive discrimination experience. Results show that facial shape information alone is sufficient to elicit the composite face effect (CFE), 1 of the most convincing demonstrations of holistic processing, whereas facial surface information is unnecessary (Experiment 1). The CFE is eliminated when faces differ only in surface but not shape information, suggesting that variation of facial shape information is necessary to observe holistic face processing (Experiment 2). Removing 3-dimensional (3D) facial shape information also eliminates the CFE, indicating the necessity of 3D shape information for holistic face processing (Experiment 3). Moreover, participants show similar holistic processing for faces with and without extensive discrimination experience (i.e., own- and other-race faces), suggesting that generalization of holistic processing to nonexperienced faces requires facial shape information, but does not necessarily require further individuation experience. These results provide compelling evidence that facial shape information underlies holistic face processing. This shape-based account not only offers a consistent explanation for previous studies of holistic face processing, but also suggests a new ground-in addition to expertise-for the generalization of holistic processing to different types of faces and to nonface objects.

  19. A Grid storage accounting system based on DGAS and HLRmon

    NASA Astrophysics Data System (ADS)

    Cristofori, A.; Fattibene, E.; Gaido, L.; Guarise, A.; Veronesi, P.

    2012-12-01

    Accounting in a production-level Grid infrastructure is of paramount importance in order to measure the utilization of the available resources. While several CPU accounting systems are deployed within the European Grid Infrastructure (EGI), storage accounting systems, stable enough to be adopted in a production environment are not yet available. As a consequence, there is a growing interest in storage accounting and work on this is being carried out in the Open Grid Forum (OGF) where a Usage Record (UR) definition suitable for storage resources has been proposed for standardization. In this paper we present a storage accounting system which is composed of three parts: a sensor layer, a data repository with a transport layer (Distributed Grid Accounting System - DGAS) and a web portal providing graphical and tabular reports (HLRmon). The sensor layer is responsible for the creation of URs according to the schema (described in this paper) that is currently being discussed within OGF. DGAS is one of the CPU accounting systems used within EGI, in particular by the Italian Grid Infrastructure (IGI) and some other National Grid Initiatives (NGIs) and projects. DGAS architecture is evolving in order to collect Usage Records for different types of resources. This improvement allows DGAS to be used as a ‘general’ data repository and transport layer. HLRmon is the web portal acting as an interface to DGAS. It has been improved to retrieve storage accounting data from the DGAS repository and create reports in an easy way. This is very useful not only for the Grid users and administrators but also for the stakeholders.

  20. Accounting Faculty Utilization of Web-Based Resources to Enhance In-Class Instruction

    ERIC Educational Resources Information Center

    Black, Thomas G.; Turetsky, Howard F.

    2010-01-01

    Our study examines the extent to which accounting faculty use web-based resources to augment classroom instruction. Moreover, we explore the effects of the institutional factors of accounting accreditation and the existence of an accounting Ph.D. program on internet use by accounting academics toward enhancing pedagogy, while controlling for the…

  1. An enhanced nonlinear damping approach accounting for system constraints in active mass dampers

    NASA Astrophysics Data System (ADS)

    Venanzi, Ilaria; Ierimonti, Laura; Ubertini, Filippo

    2015-11-01

    Active mass dampers are a viable solution for mitigating wind-induced vibrations in high-rise buildings and improve occupants' comfort. Such devices suffer particularly when they reach force saturation of the actuators and maximum extension of their stroke, which may occur in case of severe loading conditions (e.g. wind gust and earthquake). Exceeding actuators' physical limits can impair the control performance of the system or even lead to devices damage, with consequent need for repair or substitution of part of the control system. Controllers for active mass dampers should account for their technological limits. Prior work of the authors was devoted to stroke issues and led to the definition of a nonlinear damping approach, very easy to implement in practice. It consisted of a modified skyhook algorithm complemented with a nonlinear braking force to reverse the direction of the mass before reaching the stroke limit. This paper presents an enhanced version of this approach, also accounting for force saturation of the actuator and keeping the simplicity of implementation. This is achieved by modulating the control force by a nonlinear smooth function depending on the ratio between actuator's force and saturation limit. Results of a numerical investigation show that the proposed approach provides similar results to the method of the State Dependent Riccati Equation, a well-established technique for designing optimal controllers for constrained systems, yet very difficult to apply in practice.

  2. Statistical approaches to account for false-positive errors in environmental DNA samples.

    PubMed

    Lahoz-Monfort, José J; Guillera-Arroita, Gurutzeta; Tingley, Reid

    2016-05-01

    Environmental DNA (eDNA) sampling is prone to both false-positive and false-negative errors. We review statistical methods to account for such errors in the analysis of eDNA data and use simulations to compare the performance of different modelling approaches. Our simulations illustrate that even low false-positive rates can produce biased estimates of occupancy and detectability. We further show that removing or classifying single PCR detections in an ad hoc manner under the suspicion that such records represent false positives, as sometimes advocated in the eDNA literature, also results in biased estimation of occupancy, detectability and false-positive rates. We advocate alternative approaches to account for false-positive errors that rely on prior information, or the collection of ancillary detection data at a subset of sites using a sampling method that is not prone to false-positive errors. We illustrate the advantages of these approaches over ad hoc classifications of detections and provide practical advice and code for fitting these models in maximum likelihood and Bayesian frameworks. Given the severe bias induced by false-negative and false-positive errors, the methods presented here should be more routinely adopted in eDNA studies.

  3. Integrating Mission-Based Values into Accounting Curriculum: Catholic Social Teaching and Introductory Accounting

    ERIC Educational Resources Information Center

    Hise, Joan Vane; Koeplin, John P.

    2010-01-01

    This paper presents several reasons why mission-based values, in this case Catholic Social Teaching (CST), should be incorporated into a university business curriculum. The CST tenets include the sanctity of human life; call to family, community, and participation; rights and responsibilities; option for the poor and vulnerable; the dignity of…

  4. A Goal-Based Approach to African Language Instruction.

    ERIC Educational Resources Information Center

    Folarin-Schleicher, Antonia

    1999-01-01

    Discusses a goal-based approach to African language instruction, which seeks to integrate the overall interests and specific goals of the learner with the language learning and teaching objectives. Shows why content based and language across the curriculum programs may not adequately account for students' needs and interests. (Author/VWL)

  5. Pharmacokinetic Modeling of Manganese III. Physiological Approaches Accounting for Background and Tracer Kinetics

    SciTech Connect

    Teeguarden, Justin G.; Gearhart, Jeffrey; Clewell, III, H. J.; Covington, Tammie R.; Nong, Andy; Anderson, Melvin E.

    2007-01-01

    Manganese (Mn) is an essential nutrient. Mn deficiency is associated with altered lipid (Kawano et al. 1987) and carbohydrate metabolism (Baly et al. 1984; Baly et al. 1985), abnormal skeletal cartilage development (Keen et al. 2000), decreased reproductive capacity, and brain dysfunction. Occupational and accidental inhalation exposures to aerosols containing high concentrations of Mn produce neurological symptoms with Parkinson-like characteristics in workers. At present, there is also concern about use of the manganese-containing compound, methylcyclopentadienyl manganese tricarbonyl (MMT), in unleaded gasoline as an octane enhancer. Combustion of MMT produces aerosols containing a mixture of manganese salts (Lynam et al. 1999). These Mn particulates may be inhaled at low concentrations by the general public in areas using MMT. Risk assessments for essential elements need to acknowledge that risks occur with either excesses or deficiencies and the presence of significant amounts of these nutrients in the body even in the absence of any exogenous exposures. With Mn there is an added complication, i.e., the primary risk is associated with inhalation while Mn is an essential dietary nutrient. Exposure standards for inhaled Mn will need to consider the substantial background uptake from normal ingestion. Andersen et al. (1999) suggested a generic approach for essential nutrient risk assessment. An acceptable exposure limit could be based on some ‘tolerable’ change in tissue concentration in normal and exposed individuals, i.e., a change somewhere from 10 to 25 % of the individual variation in tissue concentration seen in a large human population. A reliable multi-route, multi-species pharmacokinetic model would be necessary for the implementation of this type of dosimetry-based risk assessment approach for Mn. Physiologically-based pharmacokinetic (PBPK) models for various xenobiotics have proven valuable in contributing to a variety of chemical specific risk

  6. Holding Schools Accountable: Performance-Based Reform in Education.

    ERIC Educational Resources Information Center

    Ladd, Helen F., Ed.

    Many people believe that future reforms of education should focus on the primary mission of elementary and secondary schools and that these schools must be held more accountable for the academic performance of their students. This book brings together researchers from various disciplines--most notably economics, educational policy and management,…

  7. Standards-Based Accountability: Reification, Responsibility and the Ethical Subject

    ERIC Educational Resources Information Center

    Kostogriz, Alex; Doecke, Brenton

    2011-01-01

    Over the last two decades, teachers in Australia have witnessed multiple incarnations of the idea of "educational accountability" and its enactment. Research into this phenomenon of educational policy and practice has revealed various layers of the concept, particularly its professional, bureaucratic, political and cultural dimensions that are…

  8. Accounting for water management issues within hydrological simulation: Alternative modelling options and a network optimization approach

    NASA Astrophysics Data System (ADS)

    Efstratiadis, Andreas; Nalbantis, Ioannis; Rozos, Evangelos; Koutsoyiannis, Demetris

    2010-05-01

    In mixed natural and artificialized river basins, many complexities arise due to anthropogenic interventions in the hydrological cycle, including abstractions from surface water bodies, groundwater pumping or recharge and water returns through drainage systems. Typical engineering approaches adopt a multi-stage modelling procedure, with the aim to handle the complexity of process interactions and the lack of measured abstractions. In such context, the entire hydrosystem is separated into natural and artificial sub-systems or components; the natural ones are modelled individually, and their predictions (i.e. hydrological fluxes) are transferred to the artificial components as inputs to a water management scheme. To account for the interactions between the various components, an iterative procedure is essential, whereby the outputs of the artificial sub-systems (i.e. abstractions) become inputs to the natural ones. However, this strategy suffers from multiple shortcomings, since it presupposes that pure natural sub-systems can be located and that sufficient information is available for each sub-system modelled, including suitable, i.e. "unmodified", data for calibrating the hydrological component. In addition, implementing such strategy is ineffective when the entire scheme runs in stochastic simulation mode. To cope with the above drawbacks, we developed a generalized modelling framework, following a network optimization approach. This originates from the graph theory, which has been successfully implemented within some advanced computer packages for water resource systems analysis. The user formulates a unified system which is comprised of the hydrographical network and the typical components of a water management network (aqueducts, pumps, junctions, demand nodes etc.). Input data for the later include hydraulic properties, constraints, targets, priorities and operation costs. The real-world system is described through a conceptual graph, whose dummy properties

  9. The Promise of Dynamic Systems Approaches for an Integrated Account of Human Development.

    ERIC Educational Resources Information Center

    Lewis, Marc D.

    2000-01-01

    Argues that dynamic systems approaches may provide an explanatory framework based on general scientific principles for developmental psychology, using principles of self-organization to explain how novel forms emerge without predetermination and become increasingly complex with development. Contends that self-organization provides a single…

  10. Australian Rural Accountants' Views on How Locally Provided CPD Compares with City-Based Provision

    ERIC Educational Resources Information Center

    Halabi, Abdel K.

    2015-01-01

    This paper analyses Australian rural accountants' attitudes and levels of satisfaction with continuing professional development (CPD), based on whether the CPD was delivered by a professional accounting body in a rural or metropolitan area. The paper responds to prior research that finds rural accountants are dissatisfied with professional…

  11. A user-friendly approach to cost accounting in laboratory animal facilities.

    PubMed

    Baker, David G

    2011-08-19

    Cost accounting is an essential management activity for laboratory animal facility management. In this report, the author describes basic principles of cost accounting and outlines steps for carrying out cost accounting in laboratory animal facilities. Methods of post hoc cost accounting analysis for maximizing the efficiency of facility operations are also described.

  12. Accounting for non-independent detection when estimating abundance of organisms with a Bayesian approach

    USGS Publications Warehouse

    Martin, Julien; Royle, J. Andrew; MacKenzie, Darryl I.; Edwards, Holly H.; Kery, Marc; Gardner, Beth

    2011-01-01

    Summary 1. Binomial mixture models use repeated count data to estimate abundance. They are becoming increasingly popular because they provide a simple and cost-effective way to account for imperfect detection. However, these models assume that individuals are detected independently of each other. This assumption may often be violated in the field. For instance, manatees (Trichechus manatus latirostris) may surface in turbid water (i.e. become available for detection during aerial surveys) in a correlated manner (i.e. in groups). However, correlated behaviour, affecting the non-independence of individual detections, may also be relevant in other systems (e.g. correlated patterns of singing in birds and amphibians). 2. We extend binomial mixture models to account for correlated behaviour and therefore to account for non-independent detection of individuals. We simulated correlated behaviour using beta-binomial random variables. Our approach can be used to simultaneously estimate abundance, detection probability and a correlation parameter. 3. Fitting binomial mixture models to data that followed a beta-binomial distribution resulted in an overestimation of abundance even for moderate levels of correlation. In contrast, the beta-binomial mixture model performed considerably better in our simulation scenarios. We also present a goodness-of-fit procedure to evaluate the fit of beta-binomial mixture models. 4. We illustrate our approach by fitting both binomial and beta-binomial mixture models to aerial survey data of manatees in Florida. We found that the binomial mixture model did not fit the data, whereas there was no evidence of lack of fit for the beta-binomial mixture model. This example helps illustrate the importance of using simulations and assessing goodness-of-fit when analysing ecological data with N-mixture models. Indeed, both the simulations and the goodness-of-fit procedure highlighted the limitations of the standard binomial mixture model for aerial

  13. Materiality in a Practice-Based Approach

    ERIC Educational Resources Information Center

    Svabo, Connie

    2009-01-01

    Purpose: The paper aims to provide an overview of the vocabulary for materiality which is used by practice-based approaches to organizational knowing. Design/methodology/approach: The overview is theoretically generated and is based on the anthology Knowing in Organizations: A Practice-based Approach edited by Nicolini, Gherardi and Yanow. The…

  14. Memory-Based Approaches and Beyond

    ERIC Educational Resources Information Center

    Sanford, Anthony J.; Garrod, Simon C.

    2005-01-01

    In this article, we discuss 2 issues that we believe any theory of discourse comprehension has to take account of-accessing irrelevant information and granularity. Along the lines that have been suggested as demonstrating the memory-based account, we describe some work in favor of the recruitment of apparently irrelevant information from memory…

  15. Accounting for the tongue-and-groove effect using a robust direct aperture optimization approach

    SciTech Connect

    Salari, Ehsan; Men Chunhua; Romeijn, H. Edwin

    2011-03-15

    Purpose: Traditionally, the tongue-and-groove effect due to the multileaf collimator architecture in intensity-modulated radiation therapy (IMRT) has typically been deferred to the leaf sequencing stage. The authors propose a new direct aperture optimization method for IMRT treatment planning that explicitly incorporates dose calculation inaccuracies due to the tongue-and-groove effect into the treatment plan optimization stage. Methods: The authors avoid having to accurately estimate the dosimetric effects of the tongue-and-groove architecture by using lower and upper bounds on the dose distribution delivered to the patient. They then develop a model that yields a treatment plan that is robust with respect to the corresponding dose calculation inaccuracies. Results: Tests on a set of ten clinical head-and-neck cancer cases demonstrate the effectiveness of the new method in developing robust treatment plans with tight dose distributions in targets and critical structures. This is contrasted with the very loose bounds on the dose distribution that are obtained by solving a traditional treatment plan optimization model that ignores tongue-and-groove effects in the treatment planning stage. Conclusions: A robust direct aperture optimization approach is proposed to account for the dosimetric inaccuracies caused by the tongue-and-groove effect. The experiments validate the ability of the proposed approach in designing robust treatment plans regardless of the exact consequences of the tongue-and-groove architecture.

  16. Mindfulness meditation-based pain relief: a mechanistic account.

    PubMed

    Zeidan, Fadel; Vago, David R

    2016-06-01

    Pain is a multidimensional experience that involves interacting sensory, cognitive, and affective factors, rendering the treatment of chronic pain challenging and financially burdensome. Further, the widespread use of opioids to treat chronic pain has led to an opioid epidemic characterized by exponential growth in opioid misuse and addiction. The staggering statistics related to opioid use highlight the importance of developing, testing, and validating fast-acting nonpharmacological approaches to treat pain. Mindfulness meditation is a technique that has been found to significantly reduce pain in experimental and clinical settings. The present review delineates findings from recent studies demonstrating that mindfulness meditation significantly attenuates pain through multiple, unique mechanisms-an important consideration for the millions of chronic pain patients seeking narcotic-free, self-facilitated pain therapy. PMID:27398643

  17. One Paradox in District Accountability and Site-Based Management.

    ERIC Educational Resources Information Center

    Shellman, David W.

    The paradox of site-based school management with use of standardized tests or instructional management systems that restrict teacher choices was evident in one school district in North Carolina in which measurement of student success has centered on student performance on state-mandated tests. A study was conducted to see if students whose…

  18. Performance-Based Measurement: Action for Organizations and HPT Accountability

    ERIC Educational Resources Information Center

    Larbi-Apau, Josephine A.; Moseley, James L.

    2010-01-01

    Basic measurements and applications of six selected general but critical operational performance-based indicators--effectiveness, efficiency, productivity, profitability, return on investment, and benefit-cost ratio--are presented. With each measurement, goals and potential impact are explored. Errors, risks, limitations to measurements, and a…

  19. Students' Concern about Indebtedness: A Rank Based Social Norms Account

    ERIC Educational Resources Information Center

    Aldrovandi, Silvio; Wood, Alex M.; Maltby, John; Brown, Gordon D. A.

    2015-01-01

    This paper describes a new model of students' concern about indebtedness within a rank-based social norms framework. Study 1 found that students hold highly variable beliefs about how much other students will owe at the end of their degree. Students' concern about their own anticipated debt--and their intention of taking on a part-time job during…

  20. Accountability to Public Stakeholders in Watershed-Based Restoration

    EPA Science Inventory

    There is an increasing push at the federal, state, and local levels for watershed-based conservation projects. These projects work to address water quality issues in degraded waterways through the implementation of a suite of best management practices on land throughout a watersh...

  1. School-Based Budgets: Getting, Spending, and Accounting.

    ERIC Educational Resources Information Center

    Herman, Jerry L.; Herman, Janice L.

    With the advent of large interest in school-based management came the task of inventing a different type of budgeting system--one that delegated the many tasks of developing a budget, expending the allocated funds, and controlling those expenditures in a way that did not exceed the allocation to the site level. This book explores the various means…

  2. Measuring health system performance: A new approach to accountability and quality improvement in New Zealand.

    PubMed

    Ashton, Toni

    2015-08-01

    In February 2014, the New Zealand Ministry of Health released a new framework for measuring the performance of the New Zealand health system. The two key aims are to strengthen accountability to taxpayers and to lift the performance of the system's component parts using a 'whole-of-system' approach to performance measurement. Development of this new framework--called the Integrated Performance and Incentive Framework (IPIF)--was stimulated by a need for a performance management framework which reflects the health system as a whole, which encourages primary and secondary providers to work towards the same end, and which incorporates the needs and priorities of local communities. Measures within the IPIF will be set at two levels: the system level, where measures are set nationally, and the local district level, where measures which contribute towards the system level indicators will be selected by local health alliances. In the first year, the framework applies only at the system level and only to primary health care services. It will continue to be developed over time and will gradually be extended to cover a wide range of health and disability services. The success of the IPIF in improving health sector performance depends crucially on the willingness of health sector personnel to engage closely with the measurement process.

  3. A blue/green water-based accounting framework for assessment of water security

    NASA Astrophysics Data System (ADS)

    Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.

    2014-09-01

    A comprehensive assessment of water security can incorporate several water-related concepts, while accounting for Blue and Green Water (BW and GW) types defined in accordance with the hydrological processes involved. Here we demonstrate how a quantitative analysis of provision probability and use of BW and GW can be conducted, so as to provide indicators of water scarcity and vulnerability at the basin level. To illustrate the approach, we use the Soil and Water Assessment Tool (SWAT) to model the hydrology of an agricultural basin (291 km2) within the Cantareira Water Supply System in Brazil. To provide a more comprehensive basis for decision making, we analyze the BW and GW-Footprint components against probabilistic levels (50th and 30th percentile) of freshwater availability for human activities, during a 23 year period. Several contrasting situations of BW provision are distinguished, using different hydrological-based methodologies for specifying monthly Environmental Flow Requirements (EFRs), and the risk of natural EFR violation is evaluated by use of a freshwater provision index. Our results reveal clear spatial and temporal patterns of water scarcity and vulnerability levels within the basin. Taking into account conservation targets for the basin, it appears that the more restrictive EFR methods are more appropriate than the method currently employed at the study basin. The blue/green water-based accounting framework developed here provides a useful integration of hydrologic, ecosystem and human needs information on a monthly basis, thereby improving our understanding of how and where water-related threats to human and aquatic ecosystem security can arise.

  4. A Comparative Study of the Effect of Web-Based versus In-Class Textbook Ethics Instruction on Accounting Students' Propensity to Whistle-Blow

    ERIC Educational Resources Information Center

    McManus, Lisa; Subramaniam, Nava; James, Wendy

    2012-01-01

    The authors examined whether accounting students' propensity to whistle-blow differed between those instructed through a web-based teaching module and those exposed to a traditional in-class textbook-focused approach. A total of 156 students from a second-year financial accounting course participated in the study. Ninety students utilized the…

  5. Accounting for Heaping in Retrospectively Reported Event Data – A Mixture-Model Approach

    PubMed Central

    Bar, Haim Y.; Lillard, Dean R.

    2012-01-01

    When event data are retrospectively reported, more temporally distal events tend to get “heaped” on even multiples of reporting units. Heaping may introduce a type of attenuation bias because it causes researchers to mismatch time-varying right-hand side variables. We develop a model-based approach to estimate the extent of heaping in the data, and how it affects regression parameter estimates. We use smoking cessation data as a motivating example, but our method is general. It facilitates the use of retrospective data from the multitude of cross-sectional and longitudinal studies worldwide that collect and potentially could collect event data. PMID:22733577

  6. Accounting for heaping in retrospectively reported event data - a mixture-model approach.

    PubMed

    Bar, Haim Y; Lillard, Dean R

    2012-11-30

    When event data are retrospectively reported, more temporally distal events tend to get 'heaped' on even multiples of reporting units. Heaping may introduce a type of attenuation bias because it causes researchers to mismatch time-varying right-hand side variables. We develop a model-based approach to estimate the extent of heaping in the data and how it affects regression parameter estimates. We use smoking cessation data as a motivating example, but our method is general. It facilitates the use of retrospective data from the multitude of cross-sectional and longitudinal studies worldwide that collect and potentially could collect event data.

  7. Working toward More Engaged and Successful Accounting Students: A Balanced Scorecard Approach

    ERIC Educational Resources Information Center

    Fredin, Amy; Fuchsteiner, Peter; Portz, Kris

    2015-01-01

    Prior research indicates that student engagement is the key to student success, as measured by college grades, degree completion, and graduate school enrollment. We propose a set of goals and objectives for accounting students, in particular, to help them become engaged not only in the educational process, but also in the accounting profession.…

  8. A Practical Approach to Accountability in an Oklahoma School. Project SEEK.

    ERIC Educational Resources Information Center

    Southwest Oklahoma Region 14 Service Center, Elk City.

    This booklet presents the accountability program developed by the Elk City (Oklahoma) Public Schools. During the first year of the program ten broad educational goals were formulated through a series of administrator workshops, accountability committee meetings, informal surveys of the community, and questionnaires for teachers and students.…

  9. A Total Quality Management Approach to Assurance of Learning in the Accounting Classroom: An Empirical Study

    ERIC Educational Resources Information Center

    Harvey, Mary Ellen; Eisner, Susan

    2011-01-01

    The research presented in this paper seeks to discern which combination of pedagogical tools most positively impact student learning of the introductory Accounting curriculum in the Principles of Accounting courses in a 4-year U.S. public college. This research topic is relevant because it helps address a quandary many instructors experience: how…

  10. Measure for Measure: How Proficiency-Based Accountability Systems Affect Inequality in Academic Achievement

    ERIC Educational Resources Information Center

    Jennings, Jennifer; Sohn, Heeju

    2014-01-01

    How do proficiency-based accountability systems affect inequality in academic achievement? This article reconciles mixed findings in the literature by demonstrating that three factors jointly determine accountability's impact. First, by analyzing student-level data from a large urban school district, we find that when educators face…

  11. A network identity authentication protocol of bank account system based on fingerprint identification and mixed encryption

    NASA Astrophysics Data System (ADS)

    Zhu, Lijuan; Liu, Jingao

    2013-07-01

    This paper describes a network identity authentication protocol of bank account system based on fingerprint identification and mixed encryption. This protocol can provide every bank user a safe and effective way to manage his own bank account, and also can effectively prevent the hacker attacks and bank clerk crime, so that it is absolute to guarantee the legitimate rights and interests of bank users.

  12. ASPEN--A Web-Based Application for Managing Student Server Accounts

    ERIC Educational Resources Information Center

    Sandvig, J. Christopher

    2004-01-01

    The growth of the Internet has greatly increased the demand for server-side programming courses at colleges and universities. Students enrolled in such courses must be provided with server-based accounts that support the technologies that they are learning. The process of creating, managing and removing large numbers of student server accounts is…

  13. 24 CFR 990.280 - Project-based budgeting and accounting.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... in this subpart (including project-based management, budgeting, and accounting). Asset management... accounting. 990.280 Section 990.280 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN... URBAN DEVELOPMENT THE PUBLIC HOUSING OPERATING FUND PROGRAM Asset Management § 990.280...

  14. 24 CFR 990.280 - Project-based budgeting and accounting.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... in this subpart (including project-based management, budgeting, and accounting). Asset management... accounting. 990.280 Section 990.280 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN... URBAN DEVELOPMENT THE PUBLIC HOUSING OPERATING FUND PROGRAM Asset Management § 990.280...

  15. 24 CFR 990.280 - Project-based budgeting and accounting.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... in this subpart (including project-based management, budgeting, and accounting). Asset management... accounting. 990.280 Section 990.280 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN... URBAN DEVELOPMENT THE PUBLIC HOUSING OPERATING FUND PROGRAM Asset Management § 990.280...

  16. 24 CFR 990.280 - Project-based budgeting and accounting.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... in this subpart (including project-based management, budgeting, and accounting). Asset management... accounting. 990.280 Section 990.280 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN... URBAN DEVELOPMENT THE PUBLIC HOUSING OPERATING FUND PROGRAM Asset Management § 990.280...

  17. The Tension between Teacher Accountability and Flexibility: The Paradox of Standards-Based Reform

    ERIC Educational Resources Information Center

    Nadelson, Louis S.; Fuller, Michael; Briggs, Pamela; Hammons, David; Bubak, Katie; Sass, Margaret

    2012-01-01

    The anticipated constraints imposed by the accountability process associated with standards-based reform on teachers' practice suggest a tension between teachers' desire for flexibility and the accountability mandates associated with reform initiatives. In particular, we posited that the teachers would negatively perceive the influence of…

  18. The Effect of Web-Based Collaborative Learning Methods to the Accounting Courses in Technical Education

    ERIC Educational Resources Information Center

    Cheng, K. W. Kevin

    2009-01-01

    This study mainly explored the effect of applying web-based collaborative learning instruction to the accounting curriculum on student's problem-solving attitudes in Technical Education. The research findings and proposed suggestions would serve as a reference for the development of accounting-related curricula and teaching strategies. To achieve…

  19. Does Participation in a Computer-Based Learning Program in Introductory Financial Accounting Course Lead to Choosing Accounting as a Major?

    ERIC Educational Resources Information Center

    Owhoso, Vincent; Malgwi, Charles A.; Akpomi, Margaret

    2014-01-01

    The authors examine whether students who completed a computer-based intervention program, designed to help them develop abilities and skills in introductory accounting, later declared accounting as a major. A sample of 1,341 students participated in the study, of which 74 completed the intervention program (computer-based assisted learning [CBAL])…

  20. A Review of Financial Accounting Fraud Detection based on Data Mining Techniques

    NASA Astrophysics Data System (ADS)

    Sharma, Anuj; Kumar Panigrahi, Prabin

    2012-02-01

    With an upsurge in financial accounting fraud in the current economic scenario experienced, financial accounting fraud detection (FAFD) has become an emerging topic of great importance for academic, research and industries. The failure of internal auditing system of the organization in identifying the accounting frauds has lead to use of specialized procedures to detect financial accounting fraud, collective known as forensic accounting. Data mining techniques are providing great aid in financial accounting fraud detection, since dealing with the large data volumes and complexities of financial data are big challenges for forensic accounting. This paper presents a comprehensive review of the literature on the application of data mining techniques for the detection of financial accounting fraud and proposes a framework for data mining techniques based accounting fraud detection. The systematic and comprehensive literature review of the data mining techniques applicable to financial accounting fraud detection may provide a foundation to future research in this field. The findings of this review show that data mining techniques like logistic models, neural networks, Bayesian belief network, and decision trees have been applied most extensively to provide primary solutions to the problems inherent in the detection and classification of fraudulent data.

  1. Combating QR-Code-Based Compromised Accounts in Mobile Social Networks

    PubMed Central

    Guo, Dong; Cao, Jian; Wang, Xiaoqi; Fu, Qiang; Li, Qiang

    2016-01-01

    Cyber Physical Social Sensing makes mobile social networks (MSNs) popular with users. However, such attacks are rampant as malicious URLs are spread covertly through quick response (QR) codes to control compromised accounts in MSNs to propagate malicious messages. Currently, there are generally two types of methods to identify compromised accounts in MSNs: one type is to analyze the potential threats on wireless access points and the potential threats on handheld devices’ operation systems so as to stop compromised accounts from spreading malicious messages; the other type is to apply the method of detecting compromised accounts in online social networks to MSNs. The above types of methods above focus neither on the problems of MSNs themselves nor on the interaction of sensors’ messages, which leads to the restrictiveness of platforms and the simplification of methods. In order to stop the spreading of compromised accounts in MSNs effectively, the attacks have to be traced to their sources first. Through sensors, users exchange information in MSNs and acquire information by scanning QR codes. Therefore, analyzing the traces of sensor-related information helps to identify the compromised accounts in MSNs. This paper analyzes the diversity of information sending modes of compromised accounts and normal accounts, analyzes the regularity of GPS (Global Positioning System)-based location information, and introduces the concepts of entropy and conditional entropy so as to construct an entropy-based model based on machine learning strategies. To achieve the goal, about 500,000 accounts of Sina Weibo and about 100 million corresponding messages are collected. Through the validation, the accuracy rate of the model is proved to be as high as 87.6%, and the false positive rate is only 3.7%. Meanwhile, the comparative experiments of the feature sets prove that sensor-based location information can be applied to detect the compromised accounts in MSNs. PMID:27657071

  2. A Blue/Green Water-based Accounting Framework for Assessment of Water Security

    NASA Astrophysics Data System (ADS)

    Rodrigues, D. B.; Gupta, H. V.; Mendiondo, E. M.

    2013-12-01

    A comprehensive assessment of water security can incorporate several water-related concepts, including provisioning and support for freshwater ecosystem services, water footprint, water scarcity, and water vulnerability, while accounting for Blue and Green Water (BW and GW) flows defined in accordance with the hydrological processes involved. Here, we demonstrate how a quantitative analysis of provisioning and demand (in terms of water footprint) for BW and GW ecosystem services can be conducted, so as to provide indicators of water scarcity and vulnerability at the basin level. To illustrate the approach, we use the Soil and Water Assessment Tool (SWAT) to model the hydrology of an agricultural basin (291 sq.km) within the Cantareira water supply system in Brazil. To provide a more comprehensive basis for decision-making, we compute the BW provision using three different hydrological-based methods for specifying monthly Environmental Flow Requirements (EFRs) for 23 year-period. The current BW-Footprint was defined using surface water rights for reference year 2012. Then we analyzed the BW- and GW-Footprints against long-term series of monthly values of freshwater availability. Our results reveal clear spatial and temporal patterns of water scarcity and vulnerability levels within the basin, and help to distinguish between human and natural reasons (drought) for conditions of insecurity. The Blue/Green water-based accounting framework developed here can be benchmarked at a range of spatial scales, thereby improving our understanding of how and where water-related threats to human and aquatic ecosystem security can arise. Future investigation will be necessary to better understand the intra-annual variability of blue water demand and to evaluate the impacts of uncertainties associated with a) the water rights database, b) the effects of climate change projections on blue and green freshwater provision.

  3. Analyzing the Operation of Performance-Based Accountability Systems for Public Services. Technical Report

    ERIC Educational Resources Information Center

    Camm, Frank; Stecher, Brian M.

    2010-01-01

    Empirical evidence of the effects of performance-based public management is scarce. This report describes a framework used to organize available empirical information on one form of performance-based management, a performance-based accountability system (PBAS). Such a system identifies individuals or organizations that must change their behavior…

  4. Contrasting motivational orientation and evaluative coding accounts: on the need to differentiate the effectors of approach/avoidance responses

    PubMed Central

    Kozlik, Julia; Neumann, Roland; Lozo, Ljubica

    2015-01-01

    Several emotion theorists suggest that valenced stimuli automatically trigger motivational orientations and thereby facilitate corresponding behavior. Positive stimuli were thought to activate approach motivational circuits which in turn primed approach-related behavioral tendencies whereas negative stimuli were supposed to activate avoidance motivational circuits so that avoidance-related behavioral tendencies were primed (motivational orientation account). However, recent research suggests that typically observed affective stimulus–response compatibility phenomena might be entirely explained in terms of theories accounting for mechanisms of general action control instead of assuming motivational orientations to mediate the effects (evaluative coding account). In what follows, we explore to what extent this notion is applicable. We present literature suggesting that evaluative coding mechanisms indeed influence a wide variety of affective stimulus–response compatibility phenomena. However, the evaluative coding account does not seem to be sufficient to explain affective S–R compatibility effects. Instead, several studies provide clear evidence in favor of the motivational orientation account that seems to operate independently of evaluative coding mechanisms. Implications for theoretical developments and future research designs are discussed. PMID:25983718

  5. Contrasting motivational orientation and evaluative coding accounts: on the need to differentiate the effectors of approach/avoidance responses.

    PubMed

    Kozlik, Julia; Neumann, Roland; Lozo, Ljubica

    2015-01-01

    Several emotion theorists suggest that valenced stimuli automatically trigger motivational orientations and thereby facilitate corresponding behavior. Positive stimuli were thought to activate approach motivational circuits which in turn primed approach-related behavioral tendencies whereas negative stimuli were supposed to activate avoidance motivational circuits so that avoidance-related behavioral tendencies were primed (motivational orientation account). However, recent research suggests that typically observed affective stimulus-response compatibility phenomena might be entirely explained in terms of theories accounting for mechanisms of general action control instead of assuming motivational orientations to mediate the effects (evaluative coding account). In what follows, we explore to what extent this notion is applicable. We present literature suggesting that evaluative coding mechanisms indeed influence a wide variety of affective stimulus-response compatibility phenomena. However, the evaluative coding account does not seem to be sufficient to explain affective S-R compatibility effects. Instead, several studies provide clear evidence in favor of the motivational orientation account that seems to operate independently of evaluative coding mechanisms. Implications for theoretical developments and future research designs are discussed.

  6. Contrasting motivational orientation and evaluative coding accounts: on the need to differentiate the effectors of approach/avoidance responses.

    PubMed

    Kozlik, Julia; Neumann, Roland; Lozo, Ljubica

    2015-01-01

    Several emotion theorists suggest that valenced stimuli automatically trigger motivational orientations and thereby facilitate corresponding behavior. Positive stimuli were thought to activate approach motivational circuits which in turn primed approach-related behavioral tendencies whereas negative stimuli were supposed to activate avoidance motivational circuits so that avoidance-related behavioral tendencies were primed (motivational orientation account). However, recent research suggests that typically observed affective stimulus-response compatibility phenomena might be entirely explained in terms of theories accounting for mechanisms of general action control instead of assuming motivational orientations to mediate the effects (evaluative coding account). In what follows, we explore to what extent this notion is applicable. We present literature suggesting that evaluative coding mechanisms indeed influence a wide variety of affective stimulus-response compatibility phenomena. However, the evaluative coding account does not seem to be sufficient to explain affective S-R compatibility effects. Instead, several studies provide clear evidence in favor of the motivational orientation account that seems to operate independently of evaluative coding mechanisms. Implications for theoretical developments and future research designs are discussed. PMID:25983718

  7. Safe Maneuvering Envelope Estimation Based on a Physical Approach

    NASA Technical Reports Server (NTRS)

    Lombaerts, Thomas J. J.; Schuet, Stefan R.; Wheeler, Kevin R.; Acosta, Diana; Kaneshige, John T.

    2013-01-01

    This paper discusses a computationally efficient algorithm for estimating the safe maneuvering envelope of damaged aircraft. The algorithm performs a robust reachability analysis through an optimal control formulation while making use of time scale separation and taking into account uncertainties in the aerodynamic derivatives. This approach differs from others since it is physically inspired. This more transparent approach allows interpreting data in each step, and it is assumed that these physical models based upon flight dynamics theory will therefore facilitate certification for future real life applications.

  8. Financial Management Reforms in the Health Sector: A Comparative Study Between Cash-based and Accrual-based Accounting Systems

    PubMed Central

    Abolhallaje, Masoud; Jafari, Mehdi; Seyedin, Hesam; Salehi, Masoud

    2014-01-01

    Background: Financial management and accounting reform in the public sectors was started in 2000. Moving from cash-based to accrual-based is considered as the key component of these reforms and adjustments in the public sector. Performing this reform in the health system is a part of a bigger reform under the new public management. Objectives: The current study aimed to analyze the movement from cash-based to accrual-based accounting in the health sector in Iran. Patients and Methods: This comparative study was conducted in 2013 to compare financial management and movement from cash-based to accrual-based accounting in health sector in the countries such as the United States, Britain, Canada, Australia, New Zealand, and Iran. Library resources and reputable databases such as Medline, Elsevier, Index Copernicus, DOAJ, EBSCO-CINAHL and SID, and Iranmedex were searched. Fish cards were used to collect the data. Data were compared and analyzed using comparative tables. Results: Developed countries have implemented accrual-based accounting and utilized the valid, reliable and practical information in accrual-based reporting in different areas such as price and tariffs setting, operational budgeting, public accounting, performance evaluation and comparison and evidence based decision making. In Iran, however, only a few public organizations such as the municipalities and the universities of medical sciences use accrual-based accounting, but despite what is required by law, the other public organizations do not use accrual-based accounting. Conclusions: There are advantages in applying accrual-based accounting in the public sector which certainly depends on how this system is implemented in the sector. PMID:25763194

  9. Toxin-Based Therapeutic Approaches

    PubMed Central

    Shapira, Assaf; Benhar, Itai

    2010-01-01

    Protein toxins confer a defense against predation/grazing or a superior pathogenic competence upon the producing organism. Such toxins have been perfected through evolution in poisonous animals/plants and pathogenic bacteria. Over the past five decades, a lot of effort has been invested in studying their mechanism of action, the way they contribute to pathogenicity and in the development of antidotes that neutralize their action. In parallel, many research groups turned to explore the pharmaceutical potential of such toxins when they are used to efficiently impair essential cellular processes and/or damage the integrity of their target cells. The following review summarizes major advances in the field of toxin based therapeutics and offers a comprehensive description of the mode of action of each applied toxin. PMID:22069564

  10. The Demand for Higher Education: A Static Structural Approach Accounting for Individual Heterogeneity and Nesting Patterns

    ERIC Educational Resources Information Center

    Flannery, Darragh; O'Donoghue, Cathal

    2013-01-01

    In this paper we estimate a structural model of higher education participation and labour choices in a static setting that accounts for individual heterogeneity and possible nesting structures in the decision process. We assume that young people that complete upper secondary education are faced with three choices, go to higher education, not go to…

  11. Peer-Mentoring Undergraduate Accounting Students: The Influence on Approaches to Learning and Academic Performance

    ERIC Educational Resources Information Center

    Fox, Alison; Stevenson, Lorna; Connelly, Patricia; Duff, Angus; Dunlop, Angela

    2010-01-01

    This article considers the impact of a student peer-mentoring programme (the Mentor Accountant Project, MAP) on first-year undergraduates' academic performance. The development of MAP was informed by reference to extant literature; it relies on the voluntary services of third-year students who then act as mentors to first-year student mentees in…

  12. Bursar Accounts, Payroll Deduction, and Debt Collection: A Three-Channel Approach to Lost Item Reimbursement

    ERIC Educational Resources Information Center

    Snowman, Ann MacKay

    2005-01-01

    In 2003, Penn State Libraries implemented payroll deduction and collection agency programs to gain better control of accounts receivable. The author reports on the implementation processes and first year outcomes of the programs. She recommends careful consideration of several questions before implementing such measures.

  13. Beyond Traditional Literacy Instruction: Toward an Account-Based Literacy Training Curriculum in Libraries

    ERIC Educational Resources Information Center

    Cirella, David

    2012-01-01

    A diverse group, account-based services include a wide variety of sites commonly used by patrons, including online shopping sites, social networks, photo- and video-sharing sites, banking and financial sites, government services, and cloud-based storage. Whether or not a piece of information is obtainable online must be considered when creating…

  14. Survival analysis approach to account for non-exponential decay rate effects in lifetime experiments

    NASA Astrophysics Data System (ADS)

    Coakley, K. J.; Dewey, M. S.; Huber, M. G.; Huffer, C. R.; Huffman, P. R.; Marley, D. E.; Mumm, H. P.; O`Shaughnessy, C. M.; Schelhammer, K. W.; Thompson, A. K.; Yue, A. T.

    2016-03-01

    In experiments that measure the lifetime of trapped particles, in addition to loss mechanisms with exponential survival probability functions, particles can be lost by mechanisms with non-exponential survival probability functions. Failure to account for such loss mechanisms produces systematic measurement error and associated systematic uncertainties in these measurements. In this work, we develop a general competing risks survival analysis method to account for the joint effect of loss mechanisms with either exponential or non-exponential survival probability functions, and a method to quantify the size of systematic effects and associated uncertainties for lifetime estimates. As a case study, we apply our survival analysis formalism and method to the Ultra Cold Neutron lifetime experiment at NIST. In this experiment, neutrons can escape a magnetic trap before they decay due to a wall loss mechanism with an associated non-exponential survival probability function.

  15. Spatial pattern of nitrogen deposition flux over Czech forests: a novel approach accounting for unmeasured nitrogen species

    NASA Astrophysics Data System (ADS)

    Hůnová, Iva; Stoklasová, Petra; Kurfürst, Pavel; Vlček, Ondřej; Schovánková, Jana; Stráník, Vojtěch

    2015-04-01

    atmospheric nitrogen deposition flux over the Czech forests collating all available data and model results. The aim of the presented study is to provide an improved, more reliable and more realistic estimate of spatial pattern of nitrogen deposition flux over one country. This has so far been based standardly on measurements of ambient N/NOx concentrations as dry deposition proxy, and N/NH4+ and N/NO3- as wet deposition proxy. For estimate of unmeasured species contributing to dry deposition, we used an Eulerian photochemical dispersion model CAMx, the Comprehensive Air Quality Model with extensions (ESSS, 2011), coupled with a high resolution regional numeric weather prediction model Aladin (Vlček, Corbet, 2011). Contribution of fog was estimated using a geostatistical data driven model. Final maps accounting for unmeasured species clearly indicate, that so far used approach results in substantial underestimation of nitrogen deposition flux. Substitution of unmeasured nitrogen species by modeled values seems to be a plausible way for approximation of total nitrogen deposition, and getting more realistic spatial pattern as input for further studies of likely nitrogen impacts on ecosystems. Acknowledgements: We would like to acknowledge the grants GA14-12262S - Effects of changing growth conditions on tree increment, stand production and vitality - danger or opportunity for the Central-European forestry?, and NAZV QI112A168 (ForSoil) of the Czech Ministry for Agriculture for support of this contribution. The input data used for the analysis were provided by the Czech Hydrometeorological Institute. References: Bobbink, R., Hicks, K., Galloway, J., Spranger, T., Alkemade, R. et al. (2010): Global Assessment of Nitrogen Deposition Effects on Terrestrial Plant Diversity: a Synthesis. Ecological Applications 20 (1), 30-59. Fowler D., O'Donoghue M., Muller J.B.A, et al. (2005): A chronology of nitrogen deposition in the UK between 1900 and 2000. Watter, Air & Soil Pollution: Focus

  16. [Methodological Approaches to the Organization of Counter Measures Taking into Account Landscape Features of Radioactively Contaminated Territories].

    PubMed

    Kuznetsov, V K; Sanzharova, N I

    2016-01-01

    Methodological approaches to the organization of counter measures are considered taking into account the landscape features of the radioactively contaminated territories. The current status and new requirements to the organization of counter measures in the contaminated agricultural areas are analyzed. The basic principles, objectives and problems of the formation of counter measures with regard to the landscape characteristics of the territory are presented; also substantiated are the organization and optimization of the counter measures in radioactively contaminated agricultural landscapes.

  17. [Methodological Approaches to the Organization of Counter Measures Taking into Account Landscape Features of Radioactively Contaminated Territories].

    PubMed

    Kuznetsov, V K; Sanzharova, N I

    2016-01-01

    Methodological approaches to the organization of counter measures are considered taking into account the landscape features of the radioactively contaminated territories. The current status and new requirements to the organization of counter measures in the contaminated agricultural areas are analyzed. The basic principles, objectives and problems of the formation of counter measures with regard to the landscape characteristics of the territory are presented; also substantiated are the organization and optimization of the counter measures in radioactively contaminated agricultural landscapes. PMID:27245009

  18. The Usage of an Online Discussion Forum for the Facilitation of Case-Based Learning in an Intermediate Accounting Course: A New Zealand Case

    ERIC Educational Resources Information Center

    Weil, Sidney; McGuigan, Nicholas; Kern, Thomas

    2011-01-01

    This paper describes the implementation of an online discussion forum as a means of facilitating case-based learning in an intermediate financial accounting course. The paper commences with a review of case-based learning literature and the use of online discussions as a delivery platform, linking these pedagogical approaches to the emerging needs…

  19. An Inquiry-Based Approach of Traditional "Step-by-Step" Experiments

    ERIC Educational Resources Information Center

    Szalay, L.; Tóth, Z.

    2016-01-01

    This is the start of a road map for the effective introduction of inquiry-based learning in chemistry. Advantages of inquiry-based approaches to the development of scientific literacy are widely discussed in the literature. However, unless chemistry educators take account of teachers' reservations and identified disadvantages such approaches will…

  20. An Energy Approach to a Micromechanics Model Accounting for Nonlinear Interface Debonding.

    SciTech Connect

    Tan, H.; Huang, Y.; Geubelle, P. H.; Liu, C.; Breitenfeld, M. S.

    2005-01-01

    We developed a micromechanics model to study the effect of nonlinear interface debonding on the constitutive behavior of composite materials. While implementing this micromechanics model into a large simulation code on solid rockets, we are challenged by problems such as tension/shear coupling and the nonuniform distribution of displacement jump at the particle/matrix interfaces. We therefore propose an energy approach to solve these problems. This energy approach calculates the potential energy of the representative volume element, including the contribution from the interface debonding. By minimizing the potential energy with respect to the variation of the interface displacement jump, the traction balanced interface debonding can be found and the macroscopic constitutive relations established. This energy approach has the ability to treat different load conditions in a unified way, and the interface cohesive law can be in any arbitrary forms. In this paper, the energy approach is verified to give the same constitutive behaviors as reported before.

  1. [Transnasal endoscopic approaches to the cranial base].

    PubMed

    Lysoń, Tomasz; Sieśkiewicz, Andrzej; Rutkowski, Robert; Kochanowicz, Jan; Turek, Grzegorz; Rogowski, Marek; Mariak, Zenon

    2013-01-01

    Recent advances in surgical endoscopy have made it possible to reach nearly the whole cranial base through a transnasal approach. These 'expanded approaches' lead to the frontal sinuses, the cribriform plate and planum sphenoidale, the suprasellar space, the clivus, odontoid and atlas. By pointing the endoscope laterally, the surgeon can explore structures in the coronal plane such as the cavernous sinuses, the pyramid and Meckel cave, the sphenopalatine and subtemporal fossae, and even the middle fossa and the orbit. The authors of this contribution use most of these approaches in their endoscopic skull base surgery. The purpose of this contribution is to review the hitherto established endoscopic approaches to the skull base and to illustrate them with photographs obtained during self-performed procedures and/or cadaver studies. PMID:23487296

  2. A Principles-Based Approach to Teaching International Financial Reporting Standards (IFRS)

    ERIC Educational Resources Information Center

    Persons, Obeua

    2014-01-01

    This article discusses the principles-based approach that emphasizes a "why" question by using the International Accounting Standards Board (IASB) "Conceptual Framework for Financial Reporting" to question and understand the basis for specific differences between IFRS and U.S. generally accepted accounting principles (U.S.…

  3. Cosmological constraints from Sunyaev-Zeldovich cluster counts: An approach to account for missing redshifts

    SciTech Connect

    Bonaldi, A.; Battye, R. A.; Brown, M. L.

    2014-05-10

    The accumulation of redshifts provides a significant observational bottleneck when using galaxy cluster surveys to constrain cosmological parameters. We propose a simple method to allow the use of samples where there is a fraction of the redshifts that are not known. The simplest assumption is that the missing redshifts are randomly extracted from the catalog, but the method also allows one to take into account known selection effects in the accumulation of redshifts. We quantify the reduction in statistical precision of cosmological parameter constraints as a function of the fraction of missing redshifts for simulated surveys, and also investigate the impact of making an incorrect assumption for the distribution of missing redshifts.

  4. A cloud model-based approach for water quality assessment.

    PubMed

    Wang, Dong; Liu, Dengfeng; Ding, Hao; Singh, Vijay P; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun

    2016-07-01

    Water quality assessment entails essentially a multi-criteria decision-making process accounting for qualitative and quantitative uncertainties and their transformation. Considering uncertainties of randomness and fuzziness in water quality evaluation, a cloud model-based assessment approach is proposed. The cognitive cloud model, derived from information science, can realize the transformation between qualitative concept and quantitative data, based on probability and statistics and fuzzy set theory. When applying the cloud model to practical assessment, three technical issues are considered before the development of a complete cloud model-based approach: (1) bilateral boundary formula with nonlinear boundary regression for parameter estimation, (2) hybrid entropy-analytic hierarchy process technique for calculation of weights, and (3) mean of repeated simulations for determining the degree of final certainty. The cloud model-based approach is tested by evaluating the eutrophication status of 12 typical lakes and reservoirs in China and comparing with other four methods, which are Scoring Index method, Variable Fuzzy Sets method, Hybrid Fuzzy and Optimal model, and Neural Networks method. The proposed approach yields information concerning membership for each water quality status which leads to the final status. The approach is found to be representative of other alternative methods and accurate. PMID:26995351

  5. A cloud model-based approach for water quality assessment.

    PubMed

    Wang, Dong; Liu, Dengfeng; Ding, Hao; Singh, Vijay P; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun

    2016-07-01

    Water quality assessment entails essentially a multi-criteria decision-making process accounting for qualitative and quantitative uncertainties and their transformation. Considering uncertainties of randomness and fuzziness in water quality evaluation, a cloud model-based assessment approach is proposed. The cognitive cloud model, derived from information science, can realize the transformation between qualitative concept and quantitative data, based on probability and statistics and fuzzy set theory. When applying the cloud model to practical assessment, three technical issues are considered before the development of a complete cloud model-based approach: (1) bilateral boundary formula with nonlinear boundary regression for parameter estimation, (2) hybrid entropy-analytic hierarchy process technique for calculation of weights, and (3) mean of repeated simulations for determining the degree of final certainty. The cloud model-based approach is tested by evaluating the eutrophication status of 12 typical lakes and reservoirs in China and comparing with other four methods, which are Scoring Index method, Variable Fuzzy Sets method, Hybrid Fuzzy and Optimal model, and Neural Networks method. The proposed approach yields information concerning membership for each water quality status which leads to the final status. The approach is found to be representative of other alternative methods and accurate.

  6. Accounting for Success and Failure: A Discursive Psychological Approach to Sport Talk

    ERIC Educational Resources Information Center

    Locke, Abigail

    2004-01-01

    In recent years, constructionist methodologies such as discursive psychology (Edwards & Potter, 1992) have begun to be used in sport research. This paper provides a practical guide to applying a discursive psychological approach to sport data. It discusses the assumptions and principles of discursive psychology and outlines the stages of a…

  7. Situational Effects May Account for Gain Scores in Cognitive Ability Testing: A Longitudinal SEM Approach

    ERIC Educational Resources Information Center

    Matton, Nadine; Vautier, Stephane; Raufaste, Eric

    2009-01-01

    Mean gain scores for cognitive ability tests between two sessions in a selection setting are now a robust finding, yet not fully understood. Many authors do not attribute such gain scores to an increase in the target abilities. Our approach consists of testing a longitudinal SEM model suitable to this view. We propose to model the scores' changes…

  8. A fatal fall from a balcony? A biomechanical approach to resolving conflicting witness accounts.

    PubMed

    Jones, M D; Cory, C Z

    2006-07-01

    An adult male was found below a third floor balcony having sustained fatal head injuries. An account provided by a witness described how the deceased had been in high spirits and had engaged in swinging from the third floor balcony rail in an attempt to swing onto a lower second floor balcony and whilst doing so had lost his grip and fallen (10.67 metres) to the ground below. A conflicting account was provided, some weeks later, by a second witness, who claimed to have observed an argument between two men on a third floor balcony, during which one had vigorously pushed the other over the balcony rail. The push, it was alleged, caused the man to move very quickly over the balcony rail and fall in an 'upturned crucifix' position to the ground. This paper describes a series of biomechanical experiments, conducted on a reconstruction of the third floor balcony and the second floor balcony rail, during which a volunteer was subjected to the two fall scenarios, in an attempt to resolve the conflicting witness accounts. Analysis of human movement was performed using a 3-D motion analysis system, markers were placed at the volunteer's key joint centres and were tracked to determine physical parameters. The parameter values were used to calculate what dynamic movements may have occurred had the volunteer been allowed to fall, not just a distance equivalent to the lower balcony rail but a greater distance, equivalent to that between the balcony and the ground. Calculations indicate that during the hanging-fall scenario a range of body rotation was produced between 159 degrees and 249 degrees, that is, an upturned head-first body orientation, consistent with that required to produce the described injuries and consistent with the description provided by the first witness. The push-fall scenario, however, produced a greater estimated body rotation of between 329 degrees and 530 degrees, equal to the body rotating, from the point of free-fall to the moment of impact, between

  9. A fatal fall from a balcony? A biomechanical approach to resolving conflicting witness accounts.

    PubMed

    Jones, M D; Cory, C Z

    2006-07-01

    An adult male was found below a third floor balcony having sustained fatal head injuries. An account provided by a witness described how the deceased had been in high spirits and had engaged in swinging from the third floor balcony rail in an attempt to swing onto a lower second floor balcony and whilst doing so had lost his grip and fallen (10.67 metres) to the ground below. A conflicting account was provided, some weeks later, by a second witness, who claimed to have observed an argument between two men on a third floor balcony, during which one had vigorously pushed the other over the balcony rail. The push, it was alleged, caused the man to move very quickly over the balcony rail and fall in an 'upturned crucifix' position to the ground. This paper describes a series of biomechanical experiments, conducted on a reconstruction of the third floor balcony and the second floor balcony rail, during which a volunteer was subjected to the two fall scenarios, in an attempt to resolve the conflicting witness accounts. Analysis of human movement was performed using a 3-D motion analysis system, markers were placed at the volunteer's key joint centres and were tracked to determine physical parameters. The parameter values were used to calculate what dynamic movements may have occurred had the volunteer been allowed to fall, not just a distance equivalent to the lower balcony rail but a greater distance, equivalent to that between the balcony and the ground. Calculations indicate that during the hanging-fall scenario a range of body rotation was produced between 159 degrees and 249 degrees, that is, an upturned head-first body orientation, consistent with that required to produce the described injuries and consistent with the description provided by the first witness. The push-fall scenario, however, produced a greater estimated body rotation of between 329 degrees and 530 degrees, equal to the body rotating, from the point of free-fall to the moment of impact, between

  10. A Multi-scale Approach for CO2 Accounting and Risk Analysis in CO2 Enhanced Oil Recovery Sites

    NASA Astrophysics Data System (ADS)

    Dai, Z.; Viswanathan, H. S.; Middleton, R. S.; Pan, F.; Ampomah, W.; Yang, C.; Jia, W.; Lee, S. Y.; McPherson, B. J. O. L.; Grigg, R.; White, M. D.

    2015-12-01

    Using carbon dioxide in enhanced oil recovery (CO2-EOR) is a promising technology for emissions management because CO2-EOR can dramatically reduce carbon sequestration costs in the absence of greenhouse gas emissions policies that include incentives for carbon capture and storage. This study develops a multi-scale approach to perform CO2 accounting and risk analysis for understanding CO2 storage potential within an EOR environment at the Farnsworth Unit of the Anadarko Basin in northern Texas. A set of geostatistical-based Monte Carlo simulations of CO2-oil-water flow and transport in the Marrow formation are conducted for global sensitivity and statistical analysis of the major risk metrics: CO2 injection rate, CO2 first breakthrough time, CO2 production rate, cumulative net CO2 storage, cumulative oil and CH4 production, and water injection and production rates. A global sensitivity analysis indicates that reservoir permeability, porosity, and thickness are the major intrinsic reservoir parameters that control net CO2 injection/storage and oil/CH4 recovery rates. The well spacing (the distance between the injection and production wells) and the sequence of alternating CO2 and water injection are the major operational parameters for designing an effective five-spot CO2-EOR pattern. The response surface analysis shows that net CO2 injection rate increases with the increasing reservoir thickness, permeability, and porosity. The oil/CH4 production rates are positively correlated to reservoir permeability, porosity and thickness, but negatively correlated to the initial water saturation. The mean and confidence intervals are estimated for quantifying the uncertainty ranges of the risk metrics. The results from this study provide useful insights for understanding the CO2 storage potential and the corresponding risks of commercial-scale CO2-EOR fields.

  11. Accounting for negative automaintenance in pigeons: a dual learning systems approach and factored representations.

    PubMed

    Lesaint, Florian; Sigaud, Olivier; Khamassi, Mehdi

    2014-01-01

    Animals, including Humans, are prone to develop persistent maladaptive and suboptimal behaviours. Some of these behaviours have been suggested to arise from interactions between brain systems of Pavlovian conditioning, the acquisition of responses to initially neutral stimuli previously paired with rewards, and instrumental conditioning, the acquisition of active behaviours leading to rewards. However the mechanics of these systems and their interactions are still unclear. While extensively studied independently, few models have been developed to account for these interactions. On some experiment, pigeons have been observed to display a maladaptive behaviour that some suggest to involve conflicts between Pavlovian and instrumental conditioning. In a procedure referred as negative automaintenance, a key light is paired with the subsequent delivery of food, however any peck towards the key light results in the omission of the reward. Studies showed that in such procedure some pigeons persisted in pecking to a substantial level despite its negative consequence, while others learned to refrain from pecking and maximized their cumulative rewards. Furthermore, the pigeons that were unable to refrain from pecking could nevertheless shift their pecks towards a harmless alternative key light. We confronted a computational model that combines dual-learning systems and factored representations, recently developed to account for sign-tracking and goal-tracking behaviours in rats, to these negative automaintenance experimental data. We show that it can explain the variability of the observed behaviours and the capacity of alternative key lights to distract pigeons from their detrimental behaviours. These results confirm the proposed model as an interesting tool to reproduce experiments that could involve interactions between Pavlovian and instrumental conditioning. The model allows us to draw predictions that may be experimentally verified, which could help further investigate

  12. Accounting for Negative Automaintenance in Pigeons: A Dual Learning Systems Approach and Factored Representations

    PubMed Central

    Lesaint, Florian; Sigaud, Olivier; Khamassi, Mehdi

    2014-01-01

    Animals, including Humans, are prone to develop persistent maladaptive and suboptimal behaviours. Some of these behaviours have been suggested to arise from interactions between brain systems of Pavlovian conditioning, the acquisition of responses to initially neutral stimuli previously paired with rewards, and instrumental conditioning, the acquisition of active behaviours leading to rewards. However the mechanics of these systems and their interactions are still unclear. While extensively studied independently, few models have been developed to account for these interactions. On some experiment, pigeons have been observed to display a maladaptive behaviour that some suggest to involve conflicts between Pavlovian and instrumental conditioning. In a procedure referred as negative automaintenance, a key light is paired with the subsequent delivery of food, however any peck towards the key light results in the omission of the reward. Studies showed that in such procedure some pigeons persisted in pecking to a substantial level despite its negative consequence, while others learned to refrain from pecking and maximized their cumulative rewards. Furthermore, the pigeons that were unable to refrain from pecking could nevertheless shift their pecks towards a harmless alternative key light. We confronted a computational model that combines dual-learning systems and factored representations, recently developed to account for sign-tracking and goal-tracking behaviours in rats, to these negative automaintenance experimental data. We show that it can explain the variability of the observed behaviours and the capacity of alternative key lights to distract pigeons from their detrimental behaviours. These results confirm the proposed model as an interesting tool to reproduce experiments that could involve interactions between Pavlovian and instrumental conditioning. The model allows us to draw predictions that may be experimentally verified, which could help further investigate

  13. Accounting for negative automaintenance in pigeons: a dual learning systems approach and factored representations.

    PubMed

    Lesaint, Florian; Sigaud, Olivier; Khamassi, Mehdi

    2014-01-01

    Animals, including Humans, are prone to develop persistent maladaptive and suboptimal behaviours. Some of these behaviours have been suggested to arise from interactions between brain systems of Pavlovian conditioning, the acquisition of responses to initially neutral stimuli previously paired with rewards, and instrumental conditioning, the acquisition of active behaviours leading to rewards. However the mechanics of these systems and their interactions are still unclear. While extensively studied independently, few models have been developed to account for these interactions. On some experiment, pigeons have been observed to display a maladaptive behaviour that some suggest to involve conflicts between Pavlovian and instrumental conditioning. In a procedure referred as negative automaintenance, a key light is paired with the subsequent delivery of food, however any peck towards the key light results in the omission of the reward. Studies showed that in such procedure some pigeons persisted in pecking to a substantial level despite its negative consequence, while others learned to refrain from pecking and maximized their cumulative rewards. Furthermore, the pigeons that were unable to refrain from pecking could nevertheless shift their pecks towards a harmless alternative key light. We confronted a computational model that combines dual-learning systems and factored representations, recently developed to account for sign-tracking and goal-tracking behaviours in rats, to these negative automaintenance experimental data. We show that it can explain the variability of the observed behaviours and the capacity of alternative key lights to distract pigeons from their detrimental behaviours. These results confirm the proposed model as an interesting tool to reproduce experiments that could involve interactions between Pavlovian and instrumental conditioning. The model allows us to draw predictions that may be experimentally verified, which could help further investigate

  14. An engineering based approach for hydraulic computations in river flows

    NASA Astrophysics Data System (ADS)

    Di Francesco, S.; Biscarini, C.; Pierleoni, A.; Manciola, P.

    2016-06-01

    This paper presents an engineering based approach for hydraulic risk evaluation. The aim of the research is to identify a criteria for the choice of the simplest and appropriate model to use in different scenarios varying the characteristics of main river channel. The complete flow field, generally expressed in terms of pressure, velocities, accelerations can be described through a three dimensional approach that consider all the flow properties varying in all directions. In many practical applications for river flow studies, however, the greatest changes occur only in two dimensions or even only in one. In these cases the use of simplified approaches can lead to accurate results, with easy to build and faster simulations. The study has been conducted taking in account a dimensionless parameter of channels (ratio of curvature radius and width of the channel (R/B).

  15. A subgrid based approach for morphodynamic modelling

    NASA Astrophysics Data System (ADS)

    Volp, N. D.; van Prooijen, B. C.; Pietrzak, J. D.; Stelling, G. S.

    2016-07-01

    To improve the accuracy and the efficiency of morphodynamic simulations, we present a subgrid based approach for a morphodynamic model. This approach is well suited for areas characterized by sub-critical flow, like in estuaries, coastal areas and in low land rivers. This new method uses a different grid resolution to compute the hydrodynamics and the morphodynamics. The hydrodynamic computations are carried out with a subgrid based, two-dimensional, depth-averaged model. This model uses a coarse computational grid in combination with a subgrid. The subgrid contains high resolution bathymetry and roughness information to compute volumes, friction and advection. The morphodynamic computations are carried out entirely on a high resolution grid, the bed grid. It is key to find a link between the information defined on the different grids in order to guaranty the feedback between the hydrodynamics and the morphodynamics. This link is made by using a new physics-based interpolation method. The method interpolates water levels and velocities from the coarse grid to the high resolution bed grid. The morphodynamic solution improves significantly when using the subgrid based method compared to a full coarse grid approach. The Exner equation is discretised with an upwind method based on the direction of the bed celerity. This ensures a stable solution for the Exner equation. By means of three examples, it is shown that the subgrid based approach offers a significant improvement at a minimal computational cost.

  16. Contest for Jurisdiction: An Occupational Analysis of Principals' Responses to Test-Based Accountability

    ERIC Educational Resources Information Center

    Rutledge, Stacey A.

    2010-01-01

    This case study uses a theory of occupational ecology to understand why test-based accountability has been successful at redirecting principals' work toward high-stakes standards and assessments. The principals and English teachers at two Chicago high schools were interviewed annually over a four-year period. The study finds that test-based…

  17. Outcome-Based Education and Student Learning in Managerial Accounting in Hong Kong

    ERIC Educational Resources Information Center

    Lui, Gladie; Shum, Connie

    2012-01-01

    Although Outcome-based Education has not been successful in public education in several countries, it has been successful in the medical fields in higher education in the U.S. The author implemented OBE in her Managerial Accounting course in H.K. Intended learning outcomes were mapped again Bloom's Cognitive Domain. Teaching and learning…

  18. Dynamic model of production enterprises based on accounting registers and its identification

    NASA Astrophysics Data System (ADS)

    Sirazetdinov, R. T.; Samodurov, A. V.; Yenikeev, I. A.; Markov, D. S.

    2016-06-01

    The report focuses on the mathematical modeling of economic entities based on accounting registers. Developed the dynamic model of financial and economic activity of the enterprise as a system of differential equations. Created algorithms for identification of parameters of the dynamic model. Constructed and identified the model of Russian machine-building enterprises.

  19. Principals' Inviting Leadership Behaviors in a Time of Test-Based Accountability

    ERIC Educational Resources Information Center

    Egley, Robert J.; Jones, Brett D.

    2005-01-01

    We queried Florida elementary teachers about how they perceived their principals' professional and personal inviting leadership behaviors during a time when many teachers and principals felt a lot of pressure due to test-based accountability. Despite the pressure, teachers reported that their principals demonstrated fairly high levels of inviting…

  20. Preferences for Team Learning and Lecture-Based Learning among First-Year Undergraduate Accounting Students

    ERIC Educational Resources Information Center

    Opdecam, Evelien; Everaert, Patricia; Van Keer, Hilde; Buysschaert, Fanny

    2014-01-01

    This study investigates students' "preference" for team learning and its effectiveness, compared to lecture-based learning. A quasi-experiment was set up in a financial accounting course in the first-year undergraduate of the Economics and Business Administration Program, where students had to choose between one of the two learning…

  1. The Social Organization of School Counseling in the Era of Standards-Based Accountability

    ERIC Educational Resources Information Center

    Dorsey, Alexander C.

    2011-01-01

    The reform policies of standards-based accountability, as outlined in NCLB, impede the functioning of school counseling programs and the delivery of services to students. Although recent studies have focused on the transformation of the school counseling profession, a gap exists in the literature with regard to how the experiences of school…

  2. Are Performance-Based Accountability Systems Effective? Evidence from Five Sectors. Research Brief

    ERIC Educational Resources Information Center

    Leuschner, Kristin J.

    2010-01-01

    During the past two decades, performance-based accountability systems (PBASs), which link financial or other incentives to measured performance as a means of improving services, have gained popularity among policymakers. Although PBASs can vary widely across sectors, they share three main components: goals (i.e., one or more long-term outcomes to…

  3. Is Comprehension Necessary for Error Detection? A Conflict-Based Account of Monitoring in Speech Production

    ERIC Educational Resources Information Center

    Nozari, Nazbanou; Dell, Gary S.; Schwartz, Myrna F.

    2011-01-01

    Despite the existence of speech errors, verbal communication is successful because speakers can detect (and correct) their errors. The standard theory of speech-error detection, the perceptual-loop account, posits that the comprehension system monitors production output for errors. Such a comprehension-based monitor, however, cannot explain the…

  4. Historicizing and Contextualizing Global Policy Discourses: Test- and Standards-Based Accountabilities in Education

    ERIC Educational Resources Information Center

    Lingard, Bob

    2013-01-01

    This paper in commenting on the contributions to this special number demonstrates the necessity of historicizing and contextualizing the rise of test- and standards-based modes of accountability in contemporary education policy globally. Both are imperative for understanding specific national manifestations of what has become a globalized…

  5. Toward an Episodic Context Account of Retrieval-Based Learning: Dissociating Retrieval Practice and Elaboration

    ERIC Educational Resources Information Center

    Lehman, Melissa; Smith, Megan A.; Karpicke, Jeffrey D.

    2014-01-01

    We tested the predictions of 2 explanations for retrieval-based learning; while the elaborative retrieval hypothesis assumes that the retrieval of studied information promotes the generation of semantically related information, which aids in later retrieval (Carpenter, 2009), the episodic context account proposed by Karpicke, Lehman, and Aue (in…

  6. Toward a Culture of Consequences: Performance-Based Accountability Systems for Public Services. Monograph

    ERIC Educational Resources Information Center

    Stecher, Brian M.; Camm, Frank; Damberg, Cheryl L.; Hamilton, Laura S.; Mullen, Kathleen J.; Nelson, Christopher; Sorensen, Paul; Wachs, Martin; Yoh, Allison; Zellman, Gail L.

    2010-01-01

    Performance-based accountability systems (PBASs), which link incentives to measured performance as a means of improving services to the public, have gained popularity. While PBASs can vary widely across sectors, they share three main components: goals, incentives, and measures. Research suggests that PBASs influence provider behaviors, but little…

  7. Evolutionary impact assessment: accounting for evolutionary consequences of fishing in an ecosystem approach to fisheries management

    PubMed Central

    Laugen, Ane T; Engelhard, Georg H; Whitlock, Rebecca; Arlinghaus, Robert; Dankel, Dorothy J; Dunlop, Erin S; Eikeset, Anne M; Enberg, Katja; Jørgensen, Christian; Matsumura, Shuichi; Nusslé, Sébastien; Urbach, Davnah; Baulier, Loїc; Boukal, David S; Ernande, Bruno; Johnston, Fiona D; Mollet, Fabian; Pardoe, Heidi; Therkildsen, Nina O; Uusi-Heikkilä, Silva; Vainikka, Anssi; Heino, Mikko; Rijnsdorp, Adriaan D; Dieckmann, Ulf

    2014-01-01

    Managing fisheries resources to maintain healthy ecosystems is one of the main goals of the ecosystem approach to fisheries (EAF). While a number of international treaties call for the implementation of EAF, there are still gaps in the underlying methodology. One aspect that has received substantial scientific attention recently is fisheries-induced evolution (FIE). Increasing evidence indicates that intensive fishing has the potential to exert strong directional selection on life-history traits, behaviour, physiology, and morphology of exploited fish. Of particular concern is that reversing evolutionary responses to fishing can be much more difficult than reversing demographic or phenotypically plastic responses. Furthermore, like climate change, multiple agents cause FIE, with effects accumulating over time. Consequently, FIE may alter the utility derived from fish stocks, which in turn can modify the monetary value living aquatic resources provide to society. Quantifying and predicting the evolutionary effects of fishing is therefore important for both ecological and economic reasons. An important reason this is not happening is the lack of an appropriate assessment framework. We therefore describe the evolutionary impact assessment (EvoIA) as a structured approach for assessing the evolutionary consequences of fishing and evaluating the predicted evolutionary outcomes of alternative management options. EvoIA can contribute to EAF by clarifying how evolution may alter stock properties and ecological relations, support the precautionary approach to fisheries management by addressing a previously overlooked source of uncertainty and risk, and thus contribute to sustainable fisheries. PMID:26430388

  8. The inequality footprints of nations: a novel approach to quantitative accounting of income inequality.

    PubMed

    Alsamawi, Ali; Murray, Joy; Lenzen, Manfred; Moran, Daniel; Kanemoto, Keiichiro

    2014-01-01

    In this study we use economic input-output analysis to calculate the inequality footprint of nations. An inequality footprint shows the link that each country's domestic economic activity has to income distribution elsewhere in the world. To this end we use employment and household income accounts for 187 countries and an historical time series dating back to 1990. Our results show that in 2010, most developed countries had an inequality footprint that was higher than their within-country inequality, meaning that in order to support domestic lifestyles, these countries source imports from more unequal economies. Amongst exceptions are the United States and United Kingdom, which placed them on a par with many developing countries. Russia has a high within-country inequality nevertheless it has the lowest inequality footprint in the world, which is because of its trade connections with the Commonwealth of Independent States and Europe. Our findings show that the commodities that are inequality-intensive, such as electronic components, chemicals, fertilizers, minerals, and agricultural products often originate in developing countries characterized by high levels of inequality. Consumption of these commodities may implicate within-country inequality in both developing and developed countries.

  9. The Inequality Footprints of Nations: A Novel Approach to Quantitative Accounting of Income Inequality

    PubMed Central

    Alsamawi, Ali; Murray, Joy; Lenzen, Manfred; Moran, Daniel; Kanemoto, Keiichiro

    2014-01-01

    In this study we use economic input-output analysis to calculate the inequality footprint of nations. An inequality footprint shows the link that each country's domestic economic activity has to income distribution elsewhere in the world. To this end we use employment and household income accounts for 187 countries and an historical time series dating back to 1990. Our results show that in 2010, most developed countries had an inequality footprint that was higher than their within-country inequality, meaning that in order to support domestic lifestyles, these countries source imports from more unequal economies. Amongst exceptions are the United States and United Kingdom, which placed them on a par with many developing countries. Russia has a high within-country inequality nevertheless it has the lowest inequality footprint in the world, which is because of its trade connections with the Commonwealth of Independent States and Europe. Our findings show that the commodities that are inequality-intensive, such as electronic components, chemicals, fertilizers, minerals, and agricultural products often originate in developing countries characterized by high levels of inequality. Consumption of these commodities may implicate within-country inequality in both developing and developed countries. PMID:25353333

  10. The inequality footprints of nations: a novel approach to quantitative accounting of income inequality.

    PubMed

    Alsamawi, Ali; Murray, Joy; Lenzen, Manfred; Moran, Daniel; Kanemoto, Keiichiro

    2014-01-01

    In this study we use economic input-output analysis to calculate the inequality footprint of nations. An inequality footprint shows the link that each country's domestic economic activity has to income distribution elsewhere in the world. To this end we use employment and household income accounts for 187 countries and an historical time series dating back to 1990. Our results show that in 2010, most developed countries had an inequality footprint that was higher than their within-country inequality, meaning that in order to support domestic lifestyles, these countries source imports from more unequal economies. Amongst exceptions are the United States and United Kingdom, which placed them on a par with many developing countries. Russia has a high within-country inequality nevertheless it has the lowest inequality footprint in the world, which is because of its trade connections with the Commonwealth of Independent States and Europe. Our findings show that the commodities that are inequality-intensive, such as electronic components, chemicals, fertilizers, minerals, and agricultural products often originate in developing countries characterized by high levels of inequality. Consumption of these commodities may implicate within-country inequality in both developing and developed countries. PMID:25353333

  11. A Bayesian approach for interpreting shoemark evidence in forensic casework: accounting for wear features.

    PubMed

    Skerrett, James; Neumann, Cedric; Mateos-Garcia, Ismael

    2011-07-15

    Shoemark evidence remains a cornerstone of forensic crime investigation. Shoemarks can be used at a crime scene to reconstruct the course of events; they can be used as forensic intelligence tool to establish links between crime scenes; and when control material is available, used to help infer the participation of given individuals to the commission of a crime. Nevertheless, as for most other impression evidence, the current process used to evaluate and report the weight of shoemark evidence is under extreme scrutiny. Building on previous research, this paper proposes a model to evaluate shoemark evidence in a more transparent manner. The model is currently limited to sole pattern and wear characteristics. It does not account formally for cuts and other accidental damages. Furthermore, it requires the acquisition of relevant shoemark datasets and the development of automated comparison algorithms to deploy its full benefits. These are not currently available. Instead, we demonstrate, using casework examples, that a pragmatic consideration of the various variables of the model allows us to already evaluate shoemark evidence in a more transparent way and therefore begin to address the current scientific and legal concerns.

  12. A simulation model of hospital management based on cost accounting analysis according to disease.

    PubMed

    Tanaka, Koji; Sato, Junzo; Guo, Jinqiu; Takada, Akira; Yoshihara, Hiroyuki

    2004-12-01

    Since a little before 2000, hospital cost accounting has been increasingly performed at Japanese national university hospitals. At Kumamoto University Hospital, for instance, departmental costs have been analyzed since 2000. And, since 2003, the cost balance has been obtained according to certain diseases for the preparation of Diagnosis-Related Groups and Prospective Payment System. On the basis of these experiences, we have constructed a simulation model of hospital management. This program has worked correctly at repeated trials and with satisfactory speed. Although there has been room for improvement of detailed accounts and cost accounting engine, the basic model has proved satisfactory. We have constructed a hospital management model based on the financial data of an existing hospital. We will later improve this program from the viewpoint of construction and using more various data of hospital management. A prospective outlook may be obtained for the practical application of this hospital management model.

  13. Measure for Measure: How Proficiency-Based Accountability Systems Affect Inequality in Academic Achievement

    PubMed Central

    Jennings, Jennifer; Sohn, Heeju

    2016-01-01

    How do proficiency-based accountability systems affect inequality in academic achievement? This paper reconciles mixed findings in the literature by demonstrating that three factors jointly determine accountability's impact. First, by analyzing student-level data from a large urban school district, we find that when educators face accountability pressure, they focus attention on students closest to proficiency. We refer to this practice as educational triage, and show that the difficulty of the proficiency standard affects whether lower or higher performing students gain most on high-stakes tests used to evaluate schools. Less difficult proficiency standards decrease inequality in high-stakes achievement, while more difficult ones increase it. Second, we show that educators emphasize test-specific skills with students near proficiency, a practice that we refer to as instructional triage. As a result, the effects of accountability pressure differ across high and low-stakes tests; we find no effects on inequality in low-stakes reading and math tests of similar skills. Finally, we provide suggestive evidence that instructional triage is most pronounced in the lowest performing schools. We conclude by discussing how these findings shape our understanding of accountability's impacts on educational inequality. PMID:27122642

  14. Ugi-based approaches to quinoxaline libraries.

    PubMed

    Azuaje, Jhonny; El Maatougui, Abdelaziz; García-Mera, Xerardo; Sotelo, Eddy

    2014-08-11

    An expedient and concise Ugi-based unified approach for the rapid assembly of quinoxaline frameworks has been developed. This convergent and versatile method uses readily available commercial reagents, does not require advanced intermediates, and exhibits excellent bond-forming efficiency, thus exemplifying the operationally simple synthesis of quinoxaline libraries.

  15. Physics-based approach to haptic display

    NASA Technical Reports Server (NTRS)

    Brown, J. Michael; Colgate, J. Edward

    1994-01-01

    This paper addresses the implementation of complex multiple degree of freedom virtual environments for haptic display. We suggest that a physics based approach to rigid body simulation is appropriate for hand tool simulation, but that currently available simulation techniques are not sufficient to guarantee successful implementation. We discuss the desirable features of a virtual environment simulation, specifically highlighting the importance of stability guarantees.

  16. Standards-Based Accountability as a Tool for Making a Difference in Student Learning. A State and an Institutional Perspective on Standards-Based Accountability.

    ERIC Educational Resources Information Center

    Wilkerson, Judy R.

    This paper examines Florida's standards-driven performance assessment, emphasizing teacher preparation, and touching on K-12 accountability. Florida's educational reform and accountability efforts are driven by the Florida System of School Improvement and Accountability document. The system is derived from state goals similar to the national Goals…

  17. A dynamic water accounting framework based on marginal resource opportunity cost

    NASA Astrophysics Data System (ADS)

    Tilmant, A.; Marques, G.; Mohamed, Y.

    2014-10-01

    Many river basins throughout the world are increasingly under pressure as water demands keep rising due to population growth, industrialization, urbanization and rising living standards. In the past, the typical answer to meet those demands focused on the supply-side and involved the construction of hydraulic infrastructures to capture more water from surface water bodies and from aquifers. As river basins were being more and more developed, downstream water users and ecosystems have become increasingly dependent on the management actions taken by upstream users. The increased interconnectedness between water users, aquatic ecosystems and the built environment is further compounded by climate change and its impact on the water cycle. Those pressures mean that it has become increasingly important to measure and account for changes in water fluxes and their corresponding economic value as they progress throughout the river system. Such basin water accounting should provide policy makers with important information regarding the relative contribution of each water user, infrastructure and management decision to the overall economic value of the river basin. This paper presents a dynamic water accounting approach whereby the entire river basin is considered as a value chain with multiple services including production and storage. Water users and reservoirs operators are considered as economic agents who can exchange water with their hydraulic neighbours at a price corresponding to the marginal value of water. Effective water accounting is made possible by keeping track of all water fluxes and their corresponding hypothetical transactions using the results of a hydro-economic model. The proposed approach is illustrated with the Eastern Nile River basin in Africa.

  18. A dynamic water accounting framework based on marginal resource opportunity cost

    NASA Astrophysics Data System (ADS)

    Tilmant, A.; Marques, G.; Mohamed, Y.

    2015-03-01

    Many river basins throughout the world are increasingly under pressure as water demands keep rising due to population growth, industrialization, urbanization and rising living standards. In the past, the typical answer to meet those demands focused on the supply side and involved the construction of hydraulic infrastructures to capture more water from surface water bodies and from aquifers. As river basins have become more and more developed, downstream water users and ecosystems have become increasingly dependent on the management actions taken by upstream users. The increased interconnectedness between water users, aquatic ecosystems and the built environment is further compounded by climate change and its impact on the water cycle. Those pressures mean that it has become increasingly important to measure and account for changes in water fluxes and their corresponding economic value as they progress throughout the river system. Such basin water accounting should provide policy makers with important information regarding the relative contribution of each water user, infrastructure and management decision to the overall economic value of the river basin. This paper presents a dynamic water accounting approach whereby the entire river basin is considered as a value chain with multiple services including production and storage. Water users and reservoir operators are considered as economic agents who can exchange water with their hydraulic neighbors at a price corresponding to the marginal value of water. Effective water accounting is made possible by keeping track of all water fluxes and their corresponding hypothetical transactions using the results of a hydro-economic model. The proposed approach is illustrated with the Eastern Nile River basin in Africa.

  19. Advanced Approach of Multiagent Based Buoy Communication

    PubMed Central

    Gricius, Gediminas; Drungilas, Darius; Andziulis, Arunas; Dzemydiene, Dale; Voznak, Miroslav; Kurmis, Mindaugas; Jakovlev, Sergej

    2015-01-01

    Usually, a hydrometeorological information system is faced with great data flows, but the data levels are often excessive, depending on the observed region of the water. The paper presents advanced buoy communication technologies based on multiagent interaction and data exchange between several monitoring system nodes. The proposed management of buoy communication is based on a clustering algorithm, which enables the performance of the hydrometeorological information system to be enhanced. The experiment is based on the design and analysis of the inexpensive but reliable Baltic Sea autonomous monitoring network (buoys), which would be able to continuously monitor and collect temperature, waviness, and other required data. The proposed approach of multiagent based buoy communication enables all the data from the costal-based station to be monitored with limited transition speed by setting different tasks for the agent-based buoy system according to the clustering information. PMID:26345197

  20. Spatial pattern of nitrogen deposition flux over Czech forests: a novel approach accounting for unmeasured nitrogen species

    NASA Astrophysics Data System (ADS)

    Hůnová, Iva; Stoklasová, Petra; Kurfürst, Pavel; Vlček, Ondřej; Schovánková, Jana; Stráník, Vojtěch

    2015-04-01

    atmospheric nitrogen deposition flux over the Czech forests collating all available data and model results. The aim of the presented study is to provide an improved, more reliable and more realistic estimate of spatial pattern of nitrogen deposition flux over one country. This has so far been based standardly on measurements of ambient N/NOx concentrations as dry deposition proxy, and N/NH4+ and N/NO3- as wet deposition proxy. For estimate of unmeasured species contributing to dry deposition, we used an Eulerian photochemical dispersion model CAMx, the Comprehensive Air Quality Model with extensions (ESSS, 2011), coupled with a high resolution regional numeric weather prediction model Aladin (Vlček, Corbet, 2011). Contribution of fog was estimated using a geostatistical data driven model. Final maps accounting for unmeasured species clearly indicate, that so far used approach results in substantial underestimation of nitrogen deposition flux. Substitution of unmeasured nitrogen species by modeled values seems to be a plausible way for approximation of total nitrogen deposition, and getting more realistic spatial pattern as input for further studies of likely nitrogen impacts on ecosystems. Acknowledgements: We would like to acknowledge the grants GA14-12262S - Effects of changing growth conditions on tree increment, stand production and vitality - danger or opportunity for the Central-European forestry?, and NAZV QI112A168 (ForSoil) of the Czech Ministry for Agriculture for support of this contribution. The input data used for the analysis were provided by the Czech Hydrometeorological Institute. References: Bobbink, R., Hicks, K., Galloway, J., Spranger, T., Alkemade, R. et al. (2010): Global Assessment of Nitrogen Deposition Effects on Terrestrial Plant Diversity: a Synthesis. Ecological Applications 20 (1), 30-59. Fowler D., O'Donoghue M., Muller J.B.A, et al. (2005): A chronology of nitrogen deposition in the UK between 1900 and 2000. Watter, Air & Soil Pollution: Focus

  1. Accounting Technology Associate Degree. Louisiana Technical Education Program and Course Standards. Competency-Based Postsecondary Curriculum Outline from Bulletin 1822.

    ERIC Educational Resources Information Center

    Louisiana State Dept. of Education, Baton Rouge. Div. of Vocational Education.

    This document outlines the curriculum of Louisiana's accounting technology associate degree program, which is a 6-term (77-credit hour) competency-based program designed to prepare students for employment as accounting technicians providing technical administrative support to professional accountants and other financial management personnel.…

  2. A recency-based account of the primacy effect in free recall.

    PubMed

    Tan, L; Ward, G

    2000-11-01

    Seven experiments investigated the role of rehearsal in free recall to determine whether accounts of recency effects based on the ratio rule could be extended to provide an account of primacy effects based on the number, distribution, and recency of the rehearsals of the study items. Primacy items were rehearsed more often and further toward the end of the list than middle items, particularly with a slow presentation rate (Experiment 1) and with high-frequency words (Experiment 2). Recency, but not primacy, was reduced by a filled delay (Experiment 3), although significant recency survived a filled retention interval when a fixed-rehearsal strategy was used (Experiment 4). Experimenter-presented schedules of rehearsals resulted in similar serial position curves to those observed with participant-generated rehearsals (Experiment 5) and were used to confirm the main findings in Experiments 6 and 7.

  3. Initial experience with a first-to-market member accountability-based insurance product.

    PubMed

    Woll, Douglas R; Nelson, David R

    2010-10-01

    We describe the initial experience with a first-to-market health insurance product design based on principles of both member and purchaser accountability. Two benefit levels were offered, enhanced and standard. Qualification for the enhanced benefit level was obtained through members' commitment to follow their physicians' recommended treatment plan. Employers were offered a discount of 10% in exchange for offering this new product and promoting a healthy work environment. Membership in the product grew beyond expectations, and several health improvements were noted.

  4. Frame-Based Approach To Database Management

    NASA Astrophysics Data System (ADS)

    Voros, Robert S.; Hillman, Donald J.; Decker, D. Richard; Blank, Glenn D.

    1989-03-01

    Practical knowledge-based systems need to reason in terms of knowledge that is already available in databases. This type of knowledge is usually represented as tables acquired from external databases and published reports. Knowledge based systems provide a means for reasoning about entities at a higher level of abstraction. What is needed in many of today's expert systems is a link between the knowledge base and external databases. One such approach is a frame-based database management system. Package Expert (PEx) designs packages for integrated circuits. The thrust of our work is to bring together diverse technologies, data and design knowledge in a coherent system. PEx uses design rules to reason about properties of chips and potential packages, including dimensions, possible materials and packaging requirements. This information is available in existing databases. PEx needs to deal with the following types of information consistently: material databases which are in several formats; technology databases, also in several formats; and parts files which contain dimensional information. It is inefficient and inelegant to have rules access the database directly. Instead, PEx uses a frame-based hierarchical knowledge management approach to databases. Frames serve as the interface between rule-based knowledge and databases. We describe PEx and the use of frames in database retrieval. We first give an overview and the design evolution of the expert system. Next, we describe the system implementation. Finally, we describe how the rules in the expert system access the databases via frames.

  5. Burned area, active fires and biomass burning - approaches to account for emissions from fires in Tanzania

    NASA Astrophysics Data System (ADS)

    Ruecker, Gernot; Hoffmann, Anja; Leimbach, David; Tiemann, Joachim; Ng'atigwa, Charles

    2013-04-01

    Eleven years of data from the globally available MODIS burned area and the MODS Active Fire Product have been analysed for Tanzania in conjunction with GIS data on land use and cover to provide a baseline for fire activity in this East African country. The total radiated energy (FRE) emitted by fires that were picked up by the burned area and active fire product is estimated based on a spatio-temporal clustering algorithm over the burned areas, and integration of the fire radiative power from the MODIS Active Fires product over the time of burning and the area of each burned area cluster. Resulting biomass combusted by unit area based on Woosteŕs scaling factor for FRE to biomass combusted is compared to values found in the literature, and to values found in the Global Fire Emissions Database (GFED). Pyrogenic emissions are then estimated using emission factors. According to our analysis, an average of 11 million ha burn annually (ranging between 8.5 and 12.9 million ha) in Tanzania corresponding to between 10 and 14 % of Tanzaniás land area. Most burned area is recorded in the months from May to October. The land cover types most affected are woodland and shrubland cover types: they comprise almost 70 % of Tanzania's average annual burned area or 6.8 million ha. Most burning occurs in gazetted land, with an annual average of 3.7 million ha in forest reserves, 3.3 million ha in game reserves and 1.46 million ha in national parks, totalling close to 8.5 million ha or 77 % of the annual average burned area of Tanzania. Annual variability of burned area is moderate for most of the analysed classes, and in most cases there is no clear trend to be detected in burned area, except for the Lindi region were annual burned area appears to be increasing. Preliminary results regarding emissions from fires show that for larger fires that burn over a longer time, biomass burned derived through the FRP method compares well to literature values, while the integration over

  6. Probabilistic approaches to accounting for data variability in the practical application of bioavailability in predicting aquatic risks from metals.

    PubMed

    Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura

    2013-07-01

    The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. PMID:23505250

  7. Systems Engineering Interfaces: A Model Based Approach

    NASA Technical Reports Server (NTRS)

    Fosse, Elyse; Delp, Christopher

    2013-01-01

    Currently: Ops Rev developed and maintains a framework that includes interface-specific language, patterns, and Viewpoints. Ops Rev implements the framework to design MOS 2.0 and its 5 Mission Services. Implementation de-couples interfaces and instances of interaction Future: A Mission MOSE implements the approach and uses the model based artifacts for reviews. The framework extends further into the ground data layers and provides a unified methodology.

  8. Generation of SEEAW asset accounts based on water resources management models

    NASA Astrophysics Data System (ADS)

    Pedro-Monzonís, María; Solera, Abel; Andreu, Joaquín

    2015-04-01

    One of the main challenges in the XXI century is related with the sustainable use of water. This is due to the fact that water is an essential element for the life of all who inhabit our planet. In many cases, the lack of economic valuation of water resources causes an inefficient water use. In this regard, society expects of policymakers and stakeholders maximise the profit produced per unit of natural resources. Water planning and the Integrated Water Resources Management (IWRM) represent the best way to achieve this goal. The System of Environmental-Economic Accounting for Water (SEEAW) is displayed as a tool for water allocation which enables the building of water balances in a river basin. The main concern of the SEEAW is to provide a standard approach which allows the policymakers to compare results between different territories. But building water accounts is a complex task due to the difficulty of the collection of the required data. Due to the difficulty of gauging the components of the hydrological cycle, the use of simulation models has become an essential tool extensively employed in last decades. The target of this paper is to present the building up of a database that enables the combined use of hydrological models and water resources models developed with AQUATOOL DSSS to fill in the SEEAW tables. This research is framed within the Water Accounting in a Multi-Catchment District (WAMCD) project, financed by the European Union. Its main goal is the development of water accounts in the Mediterranean Andalusian River Basin District, in Spain. This research pretends to contribute to the objectives of the "Blueprint to safeguard Europe's water resources". It is noteworthy that, in Spain, a large part of these methodological decisions are included in the Spanish Guideline of Water Planning with normative status guaranteeing consistency and comparability of the results.

  9. The Symbol Grounding Problem Revisited: A Thorough Evaluation of the ANS Mapping Account and the Proposal of an Alternative Account Based on Symbol–Symbol Associations

    PubMed Central

    Reynvoet, Bert; Sasanguie, Delphine

    2016-01-01

    Recently, a lot of studies in the domain of numerical cognition have been published demonstrating a robust association between numerical symbol processing and individual differences in mathematics achievement. Because numerical symbols are so important for mathematics achievement, many researchers want to provide an answer on the ‘symbol grounding problem,’ i.e., how does a symbol acquires its numerical meaning? The most popular account, the approximate number system (ANS) mapping account, assumes that a symbol acquires its numerical meaning by being mapped on a non-verbal and ANS. Here, we critically evaluate four arguments that are supposed to support this account, i.e., (1) there is an evolutionary system for approximate number processing, (2) non-symbolic and symbolic number processing show the same behavioral effects, (3) non-symbolic and symbolic numbers activate the same brain regions which are also involved in more advanced calculation and (4) non-symbolic comparison is related to the performance on symbolic mathematics achievement tasks. Based on this evaluation, we conclude that all of these arguments and consequently also the mapping account are questionable. Next we explored less popular alternative, where small numerical symbols are initially mapped on a precise representation and then, in combination with increasing knowledge of the counting list result in an independent and exact symbolic system based on order relations between symbols. We evaluate this account by reviewing evidence on order judgment tasks following the same four arguments. Although further research is necessary, the available evidence so far suggests that this symbol–symbol association account should be considered as a worthy alternative of how symbols acquire their meaning. PMID:27790179

  10. Reexamining the language account of cross-national differences in base-10 number representations.

    PubMed

    Vasilyeva, Marina; Laski, Elida V; Ermakova, Anna; Lai, Weng-Feng; Jeong, Yoonkyung; Hachigian, Amy

    2015-01-01

    East Asian students consistently outperform students from other nations in mathematics. One explanation for this advantage is a language account; East Asian languages, unlike most Western languages, provide cues about the base-10 structure of multi-digit numbers, facilitating the development of base-10 number representations. To test this view, the current study examined how kindergartners represented two-digit numbers using single unit-blocks and ten-blocks. The participants (N=272) were from four language groups (Korean, Mandarin, English, and Russian) that vary in the extent of "transparency" of the base-10 structure. In contrast to previous findings with older children, kindergartners showed no cross-language variability in the frequency of producing base-10 representations. Furthermore, they showed a pattern of within-language variability that was not consistent with the language account and was likely attributable to experiential factors. These findings suggest that language might not play as critical a role in the development of base-10 representations as suggested in earlier research. PMID:25240152

  11. Salience and Attention in Surprisal-Based Accounts of Language Processing

    PubMed Central

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus. PMID:27375525

  12. Salience and Attention in Surprisal-Based Accounts of Language Processing.

    PubMed

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus. PMID:27375525

  13. Matched filter based iterative adaptive approach

    NASA Astrophysics Data System (ADS)

    Nepal, Ramesh; Zhang, Yan Rockee; Li, Zhengzheng; Blake, William

    2016-05-01

    Matched Filter sidelobes from diversified LPI waveform design and sensor resolution are two important considerations in radars and active sensors in general. Matched Filter sidelobes can potentially mask weaker targets, and low sensor resolution not only causes a high margin of error but also limits sensing in target-rich environment/ sector. The improvement in those factors, in part, concern with the transmitted waveform and consequently pulse compression techniques. An adaptive pulse compression algorithm is hence desired that can mitigate the aforementioned limitations. A new Matched Filter based Iterative Adaptive Approach, MF-IAA, as an extension to traditional Iterative Adaptive Approach, IAA, has been developed. MF-IAA takes its input as the Matched Filter output. The motivation here is to facilitate implementation of Iterative Adaptive Approach without disrupting the processing chain of traditional Matched Filter. Similar to IAA, MF-IAA is a user parameter free, iterative, weighted least square based spectral identification algorithm. This work focuses on the implementation of MF-IAA. The feasibility of MF-IAA is studied using a realistic airborne radar simulator as well as actual measured airborne radar data. The performance of MF-IAA is measured with different test waveforms, and different Signal-to-Noise (SNR) levels. In addition, Range-Doppler super-resolution using MF-IAA is investigated. Sidelobe reduction as well as super-resolution enhancement is validated. The robustness of MF-IAA with respect to different LPI waveforms and SNR levels is also demonstrated.

  14. An Analytical Approach to Model Heterogonous Recrystallization Kinetics Taking into Account the Natural Spatial Inhomogeneity of Deformation

    NASA Astrophysics Data System (ADS)

    Luo, Haiwen; van der Zwaag, Sybrand

    2016-01-01

    The classical Johnson-Mehl-Avrami-Kolmogorov equation was modified to take into account the normal local strain distribution in deformed samples. This new approach is not only able to describe the influence of the local heterogeneity of recrystallization but also to produce an average apparent Avrami exponent to characterize the entire recrystallization process. In particular, it predicts that the apparent Avrami exponent should be within a narrow range of 1 to 2 and converges to 1 when the local strain varies greatly. Moreover, the apparent Avrami exponent is predicted to be insensitive to temperature and deformation conditions. These predictions are in excellent agreement with the experimental observations on static recrystallization after hot deformation in different steels and other metallic alloys.

  15. Analysis of the Second E-Forum on Competency-Based Approaches

    ERIC Educational Resources Information Center

    Perez, Leticia

    2007-01-01

    As its title suggests, this is an account of "the Second E-Forum on Competency-based Approaches" and summarises the opinions and experiences expressed by the participants. The forum was based on a discussion paper prepared by the Canadian Observatory of Educational Reforms and a series of questions raised by Philippe Jonnaert were used to…

  16. Electrochemical Approaches to Aptamer-Based Sensing

    NASA Astrophysics Data System (ADS)

    Xiao, Yi; Plaxco, Kevin W.

    Motivated by the potential convenience of electronic detection, a wide range of electrochemical, aptamer-based sensors have been reported since the first was described only in 2005. Although many of these are simply electrochemical, aptamer-based equivalents of traditional immunochemical approaches (e.g., sandwich and competition assays employing electroactive signaling moieties), others exploit the unusual physical properties of aptamers, properties that render them uniquely well suited for application to impedance and folding-based electrochemical sensors. In particular, the ability of electrode-bound aptamers to undergo reversible, binding-induced folding provides a robust, reagentless means of transducing target binding into an electronic signal that is largely impervious to nonspecific signals arising from contaminants. This capability enables the direct detection of specific proteins at physiologically relevant, picomolar concentrations in blood serum and other complex, contaminant-ridden sample matrices.

  17. Risk-Based Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2002-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine

  18. Acid-base accounting to predict post-mining drainage quality on surface mines.

    PubMed

    Skousen, J; Simmons, J; McDonald, L M; Ziemkiewicz, P

    2002-01-01

    Acid-base accounting (ABA) is an analytical procedure that provides values to help assess the acid-producing and acid-neutralizing potential of overburden rocks prior to coal mining and other large-scale excavations. This procedure was developed by West Virginia University scientists during the 1960s. After the passage of laws requiring an assessment of surface mining on water quality, ABA became a preferred method to predict post-mining water quality, and permitting decisions for surface mines are largely based on the values determined by ABA. To predict the post-mining water quality, the amount of acid-producing rock is compared with the amount of acid-neutralizing rock, and a prediction of the water quality at the site (whether acid or alkaline) is obtained. We gathered geologic and geographic data for 56 mined sites in West Virginia, which allowed us to estimate total overburden amounts, and values were determined for maximum potential acidity (MPA), neutralization potential (NP), net neutralization potential (NNP), and NP to MPA ratios for each site based on ABA. These values were correlated to post-mining water quality from springs or seeps on the mined property. Overburden mass was determined by three methods, with the method used by Pennsylvania researchers showing the most accurate results for overburden mass. A poor relationship existed between MPA and post-mining water quality, NP was intermediate, and NNP and the NP to MPA ratio showed the best prediction accuracy. In this study, NNP and the NP to MPA ratio gave identical water quality prediction results. Therefore, with NP to MPA ratios, values were separated into categories: <1 should produce acid drainage, between 1 and 2 can produce either acid or alkaline water conditions, and >2 should produce alkaline water. On our 56 surface mined sites, NP to MPA ratios varied from 0.1 to 31, and six sites (11%) did not fit the expected pattern using this category approach. Two sites with ratios <1 did not

  19. Deciding Who Decides Questions at the Intersection of School Finance Reform Litigation and Standards-Based Accountability Policies

    ERIC Educational Resources Information Center

    Superfine, Benjamin Michael

    2009-01-01

    Courts hearing school finance reform cases have recently begun to consider several issues related to standards-based accountability policies. This convergence of school finance reform litigation and standards-based accountability policies represents a chance for the courts to reallocate decision-making authority for each type of reform across the…

  20. Administrators' Perceptions of Outcome-Based Education: Outputs, Outcomes and Professional Accountability.

    ERIC Educational Resources Information Center

    Furman, Gail Chase

    This case study explores the impact of outcome-based education (OBE) in one school district 5 years after its adoption. The study is guided by constructivist theory and a perspective that policy studies can have an important problem-finding function. The basic assumption of the OBE approach, that educational improvement depends on a shift in focus…

  1. Place-Based Pedagogy in the Era of Accountability: An Action Research Study

    ERIC Educational Resources Information Center

    Saracino, Peter C.

    2010-01-01

    Today's most common method of teaching biology--driven by calls for standardization and high-stakes testing--relies on a standards-based, de-contextualized approach to education. This results in "one size fits all" curriculums that ignore local contexts relevant to students' lives, discourage student engagement and ultimately work against a deep…

  2. Computer-based accountability system (Phase I) for special nuclear materials at Argonne-West

    SciTech Connect

    Ingermanson, R.S.; Proctor, A.E.

    1982-05-01

    An automated accountability system for special nuclear materials (SNM) is under development at Argonne National Laboratory-West. Phase I of the development effort has established the following basic features of the system: a unique file organization allows rapid updating or retrieval of the status of various SNM, based on batch numbers, storage location, serial number, or other attributes. Access to the program is controlled by an interactive user interface that can be easily understood by operators who have had no prior background in electronic data processing. Extensive use of structured programming techniques make the software package easy to understand and to modify for specific applications. All routines are written in FORTRAN.

  3. The impact of activity based cost accounting on health care capital investment decisions.

    PubMed

    Greene, J K; Metwalli, A

    2001-01-01

    For the future survival of the rural hospitals in the U.S., there is a need to make sound financial decisions. The Activity Based Cost Accounting (ABC) provides more accurate and detailed cost information to make an informed capital investment decision taking into consideration all the costs and revenue reimbursement from third party payors. The paper analyzes, evaluates and compares two scenarios of acquiring capital equipment and attempts to show the importance of utilizing the ABC method in making a sound financial decision as compared to the traditional cost method. PMID:11794757

  4. Water accounting for stressed river basins based on water resources management models.

    PubMed

    Pedro-Monzonís, María; Solera, Abel; Ferrer, Javier; Andreu, Joaquín; Estrela, Teodoro

    2016-09-15

    Water planning and the Integrated Water Resources Management (IWRM) represent the best way to help decision makers to identify and choose the most adequate alternatives among other possible ones. The System of Environmental-Economic Accounting for Water (SEEA-W) is displayed as a tool for the building of water balances in a river basin, providing a standard approach to achieve comparability of the results between different territories. The target of this paper is to present the building up of a tool that enables the combined use of hydrological models and water resources models to fill in the SEEA-W tables. At every step of the modelling chain, we are capable to build the asset accounts and the physical water supply and use tables according to SEEA-W approach along with an estimation of the water services costs. The case study is the Jucar River Basin District (RBD), located in the eastern part of the Iberian Peninsula in Spain which as in other many Mediterranean basins is currently water-stressed. To guide this work we have used PATRICAL model in combination with AQUATOOL Decision Support System (DSS). The results indicate that for the average year the total use of water in the district amounts to 15,143hm(3)/year, being the Total Water Renewable Water Resources 3909hm(3)/year. On the other hand, the water service costs in Jucar RBD amounts to 1634 million € per year at constant 2012 prices. It is noteworthy that 9% of these costs correspond to non-conventional resources, such as desalinated water, reused water and water transferred from other regions.

  5. Water accounting for stressed river basins based on water resources management models.

    PubMed

    Pedro-Monzonís, María; Solera, Abel; Ferrer, Javier; Andreu, Joaquín; Estrela, Teodoro

    2016-09-15

    Water planning and the Integrated Water Resources Management (IWRM) represent the best way to help decision makers to identify and choose the most adequate alternatives among other possible ones. The System of Environmental-Economic Accounting for Water (SEEA-W) is displayed as a tool for the building of water balances in a river basin, providing a standard approach to achieve comparability of the results between different territories. The target of this paper is to present the building up of a tool that enables the combined use of hydrological models and water resources models to fill in the SEEA-W tables. At every step of the modelling chain, we are capable to build the asset accounts and the physical water supply and use tables according to SEEA-W approach along with an estimation of the water services costs. The case study is the Jucar River Basin District (RBD), located in the eastern part of the Iberian Peninsula in Spain which as in other many Mediterranean basins is currently water-stressed. To guide this work we have used PATRICAL model in combination with AQUATOOL Decision Support System (DSS). The results indicate that for the average year the total use of water in the district amounts to 15,143hm(3)/year, being the Total Water Renewable Water Resources 3909hm(3)/year. On the other hand, the water service costs in Jucar RBD amounts to 1634 million € per year at constant 2012 prices. It is noteworthy that 9% of these costs correspond to non-conventional resources, such as desalinated water, reused water and water transferred from other regions. PMID:27161139

  6. LP based approach to optimal stable matchings

    SciTech Connect

    Teo, Chung-Piaw; Sethuraman, J.

    1997-06-01

    We study the classical stable marriage and stable roommates problems using a polyhedral approach. We propose a new LP formulation for the stable roommates problem. This formulation is non-empty if and only if the underlying roommates problem has a stable matching. Furthermore, for certain special weight functions on the edges, we construct a 2-approximation algorithm for the optimal stable roommates problem. Our technique uses a crucial geometry of the fractional solutions in this formulation. For the stable marriage problem, we show that a related geometry allows us to express any fractional solution in the stable marriage polytope as convex combination of stable marriage solutions. This leads to a genuinely simple proof of the integrality of the stable marriage polytope. Based on these ideas, we devise a heuristic to solve the optimal stable roommates problem. The heuristic combines the power of rounding and cutting-plane methods. We present some computational results based on preliminary implementations of this heuristic.

  7. Lunar base CELSS: A bioregenerative approach

    NASA Technical Reports Server (NTRS)

    Easterwood, G. W.; Street, J. J.; Sartain, J. B.; Hubbell, D. H.; Robitaille, H. A.

    1992-01-01

    During the twenty-first century, human habitation of a self-sustaining lunar base could become a reality. To achieve this goal, the occupants will have to have food, water, and an adequate atmosphere within a carefully designed environment. Advanced technology will be employed to support terrestrial life-sustaining processes on the Moon. One approach to a life support system based on food production, waste management and utilization, and product synthesis is outlined. Inputs include an atmosphere, water, plants, biodegradable substrates, and manufacutured materials such as fiberglass containment vessels from lunar resources. Outputs include purification of air and water, food, and hydrogen (H2) generated from methane (CH4). Important criteria are as follows: (1) minimize resupply from Earth; and (2) recycle as efficiently as possible.

  8. Accountability in Dispositions for Juvenile Drug Offenders. Monograph.

    ERIC Educational Resources Information Center

    Pacific Inst. for Research and Evaluation, Walnut Creek, CA.

    Guidelines for the general development and implementation of accountability-based approaches for juvenile drug offenders are presented in this monograph. These topics are discussed: (1) the accountability approach; (2) the relevance of the accountability approach to drug offenders and its relationship to drug abuse treatment; (3) surveys of chief…

  9. Synthetic aperture elastography: a GPU based approach

    NASA Astrophysics Data System (ADS)

    Verma, Prashant; Doyley, Marvin M.

    2014-03-01

    Synthetic aperture (SA) ultrasound imaging system produces highly accurate axial and lateral displacement estimates; however, low frame rates and large data volumes can hamper its clinical use. This paper describes a real-time SA imaging based ultrasound elastography system that we have recently developed to overcome this limitation. In this system, we implemented both beamforming and 2D cross-correlation echo tracking on Nvidia GTX 480 graphics processing unit (GPU). We used one thread per pixel for beamforming; whereas, one block per pixel was used for echo tracking. We compared the quality of elastograms computed with our real-time system relative to those computed using our standard single threaded elastographic imaging methodology. In all studies, we used conventional measures of image quality such as elastographic signal to noise ratio (SNRe). Specifically, SNRe of axial and lateral strain elastograms computed with real-time system were 36 dB and 23 dB, respectively, which was numerically equal to those computed with our standard approach. We achieved a frame rate of 6 frames per second using our GPU based approach for 16 transmits and kernel size of 60 × 60 pixels, which is 400 times faster than that achieved using our standard protocol.

  10. Structuring a Competency-Based Accounting Communication Course at the Graduate Level

    ERIC Educational Resources Information Center

    Sharifi, Mohsen; McCombs, Gary B.; Fraser, Linda Lussy; McCabe, Robert K.

    2009-01-01

    The authors describe a graduate capstone accounting class as a basis for building communication skills desired by both accounting practitioners and accounting faculty. An academic service-learning (ASL) component is included. Adopted as a required class for a master of science degree in accounting at two universities, this course supports…

  11. Peptide Based Radiopharmaceuticals: Specific Construct Approach

    SciTech Connect

    Som, P; Rhodes, B A; Sharma, S S

    1997-10-21

    The objective of this project was to develop receptor based peptides for diagnostic imaging and therapy. A series of peptides related to cell adhesion molecules (CAM) and immune regulation were designed for radiolabeling with 99mTc and evaluated in animal models as potential diagnostic imaging agents for various disease conditions such as thrombus (clot), acute kidney failure, and inflection/inflammation imaging. The peptides for this project were designed by the industrial partner, Palatin Technologies, (formerly Rhomed, Inc.) using various peptide design approaches including a newly developed rational computer assisted drug design (CADD) approach termed MIDAS (Metal ion Induced Distinctive Array of Structures). In this approach, the biological function domain and the 99mTc complexing domain are fused together so that structurally these domains are indistinguishable. This approach allows construction of conformationally rigid metallo-peptide molecules (similar to cyclic peptides) that are metabolically stable in-vivo. All the newly designed peptides were screened in various in vitro receptor binding and functional assays to identify a lead compound. The lead compounds were formulated in a one-step 99mTc labeling kit form which were studied by BNL for detailed in-vivo imaging using various animals models of human disease. Two main peptides usingMIDAS approach evolved and were investigated: RGD peptide for acute renal failure and an immunomodulatory peptide derived from tuftsin (RMT-1) for infection/inflammation imaging. Various RGD based metallopeptides were designed, synthesized and assayed for their efficacy in inhibiting ADP-induced human platelet aggregation. Most of these peptides displayed biological activity in the 1-100 µM range. Based on previous work by others, RGD-I and RGD-II were evaluated in animal models of acute renal failure. These earlier studies showed that after acute ischemic injury the renal cortex displays

  12. Is comprehension necessary for error detection? A conflict-based account of monitoring in speech production

    PubMed Central

    Nozari, Nazbanou; Dell, Gary S.; Schwartz, Myrna F.

    2011-01-01

    Despite the existence of speech errors, verbal communication is successful because speakers can detect (and correct) their errors. The standard theory of speech-error detection, the perceptual-loop account, posits that the comprehension system monitors production output for errors. Such a comprehension-based monitor, however, cannot explain the double dissociation between comprehension and error-detection ability observed in the aphasic patients. We propose a new theory of speech-error detection which is instead based on the production process itself. The theory borrows from studies of forced-choice-response tasks the notion that error detection is accomplished by monitoring response conflict via a frontal brain structure, such as the anterior cingulate cortex. We adapt this idea to the two-step model of word production, and test the model-derived predictions on a sample of aphasic patients. Our results show a strong correlation between patients’ error-detection ability and the model’s characterization of their production skills, and no significant correlation between error detection and comprehension measures, thus supporting a production-based monitor, generally, and the implemented conflict-based monitor in particular. The successful application of the conflict-based theory to error-detection in linguistic, as well as non-linguistic domains points to a domain-general monitoring system. PMID:21652015

  13. Integrating software into PRA: a test-based approach.

    PubMed

    Li, Bin; Li, Ming; Smidts, Carol

    2005-08-01

    Probabilistic risk assessment (PRA) is a methodology to assess the probability of failure or success of a system's operation. PRA has been proved to be a systematic, logical, and comprehensive technique for risk assessment. Software plays an increasing role in modern safety critical systems. A significant number of failures can be attributed to software failures. Unfortunately, current probabilistic risk assessment concentrates on representing the behavior of hardware systems, humans, and their contributions (to a limited extent) to risk but neglects the contributions of software due to a lack of understanding of software failure phenomena. It is thus imperative to consider and model the impact of software to reflect the risk in current and future systems. The objective of our research is to develop a methodology to account for the impact of software on system failure that can be used in the classical PRA analysis process. A test-based approach for integrating software into PRA is discussed in this article. This approach includes identification of software functions to be modeled in the PRA, modeling of the software contributions in the ESD, and fault tree. The approach also introduces the concepts of input tree and output tree and proposes a quantification strategy that uses a software safety testing technique. The method is applied to an example system, PACS. PMID:16268949

  14. ECG biometric identification: A compression based approach.

    PubMed

    Bras, Susana; Pinho, Armando J

    2015-08-01

    Using the electrocardiogram signal (ECG) to identify and/or authenticate persons are problems still lacking satisfactory solutions. Yet, ECG possesses characteristics that are unique or difficult to get from other signals used in biometrics: (1) it requires contact and liveliness for acquisition (2) it changes under stress, rendering it potentially useless if acquired under threatening. Our main objective is to present an innovative and robust solution to the above-mentioned problem. To successfully conduct this goal, we rely on information-theoretic data models for data compression and on similarity metrics related to the approximation of the Kolmogorov complexity. The proposed measure allows the comparison of two (or more) ECG segments, without having to follow traditional approaches that require heartbeat segmentation (described as highly influenced by external or internal interferences). As a first approach, the method was able to cluster the data in three groups: identical record, same participant, different participant, by the stratification of the proposed measure with values near 0 for the same participant and closer to 1 for different participants. A leave-one-out strategy was implemented in order to identify the participant in the database based on his/her ECG. A 1NN classifier was implemented, using as distance measure the method proposed in this work. The classifier was able to identify correctly almost all participants, with an accuracy of 99% in the database used. PMID:26737619

  15. Place-based pedagogy in the era of accountability: An action research study

    NASA Astrophysics Data System (ADS)

    Saracino, Peter C.

    Today's most common method of teaching biology---driven by calls for standardization and high-stakes testing---relies on a standards-based, de-contextualized approach to education. This results in "one size fits all" curriculums that ignore local contexts relevant to students' lives, discourage student engagement and ultimately work against a deep and lasting understanding of content. In contrast, place-based education---a pedagogical paradigm grounded in situated cognition and the progressive education tradition of John Dewey---utilizes the community as an integrating context for learning. It encourages the growth of school-community partnerships with an eye towards raising student achievement while also drawing students into the economic, political, social and ecological life of their communities. Such an approach seeks to provide students with learning experiences that are both academically significant and valuable to their communities. This study explores how high school science teachers can capitalize on the rich affordances offered by a place-based approach despite the constraints imposed by a state-mandated curriculum and high-stakes testing. Using action research, I designed, implemented, evaluated and refined an intervention that grounded a portion of a Living Environment high school course I teach in a place-based experience. This experience served as a unique anchoring event to contextualize students' learning of other required core topics. The overarching question framing this study is: How can science teachers capitalize on the rich affordances offered by a place-based approach despite the constraints imposed by a state-mandated curriculum and high-stakes testing? The following more specific questions were explored within the context of the intervention: (1) Which elements of the place-based paradigm could I effectively integrate into a Living Environment course? (2) In what ways would this integration impact students' interest? (3) In what ways would

  16. Nanotechnology-based approaches in anticancer research.

    PubMed

    Jabir, Nasimudeen R; Tabrez, Shams; Ashraf, Ghulam Md; Shakil, Shazi; Damanhouri, Ghazi A; Kamal, Mohammad A

    2012-01-01

    Cancer is a highly complex disease to understand, because it entails multiple cellular physiological systems. The most common cancer treatments are restricted to chemotherapy, radiation and surgery. Moreover, the early recognition and treatment of cancer remains a technological bottleneck. There is an urgent need to develop new and innovative technologies that could help to delineate tumor margins, identify residual tumor cells and micrometastases, and determine whether a tumor has been completely removed or not. Nanotechnology has witnessed significant progress in the past few decades, and its effect is widespread nowadays in every field. Nanoparticles can be modified in numerous ways to prolong circulation, enhance drug localization, increase drug efficacy, and potentially decrease chances of multidrug resistance by the use of nanotechnology. Recently, research in the field of cancer nanotechnology has made remarkable advances. The present review summarizes the application of various nanotechnology-based approaches towards the diagnostics and therapeutics of cancer.

  17. Nanotechnology-based approaches in anticancer research

    PubMed Central

    Jabir, Nasimudeen R; Tabrez, Shams; Ashraf, Ghulam Md; Shakil, Shazi; Damanhouri, Ghazi A; Kamal, Mohammad A

    2012-01-01

    Cancer is a highly complex disease to understand, because it entails multiple cellular physiological systems. The most common cancer treatments are restricted to chemotherapy, radiation and surgery. Moreover, the early recognition and treatment of cancer remains a technological bottleneck. There is an urgent need to develop new and innovative technologies that could help to delineate tumor margins, identify residual tumor cells and micrometastases, and determine whether a tumor has been completely removed or not. Nanotechnology has witnessed significant progress in the past few decades, and its effect is widespread nowadays in every field. Nanoparticles can be modified in numerous ways to prolong circulation, enhance drug localization, increase drug efficacy, and potentially decrease chances of multidrug resistance by the use of nanotechnology. Recently, research in the field of cancer nanotechnology has made remarkable advances. The present review summarizes the application of various nanotechnology-based approaches towards the diagnostics and therapeutics of cancer. PMID:22927757

  18. Strategic approaches to planetary base development

    NASA Technical Reports Server (NTRS)

    Roberts, Barney B.

    1992-01-01

    The evolutionary development of a planetary expansionary outpost is considered in the light of both technical and economic issues. The outline of a partnering taxonomy is set forth which encompasses both institutional and temporal issues related to establishing shared interests and investments. The purely technical issues are discussed in terms of the program components which include nonaerospace technologies such as construction engineering. Five models are proposed in which partnership and autonomy for participants are approached in different ways including: (1) the standard customer/provider relationship; (2) a service-provider scenario; (3) the joint venture; (4) a technology joint-development model; and (5) a redundancy model for reduced costs. Based on the assumed characteristics of planetary surface systems the cooperative private/public models are championed with coordinated design by NASA to facilitate outside cooperation.

  19. Sepsis management: An evidence-based approach.

    PubMed

    Baig, Muhammad Akbar; Shahzad, Hira; Jamil, Bushra; Hussain, Erfan

    2016-03-01

    The Surviving Sepsis Campaign (SSC) guidelines have outlined an early goal directed therapy (EGDT) which demonstrates a standardized approach to ensure prompt and effective management of sepsis. Having said that, there are barriers associated with the application of evidence-based practice, which often lead to an overall poorer adherence to guidelines. Considering the global burden of disease, data from low- to middle-income countries is scarce. Asia is the largest continent but most Asian countries do not have a well-developed healthcare system and compliance rates to resuscitation and management bundles are as low as 7.6% and 3.5%, respectively. Intensive care units are not adequately equipped and financial concerns limit implementation of expensive treatment strategies. Healthcare policy-makers should be notified in order to alleviate financial restrictions and ensure delivery of standard care to septic patients.

  20. Surrogate Motherhood: A Trust-Based Approach.

    PubMed

    Beier, Katharina

    2015-12-01

    Because it is often argued that surrogacy should not be treated as contractual, the question arises in which terms this practice might then be couched. In this article, I argue that a phenomenology of surrogacy centering on the notion of trust provides a description that is illuminating from the moral point of view. My thesis is that surrogacy establishes a complex and extended reproductive unit--the "surrogacy triad" consisting of the surrogate mother, the child, and the intending parents--whose constituents are bound together by mutual trustful commitments. Even though a trust-based approach does not provide an ultimate answer to whether surrogacy should be sanctioned or prohibited, it allows for at least some practical suggestions. In particular, I will argue that, under certain conditions, surrogacy is tenable within familial or other significant relationships, and I will stress the necessity of acknowledging the new relationships and moral commitments that result from this practice.

  1. Surrogate Motherhood: A Trust-Based Approach.

    PubMed

    Beier, Katharina

    2015-12-01

    Because it is often argued that surrogacy should not be treated as contractual, the question arises in which terms this practice might then be couched. In this article, I argue that a phenomenology of surrogacy centering on the notion of trust provides a description that is illuminating from the moral point of view. My thesis is that surrogacy establishes a complex and extended reproductive unit--the "surrogacy triad" consisting of the surrogate mother, the child, and the intending parents--whose constituents are bound together by mutual trustful commitments. Even though a trust-based approach does not provide an ultimate answer to whether surrogacy should be sanctioned or prohibited, it allows for at least some practical suggestions. In particular, I will argue that, under certain conditions, surrogacy is tenable within familial or other significant relationships, and I will stress the necessity of acknowledging the new relationships and moral commitments that result from this practice. PMID:26449234

  2. A Commentary on "Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century"

    ERIC Educational Resources Information Center

    Brandt, Steffen

    2010-01-01

    This article presents the author's commentary on "Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century," in which Isaac I. Bejar and E. Aurora Graf propose the application of a test design--the duplex design (which was proposed in 1988 by Bock and Mislevy) for application in current accountability assessments.…

  3. A genome-wide approach accounting for body mass index identifies genetic variants influencing fasting glycemic traits and insulin resistance.

    PubMed

    Manning, Alisa K; Hivert, Marie-France; Scott, Robert A; Grimsby, Jonna L; Bouatia-Naji, Nabila; Chen, Han; Rybin, Denis; Liu, Ching-Ti; Bielak, Lawrence F; Prokopenko, Inga; Amin, Najaf; Barnes, Daniel; Cadby, Gemma; Hottenga, Jouke-Jan; Ingelsson, Erik; Jackson, Anne U; Johnson, Toby; Kanoni, Stavroula; Ladenvall, Claes; Lagou, Vasiliki; Lahti, Jari; Lecoeur, Cecile; Liu, Yongmei; Martinez-Larrad, Maria Teresa; Montasser, May E; Navarro, Pau; Perry, John R B; Rasmussen-Torvik, Laura J; Salo, Perttu; Sattar, Naveed; Shungin, Dmitry; Strawbridge, Rona J; Tanaka, Toshiko; van Duijn, Cornelia M; An, Ping; de Andrade, Mariza; Andrews, Jeanette S; Aspelund, Thor; Atalay, Mustafa; Aulchenko, Yurii; Balkau, Beverley; Bandinelli, Stefania; Beckmann, Jacques S; Beilby, John P; Bellis, Claire; Bergman, Richard N; Blangero, John; Boban, Mladen; Boehnke, Michael; Boerwinkle, Eric; Bonnycastle, Lori L; Boomsma, Dorret I; Borecki, Ingrid B; Böttcher, Yvonne; Bouchard, Claude; Brunner, Eric; Budimir, Danijela; Campbell, Harry; Carlson, Olga; Chines, Peter S; Clarke, Robert; Collins, Francis S; Corbatón-Anchuelo, Arturo; Couper, David; de Faire, Ulf; Dedoussis, George V; Deloukas, Panos; Dimitriou, Maria; Egan, Josephine M; Eiriksdottir, Gudny; Erdos, Michael R; Eriksson, Johan G; Eury, Elodie; Ferrucci, Luigi; Ford, Ian; Forouhi, Nita G; Fox, Caroline S; Franzosi, Maria Grazia; Franks, Paul W; Frayling, Timothy M; Froguel, Philippe; Galan, Pilar; de Geus, Eco; Gigante, Bruna; Glazer, Nicole L; Goel, Anuj; Groop, Leif; Gudnason, Vilmundur; Hallmans, Göran; Hamsten, Anders; Hansson, Ola; Harris, Tamara B; Hayward, Caroline; Heath, Simon; Hercberg, Serge; Hicks, Andrew A; Hingorani, Aroon; Hofman, Albert; Hui, Jennie; Hung, Joseph; Jarvelin, Marjo-Riitta; Jhun, Min A; Johnson, Paul C D; Jukema, J Wouter; Jula, Antti; Kao, W H; Kaprio, Jaakko; Kardia, Sharon L R; Keinanen-Kiukaanniemi, Sirkka; Kivimaki, Mika; Kolcic, Ivana; Kovacs, Peter; Kumari, Meena; Kuusisto, Johanna; Kyvik, Kirsten Ohm; Laakso, Markku; Lakka, Timo; Lannfelt, Lars; Lathrop, G Mark; Launer, Lenore J; Leander, Karin; Li, Guo; Lind, Lars; Lindstrom, Jaana; Lobbens, Stéphane; Loos, Ruth J F; Luan, Jian'an; Lyssenko, Valeriya; Mägi, Reedik; Magnusson, Patrik K E; Marmot, Michael; Meneton, Pierre; Mohlke, Karen L; Mooser, Vincent; Morken, Mario A; Miljkovic, Iva; Narisu, Narisu; O'Connell, Jeff; Ong, Ken K; Oostra, Ben A; Palmer, Lyle J; Palotie, Aarno; Pankow, James S; Peden, John F; Pedersen, Nancy L; Pehlic, Marina; Peltonen, Leena; Penninx, Brenda; Pericic, Marijana; Perola, Markus; Perusse, Louis; Peyser, Patricia A; Polasek, Ozren; Pramstaller, Peter P; Province, Michael A; Räikkönen, Katri; Rauramaa, Rainer; Rehnberg, Emil; Rice, Ken; Rotter, Jerome I; Rudan, Igor; Ruokonen, Aimo; Saaristo, Timo; Sabater-Lleal, Maria; Salomaa, Veikko; Savage, David B; Saxena, Richa; Schwarz, Peter; Seedorf, Udo; Sennblad, Bengt; Serrano-Rios, Manuel; Shuldiner, Alan R; Sijbrands, Eric J G; Siscovick, David S; Smit, Johannes H; Small, Kerrin S; Smith, Nicholas L; Smith, Albert Vernon; Stančáková, Alena; Stirrups, Kathleen; Stumvoll, Michael; Sun, Yan V; Swift, Amy J; Tönjes, Anke; Tuomilehto, Jaakko; Trompet, Stella; Uitterlinden, Andre G; Uusitupa, Matti; Vikström, Max; Vitart, Veronique; Vohl, Marie-Claude; Voight, Benjamin F; Vollenweider, Peter; Waeber, Gerard; Waterworth, Dawn M; Watkins, Hugh; Wheeler, Eleanor; Widen, Elisabeth; Wild, Sarah H; Willems, Sara M; Willemsen, Gonneke; Wilson, James F; Witteman, Jacqueline C M; Wright, Alan F; Yaghootkar, Hanieh; Zelenika, Diana; Zemunik, Tatijana; Zgaga, Lina; Wareham, Nicholas J; McCarthy, Mark I; Barroso, Ines; Watanabe, Richard M; Florez, Jose C; Dupuis, Josée; Meigs, James B; Langenberg, Claudia

    2012-06-01

    Recent genome-wide association studies have described many loci implicated in type 2 diabetes (T2D) pathophysiology and β-cell dysfunction but have contributed little to the understanding of the genetic basis of insulin resistance. We hypothesized that genes implicated in insulin resistance pathways might be uncovered by accounting for differences in body mass index (BMI) and potential interactions between BMI and genetic variants. We applied a joint meta-analysis approach to test associations with fasting insulin and glucose on a genome-wide scale. We present six previously unknown loci associated with fasting insulin at P < 5 × 10(-8) in combined discovery and follow-up analyses of 52 studies comprising up to 96,496 non-diabetic individuals. Risk variants were associated with higher triglyceride and lower high-density lipoprotein (HDL) cholesterol levels, suggesting a role for these loci in insulin resistance pathways. The discovery of these loci will aid further characterization of the role of insulin resistance in T2D pathophysiology. PMID:22581228

  4. A genome-wide approach accounting for body mass index identifies genetic variants influencing fasting glycemic traits and insulin resistance.

    PubMed

    Manning, Alisa K; Hivert, Marie-France; Scott, Robert A; Grimsby, Jonna L; Bouatia-Naji, Nabila; Chen, Han; Rybin, Denis; Liu, Ching-Ti; Bielak, Lawrence F; Prokopenko, Inga; Amin, Najaf; Barnes, Daniel; Cadby, Gemma; Hottenga, Jouke-Jan; Ingelsson, Erik; Jackson, Anne U; Johnson, Toby; Kanoni, Stavroula; Ladenvall, Claes; Lagou, Vasiliki; Lahti, Jari; Lecoeur, Cecile; Liu, Yongmei; Martinez-Larrad, Maria Teresa; Montasser, May E; Navarro, Pau; Perry, John R B; Rasmussen-Torvik, Laura J; Salo, Perttu; Sattar, Naveed; Shungin, Dmitry; Strawbridge, Rona J; Tanaka, Toshiko; van Duijn, Cornelia M; An, Ping; de Andrade, Mariza; Andrews, Jeanette S; Aspelund, Thor; Atalay, Mustafa; Aulchenko, Yurii; Balkau, Beverley; Bandinelli, Stefania; Beckmann, Jacques S; Beilby, John P; Bellis, Claire; Bergman, Richard N; Blangero, John; Boban, Mladen; Boehnke, Michael; Boerwinkle, Eric; Bonnycastle, Lori L; Boomsma, Dorret I; Borecki, Ingrid B; Böttcher, Yvonne; Bouchard, Claude; Brunner, Eric; Budimir, Danijela; Campbell, Harry; Carlson, Olga; Chines, Peter S; Clarke, Robert; Collins, Francis S; Corbatón-Anchuelo, Arturo; Couper, David; de Faire, Ulf; Dedoussis, George V; Deloukas, Panos; Dimitriou, Maria; Egan, Josephine M; Eiriksdottir, Gudny; Erdos, Michael R; Eriksson, Johan G; Eury, Elodie; Ferrucci, Luigi; Ford, Ian; Forouhi, Nita G; Fox, Caroline S; Franzosi, Maria Grazia; Franks, Paul W; Frayling, Timothy M; Froguel, Philippe; Galan, Pilar; de Geus, Eco; Gigante, Bruna; Glazer, Nicole L; Goel, Anuj; Groop, Leif; Gudnason, Vilmundur; Hallmans, Göran; Hamsten, Anders; Hansson, Ola; Harris, Tamara B; Hayward, Caroline; Heath, Simon; Hercberg, Serge; Hicks, Andrew A; Hingorani, Aroon; Hofman, Albert; Hui, Jennie; Hung, Joseph; Jarvelin, Marjo-Riitta; Jhun, Min A; Johnson, Paul C D; Jukema, J Wouter; Jula, Antti; Kao, W H; Kaprio, Jaakko; Kardia, Sharon L R; Keinanen-Kiukaanniemi, Sirkka; Kivimaki, Mika; Kolcic, Ivana; Kovacs, Peter; Kumari, Meena; Kuusisto, Johanna; Kyvik, Kirsten Ohm; Laakso, Markku; Lakka, Timo; Lannfelt, Lars; Lathrop, G Mark; Launer, Lenore J; Leander, Karin; Li, Guo; Lind, Lars; Lindstrom, Jaana; Lobbens, Stéphane; Loos, Ruth J F; Luan, Jian'an; Lyssenko, Valeriya; Mägi, Reedik; Magnusson, Patrik K E; Marmot, Michael; Meneton, Pierre; Mohlke, Karen L; Mooser, Vincent; Morken, Mario A; Miljkovic, Iva; Narisu, Narisu; O'Connell, Jeff; Ong, Ken K; Oostra, Ben A; Palmer, Lyle J; Palotie, Aarno; Pankow, James S; Peden, John F; Pedersen, Nancy L; Pehlic, Marina; Peltonen, Leena; Penninx, Brenda; Pericic, Marijana; Perola, Markus; Perusse, Louis; Peyser, Patricia A; Polasek, Ozren; Pramstaller, Peter P; Province, Michael A; Räikkönen, Katri; Rauramaa, Rainer; Rehnberg, Emil; Rice, Ken; Rotter, Jerome I; Rudan, Igor; Ruokonen, Aimo; Saaristo, Timo; Sabater-Lleal, Maria; Salomaa, Veikko; Savage, David B; Saxena, Richa; Schwarz, Peter; Seedorf, Udo; Sennblad, Bengt; Serrano-Rios, Manuel; Shuldiner, Alan R; Sijbrands, Eric J G; Siscovick, David S; Smit, Johannes H; Small, Kerrin S; Smith, Nicholas L; Smith, Albert Vernon; Stančáková, Alena; Stirrups, Kathleen; Stumvoll, Michael; Sun, Yan V; Swift, Amy J; Tönjes, Anke; Tuomilehto, Jaakko; Trompet, Stella; Uitterlinden, Andre G; Uusitupa, Matti; Vikström, Max; Vitart, Veronique; Vohl, Marie-Claude; Voight, Benjamin F; Vollenweider, Peter; Waeber, Gerard; Waterworth, Dawn M; Watkins, Hugh; Wheeler, Eleanor; Widen, Elisabeth; Wild, Sarah H; Willems, Sara M; Willemsen, Gonneke; Wilson, James F; Witteman, Jacqueline C M; Wright, Alan F; Yaghootkar, Hanieh; Zelenika, Diana; Zemunik, Tatijana; Zgaga, Lina; Wareham, Nicholas J; McCarthy, Mark I; Barroso, Ines; Watanabe, Richard M; Florez, Jose C; Dupuis, Josée; Meigs, James B; Langenberg, Claudia

    2012-05-13

    Recent genome-wide association studies have described many loci implicated in type 2 diabetes (T2D) pathophysiology and β-cell dysfunction but have contributed little to the understanding of the genetic basis of insulin resistance. We hypothesized that genes implicated in insulin resistance pathways might be uncovered by accounting for differences in body mass index (BMI) and potential interactions between BMI and genetic variants. We applied a joint meta-analysis approach to test associations with fasting insulin and glucose on a genome-wide scale. We present six previously unknown loci associated with fasting insulin at P < 5 × 10(-8) in combined discovery and follow-up analyses of 52 studies comprising up to 96,496 non-diabetic individuals. Risk variants were associated with higher triglyceride and lower high-density lipoprotein (HDL) cholesterol levels, suggesting a role for these loci in insulin resistance pathways. The discovery of these loci will aid further characterization of the role of insulin resistance in T2D pathophysiology.

  5. What is narrative therapy and what is it not?: the usefulness of Q methodology to explore accounts of White and Epston's (1990) approach to narrative therapy.

    PubMed

    Wallis, Jennifer; Burns, Jan; Capdevila, Rose

    2011-01-01

    OBJECTIVE. 'What is narrative therapy and how do you do it?' is a question that is repeatedly asked of narrative therapy, with little consistent response. This study aimed to explore and distil out the 'common themes' of practitioner definitions of White and Epston's approach to narrative therapy. DESIGN. This was an Internet-based study involving current UK practitioners of this type of narrative therapy using a unique combination of a Delphi Panel and Q methodology. METHOD. A group of experienced practitioners were recruited into the Delphi Poll and were asked two questions about what narrative therapy is and is not, and what techniques are and are not employed. These data combined with other information formed the statements of a Q-sort that was then administered to a wider range of narrative practitioners. FINDINGS. The Delphi Panel agreed on a number of key points relating to the theory, politics and practice of narrative therapy. The Q-sort produced eight distinct accounts of narrative therapy and a number of dimensions along which these different positions could be distinguished. These included narrative therapy as a political stance and integration with other approaches. CONCLUSIONS. For any therapeutic model to demonstrate its efficacy and attract proponents, an accepted definition of its components and practice should preferably be established. This study has provided some data for the UK application of White and Epston's narrative therapy, which may then assist in forming a firmer base for further research and practice.

  6. Binomial Mixture Model Based Association Testing to Account for Genetic Heterogeneity for GWAS.

    PubMed

    Xu, Zhiyuan; Pan, Wei

    2016-04-01

    Genome-wide association studies (GWAS) have confirmed the ubiquitous existence of genetic heterogeneity for common disease: multiple common genetic variants have been identified to be associated, while many more are yet expected to be uncovered. However, the single SNP (single-nucleotide polymorphism) based trend test (or its variants) that has been dominantly used in GWAS is based on contrasting the allele frequency difference between the case and control groups, completely ignoring possible genetic heterogeneity. In spite of the widely accepted notion of genetic heterogeneity, we are not aware of any previous attempt to apply genetic heterogeneity motivated methods in GWAS. Here, to explicitly account for unknown genetic heterogeneity, we applied a mixture model based single-SNP test to the Wellcome Trust Case Control Consortium (WTCCC) GWAS data with traits of Crohn's disease, bipolar disease, coronary artery disease, and type 2 diabetes, identifying much larger numbers of significant SNPs and risk loci for each trait than those of the popular trend test, demonstrating potential power gain of the mixture model based test.

  7. Accountable priority setting for trust in health systems - the need for research into a new approach for strengthening sustainable health action in developing countries

    PubMed Central

    Byskov, Jens; Bloch, Paul; Blystad, Astrid; Hurtig, Anna-Karin; Fylkesnes, Knut; Kamuzora, Peter; Kombe, Yeri; Kvåle, Gunnar; Marchal, Bruno; Martin, Douglas K; Michelo, Charles; Ndawi, Benedict; Ngulube, Thabale J; Nyamongo, Isaac; Olsen, Øystein E; Onyango-Ouma, Washington; Sandøy, Ingvild F; Shayo, Elizabeth H; Silwamba, Gavin; Songstad, Nils Gunnar; Tuba, Mary

    2009-01-01

    Despite multiple efforts to strengthen health systems in low and middle income countries, intended sustainable improvements in health outcomes have not been shown. To date most priority setting initiatives in health systems have mainly focused on technical approaches involving information derived from burden of disease statistics, cost effectiveness analysis, and published clinical trials. However, priority setting involves value-laden choices and these technical approaches do not equip decision-makers to address a broader range of relevant values - such as trust, equity, accountability and fairness - that are of concern to other partners and, not least, the populations concerned. A new focus for priority setting is needed. Accountability for Reasonableness (AFR) is an explicit ethical framework for legitimate and fair priority setting that provides guidance for decision-makers who must identify and consider the full range of relevant values. AFR consists of four conditions: i) relevance to the local setting, decided by agreed criteria; ii) publicizing priority-setting decisions and the reasons behind them; iii) the establishment of revisions/appeal mechanisms for challenging and revising decisions; iv) the provision of leadership to ensure that the first three conditions are met. REACT - "REsponse to ACcountable priority setting for Trust in health systems" is an EU-funded five-year intervention study started in 2006, which is testing the application and effects of the AFR approach in one district each in Kenya, Tanzania and Zambia. The objectives of REACT are to describe and evaluate district-level priority setting, to develop and implement improvement strategies guided by AFR and to measure their effect on quality, equity and trust indicators. Effects are monitored within selected disease and programme interventions and services and within human resources and health systems management. Qualitative and quantitative methods are being applied in an action research

  8. A recency-based account of the list length effect in free recall.

    PubMed

    Ward, Geoff

    2002-09-01

    Free recall was examined using the overt rehearsal methodology with lists of 10, 20, and 30 words. The standard list length effects were obtained: As list length increased, there was an increase in the number and a decrease in the proportion of words that were recalled. There were significant primacy and recency effects with all list lengths. However, when the data were replotted in terms of when the words were last rehearsed, recall was characterized by extended recency effects, and the data from the different list lengths were superimposed upon one another. These findings support a recency-based account of episodic memory. The list length effect reflects the facts that unrehearsed words are less recent with longer lists, and that with longer lists, a reduced proportion of primacy and middle items may be rehearsed to later positions.

  9. 24 CFR 990.280 - Project-based budgeting and accounting.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... accounting. 990.280 Section 990.280 Housing and Urban Development Regulations Relating to Housing and Urban... budgeting and accounting. (a) All PHAs covered by this subpart shall develop and maintain a system of budgeting and accounting for each project in a manner that allows for analysis of the actual revenues...

  10. Putting retrieval-induced forgetting in context: an inhibition-free, context-based account.

    PubMed

    Jonker, Tanya R; Seli, Paul; MacLeod, Colin M

    2013-10-01

    We present a new theoretical account of retrieval-induced forgetting (RIF) together with new experimental evidence that fits this account and challenges the dominant inhibition account. RIF occurs when the retrieval of some material from memory produces later forgetting of related material. The inhibition account asserts that RIF is the result of an inhibition mechanism that acts during retrieval to suppress the representations of interfering competitors. This inhibition is enduring, such that the suppressed material is difficult to access on a later test and is, therefore, recalled more poorly than baseline material. Although the inhibition account is widely accepted, a growing body of research challenges its fundamental assumptions. Our alternative account of RIF instead emphasizes the role of context in remembering. According to this context account, both of 2 tenets must be met for RIF to occur: (a) A context change must occur between study and subsequent retrieval practice, and (b) the retrieval practice context must be the active context during the final test when testing practiced categories. The results of 3 experiments, which directly test the divergent predictions of the 2 accounts, support the context account but cannot be explained by the inhibition account. In an extensive discussion, we survey the literature on RIF and apply our context account to the key findings, demonstrating the explanatory power of context.

  11. An APEL Tool Based CPU Usage Accounting Infrastructure for Large Scale Computing Grids

    NASA Astrophysics Data System (ADS)

    Jiang, Ming; Novales, Cristina Del Cano; Mathieu, Gilles; Casson, John; Rogers, William; Gordon, John

    The APEL (Accounting Processor for Event Logs) is the fundamental tool for the CPU usage accounting infrastructure deployed within the WLCG and EGEE Grids. In these Grids, jobs are submitted by users to computing resources via a Grid Resource Broker (e.g. gLite Workload Management System). As a log processing tool, APEL interprets logs of Grid gatekeeper (e.g. globus) and batch system logs (e.g. PBS, LSF, SGE and Condor) to produce CPU job accounting records identified with Grid identities. These records provide a complete description of usage of computing resources by user's jobs. APEL publishes accounting records into an accounting record repository at a Grid Operations Centre (GOC) for the access from a GUI web tool. The functions of log files parsing, records generation and publication are implemented by the APEL Parser, APEL Core, and APEL Publisher component respectively. Within the distributed accounting infrastructure, accounting records are transported from APEL Publishers at Grid sites to either a regionalised accounting system or the central one by choice via a common ActiveMQ message broker network. This provides an open transport layer for other accounting systems to publish relevant accounting data to a central accounting repository via a unified interface provided an APEL Publisher and also will give regional/National Grid Initiatives (NGIs) Grids the flexibility in their choice of accounting system. The robust and secure delivery of accounting record messages at an NGI level and between NGI accounting instances and the central one are achieved by using configurable APEL Publishers and an ActiveMQ message broker network.

  12. Evaluating face trustworthiness: a model based approach

    PubMed Central

    Baron, Sean G.; Oosterhof, Nikolaas N.

    2008-01-01

    Judgments of trustworthiness from faces determine basic approach/avoidance responses and approximate the valence evaluation of faces that runs across multiple person judgments. Here, based on trustworthiness judgments and using a computer model for face representation, we built a model for representing face trustworthiness (study 1). Using this model, we generated novel faces with an increased range of trustworthiness and used these faces as stimuli in a functional Magnetic Resonance Imaging study (study 2). Although participants did not engage in explicit evaluation of the faces, the amygdala response changed as a function of face trustworthiness. An area in the right amygdala showed a negative linear response—as the untrustworthiness of faces increased so did the amygdala response. Areas in the left and right putamen, the latter area extended into the anterior insula, showed a similar negative linear response. The response in the left amygdala was quadratic—strongest for faces on both extremes of the trustworthiness dimension. The medial prefrontal cortex and precuneus also showed a quadratic response, but their response was strongest to faces in the middle range of the trustworthiness dimension. PMID:19015102

  13. Evaluating face trustworthiness: a model based approach.

    PubMed

    Todorov, Alexander; Baron, Sean G; Oosterhof, Nikolaas N

    2008-06-01

    Judgments of trustworthiness from faces determine basic approach/avoidance responses and approximate the valence evaluation of faces that runs across multiple person judgments. Here, based on trustworthiness judgments and using a computer model for face representation, we built a model for representing face trustworthiness (study 1). Using this model, we generated novel faces with an increased range of trustworthiness and used these faces as stimuli in a functional Magnetic Resonance Imaging study (study 2). Although participants did not engage in explicit evaluation of the faces, the amygdala response changed as a function of face trustworthiness. An area in the right amygdala showed a negative linear response-as the untrustworthiness of faces increased so did the amygdala response. Areas in the left and right putamen, the latter area extended into the anterior insula, showed a similar negative linear response. The response in the left amygdala was quadratic--strongest for faces on both extremes of the trustworthiness dimension. The medial prefrontal cortex and precuneus also showed a quadratic response, but their response was strongest to faces in the middle range of the trustworthiness dimension. PMID:19015102

  14. Pattern recognition tool based on complex network-based approach

    NASA Astrophysics Data System (ADS)

    Casanova, Dalcimar; Backes, André Ricardo; Martinez Bruno, Odemir

    2013-02-01

    This work proposed a generalization of the method proposed by the authors: 'A complex network-based approach for boundary shape analysis'. Instead of modelling a contour into a graph and use complex networks rules to characterize it, here, we generalize the technique. This way, the work proposes a mathematical tool for characterization signals, curves and set of points. To evaluate the pattern description power of the proposal, an experiment of plat identification based on leaf veins image are conducted. Leaf vein is a taxon characteristic used to plant identification proposes, and one of its characteristics is that these structures are complex, and difficult to be represented as a signal or curves and this way to be analyzed in a classical pattern recognition approach. Here, we model the veins as a set of points and model as graphs. As features, we use the degree and joint degree measurements in a dynamic evolution. The results demonstrates that the technique has a good power of discrimination and can be used for plant identification, as well as other complex pattern recognition tasks.

  15. Toward an Human Resource Accounting (HRA)-Based Model for Designing an Organizational Effectiveness Audit in Education.

    ERIC Educational Resources Information Center

    Myroon, John L.

    The major purpose of this paper was to develop a Human Resource Accounting (HRA) macro-model that could be used for designing a school organizational effectiveness audit. Initially, the paper reviewed the advent and definition of HRA. In order to develop the proposed model, the different approaches to measuring effectiveness were reviewed,…

  16. Nanotechnology-Based Approaches for Guiding Neural Regeneration.

    PubMed

    Shah, Shreyas; Solanki, Aniruddh; Lee, Ki-Bum

    2016-01-19

    The mammalian brain is a phenomenal piece of "organic machinery" that has fascinated scientists and clinicians for centuries. The intricate network of tens of billions of neurons dispersed in a mixture of chemical and biochemical constituents gives rise to thoughts, feelings, memories, and life as we know it. In turn, subtle imbalances or damage to this system can cause severe complications in physical, motor, psychological, and cognitive function. Moreover, the inevitable loss of nerve tissue caused by degenerative diseases and traumatic injuries is particularly devastating because of the limited regenerative capabilities of the central nervous system (i.e., the brain and spinal cord). Among current approaches, stem-cell-based regenerative medicine has shown the greatest promise toward repairing and regenerating destroyed neural tissue. However, establishing controlled and reliable methodologies to guide stem cell differentiation into specialized neural cells of interest (e.g., neurons and oligodendrocytes) has been a prevailing challenge in the field. In this Account, we summarize the nanotechnology-based approaches our group has recently developed to guide stem-cell-based neural regeneration. We focus on three overarching strategies that were adopted to selectively control this process. First, soluble microenvironmental factors play a critical role in directing the fate of stem cells. Multiple factors have been developed in the form of small-molecule drugs, biochemical analogues, and DNA/RNA-based vectors to direct neural differentiation. However, the delivery of these factors with high transfection efficiency and minimal cytotoxicity has been challenging, especially to sensitive cell lines such as stem cells. In our first approach, we designed nanoparticle-based systems for the efficient delivery of such soluble factors to control neural differentiation. Our nanoparticles, comprising either organic or inorganic elements, were biocompatible and offered

  17. Nanotechnology-Based Approaches for Guiding Neural Regeneration.

    PubMed

    Shah, Shreyas; Solanki, Aniruddh; Lee, Ki-Bum

    2016-01-19

    The mammalian brain is a phenomenal piece of "organic machinery" that has fascinated scientists and clinicians for centuries. The intricate network of tens of billions of neurons dispersed in a mixture of chemical and biochemical constituents gives rise to thoughts, feelings, memories, and life as we know it. In turn, subtle imbalances or damage to this system can cause severe complications in physical, motor, psychological, and cognitive function. Moreover, the inevitable loss of nerve tissue caused by degenerative diseases and traumatic injuries is particularly devastating because of the limited regenerative capabilities of the central nervous system (i.e., the brain and spinal cord). Among current approaches, stem-cell-based regenerative medicine has shown the greatest promise toward repairing and regenerating destroyed neural tissue. However, establishing controlled and reliable methodologies to guide stem cell differentiation into specialized neural cells of interest (e.g., neurons and oligodendrocytes) has been a prevailing challenge in the field. In this Account, we summarize the nanotechnology-based approaches our group has recently developed to guide stem-cell-based neural regeneration. We focus on three overarching strategies that were adopted to selectively control this process. First, soluble microenvironmental factors play a critical role in directing the fate of stem cells. Multiple factors have been developed in the form of small-molecule drugs, biochemical analogues, and DNA/RNA-based vectors to direct neural differentiation. However, the delivery of these factors with high transfection efficiency and minimal cytotoxicity has been challenging, especially to sensitive cell lines such as stem cells. In our first approach, we designed nanoparticle-based systems for the efficient delivery of such soluble factors to control neural differentiation. Our nanoparticles, comprising either organic or inorganic elements, were biocompatible and offered

  18. Concurrency-based approaches to parallel programming

    SciTech Connect

    Kale, L.V.; Chrisochoides, N.; Kohl, J.

    1995-07-17

    The inevitable transition to parallel programming can be facilitated by appropriate tools, including languages and libraries. After describing the needs of applications developers, this paper presents three specific approaches aimed at development of efficient and reusable parallel software for irregular and dynamic-structured problems. A salient feature of all three approaches in their exploitation of concurrency within a processor. Benefits of individual approaches such as these can be leveraged by an interoperability environment which permits modules written using different approaches to co-exist in single applications.

  19. Concurrency-based approaches to parallel programming

    NASA Technical Reports Server (NTRS)

    Kale, L.V.; Chrisochoides, N.; Kohl, J.; Yelick, K.

    1995-01-01

    The inevitable transition to parallel programming can be facilitated by appropriate tools, including languages and libraries. After describing the needs of applications developers, this paper presents three specific approaches aimed at development of efficient and reusable parallel software for irregular and dynamic-structured problems. A salient feature of all three approaches in their exploitation of concurrency within a processor. Benefits of individual approaches such as these can be leveraged by an interoperability environment which permits modules written using different approaches to co-exist in single applications.

  20. Approach to proliferation risk assessment based on multiple objective analysis framework

    SciTech Connect

    Andrianov, A.; Kuptsov, I.

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  1. Reuse: A knowledge-based approach

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui

    1992-01-01

    This paper describes our research in automating the reuse process through the use of application domain models. Application domain models are explicit formal representations of the application knowledge necessary to understand, specify, and generate application programs. Furthermore, they provide a unified repository for the operational structure, rules, policies, and constraints of a specific application area. In our approach, domain models are expressed in terms of a transaction-based meta-modeling language. This paper has described in detail the creation and maintenance of hierarchical structures. These structures are created through a process that includes reverse engineering of data models with supplementary enhancement from application experts. Source code is also reverse engineered but is not a major source of domain model instantiation at this time. In the second phase of the software synthesis process, program specifications are interactively synthesized from an instantiated domain model. These specifications are currently integrated into a manual programming process but will eventually be used to derive executable code with mechanically assisted transformations. This research is performed within the context of programming-in-the-large types of systems. Although our goals are ambitious, we are implementing the synthesis system in an incremental manner through which we can realize tangible results. The client/server architecture is capable of supporting 16 simultaneous X/Motif users and tens of thousands of attributes and classes. Domain models have been partially synthesized from five different application areas. As additional domain models are synthesized and additional knowledge is gathered, we will inevitably add to and modify our representation. However, our current experience indicates that it will scale and expand to meet our modeling needs.

  2. Involving Diverse Communities of Practice to Minimize Unintended Consequences of Test-Based Accountability Systems

    ERIC Educational Resources Information Center

    Behizadeh, Nadia; Engelhard, George, Jr.

    2015-01-01

    In his focus article, Koretz (this issue) argues that accountability has become the primary function of large-scale testing in the United States. He then points out that tests being used for accountability purposes are flawed and that the high-stakes nature of these tests creates a context that encourages score inflation. Koretz is concerned about…

  3. Design of a Competency-Based Assessment Model in the Field of Accounting

    ERIC Educational Resources Information Center

    Ciudad-Gómez, Adelaida; Valverde-Berrocoso, Jesús

    2012-01-01

    This paper presents the phases involved in the design of a methodology to contribute both to the acquisition of competencies and to their assessment in the field of Financial Accounting, within the European Higher Education Area (EHEA) framework, which we call MANagement of COMpetence in the areas of Accounting (MANCOMA). Having selected and…

  4. Future Performance Trend Indicators: A Current Value Approach to Human Resources Accounting. Report I. Internal Consistencies and Relationships to Performance By Site. Final Report.

    ERIC Educational Resources Information Center

    Pecorella, Patricia A.; Bowers, David G.

    Analyses preparatory to construction of a suitable file for generating a system of future performance trend indicators are described. Such a system falls into the category of a current value approach to human resources accounting. It requires that there be a substantial body of data which: (1) uses the work group or unit, not the individual, as…

  5. Structure-Based Statistical Mechanical Model Accounts for the Causality and Energetics of Allosteric Communication.

    PubMed

    Guarnera, Enrico; Berezovsky, Igor N

    2016-03-01

    Allostery is one of the pervasive mechanisms through which proteins in living systems carry out enzymatic activity, cell signaling, and metabolism control. Effective modeling of the protein function regulation requires a synthesis of the thermodynamic and structural views of allostery. We present here a structure-based statistical mechanical model of allostery, allowing one to observe causality of communication between regulatory and functional sites, and to estimate per residue free energy changes. Based on the consideration of ligand free and ligand bound systems in the context of a harmonic model, corresponding sets of characteristic normal modes are obtained and used as inputs for an allosteric potential. This potential quantifies the mean work exerted on a residue due to the local motion of its neighbors. Subsequently, in a statistical mechanical framework the entropic contribution to allosteric free energy of a residue is directly calculated from the comparison of conformational ensembles in the ligand free and ligand bound systems. As a result, this method provides a systematic approach for analyzing the energetics of allosteric communication based on a single structure. The feasibility of the approach was tested on a variety of allosteric proteins, heterogeneous in terms of size, topology and degree of oligomerization. The allosteric free energy calculations show the diversity of ways and complexity of scenarios existing in the phenomenology of allosteric causality and communication. The presented model is a step forward in developing the computational techniques aimed at detecting allosteric sites and obtaining the discriminative power between agonistic and antagonistic effectors, which are among the major goals in allosteric drug design. PMID:26939022

  6. Structure-Based Statistical Mechanical Model Accounts for the Causality and Energetics of Allosteric Communication.

    PubMed

    Guarnera, Enrico; Berezovsky, Igor N

    2016-03-01

    Allostery is one of the pervasive mechanisms through which proteins in living systems carry out enzymatic activity, cell signaling, and metabolism control. Effective modeling of the protein function regulation requires a synthesis of the thermodynamic and structural views of allostery. We present here a structure-based statistical mechanical model of allostery, allowing one to observe causality of communication between regulatory and functional sites, and to estimate per residue free energy changes. Based on the consideration of ligand free and ligand bound systems in the context of a harmonic model, corresponding sets of characteristic normal modes are obtained and used as inputs for an allosteric potential. This potential quantifies the mean work exerted on a residue due to the local motion of its neighbors. Subsequently, in a statistical mechanical framework the entropic contribution to allosteric free energy of a residue is directly calculated from the comparison of conformational ensembles in the ligand free and ligand bound systems. As a result, this method provides a systematic approach for analyzing the energetics of allosteric communication based on a single structure. The feasibility of the approach was tested on a variety of allosteric proteins, heterogeneous in terms of size, topology and degree of oligomerization. The allosteric free energy calculations show the diversity of ways and complexity of scenarios existing in the phenomenology of allosteric causality and communication. The presented model is a step forward in developing the computational techniques aimed at detecting allosteric sites and obtaining the discriminative power between agonistic and antagonistic effectors, which are among the major goals in allosteric drug design.

  7. Structure-Based Statistical Mechanical Model Accounts for the Causality and Energetics of Allosteric Communication

    PubMed Central

    Guarnera, Enrico; Berezovsky, Igor N.

    2016-01-01

    Allostery is one of the pervasive mechanisms through which proteins in living systems carry out enzymatic activity, cell signaling, and metabolism control. Effective modeling of the protein function regulation requires a synthesis of the thermodynamic and structural views of allostery. We present here a structure-based statistical mechanical model of allostery, allowing one to observe causality of communication between regulatory and functional sites, and to estimate per residue free energy changes. Based on the consideration of ligand free and ligand bound systems in the context of a harmonic model, corresponding sets of characteristic normal modes are obtained and used as inputs for an allosteric potential. This potential quantifies the mean work exerted on a residue due to the local motion of its neighbors. Subsequently, in a statistical mechanical framework the entropic contribution to allosteric free energy of a residue is directly calculated from the comparison of conformational ensembles in the ligand free and ligand bound systems. As a result, this method provides a systematic approach for analyzing the energetics of allosteric communication based on a single structure. The feasibility of the approach was tested on a variety of allosteric proteins, heterogeneous in terms of size, topology and degree of oligomerization. The allosteric free energy calculations show the diversity of ways and complexity of scenarios existing in the phenomenology of allosteric causality and communication. The presented model is a step forward in developing the computational techniques aimed at detecting allosteric sites and obtaining the discriminative power between agonistic and antagonistic effectors, which are among the major goals in allosteric drug design. PMID:26939022

  8. A Monte Carlo-based model of gold nanoparticle radiosensitization accounting for increased radiobiological effectiveness.

    PubMed

    Lechtman, E; Mashouf, S; Chattopadhyay, N; Keller, B M; Lai, P; Cai, Z; Reilly, R M; Pignol, J-P

    2013-05-21

    Radiosensitization using gold nanoparticles (AuNPs) has been shown to vary widely with cell line, irradiation energy, AuNP size, concentration and intracellular localization. We developed a Monte Carlo-based AuNP radiosensitization predictive model (ARP), which takes into account the detailed energy deposition at the nano-scale. This model was compared to experimental cell survival and macroscopic dose enhancement predictions. PC-3 prostate cancer cell survival was characterized after irradiation using a 300 kVp photon source with and without AuNPs present in the cell culture media. Detailed Monte Carlo simulations were conducted, producing individual tracks of photoelectric products escaping AuNPs and energy deposition was scored in nano-scale voxels in a model cell nucleus. Cell survival in our predictive model was calculated by integrating the radiation induced lethal event density over the nucleus volume. Experimental AuNP radiosensitization was observed with a sensitizer enhancement ratio (SER) of 1.21 ± 0.13. SERs estimated using the ARP model and the macroscopic enhancement model were 1.20 ± 0.12 and 1.07 ± 0.10 respectively. In the hypothetical case of AuNPs localized within the nucleus, the ARP model predicted a SER of 1.29 ± 0.13, demonstrating the influence of AuNP intracellular localization on radiosensitization.

  9. Grid-cell-based crop water accounting for the famine early warning system

    USGS Publications Warehouse

    Verdin, J.; Klaver, R.

    2002-01-01

    Rainfall monitoring is a regular activity of food security analysts for sub-Saharan Africa due to the potentially disastrous impact of drought. Crop water accounting schemes are used to track rainfall timing and amounts relative to phenological requirements, to infer water limitation impacts on yield. Unfortunately, many rain gauge reports are available only after significant delays, and the gauge locations leave large gaps in coverage. As an alternative, a grid-cell-based formulation for the water requirement satisfaction index (WRSI) was tested for maize in Southern Africa. Grids of input variables were obtained from remote sensing estimates of rainfall, meteorological models, and digital soil maps. The spatial WRSI was computed for the 1996-97 and 1997-98 growing seasons. Maize yields were estimated by regression and compared with a limited number of reports from the field for the 1996-97 season in Zimbabwe. Agreement at a useful level (r = 0.80) was observed. This is comparable to results from traditional analysis with station data. The findings demonstrate the complementary role that remote sensing, modelling, and geospatial analysis can play in an era when field data collection in sub-Saharan Africa is suffering an unfortunate decline. Published in 2002 by John Wiley & Sons, Ltd.

  10. Stereoscopic ground-based determination of the cloud base height: theory of camera position calibration with account for lens distortion

    NASA Astrophysics Data System (ADS)

    Chulichkov, Alexey I.; Postylyakov, Oleg V.

    2016-05-01

    For the reconstruction of some geometrical characteristics of clouds a method was developed based on taking pictures of the sky by a pair of digital photo cameras and subsequent processing of the obtained sequence of stereo frames to obtain the height of the cloud base. Since the directions of the optical axes of the stereo cameras are not exactly known, a procedure of adjusting of obtained frames was developed which use photographs of the night starry sky. In the second step, the method of the morphological analysis of images is used to determine the relative shift of the coordinates of some fragment of cloud. The shift is used to estimate the searched cloud base height. The proposed method can be used for automatic processing of stereo data and getting the cloud base height. The earlier paper described a mathematical model of stereophotography measurement, poses and solves the problem of adjusting of optical axes of the cameras in paraxial (first-order geometric optics) approximation and was applied for the central part of the sky frames. This paper describes the model of experiment which takes into account lens distortion in Seidel approximation (depending on the third order of the distance from optical axis). We developed procedure of simultaneous camera position calibration and estimation of parameters of lens distortion in Seidel approximation.

  11. Nanotechnology based approaches in cancer therapeutics

    NASA Astrophysics Data System (ADS)

    Kumer Biswas, Amit; Reazul Islam, Md; Sadek Choudhury, Zahid; Mostafa, Asif; Fahim Kadir, Mohammad

    2014-12-01

    The current decades are marked not by the development of new molecules for the cure of various diseases but rather the development of new delivery methods for optimum treatment outcome. Nanomedicine is perhaps playing the biggest role in this concern. Nanomedicine offers numerous advantages over conventional drug delivery approaches and is particularly the hot topic in anticancer research. Nanoparticles (NPs) have many unique criteria that enable them to be incorporated in anticancer therapy. This topical review aims to look at the properties and various forms of NPs and their use in anticancer treatment, recent development of the process of identifying new delivery approaches as well as progress in clinical trials with these newer approaches. Although the outcome of cancer therapy can be increased using nanomedicine there are still many disadvantages of using this approach. We aim to discuss all these issues in this review.

  12. Acid-base accounting assessment of mine wastes using the chromium reducible sulfur method.

    PubMed

    Schumann, Russell; Stewart, Warwick; Miller, Stuart; Kawashima, Nobuyuki; Li, Jun; Smart, Roger

    2012-05-01

    The acid base account (ABA), commonly used in assessment of mine waste materials, relies in part on calculation of potential acidity from total sulfur measurements. However, potential acidity is overestimated where organic sulfur, sulfate sulfur and some sulfide compounds make up a substantial portion of the sulfur content. The chromium reducible sulfur (CRS) method has been widely applied to assess reduced inorganic sulfur forms in sediments and acid sulfate soils, but not in ABA assessment of mine wastes. This paper reports the application of the CRS method to measuring forms of sulfur commonly found in mine waste materials. A number of individual sulfur containing minerals and real waste materials were analyzed using both CRS and total S and the potential acidity estimates were compared with actual acidity measured from net acid generation tests and column leach tests. The results of the CRS analysis made on individual minerals demonstrate good assessment of sulfur from a range of sulfides. No sulfur was measured using the CRS method in a number of sulfate salts, including jarosite and melanterite typically found in weathered waste rocks, or from dibenzothiophene characteristic of organic sulfur compounds common to coal wastes. Comparison of ABA values for a number of coal waste samples demonstrated much better agreement of acidity predicted from CRS analysis than total S analysis with actual acidity. It also resulted in reclassification of most samples tested from PAF to NAF. Similar comparisons on base metal sulfide wastes generally resulted in overestimation of the acid potential by total S and underestimation of the acid potential by CRS in comparison to acidity measured during NAG tests, but did not generally result in reclassification. In all the cases examined, the best estimate of potential acidity included acidity calculated from both CRS and jarositic S. PMID:22444067

  13. Acid-base accounting assessment of mine wastes using the chromium reducible sulfur method.

    PubMed

    Schumann, Russell; Stewart, Warwick; Miller, Stuart; Kawashima, Nobuyuki; Li, Jun; Smart, Roger

    2012-05-01

    The acid base account (ABA), commonly used in assessment of mine waste materials, relies in part on calculation of potential acidity from total sulfur measurements. However, potential acidity is overestimated where organic sulfur, sulfate sulfur and some sulfide compounds make up a substantial portion of the sulfur content. The chromium reducible sulfur (CRS) method has been widely applied to assess reduced inorganic sulfur forms in sediments and acid sulfate soils, but not in ABA assessment of mine wastes. This paper reports the application of the CRS method to measuring forms of sulfur commonly found in mine waste materials. A number of individual sulfur containing minerals and real waste materials were analyzed using both CRS and total S and the potential acidity estimates were compared with actual acidity measured from net acid generation tests and column leach tests. The results of the CRS analysis made on individual minerals demonstrate good assessment of sulfur from a range of sulfides. No sulfur was measured using the CRS method in a number of sulfate salts, including jarosite and melanterite typically found in weathered waste rocks, or from dibenzothiophene characteristic of organic sulfur compounds common to coal wastes. Comparison of ABA values for a number of coal waste samples demonstrated much better agreement of acidity predicted from CRS analysis than total S analysis with actual acidity. It also resulted in reclassification of most samples tested from PAF to NAF. Similar comparisons on base metal sulfide wastes generally resulted in overestimation of the acid potential by total S and underestimation of the acid potential by CRS in comparison to acidity measured during NAG tests, but did not generally result in reclassification. In all the cases examined, the best estimate of potential acidity included acidity calculated from both CRS and jarositic S.

  14. Minimally invasive surgery of the anterior skull base: transorbital approaches

    PubMed Central

    Gassner, Holger G.; Schwan, Franziska; Schebesch, Karl-Michael

    2016-01-01

    Minimally invasive approaches are becoming increasingly popular to access the anterior skull base. With interdisciplinary cooperation, in particular endonasal endoscopic approaches have seen an impressive expansion of indications over the past decades. The more recently described transorbital approaches represent minimally invasive alternatives with a differing spectrum of access corridors. The purpose of the present paper is to discuss transorbital approaches to the anterior skull base in the light of the current literature. The transorbital approaches allow excellent exposure of areas that are difficult to reach like the anterior and posterior wall of the frontal sinus; working angles may be more favorable and the paranasal sinus system can be preserved while exposing the skull base. Because of their minimal morbidity and the cosmetically excellent results, the transorbital approaches represent an important addition to established endonasal endoscopic and open approaches to the anterior skull base. Their execution requires an interdisciplinary team approach. PMID:27453759

  15. The accountability for reasonableness approach to guide priority setting in health systems within limited resources – findings from action research at district level in Kenya, Tanzania, and Zambia

    PubMed Central

    2014-01-01

    Background Priority-setting decisions are based on an important, but not sufficient set of values and thus lead to disagreement on priorities. Accountability for Reasonableness (AFR) is an ethics-based approach to a legitimate and fair priority-setting process that builds upon four conditions: relevance, publicity, appeals, and enforcement, which facilitate agreement on priority-setting decisions and gain support for their implementation. This paper focuses on the assessment of AFR within the project REsponse to ACcountable priority setting for Trust in health systems (REACT). Methods This intervention study applied an action research methodology to assess implementation of AFR in one district in Kenya, Tanzania, and Zambia, respectively. The assessments focused on selected disease, program, and managerial areas. An implementing action research team of core health team members and supporting researchers was formed to implement, and continually assess and improve the application of the four conditions. Researchers evaluated the intervention using qualitative and quantitative data collection and analysis methods. Results The values underlying the AFR approach were in all three districts well-aligned with general values expressed by both service providers and community representatives. There was some variation in the interpretations and actual use of the AFR in the decision-making processes in the three districts, and its effect ranged from an increase in awareness of the importance of fairness to a broadened engagement of health team members and other stakeholders in priority setting and other decision-making processes. Conclusions District stakeholders were able to take greater charge of closing the gap between nationally set planning and the local realities and demands of the served communities within the limited resources at hand. This study thus indicates that the operationalization of the four broadly defined and linked conditions is both possible and seems to

  16. The financing of the health system in the Islamic Republic of Iran: A National Health Account (NHA) approach

    PubMed Central

    Zakeri, Mohammadreza; Olyaeemanesh, Alireza; Zanganeh, Marziee; Kazemian, Mahmoud; Rashidian, Arash; Abouhalaj, Masoud; Tofighi, Shahram

    2015-01-01

    Background: The National Health Accounts keep track of all healthcare related activities from the beginning (i.e. resource provision), to the end (i.e. service provision). This study was conducted to address following questions: How is the Iranian health system funded? Who distribute the funds? For what services are the funds spent on?, What service providers receive the funds? Methods: The required study data were collected through a number of methods. The family health expenditure data was obtained through a cross sectional multistage (seasonal) survey; while library and field study was used to collect the registered data. The collected data fell into the following three categories: the household health expenditure (the sample size: 10200 urban households and 6800 rural households-four rounds of questioning), financial agents data, the medical universities financial performance data. Results: The total health expenditure of the Iranian households was 201,496,172 million Rials in 2008, which showed a 34.4% increase when compared to 2007. The share of the total health expenditure was 6.2% of the GDP. The share of the public sector showed a decreasing trend between 2003-2008 while the share of the private sector, of which 95.77% was paid by households, had an increasing trend within the same period. The percent of out of pocket expenditure was 53.79% of the total health expenditure. The total health expenditure per capita was US$ 284.00 based on the official US$ exchange rate and US$ 683.1 based on the international US$ exchange rate.( exchange rate: 1$=9988 Rial). Conclusion: The share of the public and private sectors in financing the health system was imbalanced and did not meet the international standards. The public share of the total health expenditures has increased in the recent years despite the 4th and 5th Development Plans. The inclusion of household health insurance fees and other service related expenses increases the public contribution to 73% of the

  17. Stochastic Turing patterns: analysis of compartment-based approaches.

    PubMed

    Cao, Yang; Erban, Radek

    2014-12-01

    Turing patterns can be observed in reaction-diffusion systems where chemical species have different diffusion constants. In recent years, several studies investigated the effects of noise on Turing patterns and showed that the parameter regimes, for which stochastic Turing patterns are observed, can be larger than the parameter regimes predicted by deterministic models, which are written in terms of partial differential equations (PDEs) for species concentrations. A common stochastic reaction-diffusion approach is written in terms of compartment-based (lattice-based) models, where the domain of interest is divided into artificial compartments and the number of molecules in each compartment is simulated. In this paper, the dependence of stochastic Turing patterns on the compartment size is investigated. It has previously been shown (for relatively simpler systems) that a modeler should not choose compartment sizes which are too small or too large, and that the optimal compartment size depends on the diffusion constant. Taking these results into account, we propose and study a compartment-based model of Turing patterns where each chemical species is described using a different set of compartments. It is shown that the parameter regions where spatial patterns form are different from the regions obtained by classical deterministic PDE-based models, but they are also different from the results obtained for the stochastic reaction-diffusion models which use a single set of compartments for all chemical species. In particular, it is argued that some previously reported results on the effect of noise on Turing patterns in biological systems need to be reinterpreted.

  18. 26 CFR 1.801-8 - Contracts with reserves based on segregated asset accounts.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... segregated asset account includes a contract under which the reflection of investment return and market value...) provides that for purposes of section 801(b)(1)(A), the reflection of the investment return and the...

  19. 26 CFR 1.801-8 - Contracts with reserves based on segregated asset accounts.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... segregated asset account includes a contract under which the reflection of investment return and market value...) provides that for purposes of section 801(b)(1)(A), the reflection of the investment return and the...

  20. 26 CFR 1.801-8 - Contracts with reserves based on segregated asset accounts.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... segregated asset account includes a contract under which the reflection of investment return and market value...) provides that for purposes of section 801(b)(1)(A), the reflection of the investment return and the...

  1. 26 CFR 1.801-8 - Contracts with reserves based on segregated asset accounts.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... segregated asset account includes a contract under which the reflection of investment return and market value...) provides that for purposes of section 801(b)(1)(A), the reflection of the investment return and the...

  2. A New Approach: Competency-Based Education

    ERIC Educational Resources Information Center

    Hall, Katherine B.

    1976-01-01

    Describes competency-based education, discusses the alleged strenghts and weaknesses as presented by its supporters and critics, and points up the impact of competency-based education on the home economist, particularly the home economics educator. (TA)

  3. Appearance questions can be misleading: a discourse-based account of the appearance-reality problem.

    PubMed

    Hansen, Mikkel B; Markman, Ellen M

    2005-05-01

    Preschoolers' success on the appearance-reality task is a milestone in theory-of-mind development. On the standard task children see a deceptive object, such as a sponge that looks like a rock, and are asked, "What is this really?" and "What does this look like?" Children below 412 years of age fail saying that the object not only is a sponge but also looks like a sponge. We propose that young children's difficulty stems from ambiguity in the meaning of "looks like." This locution can refer to outward appearance ("Peter looks like Paul") but in fact often refers to likely reality ("That looks like Jim"). We propose that "looks like" is taken to refer to likely reality unless the reality is already part of the common ground of the conversation. Because this joint knowledge is unclear to young children on the appearance-reality task, they mistakenly think the appearance question is about likely reality. Study 1 analyzed everyday conversations from the CHILDES database and documented that 2 and 3-year-olds are familiar with these two different uses of the locution. To disambiguate the meaning of "looks like," Study 2 clarified that reality was shared knowledge as part of the appearance question, e.g., "What does the sponge look like?" Study 3 used a non-linguistic measure to emphasize the shared knowledge of the reality in the appearance question. Study 4 asked children on their own to articulate the contrast between appearance and reality. At 91%, 85%, and 81% correct responses, children were at near ceiling levels in each of our manipulations while they failed the standard versions of the tasks. Moreover, we show how this discourse-based explanation accounts for findings in the literature. Thus children master the appearance-reality distinction by the age of 3 but the standard task masks this understanding because of the discourse structure involved in talking about appearances. PMID:15826611

  4. Development of prototype induced-fission-based Pu accountancy instrument for safeguards applications.

    PubMed

    Seo, Hee; Lee, Seung Kyu; An, Su Jung; Park, Se-Hwan; Ku, Jeong-Hoe; Menlove, Howard O; Rael, Carlos D; LaFleur, Adrienne M; Browne, Michael C

    2016-09-01

    Prototype safeguards instrument for nuclear material accountancy (NMA) of uranium/transuranic (U/TRU) products that could be produced in a future advanced PWR fuel processing facility has been developed and characterized. This is a new, hybrid neutron measurement system based on fast neutron energy multiplication (FNEM) and passive neutron albedo reactivity (PNAR) methods. The FNEM method is sensitive to the induced fission rate by fast neutrons, while the PNAR method is sensitive to the induced fission rate by thermal neutrons in the sample to be measured. The induced fission rate is proportional to the total amount of fissile material, especially plutonium (Pu), in the U/TRU product; hence, the Pu amount can be calibrated as a function of the induced fission rate, which can be measured using either the FNEM or PNAR method. In the present study, the prototype system was built using six (3)He tubes, and its performance was evaluated for various detector parameters including high-voltage (HV) plateau, efficiency profiles, dead time, and stability. The system's capability to measure the difference in the average neutron energy for the FNEM signature also was evaluated, using AmLi, PuBe, (252)Cf, as well as four Pu-oxide sources each with a different impurity (Al, F, Mg, and B) and producing (α,n) neutrons with different average energies. Future work will measure the hybrid signature (i.e., FNEM×PNAR) for a Pu source with an external interrogating neutron source after enlarging the cavity size of the prototype system to accommodate a large-size Pu source (~600g Pu).

  5. Development of prototype induced-fission-based Pu accountancy instrument for safeguards applications.

    PubMed

    Seo, Hee; Lee, Seung Kyu; An, Su Jung; Park, Se-Hwan; Ku, Jeong-Hoe; Menlove, Howard O; Rael, Carlos D; LaFleur, Adrienne M; Browne, Michael C

    2016-09-01

    Prototype safeguards instrument for nuclear material accountancy (NMA) of uranium/transuranic (U/TRU) products that could be produced in a future advanced PWR fuel processing facility has been developed and characterized. This is a new, hybrid neutron measurement system based on fast neutron energy multiplication (FNEM) and passive neutron albedo reactivity (PNAR) methods. The FNEM method is sensitive to the induced fission rate by fast neutrons, while the PNAR method is sensitive to the induced fission rate by thermal neutrons in the sample to be measured. The induced fission rate is proportional to the total amount of fissile material, especially plutonium (Pu), in the U/TRU product; hence, the Pu amount can be calibrated as a function of the induced fission rate, which can be measured using either the FNEM or PNAR method. In the present study, the prototype system was built using six (3)He tubes, and its performance was evaluated for various detector parameters including high-voltage (HV) plateau, efficiency profiles, dead time, and stability. The system's capability to measure the difference in the average neutron energy for the FNEM signature also was evaluated, using AmLi, PuBe, (252)Cf, as well as four Pu-oxide sources each with a different impurity (Al, F, Mg, and B) and producing (α,n) neutrons with different average energies. Future work will measure the hybrid signature (i.e., FNEM×PNAR) for a Pu source with an external interrogating neutron source after enlarging the cavity size of the prototype system to accommodate a large-size Pu source (~600g Pu). PMID:27337652

  6. A performance weighting procedure for GCMs based on explicit probabilistic models and accounting for observation uncertainty

    NASA Astrophysics Data System (ADS)

    Renard, Benjamin; Vidal, Jean-Philippe

    2016-04-01

    In recent years, the climate modeling community has put a lot of effort into releasing the outputs of multimodel experiments for use by the wider scientific community. In such experiments, several structurally distinct GCMs are run using the same observed forcings (for the historical period) or the same projected forcings (for the future period). In addition, several members are produced for a single given model structure, by running each GCM with slightly different initial conditions. This multiplicity of GCM outputs offers many opportunities in terms of uncertainty quantification or GCM comparisons. In this presentation, we propose a new procedure to weight GCMs according to their ability to reproduce the observed climate. Such weights can be used to combine the outputs of several models in a way that rewards good-performing models and discards poorly-performing ones. The proposed procedure has the following main properties: 1. It is based on explicit probabilistic models describing the time series produced by the GCMs and the corresponding historical observations, 2. It can use several members whenever available, 3. It accounts for the uncertainty in observations, 4. It assigns a weight to each GCM (all weights summing up to one), 5. It can also assign a weight to the "H0 hypothesis" that all GCMs in the multimodel ensemble are not compatible with observations. The application of the weighting procedure is illustrated with several case studies including synthetic experiments, simple cases where the target GCM output is a simple univariate variable and more realistic cases where the target GCM output is a multivariate and/or a spatial variable. These case studies illustrate the generality of the procedure which can be applied in a wide range of situations, as long as the analyst is prepared to make an explicit probabilistic assumption on the target variable. Moreover, these case studies highlight several interesting properties of the weighting procedure. In

  7. Human Rights Education in Japan: An Historical Account, Characteristics and Suggestions for a Better-Balanced Approach

    ERIC Educational Resources Information Center

    Takeda, Sachiko

    2012-01-01

    Although human rights are often expressed as universal tenets, the concept was conceived in a particular socio-political and historical context. Conceptualisations and practice of human rights vary across societies, and face numerous challenges. After providing an historical account of the conceptualisation of human rights in Japanese society,…

  8. Measuring neuronal branching patterns using model-based approach.

    PubMed

    Luczak, Artur

    2010-01-01

    Neurons have complex branching systems which allow them to communicate with thousands of other neurons. Thus understanding neuronal geometry is clearly important for determining connectivity within the network and how this shapes neuronal function. One of the difficulties in uncovering relationships between neuronal shape and its function is the problem of quantifying complex neuronal geometry. Even by using multiple measures such as: dendritic length, distribution of segments, direction of branches, etc, a description of three dimensional neuronal embedding remains incomplete. To help alleviate this problem, here we propose a new measure, a shape diffusiveness index (SDI), to quantify spatial relations between branches at the local and global scale. It was shown that growth of neuronal trees can be modeled by using diffusion limited aggregation (DLA) process. By measuring "how easy" it is to reproduce the analyzed shape by using the DLA algorithm it can be measured how "diffusive" is that shape. Intuitively, "diffusiveness" measures how tree-like is a given shape. For example shapes like an oak tree will have high values of SDI. This measure is capturing an important feature of dendritic tree geometry, which is difficult to assess with other measures. This approach also presents a paradigm shift from well-defined deterministic measures to model-based measures, which estimate how well a model with specific properties can account for features of analyzed shape. PMID:21079752

  9. Assessment of Person Fit Using Resampling-Based Approaches

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2016-01-01

    De la Torre and Deng suggested a resampling-based approach for person-fit assessment (PFA). The approach involves the use of the [math equation unavailable] statistic, a corrected expected a posteriori estimate of the examinee ability, and the Monte Carlo (MC) resampling method. The Type I error rate of the approach was closer to the nominal level…

  10. Sideline emergencies: an evidence-based approach.

    PubMed

    Fitch, R Warne; Cox, Charles L; Hannah, Gene A; Diamond, Alex B; Gregory, Andrew J M; Wilson, Kristina M

    2011-01-01

    As participation in athletics continues to increase, so too will the occurrence of on-field injuries and medical emergencies. The field of sports medicine continues to advance and many events will have on-site medical staff present. This article reviews the most catastrophic injuries and medical emergencies that are encountered in sports and presents the highest level evidence in regards to on-field approach and management of the athlete.

  11. Evaluating a Pivot-Based Approach for Bilingual Lexicon Extraction

    PubMed Central

    Kim, Jae-Hoon; Kwon, Hong-Seok; Seo, Hyeong-Won

    2015-01-01

    A pivot-based approach for bilingual lexicon extraction is based on the similarity of context vectors represented by words in a pivot language like English. In this paper, in order to show validity and usability of the pivot-based approach, we evaluate the approach in company with two different methods for estimating context vectors: one estimates them from two parallel corpora based on word association between source words (resp., target words) and pivot words and the other estimates them from two parallel corpora based on word alignment tools for statistical machine translation. Empirical results on two language pairs (e.g., Korean-Spanish and Korean-French) have shown that the pivot-based approach is very promising for resource-poor languages and this approach observes its validity and usability. Furthermore, for words with low frequency, our method is also well performed. PMID:25983745

  12. A Time-Based Account of the Perception of Odor Objects and Valences

    PubMed Central

    Olofsson, Jonas K.; Bowman, Nicholas E.; Khatibi, Katherine; Gottfried, Jay A.

    2013-01-01

    Is human odor perception guided by memory or emotion? Object-centered accounts predict that recognition of unique odor qualities precedes valence decoding. Valence-centered accounts predict the opposite: that stimulus-driven valence responses precede and guide identification. In a speeded response time study, participants smelled paired odors, presented sequentially, and indicated whether the second odor in each pair belonged to the same category as the first (object evaluation task) or whether the second odor was more pleasant than the first (valence evaluation task). Object evaluation was faster and more accurate than valence evaluation. In a complementary experiment, participants performed an identification task, in which they indicated whether an odor matched the previously presented word label. Responses were quicker for odors preceded by semantically matching, rather than nonmatching, word labels, but results showed no evidence of interference from valence on nonmatching trials. These results are in accordance with object-centered accounts of odor perception. PMID:22961773

  13. A spatiotemporal dengue fever early warning model accounting for nonlinear associations with meteorological factors: a Bayesian maximum entropy approach

    NASA Astrophysics Data System (ADS)

    Lee, Chieh-Han; Yu, Hwa-Lung; Chien, Lung-Chang

    2014-05-01

    Dengue fever has been identified as one of the most widespread vector-borne diseases in tropical and sub-tropical. In the last decade, dengue is an emerging infectious disease epidemic in Taiwan especially in the southern area where have annually high incidences. For the purpose of disease prevention and control, an early warning system is urgently needed. Previous studies have showed significant relationships between climate variables, in particular, rainfall and temperature, and the temporal epidemic patterns of dengue cases. However, the transmission of the dengue fever is a complex interactive process that mostly understated the composite space-time effects of dengue fever. This study proposes developing a one-week ahead warning system of dengue fever epidemics in the southern Taiwan that considered nonlinear associations between weekly dengue cases and meteorological factors across space and time. The early warning system based on an integration of distributed lag nonlinear model (DLNM) and stochastic Bayesian Maximum Entropy (BME) analysis. The study identified the most significant meteorological measures including weekly minimum temperature and maximum 24-hour rainfall with continuous 15-week lagged time to dengue cases variation under condition of uncertainty. Subsequently, the combination of nonlinear lagged effects of climate variables and space-time dependence function is implemented via a Bayesian framework to predict dengue fever occurrences in the southern Taiwan during 2012. The result shows the early warning system is useful for providing potential outbreak spatio-temporal prediction of dengue fever distribution. In conclusion, the proposed approach can provide a practical disease control tool for environmental regulators seeking more effective strategies for dengue fever prevention.

  14. Investigative Primary Science: A Problem-Based Learning Approach

    ERIC Educational Resources Information Center

    Etherington, Matthew B.

    2011-01-01

    This study reports on the success of using a problem-based learning approach (PBL) as a pedagogical mode of learning open inquiry science within a traditional four-year undergraduate elementary teacher education program. In 2010, a problem-based learning approach to teaching primary science replaced the traditional content driven syllabus. During…

  15. A Strength-Based Approach to Teacher Professional Development

    ERIC Educational Resources Information Center

    Zwart, Rosanne C.; Korthagen, Fred A. J.; Attema-Noordewier, Saskia

    2015-01-01

    Based on positive psychology, self-determination theory and a perspective on teacher quality, this study proposes and examines a strength-based approach to teacher professional development. A mixed method pre-test/post-test design was adopted to study perceived outcomes of the approach for 93 teachers of six primary schools in the Netherlands and…

  16. EFL Reading Instruction: Communicative Task-Based Approach

    ERIC Educational Resources Information Center

    Sidek, Harison Mohd

    2012-01-01

    The purpose of this study was to examine the overarching framework of EFL (English as a Foreign Language) reading instructional approach reflected in an EFL secondary school curriculum in Malaysia. Based on such analysis, a comparison was made if Communicative Task-Based Language is the overarching instructional approach for the Malaysian EFL…

  17. Adapting Educational Measurement to the Demands of Test-Based Accountability

    ERIC Educational Resources Information Center

    Koretz, Daniel

    2015-01-01

    Accountability has become a primary function of large-scale testing in the United States. The pressure on educators to raise scores is vastly greater than it was several decades ago. Research has shown that high-stakes testing can generate behavioral responses that inflate scores, often severely. I argue that because of these responses, using…

  18. Teachers' Perceptions of the Impact of Performance-Based Accountability on Teacher Efficacy

    ERIC Educational Resources Information Center

    Gantt, Phyllis Elizabeth Crowley

    2012-01-01

    Implementation of state and federal high-stakes accountability measures such as end-of-course tests (EoCTs) has contributed to increased teacher stress in the classroom, decreased teacher creativity and autonomy, and reduced effectiveness. Prior research focused primarily on the elementary and middle school levels, so this study sought to examine…

  19. School-Based Accountability and the Distribution of Teacher Quality across Grades in Elementary School

    ERIC Educational Resources Information Center

    Fuller, Sarah C.; Ladd, Helen F.

    2013-01-01

    We use North Carolina data to explore whether the quality of teachers in the lower elementary grades (K-2) falls short of teacher quality in the upper grades (3-5) and to examine the hypothesis that school accountability pressures contribute to such quality shortfalls. Our concern with the early grades arises from recent studies highlighting how…

  20. Hamlet without the Prince: Shortcomings of an Activity-Based Account of Joint Attention

    ERIC Educational Resources Information Center

    Hobson, R. Peter

    2007-01-01

    In this commentary, I consider several strengths of the position adopted by Racine and Carpendale (2007), but suggest that the authors are in danger of overstating their case. In doing so, they appear to sideline an issue that should be pivotal for accounts of joint attention: how does a child come to arrive at an understanding that people, both…

  1. Accounting for Teamwork: A Critical Study of Group-Based Systems of Organizational Control.

    ERIC Educational Resources Information Center

    Ezzamel, Mahmoud; Willmott, Hugh

    1998-01-01

    Examines the role of accounting calculations in reorganizing manufacturing capabilities of a vertically integrated global retailing company. Introducing teamwork to replace line work extended traditional, hierarchical management control systems. Teamwork's self-managing demands contravened workers' established sense of self-identity as…

  2. Performance-Based Incentives and the Behavior of Accounting Academics: Responding to Changes

    ERIC Educational Resources Information Center

    Moya, Soledad; Prior, Diego; Rodríguez-Pérez, Gonzalo

    2015-01-01

    When laws change the rules of the game, it is important to observe the effects on the players' behavior. Some effects can be anticipated while others are difficult to enunciate before the law comes into force. In this paper we have analyzed articles authored by Spanish accounting academics between 1996 and 2005 to assess the impact of a change in…

  3. Crafting Coherence from Complex Policy Messages: Educators' Perceptions of Special Education and Standards-Based Accountability Policies

    ERIC Educational Resources Information Center

    Russell, Jennifer Lin; Bray, Laura E.

    2013-01-01

    Federal special education and accountability policies requires that educators individualize instruction for students with disabilities, while simultaneously ensuring that the vast majority of these students meet age-based grade-level standards and assessment targets. In this paper, we examine this dynamic interplay between policies through…

  4. Computer-based Approaches to Patient Education

    PubMed Central

    Lewis, Deborah

    1999-01-01

    All articles indexed in MEDLINE or CINAHL, related to the use of computer technology in patient education, and published in peer-reviewed journals between 1971 and 1998 were selected for review. Sixty-six articles, including 21 research-based reports, were identified. Forty-five percent of the studies were related to the management of chronic disease. Thirteen studies described an improvement in knowledge scores or clinical outcomes when computer-based patient education was compared with traditional instruction. Additional articles examined patients' computer experience, socioeconomic status, race, and gender and found no significant differences when compared with program outcomes. Sixteen of the 21 research-based studies had effect sizes greater than 0.5, indicating a significant change in the described outcome when the study subjects participated in computer-based patient education. The findings from this review support computer-based education as an effective strategy for transfer of knowledge and skill development for patients. The limited number of research studies (N = 21) points to the need for additional research. Recommendations for new studies include cost-benefit analysis and the impact of these new technologies on health outcomes over time. PMID:10428001

  5. Allocating physicians' overhead costs to services: an econometric/accounting-activity based-approach.

    PubMed

    Peden, Al; Baker, Judith J

    2002-01-01

    Using the optimizing properties of econometric analysis, this study analyzes how physician overhead costs (OC) can be allocated to multiple activities to maximize precision in reimbursing the costs of services. Drawing on work by Leibenstein and Friedman, the analysis also shows that allocating OC to multiple activities unbiased by revenue requires controlling for revenue when making the estimates. Further econometric analysis shows that it is possible to save about 10 percent of OC by paying only for those that are necessary.

  6. An evidence-based approach to genioplasty.

    PubMed

    Sati, Shawkat; Havlik, Robert J

    2011-02-01

    The Maintenance of Certification module series is designed to help the clinician structure his or her study in specific areas appropriate to his or her clinical practice. This article is prepared to accompany practice-based assessment of preoperative assessment, anesthesia, surgical treatment plan, perioperative management, and outcomes. In this format, the clinician is invited to compare his or her methods of patient assessment and treatment, outcomes, and complications, with authoritative, information-based references. This information base is then used for self-assessment and benchmarking in parts II and IV of the Maintenance of Certification process of the American Board of Plastic Surgery. This article is not intended to be an exhaustive treatise on the subject. Rather, it is designed to serve as a reference point for further in-depth study by review of the reference articles presented.

  7. Business Approach To Lunar Base Activation

    NASA Astrophysics Data System (ADS)

    Schmitt, Harrison H.

    2003-01-01

    It remains unlikely that any government or group of governments will make the long-term funding commitments necessary to return to the Moon in support of scientific goals or resource production. If a lunar base is to be established within the foreseeable future, it will support of commercial production and use of unique energy resources Business plan development for commercial production and use of lunar Helium-3 requires a number of major steps, including identification of the required investor base and development of fusion power technology through a series of business bridges that provide required rates of return.

  8. Component design bases - A template approach

    SciTech Connect

    Pabst, L.F. ); Strickland, K.M. )

    1991-01-01

    A well-documented nuclear plant design basis can enhance plant safety and availability. Older plants, however, often lack historical evidence of the original design intent, particularly for individual components. Most plant documentation describes the actual design (what is) rather than the bounding limits of the design. Without knowledge of these design limits, information from system descriptions and equipment specifications is often interpreted as inviolate design requirements. Such interpretations may lead to unnecessary design conservatism in plant modifications and unnecessary restrictions on plant operation. In 1986, Florida Power and Light Company's (FP and L's) Turkey Point plant embarked on one of the first design basis reconstitution programs in the United States to catalog the true design requirements. As the program developed, design basis users expressed a need for additional information at the component level. This paper outlines a structured (template) approach to develop useful component design basis information (including the WHYs behind the design).

  9. A model-based multisensor data fusion knowledge management approach

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-06-01

    A variety of approaches exist for combining data from multiple sensors. The model-based approach combines data based on its support for or refutation of elements of the model which in turn can be used to evaluate an experimental thesis. This paper presents a collection of algorithms for mapping various types of sensor data onto a thesis-based model and evaluating the truth or falsity of the thesis, based on the model. The use of this approach for autonomously arriving at findings and for prioritizing data are considered. Techniques for updating the model (instead of arriving at a true/false assertion) are also discussed.

  10. PBL Approach in Web-Based Instruction

    ERIC Educational Resources Information Center

    ChanLin, Lih-Juan; Chan, Kung-Chi

    2004-01-01

    Web-Based Instruction is increasingly being recognized as a means of teaching and learning. In dietetics, the interactions between drugs and nutrients are complex due to the wide variety of drugs and their mechanism and interactions with nutrients. How to help student professionals acquired necessary skills and knowledge is important in a dietetic…

  11. Context-Based Chemistry: The Salters Approach

    ERIC Educational Resources Information Center

    Bennett, Judith; Lubben, Fred

    2006-01-01

    This paper describes briefly the development and key features of one of the major context-based courses for upper high school students, Salters Advanced Chemistry. It goes on to consider the research evidence on the impact of the course, focusing on teachers' views, and, in particular, on students' affective and cognitive responses. The research…

  12. Accounting for ecosystem services in Life Cycle Assessment, Part II: toward an ecologically based LCA.

    PubMed

    Zhang, Yi; Baral, Anil; Bakshi, Bhavik R

    2010-04-01

    Despite the essential role of ecosystem goods and services in sustaining all human activities, they are often ignored in engineering decision making, even in methods that are meant to encourage sustainability. For example, conventional Life Cycle Assessment focuses on the impact of emissions and consumption of some resources. While aggregation and interpretation methods are quite advanced for emissions, similar methods for resources have been lagging, and most ignore the role of nature. Such oversight may even result in perverse decisions that encourage reliance on deteriorating ecosystem services. This article presents a step toward including the direct and indirect role of ecosystems in LCA, and a hierarchical scheme to interpret their contribution. The resulting Ecologically Based LCA (Eco-LCA) includes a large number of provisioning, regulating, and supporting ecosystem services as inputs to a life cycle model at the process or economy scale. These resources are represented in diverse physical units and may be compared via their mass, fuel value, industrial cumulative exergy consumption, or ecological cumulative exergy consumption or by normalization with total consumption of each resource or their availability. Such results at a fine scale provide insight about relative resource use and the risk and vulnerability to the loss of specific resources. Aggregate indicators are also defined to obtain indices such as renewability, efficiency, and return on investment. An Eco-LCA model of the 1997 economy is developed and made available via the web (www.resilience.osu.edu/ecolca). An illustrative example comparing paper and plastic cups provides insight into the features of the proposed approach. The need for further work in bridging the gap between knowledge about ecosystem services and their direct and indirect role in supporting human activities is discussed as an important area for future work.

  13. Digital phonocardiography: a PDA-based approach.

    PubMed

    Brusco, Matias; Nazeran, Homer

    2004-01-01

    In this paper we demonstrate the applicability of advanced digital signal processing algorithms to the analysis of heart sound signals and describe the development of a PDA-based biomedical instrument capable of acquisition, processing, and analysis of heart sounds. Fourier transform-based spectral analysis of heart sounds was carried out first to show the differences in the frequency contents of normal and abnormal heart sounds. As the time-varying nature of heart sounds calls for better techniques capable of analyzing such signals: the short time Fourier transform (STFT) or spectrogram analysis was performed next. This method performed remarkably well in displaying frequency, magnitude, and time information of the heart sounds, providing robust parameters to make accurate diagnosis. With continuous technological advancements in computing and biomedical instrumentation, and the concurrent popularity of handheld instruments in the medical community, we introduce the concept of PDA-based digital phonocardiography. A prototype system is comprised of a digital stethoscope and a pocket PC. Heart sounds are recorded and displayed in the pocket PC screen. Advanced signal processing algorithms are implemented using the combined capabilities of software tools such as LabVlEW and embedded Visual C(++).

  14. Cost unit accounting based on a clinical pathway: a practical tool for DRG implementation.

    PubMed

    Feyrer, R; Rösch, J; Weyand, M; Kunzmann, U

    2005-10-01

    Setting up a reliable cost unit accounting system in a hospital is a fundamental necessity for economic survival, given the current general conditions in the healthcare system. Definition of a suitable cost unit is a crucial factor for success. We present here the development and use of a clinical pathway as a cost unit as an alternative to the DRG. Elective coronary artery bypass grafting was selected as an example. Development of the clinical pathway was conducted according to a modular concept that mirrored all the treatment processes across various levels and modules. Using service records and analyses the process algorithms of the clinical pathway were developed and visualized with CorelTM iGrafix Process 2003. A detailed process cost record constituted the basis of the pathway costing, in which financial evaluation of the treatment processes was performed. The result of this study was a structured clinical pathway for coronary artery bypass grafting together with a cost calculation in the form of cost unit accounting. The use of a clinical pathway as a cost unit offers considerable advantages compared to the DRG or clinical case. The variance in the diagnoses and procedures within a pathway is minimal, so the consumption of resources is homogeneous. This leads to a considerable improvement in the value of cost unit accounting as a strategic control instrument in hospitals.

  15. A model-based approach to selection of tag SNPs

    PubMed Central

    Nicolas, Pierre; Sun, Fengzhu; Li, Lei M

    2006-01-01

    Background Single Nucleotide Polymorphisms (SNPs) are the most common type of polymorphisms found in the human genome. Effective genetic association studies require the identification of sets of tag SNPs that capture as much haplotype information as possible. Tag SNP selection is analogous to the problem of data compression in information theory. According to Shannon's framework, the optimal tag set maximizes the entropy of the tag SNPs subject to constraints on the number of SNPs. This approach requires an appropriate probabilistic model. Compared to simple measures of Linkage Disequilibrium (LD), a good model of haplotype sequences can more accurately account for LD structure. It also provides a machinery for the prediction of tagged SNPs and thereby to assess the performances of tag sets through their ability to predict larger SNP sets. Results Here, we compute the description code-lengths of SNP data for an array of models and we develop tag SNP selection methods based on these models and the strategy of entropy maximization. Using data sets from the HapMap and ENCODE projects, we show that the hidden Markov model introduced by Li and Stephens outperforms the other models in several aspects: description code-length of SNP data, information content of tag sets, and prediction of tagged SNPs. This is the first use of this model in the context of tag SNP selection. Conclusion Our study provides strong evidence that the tag sets selected by our best method, based on Li and Stephens model, outperform those chosen by several existing methods. The results also suggest that information content evaluated with a good model is more sensitive for assessing the quality of a tagging set than the correct prediction rate of tagged SNPs. Besides, we show that haplotype phase uncertainty has an almost negligible impact on the ability of good tag sets to predict tagged SNPs. This justifies the selection of tag SNPs on the basis of haplotype informativeness, although genotyping

  16. Practical approach to physical-chemical acid-base management. Stewart at the bedside.

    PubMed

    Magder, Sheldon; Emami, Ali

    2015-01-01

    The late Peter Stewart developed an approach to the analysis of acid-base disturbances in biological systems based on basic physical-chemical principles. His key argument was that the traditional carbon dioxide/bicarbonate analysis with just the use of the Henderson-Hasselbalch equation does not account for the important role in the regulation of H(+) concentration played by strong ions, weak acids and water itself. Acceptance of his analysis has been limited because it requires a complicated set of calculations to account for all the variables and it does not provide simple clinical guidance. However, the analysis can be made more pragmatic by using a series of simple equations to quantify the major processes in acid-base disturbances. These include the traditional PCO2 component and the addition of four metabolic processes, which we classify as "water-effects," "chloride-effects," "albumin effects," and "others." Six values are required for the analysis: [Na(+)], [Cl(-)], pH, Pco2, albumin concentration, and base excess. The advantage of this approach is that it gives a better understanding of the mechanisms behind acid-base abnormalities and more readily leads to clinical actions that can prevent or correct the abnormalities. We have developed a simple free mobile app that can be used to input the necessary values to use this approach at the bedside (Physical/Chemical Acid Base Calculator).

  17. Theory-Based Approaches to the Concept of Life

    ERIC Educational Resources Information Center

    El-Hani, Charbel Nino

    2008-01-01

    In this paper, I argue that characterisations of life through lists of properties have several shortcomings and should be replaced by theory-based accounts that explain the coexistence of a set of properties in living beings. The concept of life should acquire its meaning from its relationships with other concepts inside a theory. I illustrate…

  18. Arts-based and creative approaches to dementia care.

    PubMed

    McGreevy, Jessica

    2016-02-01

    This article presents a review of arts-based and creative approaches to dementia care as an alternative to antipsychotic medications. While use of antipsychotics may be appropriate for some people, the literature highlights the success of creative approaches and the benefits of their lack of negative side effects associated with antipsychotics. The focus is the use of biographical approaches, music, dance and movement to improve wellbeing, enhance social networks, support inclusive practice and enable participation. Staff must be trained to use these approaches. A case study is presented to demonstrate how creative approaches can be implemented in practice and the outcomes that can be expected when used appropriately. PMID:26938607

  19. Wavelet-based approach to character skeleton.

    PubMed

    You, Xinge; Tang, Yuan Yan

    2007-05-01

    Character skeleton plays a significant role in character recognition. The strokes of a character may consist of two regions, i.e., singular and regular regions. The intersections and junctions of the strokes belong to singular region, while the straight and smooth parts of the strokes are categorized to regular region. Therefore, a skeletonization method requires two different processes to treat the skeletons in theses two different regions. All traditional skeletonization algorithms are based on the symmetry analysis technique. The major problems of these methods are as follows. 1) The computation of the primary skeleton in the regular region is indirect, so that its implementation is sophisticated and costly. 2) The extracted skeleton cannot be exactly located on the central line of the stroke. 3) The captured skeleton in the singular region may be distorted by artifacts and branches. To overcome these problems, a novel scheme of extracting the skeleton of character based on wavelet transform is presented in this paper. This scheme consists of two main steps, namely: a) extraction of primary skeleton in the regular region and b) amendment processing of the primary skeletons and connection of them in the singular region. A direct technique is used in the first step, where a new wavelet-based symmetry analysis is developed for finding the central line of the stroke directly. A novel method called smooth interpolation is designed in the second step, where a smooth operation is applied to the primary skeleton, and, thereafter, the interpolation compensation technique is proposed to link the primary skeleton, so that the skeleton in the singular region can be produced. Experiments are conducted and positive results are achieved, which show that the proposed skeletonization scheme is applicable to not only binary image but also gray-level image, and the skeleton is robust against noise and affine transform.

  20. Approach for classification and taxonomy within family Rickettsiaceae based on the Formal Order Analysis.

    PubMed

    Shpynov, S; Pozdnichenko, N; Gumenuk, A

    2015-01-01

    Genome sequences of 36 Rickettsia and Orientia were analyzed using Formal Order Analysis (FOA). This approach takes into account arrangement of nucleotides in each sequence. A numerical characteristic, the average distance (remoteness) - "g" was used to compare of genomes. Our results corroborated previous separation of three groups within the genus Rickettsia, including typhus group, classic spotted fever group, and the ancestral group and Orientia as a separate genus. Rickettsia felis URRWXCal2 and R. akari Hartford were not in the same group based on FOA, therefore designation of a so-called transitional Rickettsia group could not be confirmed with this approach.

  1. Reimagining Kindergarten: Restoring a Developmental Approach when Accountability Demands Are Pushing Formal Instruction on the Youngest Learners

    ERIC Educational Resources Information Center

    Graue, Elizabeth

    2009-01-01

    The traditional kindergarten program often reflected a rich but generic approach with creative contexts for typical kindergartners organized around materials (manipulatives or dramatic play) or a developmental area (fine motor or language). The purpose of kindergarten reflected beliefs about how children learn, specialized training for…

  2. Ameliorated GA approach for base station planning

    NASA Astrophysics Data System (ADS)

    Wang, Andong; Sun, Hongyue; Wu, Xiaomin

    2011-10-01

    In this paper, we aim at locating base station (BS) rationally to satisfy the most customs by using the least BSs. An ameliorated GA is proposed to search for the optimum solution. In the algorithm, we mesh the area to be planned according to least overlap length derived from coverage radius, bring into isometric grid encoding method to represent BS distribution as well as its number and develop select, crossover and mutation operators to serve our unique necessity. We also construct our comprehensive object function after synthesizing coverage ratio, overlap ratio, population and geographical conditions. Finally, after importing an electronic map of the area to be planned, a recommended strategy draft would be exported correspondingly. We eventually import HongKong, China to simulate and yield a satisfactory solution.

  3. [Management of large marine ecosystem based on ecosystem approach].

    PubMed

    Chu, Jian-song

    2011-09-01

    Large marine ecosystem (LME) is a large area of ocean characterized by distinct oceanology and ecology. Its natural characteristics require management based on ecosystem approach. A series of international treaties and regulations definitely or indirectly support that it should adopt ecosystem approach to manage LME to achieve the sustainable utilization of marine resources. In practices, some countries such as Canada, Australia, and U.S.A. have adopted ecosystem-based approach to manage their oceans, and some international organizations such as global environment fund committee have carried out a number of LME programs based on ecosystem approach. Aiming at the sustainable development of their fisheries, the regional organizations such as Caribbean Community have established regional fisheries mechanism. However, the adoption of ecosystem approach to manage LME is not only a scientific and legal issue, but also a political matter largely depending on the political will and the mutual cooperation degree of related countries.

  4. Enuresis in children: a case based approach.

    PubMed

    Baird, Drew C; Seehusen, Dean A; Bode, David V

    2014-10-15

    Enuresis is defined as intermittent urinary incontinence during sleep in a child at least five years of age. Approximately 5% to 10% of all seven-year-olds have enuresis, and an estimated 5 to 7 million children in the United States have enuresis. The pathophysiology of primary nocturnal enuresis involves the inability to awaken from sleep in response to a full bladder, coupled with excessive nighttime urine production or a decreased functional capacity of the bladder. Initial evaluation should include a history, physical examination, and urinalysis. Several conditions, such as constipation, obstructive sleep apnea, diabetes mellitus, diabetes insipidus, chronic kidney disease, and psychiatric disorders, are associated with enuresis. If identified, these conditions should be evaluated and treated. Treatment of primary monosymptomatic enuresis (i.e., the only symptom is nocturnal bed-wetting in a child who has never been dry) begins with counseling the child and parents on effective behavioral modifications. First-line treatments for enuresis include bed alarm therapy and desmopressin. The choice of therapy is based on the child's age and nighttime voiding patterns, and the desires of the child and family. Referral to a pediatric urologist is indicated for children with primary enuresis refractory to standard and combination therapies, and for children with some secondary causes of enuresis, including urinary tract malformations, recurrent urinary tract infections, or neurologic disorders.

  5. Systems-based approaches toward wound healing

    PubMed Central

    Buganza-Tepole, Adrian; Kuhl, Ellen

    2013-01-01

    Wound healing in the pediatric patient is of utmost clinical and social importance, since hypertrophic scarring can have aesthetic and psychological sequelae, from early childhood to late adolescence. Wound healing is a well-orchestrated reparative response affecting the damaged tissue at the cellular, tissue, organ, and system scales. While tremendous progress has been made towards understanding wound healing at the individual temporal and spatial scales, its effects across the scales remain severely understudied and poorly understood. Here we discuss the critical need for systems-based computational modeling of wound healing across the scales, from short-term to long-term and from small to large. We illustrate the state of the art in systems modeling by means of three key signaling mechanisms: oxygen tension regulating angiogenesis and revascularization; TGF-β kinetics controlling collagen deposition; and mechanical stretch stimulating cellular mitosis and extracellular matrix remodeling. The complex network of biochemical and biomechanical signaling mechanisms and the multi-scale character of the healing process make systems modeling an integral tool in exploring personalized strategies for wound repair. A better mechanistic understanding of wound healing in the pediatric patient could open new avenues in treating children with skin disorders such as birth defects, skin cancer, wounds, and burn injuries. PMID:23314298

  6. Revising a Design Course from a Lecture Approach to a Project-Based Learning Approach

    ERIC Educational Resources Information Center

    Kunberger, Tanya

    2013-01-01

    In order to develop the evaluative skills necessary for successful performance of design, a senior, Geotechnical Engineering course was revised to immerse students in the complexity of the design process utilising a project-based learning (PBL) approach to instruction. The student-centred approach stresses self-directed group learning, which…

  7. A Novel Approach for Deriving Force Field Torsion Angle Parameters Accounting for Conformation-Dependent Solvation Effects.

    PubMed

    Zgarbová, Marie; Luque, F Javier; Šponer, Jiří; Otyepka, Michal; Jurečka, Petr

    2012-09-11

    A procedure for deriving force field torsion parameters including certain previously neglected solvation effects is suggested. In contrast to the conventional in vacuo approaches, the dihedral parameters are obtained from the difference between the quantum-mechanical self-consistent reaction field and Poisson-Boltzmann continuum solvation models. An analysis of the solvation contributions shows that two major effects neglected when torsion parameters are derived in vacuo are (i) conformation-dependent solute polarization and (ii) solvation of conformation-dependent charge distribution. Using the glycosidic torsion as an example, we demonstrate that the corresponding correction for the torsion potential is substantial and important. Our approach avoids double counting of solvation effects and provides parameters that may be used in combination with any of the widely used nonpolarizable discrete solvent models, such as TIPnP or SPC/E, or with continuum solvent models. Differences between our model and the previously suggested solvation models are discussed. Improvements were demonstrated for the latest AMBER RNA χOL3 parameters derived with inclusion of solvent effects in a previous publication (Zgarbova et al. J. Chem. Theory Comput.2011, 7, 2886). The described procedure may help to provide consistently better force field parameters than the currently used parametrization approaches.

  8. Interteaching: An Evidence-Based Approach to Instruction

    ERIC Educational Resources Information Center

    Brown, Thomas Wade; Killingsworth, Kenneth; Alavosius, Mark P.

    2014-01-01

    This paper describes "interteaching" as an evidence-based method of instruction. Instructors often rely on more traditional approaches, such as lectures, as means to deliver instruction. Despite high usage, these methods are ineffective at achieving desirable academic outcomes. We discuss an innovative approach to delivering instruction…

  9. Simulation-Based Constructivist Approach for Education Leaders

    ERIC Educational Resources Information Center

    Shapira-Lishchinsky, Orly

    2015-01-01

    The purpose of this study was to reflect the leadership strategies that may arise using a constructivist approach based on organizational learning. This approach involved the use of simulations that focused on ethical tensions in school principals' daily experiences, and the development of codes of ethical conduct to reduce these tensions. The…

  10. A unified account of tilt illusions, association fields, and contour detection based on elastica.

    PubMed

    Keemink, Sander W; van Rossum, Mark C W

    2016-09-01

    As expressed in the Gestalt law of good continuation, human perception tends to associate stimuli that form smooth continuations. Contextual modulation in primary visual cortex, in the form of association fields, is believed to play an important role in this process. Yet a unified and principled account of the good continuation law on the neural level is lacking. In this study we introduce a population model of primary visual cortex. Its contextual interactions depend on the elastica curvature energy of the smoothest contour connecting oriented bars. As expected, this model leads to association fields consistent with data. However, in addition the model displays tilt-illusions for stimulus configurations with grating and single bars that closely match psychophysics. Furthermore, the model explains not only pop-out of contours amid a variety of backgrounds, but also pop-out of single targets amid a uniform background. We thus propose that elastica is a unifying principle of the visual cortical network.

  11. A Bayesian hierarchical method to account for random effects in cytogenetic dosimetry based on calibration curves.

    PubMed

    Mano, Shuhei; Suto, Yumiko

    2014-11-01

    The dicentric chromosome assay (DCA) is one of the most sensitive and reliable methods of inferring doses of radiation exposure in patients. In DCA, one calibration curve is prepared in advance by in vitro irradiation to blood samples from one or sometimes multiple healthy donors in considering possible inter-individual variability. Although the standard method has been demonstrated to be quite accurate for actual dose estimates, it cannot account for random effects, which come from such as the blood donor used to prepare the calibration curve, the radiation-exposed patient, and the examiners. To date, it is unknown how these random effects impact on the standard method of dose estimation. We propose a novel Bayesian hierarchical method that incorporates random effects into the dose estimation. To demonstrate dose estimation by the proposed method and to assess the impact of inter-individual variability in samples from multiple donors on the estimation, peripheral blood samples from 13 occupationally non-exposed, non-smoking, healthy individuals were collected and irradiated with gamma rays. The results clearly showed significant inter-individual variability and the standard method using a sample from a single donor gave anti-conservative confidence interval of the irradiated dose. In contrast, the Bayesian credible interval for irradiated dose calculated by the proposed method using samples from multiple donors properly covered the actual doses. Although the classical confidence interval of calibration curve with accounting inter-individual variability in samples from multiple donors was roughly coincident with the Bayesian credible interval, the proposed method has better reasoning and potential for extensions.

  12. Site-Based Management versus Systems-Based Thinking: The Impact of Data-Driven Accountability and Reform

    ERIC Educational Resources Information Center

    Mette, Ian M.; Bengtson, Ed

    2015-01-01

    This case was written to help prepare building-level and central office administrators who are expected to effectively lead schools and systems in an often tumultuous world of educational accountability and reform. The intent of this case study is to allow educators to examine the impact data management has on the types of thinking required when…

  13. Accounting for Sampling Error When Inferring Population Synchrony from Time-Series Data: A Bayesian State-Space Modelling Approach with Applications

    PubMed Central

    Santin-Janin, Hugues; Hugueny, Bernard; Aubry, Philippe; Fouchet, David; Gimenez, Olivier; Pontier, Dominique

    2014-01-01

    Background Data collected to inform time variations in natural population size are tainted by sampling error. Ignoring sampling error in population dynamics models induces bias in parameter estimators, e.g., density-dependence. In particular, when sampling errors are independent among populations, the classical estimator of the synchrony strength (zero-lag correlation) is biased downward. However, this bias is rarely taken into account in synchrony studies although it may lead to overemphasizing the role of intrinsic factors (e.g., dispersal) with respect to extrinsic factors (the Moran effect) in generating population synchrony as well as to underestimating the extinction risk of a metapopulation. Methodology/Principal findings The aim of this paper was first to illustrate the extent of the bias that can be encountered in empirical studies when sampling error is neglected. Second, we presented a space-state modelling approach that explicitly accounts for sampling error when quantifying population synchrony. Third, we exemplify our approach with datasets for which sampling variance (i) has been previously estimated, and (ii) has to be jointly estimated with population synchrony. Finally, we compared our results to those of a standard approach neglecting sampling variance. We showed that ignoring sampling variance can mask a synchrony pattern whatever its true value and that the common practice of averaging few replicates of population size estimates poorly performed at decreasing the bias of the classical estimator of the synchrony strength. Conclusion/Significance The state-space model used in this study provides a flexible way of accurately quantifying the strength of synchrony patterns from most population size data encountered in field studies, including over-dispersed count data. We provided a user-friendly R-program and a tutorial example to encourage further studies aiming at quantifying the strength of population synchrony to account for uncertainty in

  14. Library support for problem-based learning: an algorithmic approach.

    PubMed

    Ispahany, Nighat; Torraca, Kathren; Chilov, Marina; Zimbler, Elaine R; Matsoukas, Konstantina; Allen, Tracy Y

    2007-01-01

    Academic health sciences libraries can take various approaches to support the problem-based learning component of the curriculum. This article presents one such approach taken to integrate information navigation skills into the small group discussion part of the Pathophysiology course in the second year of the Dental school curriculum. Along with presenting general resources for the course, the Library Toolkit introduced an algorithmic approach to finding answers to sample clinical case questions. While elements of Evidence-Based Practice were introduced, the emphasis was on teaching students to navigate relevant resources and apply various database search techniques to find answers to the clinical problems presented.

  15. Background Subtraction Approach based on Independent Component Analysis

    PubMed Central

    Jiménez-Hernández, Hugo

    2010-01-01

    In this work, a new approach to background subtraction based on independent component analysis is presented. This approach assumes that background and foreground information are mixed in a given sequence of images. Then, foreground and background components are identified, if their probability density functions are separable from a mixed space. Afterwards, the components estimation process consists in calculating an unmixed matrix. The estimation of an unmixed matrix is based on a fast ICA algorithm, which is estimated as a Newton-Raphson maximization approach. Next, the motion components are represented by the mid-significant eigenvalues from the unmixed matrix. Finally, the results show the approach capabilities to detect efficiently motion in outdoors and indoors scenarios. The results show that the approach is robust to luminance conditions changes at scene. PMID:22219704

  16. A Carbon Monitoring System Approach to US Coastal Wetland Carbon Fluxes: Progress Towards a Tier II Accounting Method with Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Windham-Myers, L.; Holmquist, J. R.; Bergamaschi, B. A.; Byrd, K. B.; Callaway, J.; Crooks, S.; Drexler, J. Z.; Feagin, R. A.; Ferner, M. C.; Gonneea, M. E.; Kroeger, K. D.; Megonigal, P.; Morris, J. T.; Schile, L. M.; Simard, M.; Sutton-Grier, A.; Takekawa, J.; Troxler, T.; Weller, D.; Woo, I.

    2015-12-01

    Despite their high rates of long-term carbon (C) sequestration when compared to upland ecosystems, coastal C accounting is only recently receiving the attention of policy makers and carbon markets. Assessing accuracy and uncertainty in net C flux estimates requires both direct and derived measurements based on both short and long term dynamics in key drivers, particularly soil accretion rates and soil organic content. We are testing the ability of remote sensing products and national scale datasets to estimate biomass and soil stocks and fluxes over a wide range of spatial and temporal scales. For example, the 2013 Wetlands Supplement to the 2006 IPCC GHG national inventory reporting guidelines requests information on development of Tier I-III reporting, which express increasing levels of detail. We report progress toward development of a Carbon Monitoring System for "blue carbon" that may be useful for IPCC reporting guidelines at Tier II levels. Our project uses a current dataset of publically available and contributed field-based measurements to validate models of changing soil C stocks, across a broad range of U.S. tidal wetland types and landuse conversions. Additionally, development of biomass algorithms for both radar and spectral datasets will be tested and used to determine the "price of precision" of different satellite products. We discuss progress in calculating Tier II estimates focusing on variation introduced by the different input datasets. These include the USFWS National Wetlands Inventory, NOAA Coastal Change Analysis Program, and combinations to calculate tidal wetland area. We also assess the use of different attributes and depths from the USDA-SSURGO database to map soil C density. Finally, we examine the relative benefit of radar, spectral and hybrid approaches to biomass mapping in tidal marshes and mangroves. While the US currently plans to report GHG emissions at a Tier I level, we argue that a Tier II analysis is possible due to national

  17. Negotiating Accountability during Student Teaching: The Influence of an Inquiry-Based Student Teaching Seminar

    ERIC Educational Resources Information Center

    Cuenca, Alexander

    2014-01-01

    Drawing on the work of Russian literary critic, Mikhail Bakhtin, this article explores how an inquiry-based social studies student teaching seminar helped three preservice teachers negotiate the pressures of standards-based reforms during student teaching. The author first examines how initial perceptions of standardization and high-stakes testing…

  18. When Creative Problem Solving Strategy Meets Web-Based Cooperative Learning Environment in Accounting Education

    ERIC Educational Resources Information Center

    Cheng, Kai Wen

    2011-01-01

    Background: Facing highly competitive and changing environment, cultivating citizens with problem-solving attitudes is one critical vision of education. In brief, the importance of education is to cultivate students with practical abilities. Realizing the advantages of web-based cooperative learning (web-based CL) and creative problem solving…

  19. Educational Accountability

    ERIC Educational Resources Information Center

    Pincoffs, Edmund L.

    1973-01-01

    Discusses educational accountability as the paradigm of performance contracting, presents some arguments for and against accountability, and discusses the goals of education and the responsibility of the teacher. (Author/PG)

  20. Local fractal dimension based approaches for colonic polyp classification.

    PubMed

    Häfner, Michael; Tamaki, Toru; Tanaka, Shinji; Uhl, Andreas; Wimmer, Georg; Yoshida, Shigeto

    2015-12-01

    This work introduces texture analysis methods that are based on computing the local fractal dimension (LFD; or also called the local density function) and applies them for colonic polyp classification. The methods are tested on 8 HD-endoscopic image databases, where each database is acquired using different imaging modalities (Pentax's i-Scan technology combined with or without staining the mucosa) and on a zoom-endoscopic image database using narrow band imaging. In this paper, we present three novel extensions to a LFD based approach. These extensions additionally extract shape and/or gradient information of the image to enhance the discriminativity of the original approach. To compare the results of the LFD based approaches with the results of other approaches, five state of the art approaches for colonic polyp classification are applied to the employed databases. Experiments show that LFD based approaches are well suited for colonic polyp classification, especially the three proposed extensions. The three proposed extensions are the best performing methods or at least among the best performing methods for each of the employed databases. The methods are additionally tested by means of a public texture image database, the UIUCtex database. With this database, the viewpoint invariance of the methods is assessed, an important features for the employed endoscopic image databases. Results imply that most of the LFD based methods are more viewpoint invariant than the other methods. However, the shape, size and orientation adapted LFD approaches (which are especially designed to enhance the viewpoint invariance) are in general not more viewpoint invariant than the other LFD based approaches.

  1. An energy-based model accounting for snow accumulation and snowmelt in a coniferous forest and in an open area

    NASA Astrophysics Data System (ADS)

    Matějka, Ondřej; Jeníček, Michal

    2016-04-01

    An energy balance approach was used to simulate snow water equivalent (SWE) evolution in an open area, forest clearing and coniferous forest during winter seasons 2011/12 and 2012/13 in the Bystřice River basin (Krušné Mountains, Czech Republic). The aim was to describe the impact of vegetation on snow accumulation and snowmelt under different forest canopy structure and trees density. Hemispherical photographs were used to describe the forest canopy structure. Energy balance model of snow accumulation and melt was set up. The snow model was adjusted to account the effects of forest canopy on driving meteorological variables. Leaf area index derived from 32 hemispherical photographs of vegetation and sky was used to implement the forest influence in the snow model. The model was evaluated using snow depth and SWE data measured at 16 localities in winter seasons from 2011 to 2013. The model was able to reproduce the SWE evolution in both winter seasons beneath the forest canopy, forest clearing and open area. The SWE maximum in forest sites was by 18% lower than in open areas and forest clearings. The portion of shortwave radiation on snowmelt rate was by 50% lower in forest areas than in open areas due to shading effect. The importance of turbulent fluxes was by 30% lower in forest sites compared to openings because of wind speed reduction up to 10% of values at corresponding open areas. Indirect estimation of interception rates was derived. Between 14 and 60% of snowfall was intercept and sublimated in the forest canopy in both winter seasons. Based on model results, the underestimation of solid precipitation (heated precipitation gauge used for measurement) at the weather station Hřebečná was revealed. The snowfall was underestimated by 40% in winter season 2011/12 and by 13% in winter season 2012/13. Although, the model formulation appeared sufficient for both analysed winter seasons, canopy effects on the longwave radiation and ground heat flux were not

  2. Retrieval-Based Model Accounts for Striking Profile of Episodic Memory and Generalization

    PubMed Central

    Banino, Andrea; Koster, Raphael; Hassabis, Demis; Kumaran, Dharshan

    2016-01-01

    A fundamental theoretical tension exists between the role of the hippocampus in generalizing across a set of related episodes, and in supporting memory for individual episodes. Whilst the former requires an appreciation of the commonalities across episodes, the latter emphasizes the representation of the specifics of individual experiences. We developed a novel version of the hippocampal-dependent paired associate inference (PAI) paradigm, which afforded us the unique opportunity to investigate the relationship between episodic memory and generalization in parallel. Across four experiments, we provide surprising evidence that the overlap between object pairs in the PAI paradigm results in a marked loss of episodic memory. Critically, however, we demonstrate that superior generalization ability was associated with stronger episodic memory. Through computational simulations we show that this striking profile of behavioral findings is best accounted for by a mechanism by which generalization occurs at the point of retrieval, through the recombination of related episodes on the fly. Taken together, our study offers new insights into the intricate relationship between episodic memory and generalization, and constrains theories of the mechanisms by which the hippocampus supports generalization. PMID:27510579

  3. Creep lifetime prediction of oxide-dispersion-strengthened nickel-base superalloys: A micromechanically based approach

    NASA Astrophysics Data System (ADS)

    Heilmaier, M.; Reppich, B.

    1996-12-01

    The high-temperature creep behavior of the oxide-dispersion-strengthened (ODS) nickel-base superalloys MA 754 and MA 6000 has been investigated at temperatures up to 1273 K and lifetimes of approximately 4000 hours using monotonic creep tests at constant true stress σ, as well as true constant extension rate tests (CERTs) atdot \\varepsilon . The derivation of creep rupture-lifetime diagrams is usually performed with conventional engineering parametric methods, according to Sherby and Dorn or Larson and Miller. In contrast, an alternative method is presented that is based on a more microstructural approach. In order to describe creep, the effective stress model takes into account the hardening contribution σ p caused by the presence of second-phase particles, as well as the classical Taylor back-stress σ p caused by dislocations. The modeled strain rate-stress dependence can be transferred directly into creep-rupture stress-lifetime diagrams using a modified Monkman-Grant (MG) relationship, which adequately describes the interrelation betweendot \\varepsilon representing dislocation motion, and lifetime t f representing creep failure. The comparison with measured creep-rupture data proves the validity of the proposed micromechanical concept.

  4. RESULTS FROM A DEMONSTRATION OF RF-BASED UF6 CYLINDER ACCOUNTING AND TRACKING SYSTEM INSTALLED AT A USEC FACILITY

    SciTech Connect

    Pickett, Chris A; Kovacic, Donald N; Morgan, Jim; Younkin, James R; Carrick, Bernie; Ken, Whittle; Johns, R E

    2008-09-01

    add tamper-indicating and data authentication features to some of the pertinent system components. Future efforts will focus on these needs along with implementing protocols relevant to IAEA safeguards. The work detailed in this report demonstrates the feasibility of constructing RF devices that can survive the operational rigors associated with the transportation, storage, and processing of UF6 cylinders. The system software specially designed for this project is called Cylinder Accounting and Tracking System (CATS). This report details the elements of the CATS rules-based architecture and its use in safeguards-monitoring and asset-tracking applications. Information is also provided on improvements needed to make the technology ready, as well as options for improving the safeguards aspects of the technology. The report also includes feedback from personnel involved in the testing, as well as individuals who could utilize an RF-based system in supporting the performance of their work. The system software was set up to support a Mailbox declaration, where a declaration can be made either before or after cylinder movements take place. When the declaration is made before cylinders move, the operators must enter this information into CATS. If the IAEA then shows up unexpectedly at the facility, they can see how closely the operational condition matches the declaration. If the declaration is made after the cylinders move, this provides greater operational flexibility when schedules are interrupted or are changed, by allowing operators to declare what moves have been completed. The IAEA can then compare where cylinders are with where CATS or the system says they are located. The ability of CATS to automatically generate Mailbox declarations is seen by the authors as a desirable feature. The Mailbox approach is accepted by the IAEA but has not been widely implemented (and never in enrichment facilities). During the course of this project, we have incorporated alternative

  5. Better ILP-Based Approaches to Haplotype Assembly.

    PubMed

    Chen, Zhi-Zhong; Deng, Fei; Shen, Chao; Wang, Yiji; Wang, Lusheng

    2016-07-01

    Haplotype assembly is to directly construct the haplotypes of an individual from sequence fragments (reads) of the individual. Although a number of programs have been designed for computing optimal or heuristic solutions to the haplotype assembly problem, computing an optimal solution may take days or even months while computing a heuristic solution usually requires a trade-off between speed and accuracy. This article refines a previously known integer linear programming-based (ILP-based) approach to the haplotype assembly problem in twofolds. First, the read-matrices of some datasets (such as NA12878) come with a quality for each base in the reads. We here propose to utilize the qualities in the ILP-based approach. Secondly, we propose to use the ILP-based approach to improve the output of any heuristic program for the problem. Experiments with both real and simulated datasets show that the qualities of read-matrices help us find more accurate solutions without significant loss of speed. Moreover, our experimental results show that the proposed hybrid approach improves the output of ReFHap (the current leading heuristic) significantly (say, by almost 25% of the QAN50 score) without significant loss of speed, and can even find optimal solutions in much shorter time than the original ILP-based approach. Our program is available upon request to the authors. PMID:27347882

  6. Better ILP-Based Approaches to Haplotype Assembly.

    PubMed

    Chen, Zhi-Zhong; Deng, Fei; Shen, Chao; Wang, Yiji; Wang, Lusheng

    2016-07-01

    Haplotype assembly is to directly construct the haplotypes of an individual from sequence fragments (reads) of the individual. Although a number of programs have been designed for computing optimal or heuristic solutions to the haplotype assembly problem, computing an optimal solution may take days or even months while computing a heuristic solution usually requires a trade-off between speed and accuracy. This article refines a previously known integer linear programming-based (ILP-based) approach to the haplotype assembly problem in twofolds. First, the read-matrices of some datasets (such as NA12878) come with a quality for each base in the reads. We here propose to utilize the qualities in the ILP-based approach. Secondly, we propose to use the ILP-based approach to improve the output of any heuristic program for the problem. Experiments with both real and simulated datasets show that the qualities of read-matrices help us find more accurate solutions without significant loss of speed. Moreover, our experimental results show that the proposed hybrid approach improves the output of ReFHap (the current leading heuristic) significantly (say, by almost 25% of the QAN50 score) without significant loss of speed, and can even find optimal solutions in much shorter time than the original ILP-based approach. Our program is available upon request to the authors.

  7. A relaxation-based approach to damage modeling

    NASA Astrophysics Data System (ADS)

    Junker, Philipp; Schwarz, Stephan; Makowski, Jerzy; Hackl, Klaus

    2016-10-01

    Material models, including softening effects due to, for example, damage and localizations, share the problem of ill-posed boundary value problems that yield mesh-dependent finite element results. It is thus necessary to apply regularization techniques that couple local behavior described, for example, by internal variables, at a spatial level. This can take account of the gradient of the internal variable to yield mesh-independent finite element results. In this paper, we present a new approach to damage modeling that does not use common field functions, inclusion of gradients or complex integration techniques: Appropriate modifications of the relaxed (condensed) energy hold the same advantage as other methods, but with much less numerical effort. We start with the theoretical derivation and then discuss the numerical treatment. Finally, we present finite element results that prove empirically how the new approach works.

  8. Assessing a New Approach to Class-Based Affirmative Action

    ERIC Educational Resources Information Center

    Gaertner, Matthew N.

    2011-01-01

    In November, 2008, Colorado and Nebraska voted on amendments that sought to end race-based affirmative action at public universities. In anticipation of the vote, Colorado's flagship public institution--The University of Colorado at Boulder (CU)--explored statistical approaches to support class-based affirmative action. This paper details CU's…

  9. Component-Based Approach in Learning Management System Development

    ERIC Educational Resources Information Center

    Zaitseva, Larisa; Bule, Jekaterina; Makarov, Sergey

    2013-01-01

    The paper describes component-based approach (CBA) for learning management system development. Learning object as components of e-learning courses and their metadata is considered. The architecture of learning management system based on CBA being developed in Riga Technical University, namely its architecture, elements and possibilities are…

  10. Assessing a New Approach to Class-Based Affirmative Action

    ERIC Educational Resources Information Center

    Gaertner, Matthew Newman

    2011-01-01

    In November, 2008, Colorado and Nebraska voted on amendments that sought to end race-based affirmative action at public universities in those states. In anticipation of the vote, the University of Colorado at Boulder (CU) explored statistical approaches to support class-based (i.e., socioeconomic) affirmative action. This dissertation introduces…

  11. LIFE CLIMATREE project: A novel approach for accounting and monitoring carbon sequestration of tree crops and their potential as carbon sink areas

    NASA Astrophysics Data System (ADS)

    Stergiou, John; Tagaris, Efthimios; -Eleni Sotiropoulou, Rafaella

    2016-04-01

    Climate Change Mitigation is one of the most important objectives of the Kyoto Convention, and is mostly oriented towards reducing GHG emissions. However, carbon sink is retained only in the calculation of the forests capacity since agricultural land and farmers practices for securing carbon stored in soils have not been recognized in GHG accounting, possibly resulting in incorrect estimations of the carbon dioxide balance in the atmosphere. The agricultural sector, which is a key sector in the EU, presents a consistent strategic framework since 1954, in the form of Common Agricultural Policy (CAP). In its latest reform of 2013 (reg. (EU) 1305/13) CAP recognized the significance of Agriculture as a key player in Climate Change policy. In order to fill this gap the "LIFE ClimaTree" project has recently founded by the European Commission aiming to provide a novel method for including tree crop cultivations in the LULUCF's accounting rules for GHG emissions and removal. In the framework of "LIFE ClimaTree" project estimation of carbon sink within EU through the inclusion of the calculated tree crop capacity will be assessed for both current and future climatic conditions by 2050s using the GISS-WRF modeling system in a very fine scale (i.e., 9km x 9km) using RCP8.5 and RCP4.5 climate scenarios. Acknowledgement: LIFE CLIMATREE project "A novel approach for accounting and monitoring carbon sequestration of tree crops and their potential as carbon sink areas" (LIFE14 CCM/GR/000635).

  12. Requirements for independent community-based quality assessment and accountability practices in humanitarian assistance and disaster relief activities.

    PubMed

    Kirsch, Thomas D; Perrin, Paul; Burkle, Frederick M; Canny, William; Purdin, Susan; Lin, William; Sauer, Lauren

    2012-06-01

    During responses to disasters, the credibility of humanitarian agencies can be threatened by perceptions of poor quality of the responses. Many initiatives have been introduced over the last two decades to help address these issues and enhance the overall quality of humanitarian response, often with limited success. There remain important gaps and deficiencies in quality assurance efforts, including potential conflicts of interest. While many definitions for quality exist, a common component is that meeting the needs of the "beneficiary" or "client" is the ultimate determinant of quality. This paper examines the current status of assessment and accountability practices in the humanitarian response community, identifies gaps, and recommends timely, concise, and population-based assessments to elicit the perspective of quality performance and accountability to the affected populations. Direct and independent surveys of the disaster-affected population will help to redirect ongoing aid efforts, and generate more effective and comparable methods for assessing the quality of humanitarian practices and assistance activities.

  13. Component-based approach to robot vision for computational efficiency

    NASA Astrophysics Data System (ADS)

    Lee, Junhee; Kim, Dongsun; Park, Yeonchool; Park, Sooyong; Lee, Sukhan

    2007-12-01

    The purpose of this paper is to show merit and feasibility of the component based approach in robot system integration. Many methodologies such as 'component based approach, 'middle ware based approach' are suggested to integrate various complex functions on robot system efficiently. However, these methodologies are not used to robot function development broadly, because these 'Top-down' methodologies are modeled and researched in software engineering field, which are different from robot function researches, so that cannot be trusted by function developers. Developers' the main concern of these methodologies is the performance decreasing, which origins from overhead of a framework. This paper overcomes this misunderstanding by showing time performance increasing, when an experiment uses 'Self Healing, Adaptive and Growing softwarE (SHAGE)' framework, one of the component based framework. As an example of real robot function, visual object recognition is chosen to experiment.

  14. A biomimetic vision-based hovercraft accounts for bees' complex behaviour in various corridors.

    PubMed

    Roubieu, Frédéric L; Serres, Julien R; Colonnier, Fabien; Franceschini, Nicolas; Viollet, Stéphane; Ruffier, Franck

    2014-09-01

    Here we present the first systematic comparison between the visual guidance behaviour of a biomimetic robot and those of honeybees flying in similar environments. We built a miniature hovercraft which can travel safely along corridors with various configurations. For the first time, we implemented on a real physical robot the 'lateral optic flow regulation autopilot', which we previously studied computer simulations. This autopilot inspired by the results of experiments on various species of hymenoptera consists of two intertwined feedback loops, the speed and lateral control loops, each of which has its own optic flow (OF) set-point. A heading-lock system makes the robot move straight ahead as fast as 69 cm s(-1) with a clearance from one wall as small as 31 cm, giving an unusually high translational OF value (125° s(-1)). Our biomimetic robot was found to navigate safely along straight, tapered and bent corridors, and to react appropriately to perturbations such as the lack of texture on one wall, the presence of a tapering or non-stationary section of the corridor and even a sloping terrain equivalent to a wind disturbance. The front end of the visual system consists of only two local motion sensors (LMS), one on each side. This minimalistic visual system measuring the lateral OF suffices to control both the robot's forward speed and its clearance from the walls without ever measuring any speeds or distances. We added two additional LMSs oriented at +/-45° to improve the robot's performances in stiffly tapered corridors. The simple control system accounts for worker bees' ability to navigate safely in six challenging environments: straight corridors, single walls, tapered corridors, straight corridors with part of one wall moving or missing, as well as in the presence of wind. PMID:24615558

  15. A biomimetic vision-based hovercraft accounts for bees' complex behaviour in various corridors.

    PubMed

    Roubieu, Frédéric L; Serres, Julien R; Colonnier, Fabien; Franceschini, Nicolas; Viollet, Stéphane; Ruffier, Franck

    2014-09-01

    Here we present the first systematic comparison between the visual guidance behaviour of a biomimetic robot and those of honeybees flying in similar environments. We built a miniature hovercraft which can travel safely along corridors with various configurations. For the first time, we implemented on a real physical robot the 'lateral optic flow regulation autopilot', which we previously studied computer simulations. This autopilot inspired by the results of experiments on various species of hymenoptera consists of two intertwined feedback loops, the speed and lateral control loops, each of which has its own optic flow (OF) set-point. A heading-lock system makes the robot move straight ahead as fast as 69 cm s(-1) with a clearance from one wall as small as 31 cm, giving an unusually high translational OF value (125° s(-1)). Our biomimetic robot was found to navigate safely along straight, tapered and bent corridors, and to react appropriately to perturbations such as the lack of texture on one wall, the presence of a tapering or non-stationary section of the corridor and even a sloping terrain equivalent to a wind disturbance. The front end of the visual system consists of only two local motion sensors (LMS), one on each side. This minimalistic visual system measuring the lateral OF suffices to control both the robot's forward speed and its clearance from the walls without ever measuring any speeds or distances. We added two additional LMSs oriented at +/-45° to improve the robot's performances in stiffly tapered corridors. The simple control system accounts for worker bees' ability to navigate safely in six challenging environments: straight corridors, single walls, tapered corridors, straight corridors with part of one wall moving or missing, as well as in the presence of wind.

  16. An Efficient Deterministic Approach to Model-based Prediction Uncertainty Estimation

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Saxena, Abhinav; Goebel, Kai

    2012-01-01

    Prognostics deals with the prediction of the end of life (EOL) of a system. EOL is a random variable, due to the presence of process noise and uncertainty in the future inputs to the system. Prognostics algorithm must account for this inherent uncertainty. In addition, these algorithms never know exactly the state of the system at the desired time of prediction, or the exact model describing the future evolution of the system, accumulating additional uncertainty into the predicted EOL. Prediction algorithms that do not account for these sources of uncertainty are misrepresenting the EOL and can lead to poor decisions based on their results. In this paper, we explore the impact of uncertainty in the prediction problem. We develop a general model-based prediction algorithm that incorporates these sources of uncertainty, and propose a novel approach to efficiently handle uncertainty in the future input trajectories of a system by using the unscented transformation. Using this approach, we are not only able to reduce the computational load but also estimate the bounds of uncertainty in a deterministic manner, which can be useful to consider during decision-making. Using a lithium-ion battery as a case study, we perform several simulation-based experiments to explore these issues, and validate the overall approach using experimental data from a battery testbed.

  17. Discriminating bot accounts based solely on temporal features of microblog behavior

    NASA Astrophysics Data System (ADS)

    Pan, Junshan; Liu, Ying; Liu, Xiang; Hu, Hanping

    2016-05-01

    As the largest microblog service in China, Sina Weibo has attracted numerous automated applications (known as bots) due to its popularity and open architecture. We classify the active users from Sina Weibo into human, bot-based and hybrid groups based solely on the study of temporal features of their posting behavior. The anomalous burstiness parameter and time-interval entropy value are exploited to characterize automation. We also reveal different behavior patterns among the three types of users regarding their reposting ratio, daily rhythm and active days. Our findings may help Sina Weibo manage a better community and should be considered for dynamic models of microblog behaviors.

  18. Accounting & Computing Curriculum Guide.

    ERIC Educational Resources Information Center

    Avani, Nathan T.; And Others

    This curriculum guide consists of materials for use in teaching a competency-based accounting and computing course that is designed to prepare students for employability in the following occupational areas: inventory control clerk, invoice clerk, payroll clerk, traffic clerk, general ledger bookkeeper, accounting clerk, account information clerk,…

  19. Community-Based Participatory Evaluation: The Healthy Start Approach

    PubMed Central

    Braithwaite, Ronald L.; McKenzie, Robetta D.; Pruitt, Vikki; Holden, Kisha B.; Aaron, Katrina; Hollimon, Chavone

    2013-01-01

    The use of community-based participatory research has gained momentum as a viable approach to academic and community engagement for research over the past 20 years. This article discusses an approach for extending the process with an emphasis on evaluation of a community partnership–driven initiative and thus advances the concept of conducting community-based participatory evaluation (CBPE) through a model used by the Healthy Start project of the Augusta Partnership for Children, Inc., in Augusta, Georgia. Application of the CBPE approach advances the importance of bilateral engagements with consumers and academic evaluators. The CBPE model shows promise as a reliable and credible evaluation approach for community-level assessment of health promotion programs. PMID:22461687

  20. Frequency Affects Object Relative Clause Processing: Some Evidence in Favor of Usage-Based Accounts

    ERIC Educational Resources Information Center

    Reali, Florencia

    2014-01-01

    The processing difficulty of nested grammatical structure has been explained by different psycholinguistic theories. Here I provide corpus and behavioral evidence in favor of usage-based models, focusing on the case of object relative clauses in Spanish as a first language. A corpus analysis of spoken Spanish reveals that, as in English, the…

  1. ACCOUNTING FOR BIOLOGICAL AND ANTHROPOGENIC FACTORS IN NATIONAL LAND-BASED CARBON BUDGETS

    EPA Science Inventory

    Efforts to quantify net greenhouse gas emissions at the national scale, as required by the United Nations Framework Convention on Climate Change, must include both industrial emissions and the net flux associated with the land base. In this study, data on current land use, rates ...

  2. Audit Culture: Unintended Consequences of Accountability Practices in Evidence-Based Programs

    ERIC Educational Resources Information Center

    Owczarzak, Jill; Broaddus, Michelle; Pinkerton, Steven

    2016-01-01

    Evaluation has become expected within the nonprofit sector, including HIV prevention service delivery through community-based organizations (CBOs). While staff and directors at CBOs may acknowledge the potential contribution of evaluation data to the improvement of agency services, the results of evaluation are often used to demonstrate fiscal…

  3. Bringing Technology to Students' Proximity: A Sociocultural Account of Technology-Based Learning Projects

    ERIC Educational Resources Information Center

    Mukama, Evode

    2014-01-01

    This paper depicts a study carried out in Rwanda concerning university students who participated in a contest to produce short documentary films. The purpose of this research is to conceptualize these kinds of technology-based learning projects (TBLPs) through a sociocultural perspective. The methodology included focus group discussions and field…

  4. Making Adequate Yearly Progress: Teacher Learning in School-Based Accountability Contexts

    ERIC Educational Resources Information Center

    Rinke, Carol; Valli, Linda

    2010-01-01

    Context: This study addresses recent changes in professional development policy, practice, and theory, in which professional development has increasingly become continual, collaborative, and school based. We consider both traditional notions of structure and content as well as context in developing a more complete understanding of professional…

  5. Working Memory Span Development: A Time-Based Resource-Sharing Model Account

    ERIC Educational Resources Information Center

    Barrouillet, Pierre; Gavens, Nathalie; Vergauwe, Evie; Gaillard, Vinciane; Camos, Valerie

    2009-01-01

    The time-based resource-sharing model (P. Barrouillet, S. Bernardin, & V. Camos, 2004) assumes that during complex working memory span tasks, attention is frequently and surreptitiously switched from processing to reactivate decaying memory traces before their complete loss. Three experiments involving children from 5 to 14 years of age…

  6. THE FUTURE OF COMPUTER-BASED TOXICITY PREDICTION: MECHANISM-BASED MODELS VS. INFORMATION MINING APPROACHES

    EPA Science Inventory


    The Future of Computer-Based Toxicity Prediction:
    Mechanism-Based Models vs. Information Mining Approaches

    When we speak of computer-based toxicity prediction, we are generally referring to a broad array of approaches which rely primarily upon chemical structure ...

  7. Accounting for failure: risk-based regulation and the problems of ensuring healthcare quality in the NHS

    PubMed Central

    Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry

    2016-01-01

    In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013–2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes. PMID:27499677

  8. Biomarker Discovery by Novel Sensors Based on Nanoproteomics Approaches

    PubMed Central

    Dasilva, Noelia; Díez, Paula; Matarraz, Sergio; González-González, María; Paradinas, Sara; Orfao, Alberto; Fuentes, Manuel

    2012-01-01

    During the last years, proteomics has facilitated biomarker discovery by coupling high-throughput techniques with novel nanosensors. In the present review, we focus on the study of label-based and label-free detection systems, as well as nanotechnology approaches, indicating their advantages and applications in biomarker discovery. In addition, several disease biomarkers are shown in order to display the clinical importance of the improvement of sensitivity and selectivity by using nanoproteomics approaches as novel sensors. PMID:22438764

  9. Healthcare information system approaches based on middleware concepts.

    PubMed

    Holena, M; Blobel, B

    1997-01-01

    To meet the challenges for efficient and high-level quality, health care systems must implement the "Shared Care" paradigm of distributed co-operating systems. To this end, both the newly developed and legacy applications must be fully integrated into the care process. These requirements can be fulfilled by information systems based on middleware concepts. In the paper, the middleware approaches HL7, DHE, and CORBA are described. The relevance of those approaches to the healthcare domain is documented. The description presented here is complemented through two other papers in this volume, concentrating on the evaluation of the approaches, and on their security threats and solutions. PMID:10175361

  10. A Novel Concept Acquisition Approach Based on Formal Contexts

    PubMed Central

    Qian, Ting; Wei, Ling

    2014-01-01

    As an important tool for data analysis and knowledge processing, formal concept analysis (FCA) has been applied to many fields. In this paper, we introduce a new method to find all formal concepts based on formal contexts. The amount of intents calculation is reduced by the method. And the corresponding algorithm of our approach is proposed. The main theorems and the corresponding algorithm are examined by examples, respectively. At last, several real-life databases are analyzed to demonstrate the application of the proposed approach. Experimental results show that the proposed approach is simple and effective. PMID:25147834

  11. An information-based neural approach to constraint satisfaction.

    PubMed

    Jönsson, H; Söderberg, B

    2001-08-01

    A novel artificial neural network approach to constraint satisfaction problems is presented. Based on information-theoretical considerations, it differs from a conventional mean-field approach in the form of the resulting free energy. The method, implemented as an annealing algorithm, is numerically explored on a testbed of K-SAT problems. The performance shows a dramatic improvement over that of a conventional mean-field approach and is comparable to that of a state-of-the-art dedicated heuristic (GSAT+walk). The real strength of the method, however, lies in its generality. With minor modifications, it is applicable to arbitrary types of discrete constraint satisfaction problems. PMID:11506672

  12. The Transconjunctival Transorbital Approach: A Keyhole Approach to the Midline Anterior Skull Base

    PubMed Central

    Raza, Shaan M.; Quinones-Hinojosa, Alfredo; Lim, Michael; Owusu Boahene, Kofi D.

    2015-01-01

    OBJECTIVE To report an initial experience with a medial transorbital approach to the midline skull base performed via a transconjunctival incision. METHODS The authors retrospectively reviewed their clinical experience with this approach in the management of benign cranial base pathology. Preoperative imaging, intraoperative records, hospitalization charts, and postoperative records were reviewed for relevant data. RESULTS During the period 2009–2011, six patients underwent a transconjunctival craniotomy performed by a neurosurgeon and otolaryngologist–head and neck surgeon working together. The indications for surgery were esthesioneuroblastoma in one patient, juvenile angiofibroma in one patient, Paget disease in one patient, and recalcitrant cerebrospinal fluid leaks in three patients. Three patients had prior cranial base surgery (either open craniotomy or an endonasal approach) done at another institution. The mean length of stay was 3.8 days; mean follow-up was 6 months. Surgery was considered successful in all cases (negative margins or no leak recurrence); diplopia was noted in one patient postoperatively. CONCLUSIONS The transconjunctival medial orbital craniectomy provides a minimally invasive keyhole approach to lesions located anteriorly along the anterior cranial fossa that are in the midline with lateral extension over the orbital roof. Based on our initial experience with this technique, the working space afforded limits complex surgical dissection; this approach is primarily well suited for less extensive pathology. PMID:22722037

  13. Can we reconcile atmospheric estimates of the Northern terrestrial carbon sink with land-based accounting?

    SciTech Connect

    Ciais, Philippe; Luyssaert, Sebastiaan; Chevallier, Fredric; Poussi, Zegbeu; Peylin, Philippe; Breon, Francois-Marie; Canadell, J.G.; Shvidenko, Anatoly; Jonas, Matthias; King, Anthony Wayne; Schulze, E.-D.; Roedenbeck, Christian; Piao, Shilong; Peters, Wouter

    2010-10-01

    We estimatethenorthernhemisphere(NH)terrestrialcarbon sink bycomparingfourrecentatmosphericinversionswith land-based Caccountingdataforsixlargenorthernregions. The meanNHterrestrialCO2 sink fromtheinversionmodelsis 1.7 PgCyear1 over theperiod2000 2004. Theuncertaintyof this estimateisbasedonthetypicalindividual(1-sigma) precision ofoneinversion(0.9PgCyear1) andisconsistent with themin max rangeofthefourinversionmeanestimates (0.8 PgCyear1). Inversionsagreewithintheiruncertaintyfor the distributionoftheNHsinkofCO2 in longitude,withRussia being thelargestsink.Theland-basedaccountingestimateof NH carbonsinkis1.7PgCyear1 for thesumofthesixregions studied. The1-sigmauncertaintyoftheland-basedestimate (0.3 PgCyear1) issmallerthanthatofatmosphericinversions, but noindependentland-basedfluxestimateisavailableto derive a betweenaccountingmodel uncertainty. Encouragingly, thetop-downatmosphericandthebottom-up land-based methodsconvergetoconsistentmeanestimates within theirrespectiveerrors,increasingtheconfidenceinthe overall budget.Theseresultsalsoconfirmthecontinuedcritical role ofNHterrestrialecosystemsinslowingdownthe atmospheric accumulationofanthropogenicCO2

  14. What Part of "No" Do Children Not Understand? A Usage-Based Account of Multiword Negation

    ERIC Educational Resources Information Center

    Cameron-Faulkner, Thea; Lieven, Elena; Theakston, Anna

    2007-01-01

    The study investigates the development of English multiword negation, in particular the negation of zero marked verbs (e.g. "no sleep", "not see", "can't reach") from a usage-based perspective. The data was taken from a dense database consisting of the speech of an English-speaking child (Brian) aged 2;3-3;4 (MLU 2.05-3.1) and his mother. The…

  15. Liaison acquisition, word segmentation and construction in French: a usage-based account.

    PubMed

    Chevrot, Jean-Pierre; Dugua, Celine; Fayol, Michel

    2009-06-01

    In the linguistics field, liaison in French is interpreted as an indicator of interactions between the various levels of language organization. The current study examines the same issue while adopting a developmental perspective. Five experiments involving children aged two to six years provide evidence for a developmental scenario which interrelates a number of different issues: the acquisition of phonological alternations, the segmentation of new words, the long-term stabilization of the word form in the lexicon and the formation of item-based constructions. According to this scenario, children favour the presence of initial CV syllables when segmenting stored chunks of speech of the type word1-liaison-word2 (les arbres 'the trees' is segmented as /le/+/zarbr/). They cope with the variation of the liaison in the input by memorizing multiple exemplars of the same word2 (/zarbr/, /narbr/). They learn the correct relations between the word1s and the word2 exemplars through exposure to the well-formed sequence (un+/narbr/, deux+/zarbr/). They generalize the relation between a word1 and a class of word2 exemplars beginning with a specific liaison consonant by integrating this information into an item-based schema (e.g. un+/nX/, deux+/zX/). This model is based on the idea that the segmentation of new words and the development of syntactic schemas are two aspects of the same process. PMID:18947442

  16. Long-term consumption-based accounting of CO2 (1948-2011) in global view

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Chou, J.

    2013-12-01

    Accompanied the boom of global economy, soar the CO2 emissions during the past decades, with developing group showing higher growth rates than the developed. Geographical separation in production and consumption or the so-called carbon leakage is suggested to be one of the main reasons. Based on detailed input-output data, existing studies have derived consumption-based emission series in some 20 years together with combustion ones. In order to broaden data coverage, to which existed studies fail due to data availability, we develop a new model to embrace years from 1948 to 2011. The results basically coincide with existing studies. And it has displayed emission estimations in 3 levels: upper-bound level, lower-bound level and mean level. What is more, this model is succinct enough to be easily expanded for future estimations in GHGs or other ecological footprints analysis considering material flows. The emission transfers and consumption based emissions calculated are rocketing significantly, and a synthetic consideration of historical emissions in both views (producer and consumer respectively) is solemnly recommended in order to achieve solid progress in future climate policies, especially for future ADP scheme under which all parties are subjected.

  17. [Rational bases for new approaches to the therapy of pediatric solid tumors: immunotherapy and gene therapy].

    PubMed

    Pistoia, V; Prigione, I; Facchetti, P; Corrias, M V

    1994-01-01

    Neuroblastoma is one of the commonest solid tumors in children. Conventional therapeutic approaches, such as surgery, chemotherapy and radiotherapy, fail to control tumor progression in stage III and IV patients. The search for novel therapeutic strategies should necessarily take into account immunotherapy and gene therapy. Here the theoretical bases for the development of such approaches are discussed. Studies carried out with neuroblastoma (NB) cell lines have shown that neoplastic cells express a wide array of potential tumor associated antigens (TAA) but are devoid of HLA molecules which are necessary for TAA presentation to the host immune system. Transfection of NB cells with the interferon gamma gene appears a promising approach, since this cytokine up-regulates the expression of class I HLA molecules in NB cells. Other cytokines of potential interest for gene transfer studies are interleukin 2 (IL2) and interleukin 12 (IL12).

  18. A Variance Based Active Learning Approach for Named Entity Recognition

    NASA Astrophysics Data System (ADS)

    Hassanzadeh, Hamed; Keyvanpour, Mohammadreza

    The cost of manually annotating corpora is one of the significant issues in many text based tasks such as text mining, semantic annotation and generally information extraction. Active Learning is an approach that deals with reduction of labeling costs. In this paper we proposed an effective active learning approach based on minimal variance that reduces manual annotation cost by using a small number of manually labeled examples. In our approach we use a confidence measure based on the model's variance that reaches a considerable accuracy for annotating entities. Conditional Random Field (CRF) is chosen as the underlying learning model due to its promising performance in many sequence labeling tasks. The experiments show that the proposed method needs considerably fewer manual labeled samples to produce a desirable result.

  19. An Open Science Approach to Gis-Based Paleoenvironment Data

    NASA Astrophysics Data System (ADS)

    Willmes, C.; Becker, D.; Verheul, J.; Yener, Y.; Zickel, M.; Bolten, A.; Bubenzer, O.; Bareth, G.

    2016-06-01

    Paleoenvironmental studies and according information (data) are abundantly published and available in the scientific record. However, GIS-based paleoenvironmental information and datasets are comparably rare. Here, we present an Open Science approach for creating GIS-based data and maps of paleoenvironments, and Open Access publishing them in a web based Spatial Data Infrastructure (SDI), for access by the archaeology and paleoenvironment communities. We introduce an approach to gather and create GIS datasets from published non-GIS based facts and information (data), such as analogous maps, textual information or figures in scientific publications. These collected and created geo-datasets and maps are then published, including a Digital Object Identifier (DOI) to facilitate scholarly reuse and citation of the data, in a web based Open Access Research Data Management Infrastructure. The geo-datasets are additionally published in an Open Geospatial Consortium (OGC) standards compliant SDI, and available for GIS integration via OGC Open Web Services (OWS).

  20. A Market-Based Approach to Multi-factory Scheduling

    NASA Astrophysics Data System (ADS)

    Vytelingum, Perukrishnen; Rogers, Alex; MacBeth, Douglas K.; Dutta, Partha; Stranjak, Armin; Jennings, Nicholas R.

    In this paper, we report on the design of a novel market-based approach for decentralised scheduling across multiple factories. Specifically, because of the limitations of scheduling in a centralised manner - which requires a center to have complete and perfect information for optimality and the truthful revelation of potentially commercially private preferences to that center - we advocate an informationally decentralised approach that is both agile and dynamic. In particular, this work adopts a market-based approach for decentralised scheduling by considering the different stakeholders representing different factories as self-interested, profit-motivated economic agents that trade resources for the scheduling of jobs. The overall schedule of these jobs is then an emergent behaviour of the strategic interaction of these trading agents bidding for resources in a market based on limited information and their own preferences. Using a simple (zero-intelligence) bidding strategy, we empirically demonstrate that our market-based approach achieves a lower bound efficiency of 84%. This represents a trade-off between a reasonable level of efficiency (compared to a centralised approach) and the desirable benefits of a decentralised solution.

  1. A new computational account of cognitive control over reinforcement-based decision-making: Modeling of a probabilistic learning task.

    PubMed

    Zendehrouh, Sareh

    2015-11-01

    Recent work on decision-making field offers an account of dual-system theory for decision-making process. This theory holds that this process is conducted by two main controllers: a goal-directed system and a habitual system. In the reinforcement learning (RL) domain, the habitual behaviors are connected with model-free methods, in which appropriate actions are learned through trial-and-error experiences. However, goal-directed behaviors are associated with model-based methods of RL, in which actions are selected using a model of the environment. Studies on cognitive control also suggest that during processes like decision-making, some cortical and subcortical structures work in concert to monitor the consequences of decisions and to adjust control according to current task demands. Here a computational model is presented based on dual system theory and cognitive control perspective of decision-making. The proposed model is used to simulate human performance on a variant of probabilistic learning task. The basic proposal is that the brain implements a dual controller, while an accompanying monitoring system detects some kinds of conflict including a hypothetical cost-conflict one. The simulation results address existing theories about two event-related potentials, namely error related negativity (ERN) and feedback related negativity (FRN), and explore the best account of them. Based on the results, some testable predictions are also presented. PMID:26339919

  2. A new computational account of cognitive control over reinforcement-based decision-making: Modeling of a probabilistic learning task.

    PubMed

    Zendehrouh, Sareh

    2015-11-01

    Recent work on decision-making field offers an account of dual-system theory for decision-making process. This theory holds that this process is conducted by two main controllers: a goal-directed system and a habitual system. In the reinforcement learning (RL) domain, the habitual behaviors are connected with model-free methods, in which appropriate actions are learned through trial-and-error experiences. However, goal-directed behaviors are associated with model-based methods of RL, in which actions are selected using a model of the environment. Studies on cognitive control also suggest that during processes like decision-making, some cortical and subcortical structures work in concert to monitor the consequences of decisions and to adjust control according to current task demands. Here a computational model is presented based on dual system theory and cognitive control perspective of decision-making. The proposed model is used to simulate human performance on a variant of probabilistic learning task. The basic proposal is that the brain implements a dual controller, while an accompanying monitoring system detects some kinds of conflict including a hypothetical cost-conflict one. The simulation results address existing theories about two event-related potentials, namely error related negativity (ERN) and feedback related negativity (FRN), and explore the best account of them. Based on the results, some testable predictions are also presented.

  3. Accounting for Genetic Architecture Improves Sequence Based Genomic Prediction for a Drosophila Fitness Trait

    PubMed Central

    Magwire, Michael; Schlather, Martin; Simianer, Henner; Mackay, Trudy F. C.

    2015-01-01

    The ability to predict quantitative trait phenotypes from molecular polymorphism data will revolutionize evolutionary biology, medicine and human biology, and animal and plant breeding. Efforts to map quantitative trait loci have yielded novel insights into the biology of quantitative traits, but the combination of individually significant quantitative trait loci typically has low predictive ability. Utilizing all segregating variants can give good predictive ability in plant and animal breeding populations, but gives little insight into trait biology. Here, we used the Drosophila Genetic Reference Panel to perform both a genome wide association analysis and genomic prediction for the fitness-related trait chill coma recovery time. We found substantial total genetic variation for chill coma recovery time, with a genetic architecture that differs between males and females, a small number of molecular variants with large main effects, and evidence for epistasis. Although the top additive variants explained 36% (17%) of the genetic variance among lines in females (males), the predictive ability using genomic best linear unbiased prediction and a relationship matrix using all common segregating variants was very low for females and zero for males. We hypothesized that the low predictive ability was due to the mismatch between the infinitesimal genetic architecture assumed by the genomic best linear unbiased prediction model and the true genetic architecture of chill coma recovery time. Indeed, we found that the predictive ability of the genomic best linear unbiased prediction model is markedly improved when we combine quantitative trait locus mapping with genomic prediction by only including the top variants associated with main and epistatic effects in the relationship matrix. This trait-associated prediction approach has the advantage that it yields biologically interpretable prediction models. PMID:25950439

  4. Grid-based electronic structure calculations: The tensor decomposition approach

    NASA Astrophysics Data System (ADS)

    Rakhuba, M. V.; Oseledets, I. V.

    2016-05-01

    We present a fully grid-based approach for solving Hartree-Fock and all-electron Kohn-Sham equations based on low-rank approximation of three-dimensional electron orbitals. Due to the low-rank structure the total complexity of the algorithm depends linearly with respect to the one-dimensional grid size. Linear complexity allows for the usage of fine grids, e.g. 81923 and, thus, cheap extrapolation procedure. We test the proposed approach on closed-shell atoms up to the argon, several molecules and clusters of hydrogen atoms. All tests show systematical convergence with the required accuracy.

  5. Promoting accountability in obstetric care: use of criteria-based audit in Viet Nam.

    PubMed

    Bailey, P E; Binh, H T; Bang, H T

    2010-01-01

    Audits can improve clinical and managerial practices, enhance the rational use of limited resources, and improve staff morale and motivation. Staff at five hospitals in Thanh Hoa and Quang Tri provinces (Viet Nam) used criteria-based audit (CBA) as a tool to improve the quality of emergency obstetric and newborn care. CBA compares current practice with standards based on the best available evidence and the local context. The audit cycle begins with a known problem, proceeds with an initial assessment and data collection, analysis of those data, formulation and implementation of an action plan, and a re-evaluation of the topic initially assessed. Teams found that clinical protocols for treating major obstetric complications were not followed, although, national guidelines had been issued in 2002. In an audit of facility organisation, staff addressed obstacles to the timely treatment of obstetric emergencies during off hours. In each audit, teams devised mechanisms to correct problems that resulted in significant improvements when the audit cycle was repeated. CBA improved adherence to national guidelines, improved record-keeping, heightened teamwork, and showed staff that they could identify and solve many of their own problems.

  6. Applying human rights to maternal health: UN Technical Guidance on rights-based approaches.

    PubMed

    Yamin, Alicia Ely

    2013-05-01

    In the last few years there have been several critical milestones in acknowledging the centrality of human rights to sustainably addressing the scourge of maternal death and morbidity around the world, including from the United Nations Human Rights Council. In 2012, the Council adopted a resolution welcoming a Technical Guidance on rights-based approaches to maternal mortality and morbidity, and calling for a report on its implementation in 2 years. The present paper provides an overview of the contents and significance of the Guidance. It reviews how the Guidance can assist policymakers in improving women's health and their enjoyment of rights by setting out the implications of adopting a human rights-based approach at each step of the policy cycle, from planning and budgeting, to ensuring implementation, to monitoring and evaluation, to fostering accountability mechanisms. The Guidance should also prove useful to clinicians in understanding rights frameworks as applied to maternal health.

  7. An Individual-Based Model of Zebrafish Population Dynamics Accounting for Energy Dynamics

    PubMed Central

    Beaudouin, Rémy; Goussen, Benoit; Piccini, Benjamin; Augustine, Starrlight; Devillers, James; Brion, François; Péry, Alexandre R. R.

    2015-01-01

    Developing population dynamics models for zebrafish is crucial in order to extrapolate from toxicity data measured at the organism level to biological levels relevant to support and enhance ecological risk assessment. To achieve this, a dynamic energy budget for individual zebrafish (DEB model) was coupled to an individual based model of zebrafish population dynamics (IBM model). Next, we fitted the DEB model to new experimental data on zebrafish growth and reproduction thus improving existing models. We further analysed the DEB-model and DEB-IBM using a sensitivity analysis. Finally, the predictions of the DEB-IBM were compared to existing observations on natural zebrafish populations and the predicted population dynamics are realistic. While our zebrafish DEB-IBM model can still be improved by acquiring new experimental data on the most uncertain processes (e.g. survival or feeding), it can already serve to predict the impact of compounds at the population level. PMID:25938409

  8. The gamesmanship of sex: a model based on African American adolescent accounts.

    PubMed

    Eyre, S L; Hoffman, V; Millstein, S G

    1998-12-01

    This article examines adolescent understanding of the social context of sexual behavior. Using grounded theory to interpret interviews with 39 African American male and female adolescents, the article builds a model of sex-related behavior as a set of interrelated games. A courtship game involves communication of sexual or romantic interest and, over time, formation of a romantic relationship. A duplicity game draws on conventions of a courtship game to trick a partner into having sex. A disclosure game spreads stories about one's own and other's sex-related activities to peers in a gossip network. Finally, a prestige game builds social reputation in the eyes of peers, typically based on gender-specific standards. The article concludes by examining the meanings that sex-related behavior may have for adolescents and the potential use of social knowledge for facilitating adolescent health.

  9. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    NASA Technical Reports Server (NTRS)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  10. Pragmatic versus form-based accounts of referential contrast: evidence for effects of informativity expectations.

    PubMed

    Sedivy, Julie C

    2003-01-01

    Characterizing the relationship between form-based linguistic knowledge and representation of context has long been of importance in the study of on-line language processing. Recent experimental research has shown evidence of very rapid effects of referential context in resolving local indeterminacies on-line. However, there has been no consensus regarding the nature of these context effects. The current paper summarizes recent work covering a range of phenomena for which referential contrast has been shown to influence on-line processing, including prenominal and post-nominal modification, focus operators, and intonational focus. The results of the body of work suggest that referential context effects are not limited to situations in which the linguistic form of the utterance directly specifies the point of contact with context. Rather, context effects of a pragmatic, Gricean nature appear to be possible, suggesting the relationship between linguistic form and context in rapid on-line processing can be of a very indirect nature.

  11. A proven knowledge-based approach to prioritizing process information

    NASA Technical Reports Server (NTRS)

    Corsberg, Daniel R.

    1991-01-01

    Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.

  12. A Comparison of Seven Cox Regression-Based Models to Account for Heterogeneity Across Multiple HIV Treatment Cohorts in Latin America and the Caribbean

    PubMed Central

    Giganti, Mark J.; Luz, Paula M.; Caro-Vega, Yanink; Cesar, Carina; Padgett, Denis; Koenig, Serena; Echevarria, Juan; McGowan, Catherine C.; Shepherd, Bryan E.

    2015-01-01

    Abstract Many studies of HIV/AIDS aggregate data from multiple cohorts to improve power and generalizability. There are several analysis approaches to account for cross-cohort heterogeneity; we assessed how different approaches can impact results from an HIV/AIDS study investigating predictors of mortality. Using data from 13,658 HIV-infected patients starting antiretroviral therapy from seven Latin American and Caribbean cohorts, we illustrate the assumptions of seven readily implementable approaches to account for across cohort heterogeneity with Cox proportional hazards models, and we compare hazard ratio estimates across approaches. As a sensitivity analysis, we modify cohort membership to generate specific heterogeneity conditions. Hazard ratio estimates varied slightly between the seven analysis approaches, but differences were not clinically meaningful. Adjusted hazard ratio estimates for the association between AIDS at treatment initiation and death varied from 2.00 to 2.20 across approaches that accounted for heterogeneity; the adjusted hazard ratio was estimated as 1.73 in analyses that ignored across cohort heterogeneity. In sensitivity analyses with more extreme heterogeneity, we noted a slightly greater distinction between approaches. Despite substantial heterogeneity between cohorts, the impact of the specific approach to account for heterogeneity was minimal in our case study. Our results suggest that it is important to account for across cohort heterogeneity in analyses, but that the specific technique for addressing heterogeneity may be less important. Because of their flexibility in accounting for cohort heterogeneity, we prefer stratification or meta-analysis methods, but we encourage investigators to consider their specific study conditions and objectives. PMID:25647087

  13. A Comparison of Seven Cox Regression-Based Models to Account for Heterogeneity Across Multiple HIV Treatment Cohorts in Latin America and the Caribbean.

    PubMed

    Giganti, Mark J; Luz, Paula M; Caro-Vega, Yanink; Cesar, Carina; Padgett, Denis; Koenig, Serena; Echevarria, Juan; McGowan, Catherine C; Shepherd, Bryan E

    2015-05-01

    Many studies of HIV/AIDS aggregate data from multiple cohorts to improve power and generalizability. There are several analysis approaches to account for cross-cohort heterogeneity; we assessed how different approaches can impact results from an HIV/AIDS study investigating predictors of mortality. Using data from 13,658 HIV-infected patients starting antiretroviral therapy from seven Latin American and Caribbean cohorts, we illustrate the assumptions of seven readily implementable approaches to account for across cohort heterogeneity with Cox proportional hazards models, and we compare hazard ratio estimates across approaches. As a sensitivity analysis, we modify cohort membership to generate specific heterogeneity conditions. Hazard ratio estimates varied slightly between the seven analysis approaches, but differences were not clinically meaningful. Adjusted hazard ratio estimates for the association between AIDS at treatment initiation and death varied from 2.00 to 2.20 across approaches that accounted for heterogeneity; the adjusted hazard ratio was estimated as 1.73 in analyses that ignored across cohort heterogeneity. In sensitivity analyses with more extreme heterogeneity, we noted a slightly greater distinction between approaches. Despite substantial heterogeneity between cohorts, the impact of the specific approach to account for heterogeneity was minimal in our case study. Our results suggest that it is important to account for across cohort heterogeneity in analyses, but that the specific technique for addressing heterogeneity may be less important. Because of their flexibility in accounting for cohort heterogeneity, we prefer stratification or meta-analysis methods, but we encourage investigators to consider their specific study conditions and objectives.

  14. Accounting for phase drifts in SSVEP-based BCIs by means of biphasic stimulation.

    PubMed

    Wu, Hung-Yi; Lee, Po-Lei; Chang, Hsiang-Chih; Hsieh, Jen-Chuen

    2011-05-01

    This study proposes a novel biphasic stimulation technique to solve the issue of phase drifts in steady-state visual evoked potential (SSVEPs) in phase-tagged systems. Phase calibration was embedded in stimulus sequences using a biphasic flicker, which is driven by a sequence with alternating reference and phase-shift states. Nine subjects were recruited to participate in off-line and online tests. Signals were bandpass filtered and segmented by trigger signals into reference and phase-shift epochs. Frequency components of SSVEP in the reference and phase-shift epochs were extracted using the Fourier method with a 50% overlapped sliding window. The real and imaginary parts of the SSVEP frequency components were organized into complex vectors in each epoch. Hotelling's t-square test was used to determine the significances of nonzero mean vectors. The rejection of noisy data segments and the validation of gaze detections were made based on p values. The phase difference between the valid mean vectors of reference and phase-shift epochs was used to identify user's gazed targets in this system. Data showed an average information transfer rate of 44.55 and 38.21 bits/min in off-line and online tests, respectively. PMID:21193370

  15. A Multi-Faceted Approach to Inquiry-Based Learning

    NASA Astrophysics Data System (ADS)

    Brudzinski, M. R.; Sikorski, J.

    2009-12-01

    In order to fully attain the benefits of inquiry-based learning, instructors who typically employ the traditional lecture format need to make several adjustments to their approach. This change in styles can be intimidating and logistically difficult to overcome. A stepwise approach to this transformation is likely to be more manageable for individual faculty or departments. In this session, we will describe several features that we are implementing in our introductory geology course with the ultimate goal of converting to an entirely inquiry-based approach. Our project is part of the Miami University initiative in the top 25 enrolled courses to move towards the “student as scholar” model for engaged learning. Some of the features we developed for our course include: student learning outcomes, student development outcomes, out-of-class content quizzes, in-class conceptests, pre-/post-course assessment, reflective knowledge surveys, and daily group activities.

  16. Covariance-based approaches to aeroacoustic noise source analysis.

    PubMed

    Du, Lin; Xu, Luzhou; Li, Jian; Guo, Bin; Stoica, Petre; Bahr, Chris; Cattafesta, Louis N

    2010-11-01

    In this paper, several covariance-based approaches are proposed for aeroacoustic noise source analysis under the assumptions of a single dominant source and all observers contaminated solely by uncorrelated noise. The Cramér-Rao Bounds (CRB) of the unbiased source power estimates are also derived. The proposed methods are evaluated using both simulated data as well as data acquired from an airfoil trailing edge noise experiment in an open-jet aeroacoustic facility. The numerical examples show that the covariance-based algorithms significantly outperform an existing least-squares approach and provide accurate power estimates even under low signal-to-noise ratio (SNR) conditions. Furthermore, the mean-squared-errors (MSEs) of the so-obtained estimates are close to the corresponding CRB especially for a large number of data samples. The experimental results show that the power estimates of the proposed approaches are consistent with one another as long as the core analysis assumptions are obeyed.

  17. Effects of storm runoff on acid-base accounting of mine drainage

    SciTech Connect

    Sjoegren, D.R.; Olyphant, G.A.; Harper, D.

    1997-12-31

    Pre-reclamation conditions were documented at an abandoned mine site in an upland area at the headwaters of a small perennial stream in southwestern Indiana. Stream discharge and chemistry were monitored from April to October 1995, in an effort to assess the total acid-base budget of outflows from the site. The chemistry of three lakes, a shallow aquifer, and flooded mine voids was also monitored. During the period of monitoring, thirty-five rainfall-runoff events occurred, producing a total storm discharge of approximately 6.12 x 10{sup 7} L. Baseflow during the monitoring period was approximately 1.10 x 10{sup 8} L and was characterized by water chemistry that was similar to that of a spring that issued from the flooded mine voids. Analysis of the discharge and chemistry associated with an isolated thunderstorm revealed fluctuations in acidity that were not congruent with fluctuations in the total discharge hydrograph. For example, acidity increased rapidly during the initial phase of hydrograph rise, but dropped significantly as the storm hydrograph peaked. A second, more subdued, rise in acidity occurred during a second rain pulse, and the acidity gradually decreased to pre-storm levels during hydrograph recession. The trends are interpreted to reflect different sources of storm runoff associated with various components of the total discharge hydrograph. Preliminary calculations indicate that the total quantity of acidity that is discharged during stormflow is about eight times higher than that which is discharged during a comparable period under baseflow conditions. While the lower acid concentrations generated during storm events are ecologically favorable, the increase in total quantities of acidity can have implications for the buffering capacities of receiving water bodies.

  18. Emerging accounting trends accounting for leases.

    PubMed

    Valletta, Robert; Huggins, Brian

    2010-12-01

    A new model for lease accounting can have a significant impact on hospitals and healthcare organizations. The new approach proposes a "right-of-use" model that involves complex estimates and significant administrative burden. Hospitals and health systems that draw heavily on lease arrangements should start preparing for the new approach now even though guidance and a final rule are not expected until mid-2011. This article highlights a number of considerations from the lessee point of view.

  19. A Problem-Based Learning Approach to Entrepreneurship Education

    ERIC Educational Resources Information Center

    Tan, Siok San; Ng, C. K. Frank

    2006-01-01

    Purpose: While it is generally acknowledged that entrepreneurship can be taught, many differ in their opinions about the appropriate methodologies to teach and equip students with the requisite entrepreneurial skills. This paper presents a case to suggest that a problem-based learning (PBL) approach practised at the Republic Polytechnic in…

  20. From Equation to Inequality Using a Function-Based Approach

    ERIC Educational Resources Information Center

    Verikios, Petros; Farmaki, Vassiliki

    2010-01-01

    This article presents features of a qualitative research study concerning the teaching and learning of school algebra using a function-based approach in a grade 8 class, of 23 students, in 26 lessons, in a state school of Athens, in the school year 2003-2004. In this article, we are interested in the inequality concept and our aim is to…

  1. Content Based Image Retrieval and Information Theory: A General Approach.

    ERIC Educational Resources Information Center

    Zachary, John; Iyengar, S. S.; Barhen, Jacob

    2001-01-01

    Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image…

  2. An Evidence-Based Approach to Introductory Chemistry

    ERIC Educational Resources Information Center

    Johnson, Philip

    2014-01-01

    Drawing on research into students' understanding, this article argues that the customary approach to introductory chemistry has created difficulties for students. Instead of being based on the notion of "solids, liquids and gases", introductory chemistry should be structured to develop the concept of a substance. The concept of a…

  3. A Genre-Based Approach to Teaching EFL Summary Writing

    ERIC Educational Resources Information Center

    Chen, Yuan-Shan; Su, Shao-Wen

    2012-01-01

    This study utilizes a pre-test/post-test assessment to investigate the instructional efficacy of a genre-based approach to teaching summary writing. Forty-one EFL university students in Taiwan were asked before and after the instruction to summarize a simplified version of The Adventures of Tom Sawyer in a maximum of 500 words. All the students'…

  4. Microscopic Approach to Species Coexistence Based on Evolutionary Game Dynamics

    NASA Astrophysics Data System (ADS)

    Grebogi, Celso; Lai, Ying-Cheng; Wang, Wen-Xu

    2014-12-01

    An outstanding problem in complex systems and mathematical biology is to explore and understand the fundamental mechanisms of species coexistence. Existing approaches are based on niche partitioning, dispersal, chaotic evolutionary dynamics, and more recently, evolutionary games. Here we briefly review a number of fundamental issues associated with the coexistence of mobile species under cyclic competitions in the framework of evolutionary games.

  5. Tennis: Applied Examples of a Game-Based Teaching Approach

    ERIC Educational Resources Information Center

    Crespo, Miguel; Reid, Machar M.; Miley, Dave

    2004-01-01

    In this article, the authors reveal that tennis has been increasingly taught with a tactical model or game-based approach, which emphasizes learning through practice in match-like drills and actual play, rather than in practicing strokes for exact technical execution. Its goal is to facilitate the player's understanding of the tactical, physical…

  6. Evaluation of a Blog Based Parent Involvement Approach by Parents

    ERIC Educational Resources Information Center

    Ozcinar, Zehra; Ekizoglu, Nihat

    2013-01-01

    Despite the well-known benefits of parent involvement in children's education, research clearly shows that it is difficult to effectively involve parents. This study aims to capture parents' views of a Blog Based Parent Involvement Approach (BPIA) designed to secure parent involvement in education by strengthening school-parent communication. Data…

  7. IIM Digital Library System: Consortia-Based Approach.

    ERIC Educational Resources Information Center

    Pandian, M. Paul; Jambhekar, Ashok; Karisiddappa, C. R.

    2002-01-01

    Provides a framework for the design and development of an intranet model based on a consortia approach by the Indian Institutes of Management (IIM) digital library system that will facilitate information access and use by providing a single Web-enabled window to users to their own resources and to sources in other participating institutions.…

  8. School-Based HIV Prevention: A Multidisciplinary Approach.

    ERIC Educational Resources Information Center

    Kerr, Dianne L.; And Others

    This manual was written to help school-based professionals implement school health education programs to prevent the spread of the human immunodeficiency virus (HIV). The manual provides a framework and plan to promote an interdisciplinary approach to HIV education in schools. The manual begins with a review of basic facts about acquired immune…

  9. Combining U.S.-based prioritization tools to improve screening level accountability for environmental impact: the case of the chemical manufacturing industry.

    PubMed

    Zhou, Xiaoying; Schoenung, Julie M

    2009-12-15

    There are two quantitative indicators that are most widely used to assess the extent of compliance of industrial facilities with environmental regulations: the quantity of hazardous waste generated and the amount of toxics released. These indicators, albeit useful in terms of some environmental monitoring, fail to account for direct or indirect effects on human and environmental health, especially when aggregating total quantity of releases for a facility or industry sector. Thus, there is a need for a more comprehensive approach that can prioritize a particular chemical (or industry sector) on the basis of its relevant environmental performance and impact on human health. Accordingly, the objective of the present study is to formulate an aggregation of tools that can simultaneously capture multiple effects and several environmental impact categories. This approach allows us to compare and combine results generated with the aid of select U.S.-based quantitative impact assessment tools, thereby supplementing compliance-based metrics such as data from the U.S. Toxic Release Inventory. A case study, which presents findings for the U.S. chemical manufacturing industry, is presented to illustrate the aggregation of these tools. Environmental impacts due to both upstream and manufacturing activities are also evaluated for each industry sector. The proposed combinatorial analysis allows for a more robust evaluation for rating and prioritizing the environmental impacts of industrial waste.

  10. A Dynamic Path Planning Approach for Multirobot Sensor-Based Coverage Considering Energy Constraints.

    PubMed

    Yazici, Ahmet; Kirlik, Gokhan; Parlaktuna, Osman; Sipahioglu, Aydin

    2014-03-01

    Multirobot sensor-based coverage path planning determines a tour for each robot in a team such that every point in a given workspace is covered by at least one robot using its sensors. In sensor-based coverage of narrow spaces, i.e., obstacles lie within the sensor range, a generalized Voronoi diagram (GVD)-based graph can be used to model the environment. A complete sensor-based coverage path plan for the robot team can be obtained by using the capacitated arc routing problem solution methods on the GVD-based graph. Unlike capacitated arc routing problem, sensor-based coverage problem requires to consider two types of edge demands. Therefore, modified Ulusoy algorithm is used to obtain mobile robot tours by taking into account two different energy consumption cases during sensor-based coverage. However, due to the partially unknown nature of the environment, the robots may encounter obstacles on their tours. This requires a replanning process that considers the remaining energy capacities and the current positions of the robots. In this paper, the modified Ulusoy algorithm is extended to incorporate this dynamic planning problem. A dynamic path-planning approach is proposed for multirobot sensor-based coverage of narrow environments by considering the energy capacities of the mobile robots. The approach is tested in a laboratory environment using Pioneer 3-DX mobile robots. Simulations are also conducted for a larger test environment. PMID:23757551

  11. Simulation-Based Approach for Site-Specific Optimization of Hydrokinetic Turbine Arrays

    NASA Astrophysics Data System (ADS)

    Sotiropoulos, F.; Chawdhary, S.; Yang, X.; Khosronejad, A.; Angelidis, D.

    2014-12-01

    A simulation-based approach has been developed to enable site-specific optimization of tidal and current turbine arrays in real-life waterways. The computational code is based on the St. Anthony Falls Laboratory Virtual StreamLab (VSL3D), which is able to carry out high-fidelity simulations of turbulent flow and sediment transport processes in rivers and streams taking into account the arbitrary geometrical complexity characterizing natural waterways. The computational framework can be used either in turbine-resolving mode, to take into account all geometrical details of the turbine, or with the turbines parameterized as actuator disks or actuator lines. Locally refined grids are employed to dramatically increase the resolution of the simulation and enable efficient simulations of multi-turbine arrays. Turbine/sediment interactions are simulated using the coupled hydro-morphodynamic module of VSL3D. The predictive capabilities of the resulting computational framework will be demonstrated by applying it to simulate turbulent flow past a tri-frame configuration of hydrokinetic turbines in a rigid-bed turbulent open channel flow as well as turbines mounted on mobile bed open channels to investigate turbine/sediment interactions. The utility of the simulation-based approach for guiding the optimal development of turbine arrays in real-life waterways will also be discussed and demonstrated. This work was supported by NSF grant IIP-1318201. Simulations were carried out at the Minnesota Supercomputing Institute.

  12. Branch-Based Model for the Diameters of the Pulmonary Airways: Accounting for Departures From Self-Consistency and Registration Errors

    SciTech Connect

    Neradilek, Moni B.; Polissar, Nayak L.; Einstein, Daniel R.; Glenny, Robb W.; Minard, Kevin R.; Carson, James P.; Jiao, Xiangmin; Jacob, Richard E.; Cox, Timothy C.; Postlethwait, Edward M.; Corley, Richard A.

    2012-04-24

    We examine a previously published branch-based approach to modeling airway diameters that is predicated on the assumption of self-consistency across all levels of the tree. We mathematically formulate this assumption, propose a method to test it and develop a more general model to be used when the assumption is violated. We discuss the effect of measurement error on the estimated models and propose methods that account for it. The methods are illustrated on data from MRI and CT images of silicone casts of two rats, two normal monkeys and one ozone-exposed monkey. Our results showed substantial departures from self-consistency in all five subjects. When departures from selfconsistency exist we do not recommend using the self-consistency model, even as an approximation, as we have shown that it may likely lead to an incorrect representation of the diameter geometry. Measurement error has an important impact on the estimated morphometry models and needs to be accounted for in the analysis.

  13. Comparison of individual-based modeling and population approaches for prediction of foodborne pathogens growth.

    PubMed

    Augustin, Jean-Christophe; Ferrier, Rachel; Hezard, Bernard; Lintz, Adrienne; Stahl, Valérie

    2015-02-01

    Individual-based modeling (IBM) approach combined with the microenvironment modeling of vacuum-packed cold-smoked salmon was more effective to describe the variability of the growth of a few Listeria monocytogenes cells contaminating irradiated salmon slices than the traditional population models. The IBM approach was particularly relevant to predict the absence of growth in 25% (5 among 20) of artificially contaminated cold-smoked salmon samples stored at 8 °C. These results confirmed similar observations obtained with smear soft cheese (Ferrier et al., 2013). These two different food models were used to compare the IBM/microscale and population/macroscale modeling approaches in more global exposure and risk assessment frameworks taking into account the variability and/or the uncertainty of the factors influencing the growth of L. monocytogenes. We observed that the traditional population models significantly overestimate exposure and risk estimates in comparison to IBM approach when contamination of foods occurs with a low number of cells (<100 per serving). Moreover, the exposure estimates obtained with the population model were characterized by a great uncertainty. The overestimation was mainly linked to the ability of IBM to predict no growth situations rather than the consideration of microscale environment. On the other hand, when the aim of quantitative risk assessment studies is only to assess the relative impact of changes in control measures affecting the growth of foodborne bacteria, the two modeling approach gave similar results and the simplest population approach was suitable.

  14. An Efficient Soft Set-Based Approach for Conflict Analysis

    PubMed Central

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  15. A Hybrid LSSVR/HMM-Based Prognostic Approach

    PubMed Central

    Liu, Zhijuan; Li, Qing; Liu, Xianhui; Mu, Chundi

    2013-01-01

    In a health management system, prognostics, which is an engineering discipline that predicts a system's future health, is an important aspect yet there is currently limited research in this field. In this paper, a hybrid approach for prognostics is proposed. The approach combines the least squares support vector regression (LSSVR) with the hidden Markov model (HMM). Features extracted from sensor signals are used to train HMMs, which represent different health levels. A LSSVR algorithm is used to predict the feature trends. The LSSVR training and prediction algorithms are modified by adding new data and deleting old data and the probabilities of the predicted features for each HMM are calculated based on forward or backward algorithms. Based on these probabilities, one can determine a system's future health state and estimate the remaining useful life (RUL). To evaluate the proposed approach, a test was carried out using bearing vibration signals. Simulation results show that the LSSVR/HMM approach can forecast faults long before they occur and can predict the RUL. Therefore, the LSSVR/HMM approach is very promising in the field of prognostics. PMID:23624688

  16. An Efficient Soft Set-Based Approach for Conflict Analysis.

    PubMed

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.

  17. The cortex-based alignment approach to TMS coil positioning.

    PubMed

    Duecker, Felix; Frost, Martin A; de Graaf, Tom A; Graewe, Britta; Jacobs, Christianne; Goebel, Rainer; Sack, Alexander T

    2014-10-01

    TMS allows noninvasive manipulation of brain activity in healthy participants and patients. The effectiveness of TMS experiments critically depends on precise TMS coil positioning, which is best for most brain areas when a frameless stereotactic system is used to target activation foci based on individual fMRI data. From a purely scientific perspective, individual fMRI-guided TMS is thus the method of choice to ensure optimal TMS efficiency. Yet, from a more practical perspective, such individual functional data are not always available, and therefore alternative TMS coil positioning approaches are often applied, for example, based on functional group data reported in Talairach coordinates. We here propose a novel method for TMS coil positioning that is based on functional group data, yet only requires individual anatomical data. We used cortex-based alignment (CBA) to transform individual anatomical data to an atlas brain that includes probabilistic group maps of two functional regions (FEF and hMT+/V5). Then, these functional group maps were back-transformed to the individual brain anatomy, preserving functional-anatomical correspondence. As a proof of principle, the resulting CBA-based functional targets in individual brain space were compared with individual FEF and hMT+/V5 hotspots as conventionally localized with individual fMRI data and with targets based on Talairach coordinates as commonly done in TMS research in case only individual anatomical data are available. The CBA-based approach significantly improved localization of functional brain areas compared with traditional Talairach-based targeting. Given the widespread availability of CBA schemes and preexisting functional group data, the proposed procedure is easy to implement and at no additional measurement costs. However, the accuracy of individual fMRI-guided TMS remains unparalleled, and the CBA-based approach should only be the method of choice when individual functional data cannot be obtained or

  18. Assessment of acid-base balance. Stewart's approach.

    PubMed

    Fores-Novales, B; Diez-Fores, P; Aguilera-Celorrio, L J

    2016-04-01

    The study of acid-base equilibrium, its regulation and its interpretation have been a source of debate since the beginning of 20th century. Most accepted and commonly used analyses are based on pH, a notion first introduced by Sorensen in 1909, and on the Henderson-Hasselbalch equation (1916). Since then new concepts have been development in order to complete and make easier the understanding of acid-base disorders. In the early 1980's Peter Stewart brought the traditional interpretation of acid-base disturbances into question and proposed a new method. This innovative approach seems more suitable for studying acid-base abnormalities in critically ill patients. The aim of this paper is to update acid-base concepts, methods, limitations and applications.

  19. Revising a design course from a lecture approach to a project-based learning approach

    NASA Astrophysics Data System (ADS)

    Kunberger, Tanya

    2013-06-01

    In order to develop the evaluative skills necessary for successful performance of design, a senior, Geotechnical Engineering course was revised to immerse students in the complexity of the design process utilising a project-based learning (PBL) approach to instruction. The student-centred approach stresses self-directed group learning, which focuses on the process rather than the result and underscores not only the theoretical but also the practical constraints of a problem. The shift in course emphasis, to skills over concepts, results in reduced content coverage but increased student ability to independently acquire a breadth of knowledge.

  20. Searching for adaptive traits in genetic resources - phenology based approach

    NASA Astrophysics Data System (ADS)

    Bari, Abdallah

    2015-04-01

    Searching for adaptive traits in genetic resources - phenology based approach Abdallah Bari, Kenneth Street, Eddy De Pauw, Jalal Eddin Omari, and Chandra M. Biradar International Center for Agricultural Research in the Dry Areas, Rabat Institutes, Rabat, Morocco Phenology is an important plant trait not only for assessing and forecasting food production but also for searching in genebanks for adaptive traits. Among the phenological parameters we have been considering to search for such adaptive and rare traits are the onset (sowing period) and the seasonality (growing period). Currently an application is being developed as part of the focused identification of germplasm strategy (FIGS) approach to use climatic data in order to identify crop growing seasons and characterize them in terms of onset and duration. These approximations of growing period characteristics can then be used to estimate flowering and maturity dates for dryland crops, such as wheat, barley, faba bean, lentils and chickpea, and assess, among others, phenology-related traits such as days to heading [dhe] and grain filling period [gfp]. The approach followed here is based on first calculating long term average daily temperatures by fitting a curve to the monthly data over days from beginning of the year. Prior to the identification of these phenological stages the onset is extracted first from onset integer raster GIS layers developed based on a model of the growing period that considers both moisture and temperature limitations. The paper presents some examples of real applications of the approach to search for rare and adaptive traits.

  1. Human Rights-Based Approaches to Mental Health

    PubMed Central

    Bradley, Valerie J.; Sahakian, Barbara J.

    2016-01-01

    Abstract The incidence of human rights violations in mental health care across nations has been described as a “global emergency” and an “unresolved global crisis.” The relationship between mental health and human rights is complex and bidirectional. Human rights violations can negatively impact mental health. Conversely, respecting human rights can improve mental health. This article reviews cases where an explicitly human rights-based approach was used in mental health care settings. Although the included studies did not exhibit a high level of methodological rigor, the qualitative information obtained was considered useful and informative for future studies. All studies reviewed suggest that human-rights based approaches can lead to clinical improvements at relatively low costs. Human rights-based approaches should be utilized for legal and moral reasons, since human rights are fundamental pillars of justice and civilization. The fact that such approaches can contribute to positive therapeutic outcomes and, potentially, cost savings, is additional reason for their implementation. However, the small sample size and lack of controlled, quantitative measures limit the strength of conclusions drawn from included studies. More objective, high quality research is needed to ascertain the true extent of benefits to service users and providers. PMID:27781015

  2. Style: A Computational and Conceptual Blending-Based Approach

    NASA Astrophysics Data System (ADS)

    Goguen, Joseph A.; Harrell, D. Fox

    This chapter proposes a new approach to style, arising from our work on computational media using structural blending, which enriches the conceptual blending of cognitive linguistics with structure building operations in order to encompass syntax and narrative as well as metaphor. We have implemented both conceptual and structural blending, and conducted initial experiments with poetry, including interactive multimedia poetry, although the approach generalizes to other media. The central idea is to generate multimedia content and analyze style in terms of blending principles, based on our finding that different principles from those of common sense blending are often needed for some contemporary poetic metaphors.

  3. A complexity-based approach to batterer intervention programmes.

    PubMed

    Medina-Maldonado, Venus E; Medina-Maldonado, Rossana; Parada-Cores, Germán

    2014-01-01

    This paper was aimed at providing opinion by adopting a complexity-based approach to coordinating nursing science and psychology concerning psycho-educational intervention for batterers regarding their partner or ex-partner. Improving both disciplines' interrelationship should facilitate implementing relevant action, thereby engendering motivation for change in participants and modifying sexist attitudes and beliefs. The document has analyzed the importance of coordinating scientific disciplines' action and defined guidelines for an approach involving intervention as well as highlighting implications for practice and research.

  4. Network Medicine: A Network-based Approach to Human Diseases

    NASA Astrophysics Data System (ADS)

    Ghiassian, Susan Dina

    With the availability of large-scale data, it is now possible to systematically study the underlying interaction maps of many complex systems in multiple disciplines. Statistical physics has a long and successful history in modeling and characterizing systems with a large number of interacting individuals. Indeed, numerous approaches that were first developed in the context of statistical physics, such as the notion of random walks and diffusion processes, have been applied successfully to study and characterize complex systems in the context of network science. Based on these tools, network science has made important contributions to our understanding of many real-world, self-organizing systems, for example in computer science, sociology and economics. Biological systems are no exception. Indeed, recent studies reflect the necessity of applying statistical and network-based approaches in order to understand complex biological systems, such as cells. In these approaches, a cell is viewed as a complex network consisting of interactions among cellular components, such as genes and proteins. Given the cellular network as a platform, machinery, functionality and failure of a cell can be studied with network-based approaches, a field known as systems biology. Here, we apply network-based approaches to explore human diseases and their associated genes within the cellular network. This dissertation is divided in three parts: (i) A systematic analysis of the connectivity patterns among disease proteins within the cellular network. The quantification of these patterns inspires the design of an algorithm which predicts a disease-specific subnetwork containing yet unknown disease associated proteins. (ii) We apply the introduced algorithm to explore the common underlying mechanism of many complex diseases. We detect a subnetwork from which inflammatory processes initiate and result in many autoimmune diseases. (iii) The last chapter of this dissertation describes the

  5. Dependency Resolution Difficulty Increases with Distance in Persian Separable Complex Predicates: Evidence for Expectation and Memory-Based Accounts

    PubMed Central

    Safavi, Molood S.; Husain, Samar; Vasishth, Shravan

    2016-01-01

    Delaying the appearance of a verb in a noun-verb dependency tends to increase processing difficulty at the verb; one explanation for this locality effect is decay and/or interference of the noun in working memory. Surprisal, an expectation-based account, predicts that delaying the appearance of a verb either renders it no more predictable or more predictable, leading respectively to a prediction of no effect of distance or a facilitation. Recently, Husain et al. (2014) suggested that when the exact identity of the upcoming verb is predictable (strong predictability), increasing argument-verb distance leads to facilitation effects, which is consistent with surprisal; but when the exact identity of the upcoming verb is not predictable (weak predictability), locality effects are seen. We investigated Husain et al.'s proposal using Persian complex predicates (CPs), which consist of a non-verbal element—a noun in the current study—and a verb. In CPs, once the noun has been read, the exact identity of the verb is highly predictable (strong predictability); this was confirmed using a sentence completion study. In two self-paced reading (SPR) and two eye-tracking (ET) experiments, we delayed the appearance of the verb by interposing a relative clause (Experiments 1 and 3) or a long PP (Experiments 2 and 4). We also included a simple Noun-Verb predicate configuration with the same distance manipulation; here, the exact identity of the verb was not predictable (weak predictability). Thus, the design crossed Predictability Strength and Distance. We found that, consistent with surprisal, the verb in the strong predictability conditions was read faster than in the weak predictability conditions. Furthermore, greater verb-argument distance led to slower reading times; strong predictability did not neutralize or attenuate the locality effects. As regards the effect of distance on dependency resolution difficulty, these four experiments present evidence in favor of working

  6. Dependency Resolution Difficulty Increases with Distance in Persian Separable Complex Predicates: Evidence for Expectation and Memory-Based Accounts.

    PubMed

    Safavi, Molood S; Husain, Samar; Vasishth, Shravan

    2016-01-01

    Delaying the appearance of a verb in a noun-verb dependency tends to increase processing difficulty at the verb; one explanation for this locality effect is decay and/or interference of the noun in working memory. Surprisal, an expectation-based account, predicts that delaying the appearance of a verb either renders it no more predictable or more predictable, leading respectively to a prediction of no effect of distance or a facilitation. Recently, Husain et al. (2014) suggested that when the exact identity of the upcoming verb is predictable (strong predictability), increasing argument-verb distance leads to facilitation effects, which is consistent with surprisal; but when the exact identity of the upcoming verb is not predictable (weak predictability), locality effects are seen. We investigated Husain et al.'s proposal using Persian complex predicates (CPs), which consist of a non-verbal element-a noun in the current study-and a verb. In CPs, once the noun has been read, the exact identity of the verb is highly predictable (strong predictability); this was confirmed using a sentence completion study. In two self-paced reading (SPR) and two eye-tracking (ET) experiments, we delayed the appearance of the verb by interposing a relative clause (Experiments 1 and 3) or a long PP (Experiments 2 and 4). We also included a simple Noun-Verb predicate configuration with the same distance manipulation; here, the exact identity of the verb was not predictable (weak predictability). Thus, the design crossed Predictability Strength and Distance. We found that, consistent with surprisal, the verb in the strong predictability conditions was read faster than in the weak predictability conditions. Furthermore, greater verb-argument distance led to slower reading times; strong predictability did not neutralize or attenuate the locality effects. As regards the effect of distance on dependency resolution difficulty, these four experiments present evidence in favor of working memory

  7. Dependency Resolution Difficulty Increases with Distance in Persian Separable Complex Predicates: Evidence for Expectation and Memory-Based Accounts.

    PubMed

    Safavi, Molood S; Husain, Samar; Vasishth, Shravan

    2016-01-01

    Delaying the appearance of a verb in a noun-verb dependency tends to increase processing difficulty at the verb; one explanation for this locality effect is decay and/or interference of the noun in working memory. Surprisal, an expectation-based account, predicts that delaying the appearance of a verb either renders it no more predictable or more predictable, leading respectively to a prediction of no effect of distance or a facilitation. Recently, Husain et al. (2014) suggested that when the exact identity of the upcoming verb is predictable (strong predictability), increasing argument-verb distance leads to facilitation effects, which is consistent with surprisal; but when the exact identity of the upcoming verb is not predictable (weak predictability), locality effects are seen. We investigated Husain et al.'s proposal using Persian complex predicates (CPs), which consist of a non-verbal element-a noun in the current study-and a verb. In CPs, once the noun has been read, the exact identity of the verb is highly predictable (strong predictability); this was confirmed using a sentence completion study. In two self-paced reading (SPR) and two eye-tracking (ET) experiments, we delayed the appearance of the verb by interposing a relative clause (Experiments 1 and 3) or a long PP (Experiments 2 and 4). We also included a simple Noun-Verb predicate configuration with the same distance manipulation; here, the exact identity of the verb was not predictable (weak predictability). Thus, the design crossed Predictability Strength and Distance. We found that, consistent with surprisal, the verb in the strong predictability conditions was read faster than in the weak predictability conditions. Furthermore, greater verb-argument distance led to slower reading times; strong predictability did not neutralize or attenuate the locality effects. As regards the effect of distance on dependency resolution difficulty, these four experiments present evidence in favor of working memory

  8. PoMo: An Allele Frequency-Based Approach for Species Tree Estimation

    PubMed Central

    De Maio, Nicola; Schrempf, Dominik; Kosiol, Carolin

    2015-01-01

    Incomplete lineage sorting can cause incongruencies of the overall species-level phylogenetic tree with the phylogenetic trees for individual genes or genomic segments. If these incongruencies are not accounted for, it is possible to incur several biases in species tree estimation. Here, we present a simple maximum likelihood approach that accounts for ancestral variation and incomplete lineage sorting. We use a POlymorphisms-aware phylogenetic MOdel (PoMo) that we have recently shown to efficiently estimate mutation rates and fixation biases from within and between-species variation data. We extend this model to perform efficient estimation of species trees. We test the performance of PoMo in several different scenarios of incomplete lineage sorting using simulations and compare it with existing methods both in accuracy and computational speed. In contrast to other approaches, our model does not use coalescent theory but is allele frequency based. We show that PoMo is well suited for genome-wide species tree estimation and that on such data it is more accurate than previous approaches. PMID:26209413

  9. A novel image fusion approach based on compressive sensing

    NASA Astrophysics Data System (ADS)

    Yin, Hongpeng; Liu, Zhaodong; Fang, Bin; Li, Yanxia

    2015-11-01

    Image fusion can integrate complementary and relevant information of source images captured by multiple sensors into a unitary synthetic image. The compressive sensing-based (CS) fusion approach can greatly reduce the processing speed and guarantee the quality of the fused image by integrating fewer non-zero coefficients. However, there are two main limitations in the conventional CS-based fusion approach. Firstly, directly fusing sensing measurements may bring greater uncertain results with high reconstruction error. Secondly, using single fusion rule may result in the problems of blocking artifacts and poor fidelity. In this paper, a novel image fusion approach based on CS is proposed to solve those problems. The non-subsampled contourlet transform (NSCT) method is utilized to decompose the source images. The dual-layer Pulse Coupled Neural Network (PCNN) model is used to integrate low-pass subbands; while an edge-retention based fusion rule is proposed to fuse high-pass subbands. The sparse coefficients are fused before being measured by Gaussian matrix. The fused image is accurately reconstructed by Compressive Sampling Matched Pursuit algorithm (CoSaMP). Experimental results demonstrate that the fused image contains abundant detailed contents and preserves the saliency structure. These also indicate that our proposed method achieves better visual quality than the current state-of-the-art methods.

  10. Genome-based approaches to develop vaccines against bacterial pathogens.

    PubMed

    Serruto, Davide; Serino, Laura; Masignani, Vega; Pizza, Mariagrazia

    2009-05-26

    Bacterial infectious diseases remain the single most important threat to health worldwide. Although conventional vaccinology approaches were successful in conferring protection against several diseases, they failed to provide efficacious solutions against many others. The advent of whole-genome sequencing changed the way to think about vaccine development, enabling the targeting of possible vaccine candidates starting from the genomic information of a single bacterial isolate, with a process named reverse vaccinology. As the genomic era progressed, reverse vaccinology has evolved with a pan-genome approach and multi-strain genome analysis became fundamental for the design of universal vaccines. This review describes the applications of genome-based approaches in the development of new vaccines against bacterial pathogens.

  11. A data base approach for prediction of deforestation-induced mass wasting events

    NASA Technical Reports Server (NTRS)

    Logan, T. L.

    1981-01-01

    A major topic of concern in timber management is determining the impact of clear-cutting on slope stability. Deforestation treatments on steep mountain slopes have often resulted in a high frequency of major mass wasting events. The Geographic Information System (GIS) is a potentially useful tool for predicting the location of mass wasting sites. With a raster-based GIS, digitally encoded maps of slide hazard parameters can be overlayed and modeled to produce new maps depicting high probability slide areas. The present investigation has the objective to examine the raster-based information system as a tool for predicting the location of the clear-cut mountain slopes which are most likely to experience shallow soil debris avalanches. A literature overview is conducted, taking into account vegetation, roads, precipitation, soil type, slope-angle and aspect, and models predicting mass soil movements. Attention is given to a data base approach and aspects of slide prediction.

  12. A fuzzy behaviorist approach to sensor-based robot control

    SciTech Connect

    Pin, F.G.

    1996-05-01

    Sensor-based operation of autonomous robots in unstructured and/or outdoor environments has revealed to be an extremely challenging problem, mainly because of the difficulties encountered when attempting to represent the many uncertainties which are always present in the real world. These uncertainties are primarily due to sensor imprecisions and unpredictability of the environment, i.e., lack of full knowledge of the environment characteristics and dynamics. An approach. which we have named the {open_quotes}Fuzzy Behaviorist Approach{close_quotes} (FBA) is proposed in an attempt to remedy some of these difficulties. This approach is based on the representation of the system`s uncertainties using Fuzzy Set Theory-based approximations and on the representation of the reasoning and control schemes as sets of elemental behaviors. Using the FBA, a formalism for rule base development and an automated generator of fuzzy rules have been developed. This automated system can automatically construct the set of membership functions corresponding to fuzzy behaviors. Once these have been expressed in qualitative terms by the user. The system also checks for completeness of the rule base and for non-redundancy of the rules (which has traditionally been a major hurdle in rule base development). Two major conceptual features, the suppression and inhibition mechanisms which allow to express a dominance between behaviors are discussed in detail. Some experimental results obtained with the automated fuzzy, rule generator applied to the domain of sensor-based navigation in aprion unknown environments. using one of our autonomous test-bed robots as well as a real car in outdoor environments, are then reviewed and discussed to illustrate the feasibility of large-scale automatic fuzzy rule generation using the {open_quotes}Fuzzy Behaviorist{close_quotes} concepts.

  13. A Kalman-Filter-Based Approach to Combining Independent Earth-Orientation Series

    NASA Technical Reports Server (NTRS)

    Gross, Richard S.; Eubanks, T. M.; Steppe, J. A.; Freedman, A. P.; Dickey, J. O.; Runge, T. F.

    1998-01-01

    An approach. based upon the use of a Kalman filter. that is currently employed at the Jet Propulsion Laboratory (JPL) for combining independent measurements of the Earth's orientation, is presented. Since changes in the Earth's orientation can be described is a randomly excited stochastic process, the uncertainty in our knowledge of the Earth's orientation grows rapidly in the absence of measurements. The Kalman-filter methodology allows for an objective accounting of this uncertainty growth, thereby facilitating the intercomparison of measurements taken at different epochs (not necessarily uniformly spaced in time) and with different precision. As an example of this approach to combining Earth-orientation series, a description is given of a combination, SPACE95, that has been generated recently at JPL.

  14. A novel logic-based approach for quantitative toxicology prediction.

    PubMed

    Amini, Ata; Muggleton, Stephen H; Lodhi, Huma; Sternberg, Michael J E

    2007-01-01

    There is a pressing need for accurate in silico methods to predict the toxicity of molecules that are being introduced into the environment or are being developed into new pharmaceuticals. Predictive toxicology is in the realm of structure activity relationships (SAR), and many approaches have been used to derive such SAR. Previous work has shown that inductive logic programming (ILP) is a powerful approach that circumvents several major difficulties, such as molecular superposition, faced by some other SAR methods. The ILP approach reasons with chemical substructures within a relational framework and yields chemically understandable rules. Here, we report a general new approach, support vector inductive logic programming (SVILP), which extends the essentially qualitative ILP-based SAR to quantitative modeling. First, ILP is used to learn rules, the predictions of which are then used within a novel kernel to derive a support-vector generalization model. For a highly heterogeneous dataset of 576 molecules with known fathead minnow fish toxicity, the cross-validated correlation coefficients (R2CV) from a chemical descriptor method (CHEM) and SVILP are 0.52 and 0.66, respectively. The ILP, CHEM, and SVILP approaches correctly predict 55, 58, and 73%, respectively, of toxic molecules. In a set of 165 unseen molecules, the R2 values from the commercial software TOPKAT and SVILP are 0.26 and 0.57, respectively. In all calculations, SVILP showed significant improvements in comparison with the other methods. The SVILP approach has a major advantage in that it uses ILP automatically and consistently to derive rules, mostly novel, describing fragments that are toxicity alerts. The SVILP is a general machine-learning approach and has the potential of tackling many problems relevant to chemoinformatics including in silico drug design.

  15. A microfabrication-based approach to quantitative isothermal titration calorimetry.

    PubMed

    Wang, Bin; Jia, Yuan; Lin, Qiao

    2016-04-15

    Isothermal titration calorimetry (ITC) directly measures heat evolved in a chemical reaction to determine equilibrium binding properties of biomolecular systems. Conventional ITC instruments are expensive, use complicated design and construction, and require long analysis times. Microfabricated calorimetric devices are promising, although they have yet to allow accurate, quantitative ITC measurements of biochemical reactions. This paper presents a microfabrication-based approach to integrated, quantitative ITC characterization of biomolecular interactions. The approach integrates microfabricated differential calorimetric sensors with microfluidic titration. Biomolecules and reagents are introduced at each of a series of molar ratios, mixed, and allowed to react. The reaction thermal power is differentially measured, and used to determine the thermodynamic profile of the biomolecular interactions. Implemented in a microdevice featuring thermally isolated, well-defined reaction volumes with minimized fluid evaporation as well as highly sensitive thermoelectric sensing, the approach enables accurate and quantitative ITC measurements of protein-ligand interactions under different isothermal conditions. Using the approach, we demonstrate ITC characterization of the binding of 18-Crown-6 with barium chloride, and the binding of ribonuclease A with cytidine 2'-monophosphate within reaction volumes of approximately 0.7 µL and at concentrations down to 2mM. For each binding system, the ITC measurements were completed with considerably reduced analysis times and material consumption, and yielded a complete thermodynamic profile of the molecular interaction in agreement with published data. This demonstrates the potential usefulness of our approach for biomolecular characterization in biomedical applications. PMID:26655185

  16. A microfabrication-based approach to quantitative isothermal titration calorimetry.

    PubMed

    Wang, Bin; Jia, Yuan; Lin, Qiao

    2016-04-15

    Isothermal titration calorimetry (ITC) directly measures heat evolved in a chemical reaction to determine equilibrium binding properties of biomolecular systems. Conventional ITC instruments are expensive, use complicated design and construction, and require long analysis times. Microfabricated calorimetric devices are promising, although they have yet to allow accurate, quantitative ITC measurements of biochemical reactions. This paper presents a microfabrication-based approach to integrated, quantitative ITC characterization of biomolecular interactions. The approach integrates microfabricated differential calorimetric sensors with microfluidic titration. Biomolecules and reagents are introduced at each of a series of molar ratios, mixed, and allowed to react. The reaction thermal power is differentially measured, and used to determine the thermodynamic profile of the biomolecular interactions. Implemented in a microdevice featuring thermally isolated, well-defined reaction volumes with minimized fluid evaporation as well as highly sensitive thermoelectric sensing, the approach enables accurate and quantitative ITC measurements of protein-ligand interactions under different isothermal conditions. Using the approach, we demonstrate ITC characterization of the binding of 18-Crown-6 with barium chloride, and the binding of ribonuclease A with cytidine 2'-monophosphate within reaction volumes of approximately 0.7 µL and at concentrations down to 2mM. For each binding system, the ITC measurements were completed with considerably reduced analysis times and material consumption, and yielded a complete thermodynamic profile of the molecular interaction in agreement with published data. This demonstrates the potential usefulness of our approach for biomolecular characterization in biomedical applications.

  17. An SQL-based approach to physics analysis

    NASA Astrophysics Data System (ADS)

    Limper, Maaike, Dr

    2014-06-01

    As part of the CERN openlab collaboration a study was made into the possibility of performing analysis of the data collected by the experiments at the Large Hadron Collider (LHC) through SQL-queries on data stored in a relational database. Currently LHC physics analysis is done using data stored in centrally produced "ROOT-ntuple" files that are distributed through the LHC computing grid. The SQL-based approach to LHC physics analysis presented in this paper allows calculations in the analysis to be done at the database and can make use of the database's in-built parallelism features. Using this approach it was possible to reproduce results for several physics analysis benchmarks. The study shows the capability of the database to handle complex analysis tasks but also illustrates the limits of using row-based storage for storing physics analysis data, as performance was limited by the I/O read speed of the system.

  18. A Computationally Based Approach to Homogenizing Advanced Alloys

    SciTech Connect

    Jablonski, P D; Cowen, C J

    2011-02-27

    We have developed a computationally based approach to optimizing the homogenization heat treatment of complex alloys. The Scheil module within the Thermo-Calc software is used to predict the as-cast segregation present within alloys, and DICTRA (Diffusion Controlled TRAnsformations) is used to model the homogenization kinetics as a function of time, temperature and microstructural scale. We will discuss this approach as it is applied to both Ni based superalloys as well as the more complex (computationally) case of alloys that solidify with more than one matrix phase as a result of segregation. Such is the case typically observed in martensitic steels. With these alloys it is doubly important to homogenize them correctly, especially at the laboratory scale, since they are austenitic at high temperature and thus constituent elements will diffuse slowly. The computationally designed heat treatment and the subsequent verification real castings are presented.

  19. English to Sanskrit Machine Translation Using Transfer Based approach

    NASA Astrophysics Data System (ADS)

    Pathak, Ganesh R.; Godse, Sachin P.

    2010-11-01

    Translation is one of the needs of global society for communicating thoughts and ideas of one country with other country. Translation is the process of interpretation of text meaning and subsequent production of equivalent text, also called as communicating same meaning (message) in another language. In this paper we gave detail information on how to convert source language text in to target language text using Transfer Based Approach for machine translation. Here we implemented English to Sanskrit machine translator using transfer based approach. English is global language used for business and communication but large amount of population in India is not using and understand the English. Sanskrit is ancient language of India most of the languages in India are derived from Sanskrit. Sanskrit can be act as an intermediate language for multilingual translation.

  20. Protecting the Smart Grid: A Risk Based Approach

    SciTech Connect

    Clements, Samuel L.; Kirkham, Harold; Elizondo, Marcelo A.; Lu, Shuai

    2011-10-10

    This paper describes a risk-based approach to security that has been used for years in protecting physical assets, and shows how it could be modified to help secure the digital aspects of the smart grid and control systems in general. One way the smart grid has been said to be vulnerable is that mass load fluctuations could be created by quickly turning off and on large quantities of smart meters. We investigate the plausibility.

  1. Biomaterial Approaches for Stem Cell-Based Myocardial Tissue Engineering

    PubMed Central

    Cutts, Josh; Nikkhah, Mehdi; Brafman, David A

    2015-01-01

    Adult and pluripotent stem cells represent a ready supply of cellular raw materials that can be used to generate the functionally mature cells needed to replace damaged or diseased heart tissue. However, the use of stem cells for cardiac regenerative therapies is limited by the low efficiency by which stem cells are differentiated in vitro to cardiac lineages as well as the inability to effectively deliver stem cells and their derivatives to regions of damaged myocardium. In this review, we discuss the various biomaterial-based approaches that are being implemented to direct stem cell fate both in vitro and in vivo. First, we discuss the stem cell types available for cardiac repair and the engineering of naturally and synthetically derived biomaterials to direct their in vitro differentiation to the cell types that comprise heart tissue. Next, we describe biomaterial-based approaches that are being implemented to enhance the in vivo integration and differentiation of stem cells delivered to areas of cardiac damage. Finally, we present emerging trends of using stem cell-based biomaterial approaches to deliver pro-survival factors and fully vascularized tissue to the damaged and diseased cardiac tissue. PMID:26052226

  2. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2001-01-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  3. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2000-12-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  4. Emissions trading -- Market-based approaches offer pollution control incentives

    SciTech Connect

    Tombach, I.

    1994-06-01

    In the last several years, market-based'' strategies for achieving air quality goals have joined the traditional command-and-control'' approach to air pollution management. The premise behind market approaches is that the right'' to emit air pollutants provided by a permit has monetary value. A market-based approach provides facility operators with incentives to take advantage of the monetary value associated with reducing emissions below permitted levels. It has been recognized for some time that applying such a mechanism can be a cost-effective regional air quality management strategy. To date, economic incentives have been exploited somewhat in emissions reduction credit programs operating in non-attainment areas, but transactions have been tightly controlled by regulatory agencies. Two recently implemented programs have taken to the marketplace the management of emissions from specific sources. One is incorporated in the acid rain mitigation provisions of Title 4 of the Clean Air Act (CAA) Amendments; the other is the Regional Clean Air Incentives Market (RECLAIM), a program based on reducing ozone. Both Title 4 and RECLAIM are intended to achieve substantial reductions during the next decade in emissions of sulfur dioxide (SO[sub 2]) and oxides of nitrogens (NO[sub x]) from selected larger sources.

  5. Fragment-based approaches and computer-aided drug discovery.

    PubMed

    Rognan, Didier

    2012-01-01

    Fragment-based design has significantly modified drug discovery strategies and paradigms in the last decade. Besides technological advances and novel therapeutic avenues, one of the most significant changes brought by this new discipline has occurred in the minds of drug designers. Fragment-based approaches have markedly impacted rational computer-aided design both in method development and in applications. The present review illustrates the importance of molecular fragments in many aspects of rational ligand design, and discusses how thinking in "fragment space" has boosted computational biology and chemistry. PMID:21710380

  6. A mindfulness-based approach to the treatment of insomnia.

    PubMed

    Ong, Jason; Sholtes, David

    2010-11-01

    Mindfulness meditation has emerged as a novel approach to emotion regulation and stress reduction that has several health benefits. Preliminary work has been conducted on mindfulness-based therapy for insomnia (MBT-I), a meditation-based program for individuals suffering from chronic sleep disturbance. This treatment integrates behavioral treatments for insomnia with the principles and practices of mindfulness meditation. A case illustration of a chronic insomnia sufferer demonstrates the application of mindfulness principles for developing adaptive ways of working with the nocturnal symptoms and waking consequences of chronic insomnia.

  7. A New Approach to Image Fusion Based on Cokriging

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; LeMoigne, Jacqueline; Mount, David M.; Morisette, Jeffrey T.

    2005-01-01

    We consider the image fusion problem involving remotely sensed data. We introduce cokriging as a method to perform fusion. We investigate the advantages of fusing Hyperion with ALI. The evaluation is performed by comparing the classification of the fused data with that of input images and by calculating well-chosen quantitative fusion quality metrics. We consider the Invasive Species Forecasting System (ISFS) project as our fusion application. The fusion of ALI with Hyperion data is studies using PCA and wavelet-based fusion. We then propose utilizing a geostatistical based interpolation method called cokriging as a new approach for image fusion.

  8. Science based integrated approach to advanced nuclear fuel development - vision, approach, and overview

    SciTech Connect

    Unal, Cetin; Pasamehmetoglu, Kemal; Carmack, Jon

    2010-01-01

    Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Rcactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems is critical. In order to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating the phase and microstructural behavior of the nuclear fuel system materials and matrices. The purpose of this paper is to identify the modeling and simulation approach in order to deliver predictive tools for advanced fuels development. The coordination between experimental nuclear fuel design, development technical experts, and computational fuel modeling and simulation technical experts is a critical aspect of the approach and naturally leads to an integrated, goal-oriented science-based R & D approach and strengthens both the experimental and computational efforts. The Advanced Fuels Campaign (AFC) and Nuclear Energy Advanced Modeling and Simulation (NEAMS) Fuels Integrated Performance and Safety Code (IPSC) are working together to determine experimental data and modeling needs. The primary objective of the NEAMS fuels IPSC project is to deliver a coupled, three-dimensional, predictive computational platform for modeling the fabrication and both normal and abnormal operation of nuclear fuel pins and assemblies, applicable to both existing and future reactor fuel designs. The science based program is pursuing the development of an integrated multi-scale and multi-physics modeling and simulation platform for nuclear fuels. This overview paper discusses the vision, goals and approaches how to develop and implement the new approach.

  9. Application of a swarm-based approach for phase unwrapping

    NASA Astrophysics Data System (ADS)

    da S. Maciel, Lucas; Albertazzi G., Armando, Jr.

    2014-07-01

    An algorithm for phase unwrapping based on swarm intelligence is proposed. The novel approach is based on the emergent behavior of swarms. This behavior is the result of the interactions between independent agents following a simple set of rules and is regarded as fast, flexible and robust. The rules here were designed with two purposes. Firstly, the collective behavior must result in a reliable map of the unwrapped phase. The unwrapping reliability was evaluated by each agent during run-time, based on the quality of the neighboring pixels. In addition, the rule set must result in a behavior that focuses on wrapped regions. Stigmergy and communication rules were implemented in order to enable each agent to seek less worked areas of the image. The agents were modeled as Finite-State Machines. Based on the availability of unwrappable pixels, each agent assumed a different state in order to better adapt itself to the surroundings. The implemented rule set was able to fulfill the requirements on reliability and focused unwrapping. The unwrapped phase map was comparable to those from established methods as the agents were able to reliably evaluate each pixel quality. Also, the unwrapping behavior, being observed in real time, was able to focus on workable areas as the agents communicated in order to find less traveled regions. The results were very positive for such a new approach to the phase unwrapping problem. Finally, the authors see great potential for future developments concerning the flexibility, robustness and processing times of the swarm-based algorithm.

  10. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Goebel, Kai

    2011-01-01

    Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.

  11. Optimizing Dendritic Cell-Based Approaches for Cancer Immunotherapy

    PubMed Central

    Datta, Jashodeep; Terhune, Julia H.; Lowenfeld, Lea; Cintolo, Jessica A.; Xu, Shuwen; Roses, Robert E.; Czerniecki, Brian J.

    2014-01-01

    Dendritic cells (DC) are professional antigen-presenting cells uniquely suited for cancer immunotherapy. They induce primary immune responses, potentiate the effector functions of previously primed T-lymphocytes, and orchestrate communication between innate and adaptive immunity. The remarkable diversity of cytokine activation regimens, DC maturation states, and antigen-loading strategies employed in current DC-based vaccine design reflect an evolving, but incomplete, understanding of optimal DC immunobiology. In the clinical realm, existing DC-based cancer immunotherapy efforts have yielded encouraging but inconsistent results. Despite recent U.S. Federal and Drug Administration (FDA) approval of DC-based sipuleucel-T for metastatic castration-resistant prostate cancer, clinically effective DC immunotherapy as monotherapy for a majority of tumors remains a distant goal. Recent work has identified strategies that may allow for more potent “next-generation” DC vaccines. Additionally, multimodality approaches incorporating DC-based immunotherapy may improve clinical outcomes. PMID:25506283

  12. Model-based approach for elevator performance estimation

    NASA Astrophysics Data System (ADS)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  13. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  14. Sequential Bayesian Detection: A Model-Based Approach

    SciTech Connect

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  15. Graph-based and statistical approaches for detecting spectrally variable target materials

    NASA Astrophysics Data System (ADS)

    Ziemann, Amanda K.; Theiler, James

    2016-05-01

    In discriminating target materials from background clutter in hyperspectral imagery, one must contend with variability in both. Most algorithms focus on the clutter variability, but for some materials there is considerable variability in the spectral signatures of the target. This is especially the case for solid target materials, whose signatures depend on morphological properties (particle size, packing density, etc.) that are rarely known a priori. In this paper, we investigate detection algorithms that explicitly take into account the diversity of signatures for a given target. In particular, we investigate variable target detectors when applied to new representations of the hyperspectral data: a manifold learning based approach, and a residual based approach. The graph theory and manifold learning based approach incorporates multiple spectral signatures of the target material of interest; this is built upon previous work that used a single target spectrum. In this approach, we first build an adaptive nearest neighbors (ANN) graph on the data and target spectra, and use a biased locally linear embedding (LLE) transformation to perform nonlinear dimensionality reduction. This biased transformation results in a lower-dimensional representation of the data that better separates the targets from the background. The residual approach uses an annulus based computation to represent each pixel after an estimate of the local background is removed, which suppresses local backgrounds and emphasizes the target-containing pixels. We will show detection results in the original spectral space, the dimensionality-reduced space, and the residual space, all using subspace detectors: ranked spectral angle mapper (rSAM), subspace adaptive matched filter (ssAMF), and subspace adaptive cosine/coherence estimator (ssACE). Results of this exploratory study will be shown on a ground-truthed hyperspectral image with variable target spectra and both full and mixed pixel targets.

  16. Assisting students for lecture preparation: A Web-based approach

    NASA Astrophysics Data System (ADS)

    Herrick, Brad Jay

    Students continue to arrive at universities with poor study and time management skills: they are not proactive in their studies while professors are not willing to hold them accountable for their shortcomings. The result is a 'dumbing down' of the course. This can be defeated by student preparation prior to attending lecture, especially in very large-lecture classrooms (N>400). In fact, it provides a process to 'dumb up' the course. A Web-based system for providing content specific lecture preparations (termed 'Previews') was developed and tested in three courses in a large southwestern research institution. Significance was found in final course achievement by treatment levels, including variations by the total number of participations in the lecture preparations. Method of implementation and results are discussed, including future considerations.

  17. Use of the ‘Accountability for Reasonableness’ Approach to Improve Fairness in Accessing Dialysis in a Middle-Income Country

    PubMed Central

    Maree, Jonathan David; Chirehwa, Maxwell T.; Benatar, Solomon R.

    2016-01-01

    Universal access to renal replacement therapy is beyond the economic capability of most low and middle-income countries due to large patient numbers and the high recurrent cost of treating end stage kidney disease. In countries where limited access is available, no systems exist that allow for optimal use of the scarce dialysis facilities. We previously reported that using national guidelines to select patients for renal replacement therapy resulted in biased allocation. We reengineered selection guidelines using the ‘Accountability for Reasonableness’ (procedural fairness) framework in collaboration with relevant stakeholders, applying these in a novel way to categorize and prioritize patients in a unique hierarchical fashion. The guidelines were primarily premised on patients being transplantable. We examined whether the revised guidelines enhanced fairness of dialysis resource allocation. This is a descriptive study of 1101 end stage kidney failure patients presenting to a tertiary renal unit in a middle-income country, evaluated for dialysis treatment over a seven-year period. The Assessment Committee used the accountability for reasonableness-based guidelines to allocate patients to one of three assessment groups. Category 1 patients were guaranteed renal replacement therapy, Category 3 patients were palliated, and Category 2 were offered treatment if resources allowed. Only 25.2% of all end stage kidney disease patients assessed were accepted for renal replacement treatment. The majority of patients (48%) were allocated to Category 2. Of 134 Category 1 patients, 98% were accepted for treatment while 438 (99.5%) Category 3 patients were excluded. Compared with those palliated, patients accepted for dialysis treatment were almost 10 years younger, employed, married with children and not diabetic. Compared with our previous selection process our current method of priority setting based on procedural fairness arguably resulted in more equitable allocation of

  18. Use of an ecologically relevant modelling approach to improve remote sensing-based schistosomiasis risk profiling.

    PubMed

    Walz, Yvonne; Wegmann, Martin; Leutner, Benjamin; Dech, Stefan; Vounatsou, Penelope; N'Goran, Eliézer K; Raso, Giovanna; Utzinger, Jürg

    2015-11-30

    Schistosomiasis is a widespread water-based disease that puts close to 800 million people at risk of infection with more than 250 million infected, mainly in sub-Saharan Africa. Transmission is governed by the spatial distribution of specific freshwater snails that act as intermediate hosts and the frequency, duration and extent of human bodies exposed to infested water sources during human water contact. Remote sensing data have been utilized for spatially explicit risk profiling of schistosomiasis. Since schistosomiasis risk profiling based on remote sensing data inherits a conceptual drawback if school-based disease prevalence data are directly related to the remote sensing measurements extracted at the location of the school, because the disease transmission usually does not exactly occur at the school, we took the local environment around the schools into account by explicitly linking ecologically relevant environmental information of potential disease transmission sites to survey measurements of disease prevalence. Our models were validated at two sites with different landscapes in Côte d'Ivoire using high- and moderate-resolution remote sensing data based on random forest and partial least squares regression. We found that the ecologically relevant modelling approach explained up to 70% of the variation in Schistosoma infection prevalence and performed better compared to a purely pixel-based modelling approach. Furthermore, our study showed that model performance increased as a function of enlarging the school catchment area, confirming the hypothesis that suitable environments for schistosomiasis transmission rarely occur at the location of survey measurements.

  19. Accountability and Primary Healthcare

    PubMed Central

    Mukhi, Shaheena; Barnsley, Jan; Deber, Raisa B.

    2014-01-01

    This paper examines the accountability structures within primary healthcare (PHC) in Ontario; in particular, who is accountable for what and to whom, and the policy tools being used. Ontario has implemented a series of incremental reforms, using expenditure policy instruments, enforced through contractual agreements to provide a defined set of publicly financed services that are privately delivered, most often by family physicians. The findings indicate that reporting, funding, evaluation and governance accountability requirements vary across service provider models. Accountability to the funder and patients is most common. Agreements, incentives and compensation tools have been used but may be insufficient to ensure parties are being held responsible for their activities related to stated goals. Clear definitions of various governance structures, a cohesive approach to monitoring critical performance indicators and associated improvement strategies are important elements in operationalizing accountability and determining whether goals are being met. PMID:25305392

  20. Accountability and primary healthcare.

    PubMed

    Mukhi, Shaheena; Barnsley, Jan; Deber, Raisa B

    2014-09-01

    This paper examines the accountability structures within primary healthcare (PHC) in Ontario; in particular, who is accountable for what and to whom, and the policy tools being used. Ontario has implemented a series of incremental reforms, using expenditure policy instruments, enforced through contractual agreements to provide a defined set of publicly financed services that are privately delivered, most often by family physicians. The findings indicate that reporting, funding, evaluation and governance accountability requirements vary across service provider models. Accountability to the funder and patients is most common. Agreements, incentives and compensation tools have been used but may be insufficient to ensure parties are being held responsible for their activities related to stated goals. Clear definitions of various governance structures, a cohesive approach to monitoring critical performance indicators and associated improvement strategies are important elements in operationalizing accountability and determining whether goals are being met. PMID:25305392

  1. Accountability Overboard

    ERIC Educational Resources Information Center

    Chieppo, Charles D.; Gass, James T.

    2009-01-01

    This article reports that special interest groups opposed to charter schools and high-stakes testing have hijacked Massachusetts's once-independent board of education and stand poised to water down the Massachusetts Comprehensive Assessment System (MCAS) tests and the accountability system they support. President Barack Obama and Massachusetts…

  2. Painless Accountability.

    ERIC Educational Resources Information Center

    Brown, R. W.; And Others

    The computerized Painless Accountability System is a performance objective system from which instructional programs are developed. Three main simplified behavioral response levels characterize this system: (1) cognitive, (2) psychomotor, and (3) affective domains. Each of these objectives are classified by one of 16 descriptors. The second major…

  3. Accounting Specialist.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication identifies 20 subjects appropriate for use in a competency list for the occupation of accounting specialist, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 20 units are as follows:…

  4. Bifurcation-based approach reveals synergism and optimal combinatorial perturbation.

    PubMed

    Liu, Yanwei; Li, Shanshan; Liu, Zengrong; Wang, Ruiqi

    2016-06-01

    Cells accomplish the process of fate decisions and form terminal lineages through a series of binary choices in which cells switch stable states from one branch to another as the interacting strengths of regulatory factors continuously vary. Various combinatorial effects may occur because almost all regulatory processes are managed in a combinatorial fashion. Combinatorial regulation is crucial for cell fate decisions because it may effectively integrate many different signaling pathways to meet the higher regulation demand during cell development. However, whether the contribution of combinatorial regulation to the state transition is better than that of a single one and if so, what the optimal combination strategy is, seem to be significant issue from the point of view of both biology and mathematics. Using the approaches of combinatorial perturbations and bifurcation analysis, we provide a general framework for the quantitative analysis of synergism in molecular networks. Different from the known methods, the bifurcation-based approach depends only on stable state responses to stimuli because the state transition induced by combinatorial perturbations occurs between stable states. More importantly, an optimal combinatorial perturbation strategy can be determined by investigating the relationship between the bifurcation curve of a synergistic perturbation pair and the level set of a specific objective function. The approach is applied to two models, i.e., a theoretical multistable decision model and a biologically realistic CREB model, to show its validity, although the approach holds for a general class of biological systems.

  5. A modified Shockley equation taking into account the multi-element nature of light emitting diodes based on nanowire ensembles.

    PubMed

    Musolino, M; Tahraoui, A; Treeck, D van; Geelhaar, L; Riechert, H

    2016-07-01

    In this work we study how the multi-element nature of light emitting diodes (LEDs) based on nanowire (NW) ensembles influences their current voltage (I-V) characteristics. We systematically address critical issues of the fabrication process that can result in significant fluctuations of the electrical properties among the individual NWs in such LEDs, paying particular attention to the planarization step. Electroluminescence (EL) maps acquired for two nominally identical NW-LEDs reveal that small processing variations can result in a large difference in the number of individual nano-devices emitting EL. The lower number of EL spots in one of the LEDs is caused by its inhomogeneous electrical properties. The I-V characteristics of this LED cannot be described well by the classical Shockley model. We are able to take into account the multi-element nature of such LEDs and fit the I-V characteristics in the forward bias regime by employing an ad hoc adjusted version of the Shockley equation. More specifically, we introduce a bias dependence of the ideality factor. The basic considerations of our model should remain valid also for other types of devices based on ensembles of interconnected p-n junctions with inhomogeneous electrical properties, regardless of the employed material system.

  6. A modified Shockley equation taking into account the multi-element nature of light emitting diodes based on nanowire ensembles

    NASA Astrophysics Data System (ADS)

    Musolino, M.; Tahraoui, A.; van Treeck, D.; Geelhaar, L.; Riechert, H.

    2016-07-01

    In this work we study how the multi-element nature of light emitting diodes (LEDs) based on nanowire (NW) ensembles influences their current voltage (I-V) characteristics. We systematically address critical issues of the fabrication process that can result in significant fluctuations of the electrical properties among the individual NWs in such LEDs, paying particular attention to the planarization step. Electroluminescence (EL) maps acquired for two nominally identical NW-LEDs reveal that small processing variations can result in a large difference in the number of individual nano-devices emitting EL. The lower number of EL spots in one of the LEDs is caused by its inhomogeneous electrical properties. The I-V characteristics of this LED cannot be described well by the classical Shockley model. We are able to take into account the multi-element nature of such LEDs and fit the I-V characteristics in the forward bias regime by employing an ad hoc adjusted version of the Shockley equation. More specifically, we introduce a bias dependence of the ideality factor. The basic considerations of our model should remain valid also for other types of devices based on ensembles of interconnected p-n junctions with inhomogeneous electrical properties, regardless of the employed material system.

  7. Pedestrian detection from thermal images: A sparse representation based approach

    NASA Astrophysics Data System (ADS)

    Qi, Bin; John, Vijay; Liu, Zheng; Mita, Seiichi

    2016-05-01

    Pedestrian detection, a key technology in computer vision, plays a paramount role in the applications of advanced driver assistant systems (ADASs) and autonomous vehicles. The objective of pedestrian detection is to identify and locate people in a dynamic environment so that accidents can be avoided. With significant variations introduced by illumination, occlusion, articulated pose, and complex background, pedestrian detection is a challenging task for visual perception. Different from visible images, thermal images are captured and presented with intensity maps based objects' emissivity, and thus have an enhanced spectral range to make human beings perceptible from the cool background. In this study, a sparse representation based approach is proposed for pedestrian detection from thermal images. We first adopted the histogram of sparse code to represent image features and then detect pedestrian with the extracted features in an unimodal and a multimodal framework respectively. In the unimodal framework, two types of dictionaries, i.e. joint dictionary and individual dictionary, are built by learning from prepared training samples. In the multimodal framework, a weighted fusion scheme is proposed to further highlight the contributions from features with higher separability. To validate the proposed approach, experiments were conducted to compare with three widely used features: Haar wavelets (HWs), histogram of oriented gradients (HOG), and histogram of phase congruency (HPC) as well as two classification methods, i.e. AdaBoost and support vector machine (SVM). Experimental results on a publicly available data set demonstrate the superiority of the proposed approach.

  8. Contingent approach to Internet-based supply network integration

    NASA Astrophysics Data System (ADS)

    Ho, Jessica; Boughton, Nick; Kehoe, Dennis; Michaelides, Zenon

    2001-10-01

    The Internet is playing an increasingly important role in enhancing the operations of supply networks as many organizations begin to recognize the benefits of Internet- enabled supply arrangements. However, the developments and applications to-date do not extend significantly beyond the dyadic model, whereas the real advantages are to be made with the external and network models to support a coordinated and collaborative based approach. The DOMAIN research group at the University of Liverpool is currently defining new Internet- enabled approaches to enable greater collaboration across supply chains. Different e-business models and tools are focusing on different applications. Using inappropriate e- business models, tools or techniques will bring negative results instead of benefits to all the tiers in the supply network. Thus there are a number of issues to be considered before addressing Internet based supply network integration, in particular an understanding of supply chain management, the emergent business models and evaluating the effects of deploying e-business to the supply network or a particular tier. It is important to utilize a contingent approach to selecting the right e-business model to meet the specific supply chain requirements. This paper addresses the issues and provides a case study on the indirect materials supply networks.

  9. A Novel Rules Based Approach for Estimating Software Birthmark

    PubMed Central

    Binti Alias, Norma; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363

  10. A novel rules based approach for estimating software birthmark.

    PubMed

    Nazir, Shah; Shahzad, Sara; Khan, Sher Afzal; Alias, Norma Binti; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark.

  11. Rights-Based Approaches to Ensure Sustainable Nutrition Security.

    PubMed

    Banerjee, Sweta

    2016-01-01

    In India, a rights-based approach has been used to address large-scale malnutrition, including both micro- and macro-level nutrition deficiencies. Stunting, which is an intergenerational chronic consequence of malnutrition, is especially widespread in India (38% among children under 5 years old). To tackle this problem, the government of India has designed interventions for the first 1,000 days, a critical period of the life cycle, through a number of community-based programs to fulfill the rights to food and life. However, the entitlements providing these rights have not yet produced the necessary changes in the malnutrition status of people, especially women and children. The government of India has already implemented laws and drafted a constitution that covers the needs of its citizens, but corruption, bureaucracy, lack of awareness of rights and entitlements and social discrimination limit people's access to basic rights and services. To address this crisis, Welthungerhilfe India, working in remote villages of the most backward states in India, has shifted from a welfare-based approach to a rights-based approach. The Fight Hunger First Initiative, started by Welthungerhilfe in 2011, is designed on the premise that in the long term, poor people can only leave poverty behind if adequate welfare systems are in place and if basic rights are fulfilled; these rights include access to proper education, sufficient access to adequate food and income, suitable health services and equal rights. Only then can the next generation of disadvantaged populations look forward to a new and better future and can growth benefit the entire society. The project, co-funded by the Federal Ministry for Economic Cooperation and Development, is a long-term multi-sectoral program that involves institution-building and empowerment. PMID:27198153

  12. A risk-based approach for a national assessment

    SciTech Connect

    Whelan, Gene; Laniak, Gerard F.

    1998-10-18

    The need for environmental systems modeling is growing rapidly because of the 1) the combination of increasing technical scope and complexity related to questions of risk-based cause and effect and 2) need to explicitly address cost effectiveness in both the development and implementation of environmental regulations. The nature of risk assessments are evolving with their increased complexity in assessing individual sites and collection of sites, addressing regional or national regulatory needs. These assessments require the integration of existing tools and the development of new databases and models, based on a comprehensive and holistic view of the risk assessment problem. To meet these environmental regulatory needs, multiple-media-based assessments are formulated to view and assess risks from a comprehensive environmental systems perspective, crossing the boundaries of several scientific disciplines. Given the consideration and the advanced states of computer hardware and software, it is possible to design a software system that facilitates the development and integration of assessment tools (e.g., databases and models). In this paper, a risk-based approach for supporting national risk assessments is presented. This approach combines 1) databases, 2) multiple media models, combining source-term, fate and transport, exposure, and risk/hazard, and 3) sensitivity/uncertainty capabilities within a software system capable of growing within the science of risk assessment. The design and linkages of the system are discussed. This paper also provides the rationale behind the design of the framework, as there is a recognized need to develop more holistic approaches to risk assessment.

  13. Rights-Based Approaches to Ensure Sustainable Nutrition Security.

    PubMed

    Banerjee, Sweta

    2016-01-01

    In India, a rights-based approach has been used to address large-scale malnutrition, including both micro- and macro-level nutrition deficiencies. Stunting, which is an intergenerational chronic consequence of malnutrition, is especially widespread in India (38% among children under 5 years old). To tackle this problem, the government of India has designed interventions for the first 1,000 days, a critical period of the life cycle, through a number of community-based programs to fulfill the rights to food and life. However, the entitlements providing these rights have not yet produced the necessary changes in the malnutrition status of people, especially women and children. The government of India has already implemented laws and drafted a constitution that covers the needs of its citizens, but corruption, bureaucracy, lack of awareness of rights and entitlements and social discrimination limit people's access to basic rights and services. To address this crisis, Welthungerhilfe India, working in remote villages of the most backward states in India, has shifted from a welfare-based approach to a rights-based approach. The Fight Hunger First Initiative, started by Welthungerhilfe in 2011, is designed on the premise that in the long term, poor people can only leave poverty behind if adequate welfare systems are in place and if basic rights are fulfilled; these rights include access to proper education, sufficient access to adequate food and income, suitable health services and equal rights. Only then can the next generation of disadvantaged populations look forward to a new and better future and can growth benefit the entire society. The project, co-funded by the Federal Ministry for Economic Cooperation and Development, is a long-term multi-sectoral program that involves institution-building and empowerment.

  14. [Community-based approaches in the fight against Buruli ulcer : review of the literature].

    PubMed

    Ndongo, Paule Yolande; Fond-Harmant, Laurence; Deccache, Alain

    2014-01-01

    Buruli ulcer (BU) is an infectious skin disease caused by Mycobacterium ulcerans. It mainly affects poor communities living close to bodies of water. In the absence of early treatment, this "neglected" disease can cause lasting deformities and may require limb amputation. It is reported in 34 countries and is the third most common mycobacterial disease in immunocompetent patients. Considerable progress has been made in treatment and prevention. The Cotonou Declaration (2009) describes the recommended control strategies. Although effective, current control strategies are limited because they do not take into account all the factors that influence emergence, prevention and cure of the disease. The control of Buruli ulcer mainly depends on intervention on social, cultural and psychosocial factors that influence preventive and self-care behaviour. The health promotion approach requires collaboration with populations in order to perform simultaneous actions on BU factors in the community setting. Although effective on many health problems, health promotion is not applied in the fight against BU due to the absence of action on all factors such as poverty. This article presents a review of the literature on BU strategies and community approaches. 407 relevant articles published in 1998-2013 period were examined. Eleven programmes are based on a top-down approach, which does not include populations in decision-making processes, unlike the bottom-up participatory approaches recommended in health promotion. Three health promotion programmes and 6 community-based participatory approaches were identified and examined. Community participation and empowerment constitute the basis for a community approach in the fight against Buruli ulcer.

  15. Relativistic three-body quark model of light baryons based on hypercentral approach

    NASA Astrophysics Data System (ADS)

    Aslanzadeh, M.; Rajabi, A. A.

    2015-05-01

    In this paper, we have treated the light baryons as a relativistic three-body bound system. Inspired by lattice QCD calculations, we treated baryons as a spin-independent three-quark system within a relativistic three-quark model based on the three-particle Klein-Gordon equation. We presented the analytical solution of three-body Klein-Gordon equation with employing the constituent quark model based on a hypercentral approach through which two- and three-body forces are taken into account. Herewith the average energy values of the up, down and strange quarks containing multiplets are reproduced. To describe the hyperfine structure of the baryon, the splittings within the SU(6)-multiplets are produced by the generalized Gürsey Radicati mass formula. The considered SU(6)-invariant potential is popular "Coulomb-plus-linear" potential and the strange and non-strange baryons spectra are in general well reproduced.

  16. New approaches to addiction treatment based on learning and memory.

    PubMed

    Kiefer, Falk; Dinter, Christina

    2013-01-01

    Preclinical studies suggest that physiological learning processes are similar to changes observed in addicts at the molecular, neuronal, and structural levels. Based on the importance of classical and instrumental conditioning in the development and maintenance of addictive disorders, many have suggested cue-exposure-based extinction training of conditioned, drug-related responses as a potential new treatment of addiction. It may also be possible to facilitate this extinction training with pharmacological compounds that strengthen memory consolidation during cue exposure. Another potential therapeutic intervention would be based on the so-called reconsolidation theory. According to this hypothesis, already-consolidated memories return to a labile state when reactivated, allowing them to undergo another phase of consolidation-reconsolidation, which can be pharmacologically manipulated. These approaches suggest that the extinction of drug-related memories may represent a viable treatment strategy in the future treatment of addiction.

  17. New approaches to addiction treatment based on learning and memory.

    PubMed

    Kiefer, Falk; Dinter, Christina

    2013-01-01

    Preclinical studies suggest that physiological learning processes are similar to changes observed in addicts at the molecular, neuronal, and structural levels. Based on the importance of classical and instrumental conditioning in the development and maintenance of addictive disorders, many have suggested cue-exposure-based extinction training of conditioned, drug-related responses as a potential new treatment of addiction. It may also be possible to facilitate this extinction training with pharmacological compounds that strengthen memory consolidation during cue exposure. Another potential therapeutic intervention would be based on the so-called reconsolidation theory. According to this hypothesis, already-consolidated memories return to a labile state when reactivated, allowing them to undergo another phase of consolidation-reconsolidation, which can be pharmacologically manipulated. These approaches suggest that the extinction of drug-related memories may represent a viable treatment strategy in the future treatment of addiction. PMID:21735361

  18. Probabilistic Risk-Based Approach to Aeropropulsion System Assessment Developed

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2001-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decisionmakers to determine the feasibility and return-on-investment of a new aircraft engine.

  19. Predicting dispersal distance in mammals: a trait-based approach.

    PubMed

    Whitmee, Sarah; Orme, C David L

    2013-01-01

    Dispersal is one of the principal mechanisms influencing ecological and evolutionary processes but quantitative empirical data are unfortunately scarce. As dispersal is likely to influence population responses to climate change, whether by adaptation or by migration, there is an urgent need to obtain estimates of dispersal distance. Cross-species correlative approaches identifying predictors of dispersal distance can provide much-needed insights into this data-scarce area. Here, we describe the compilation of a new data set of natal dispersal distances and use it to test life-history predictors of dispersal distance in mammals and examine the strength of the phylogenetic signal in dispersal distance. We find that both maximum and median dispersal distances have strong phylogenetic signals. No single model performs best in describing either maximum or median dispersal distances when phylogeny is taken into account but many models show high explanatory power, suggesting that dispersal distance per generation can be estimated for mammals with comparatively little data availability. Home range area, geographic range size and body mass are identified as the most important terms across models. Cross-validation of models supports the ability of these variables to predict dispersal distances, suggesting that models may be extended to species where dispersal distance is unknown.

  20. A cloud-based approach to medical NLP.

    PubMed

    Chard, Kyle; Russell, Michael; Lussier, Yves A; Mendonça, Eneida A; Silverstein, Jonathan C

    2011-01-01

    Natural Language Processing (NLP) enables access to deep content embedded in medical texts. To date, NLP has not fulfilled its promise of enabling robust clinical encoding, clinical use, quality improvement, and research. We submit that this is in part due to poor accessibility, scalability, and flexibility of NLP systems. We describe here an approach and system which leverages cloud-based approaches such as virtual machines and Representational State Transfer (REST) to extract, process, synthesize, mine, compare/contrast, explore, and manage medical text data in a flexibly secure and scalable architecture. Available architectures in which our Smntx (pronounced as semantics) system can be deployed include: virtual machines in a HIPAA-protected hospital environment, brought up to run analysis over bulk data and destroyed in a local cloud; a commercial cloud for a large complex multi-institutional trial; and within other architectures such as caGrid, i2b2, or NHIN.

  1. A Cloud-based Approach to Medical NLP

    PubMed Central

    Chard, Kyle; Russell, Michael; Lussier, Yves A.; Mendonça, Eneida A; Silverstein, Jonathan C.

    2011-01-01

    Natural Language Processing (NLP) enables access to deep content embedded in medical texts. To date, NLP has not fulfilled its promise of enabling robust clinical encoding, clinical use, quality improvement, and research. We submit that this is in part due to poor accessibility, scalability, and flexibility of NLP systems. We describe here an approach and system which leverages cloud-based approaches such as virtual machines and Representational State Transfer (REST) to extract, process, synthesize, mine, compare/contrast, explore, and manage medical text data in a flexibly secure and scalable architecture. Available architectures in which our Smntx (pronounced as semantics) system can be deployed include: virtual machines in a HIPAA-protected hospital environment, brought up to run analysis over bulk data and destroyed in a local cloud; a commercial cloud for a large complex multi-institutional trial; and within other architectures such as caGrid, i2b2, or NHIN. PMID:22195072

  2. Integrated Systems-Based Approach to Monitoring Environmental Remediation

    SciTech Connect

    Bunn, Amoret L.; Truex, Michael J.; Oostrom, Martinus; Carroll, Kenneth C.; Wellman, Dawn M.

    2013-02-24

    The US Department of Energy (DOE) is responsible for risk reduction and cleanup of its nuclear weapons complex. Remediation strategies for some of the existing contamination use techniques that mitigate risk, but leave contaminants in place. Monitoring to verify remedy performance and long-term mitigation of risk is a key element for implementing these strategies and can be a large portion of the total cost of remedy implementation. Especially in these situations, there is a need for innovative monitoring approaches that move away from the cost and labor intensive point-source monitoring. A systems-based approach to monitoring design focuses monitoring on controlling features and processes to enable effective interpretation of remedy performance.

  3. Integrated Systems-Based Approach to Monitoring Environmental Remediation - 13211

    SciTech Connect

    Truex, Mike; Oostrom, Mart; Carroll, K.C.; Bunn, Amoret; Wellman, Dawn

    2013-07-01

    The US Department of Energy (DOE) is responsible for risk reduction and cleanup of its nuclear weapons complex. Remediation strategies for some of the existing contamination use techniques that mitigate risk, but leave contaminants in place. Monitoring to verify remedy performance and long-term mitigation of risk is a key element for implementing these strategies and can be a large portion of the total cost of remedy implementation. Especially in these situations, there is a need for innovative monitoring approaches that move away from the cost and labor intensive point-source monitoring. A systems-based approach to monitoring design focuses monitoring on controlling features and processes to enable effective interpretation of remedy performance. (authors)

  4. Evolutionary modeling-based approach for model errors correction

    NASA Astrophysics Data System (ADS)

    Wan, S. Q.; He, W. P.; Wang, L.; Jiang, W.; Zhang, W.

    2012-08-01

    The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963) equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data." On the basis of the intelligent features of evolutionary modeling (EM), including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  5. A Thermodynamically-consistent FBA-based Approach to Biogeochemical Reaction Modeling

    NASA Astrophysics Data System (ADS)

    Shapiro, B.; Jin, Q.

    2015-12-01

    Microbial rates are critical to understanding biogeochemical processes in natural environments. Recently, flux balance analysis (FBA) has been applied to predict microbial rates in aquifers and other settings. FBA is a genome-scale constraint-based modeling approach that computes metabolic rates and other phenotypes of microorganisms. This approach requires a prior knowledge of substrate uptake rates, which is not available for most natural microbes. Here we propose to constrain substrate uptake rates on the basis of microbial kinetics. Specifically, we calculate rates of respiration (and fermentation) using a revised Monod equation; this equation accounts for both the kinetics and thermodynamics of microbial catabolism. Substrate uptake rates are then computed from the rates of respiration, and applied to FBA to predict rates of microbial growth. We implemented this method by linking two software tools, PHREEQC and COBRA Toolbox. We applied this method to acetotrophic methanogenesis by Methanosarcina barkeri, and compared the simulation results to previous laboratory observations. The new method constrains acetate uptake by accounting for the kinetics and thermodynamics of methanogenesis, and predicted well the observations of previous experiments. In comparison, traditional methods of dynamic-FBA constrain acetate uptake on the basis of enzyme kinetics, and failed to reproduce the experimental results. These results show that microbial rate laws may provide a better constraint than enzyme kinetics for applying FBA to biogeochemical reaction modeling.

  6. Practical and scientifically based approaches for cleanup and site restoration.

    PubMed

    Till, John E; McBaugh, Debra

    2005-11-01

    This paper presents practical and scientific approaches for cleanup and site restoration following terrorist events. Both approaches are required in actual emergency situations and are complementary. The practical examples are taken from the May 2003 second biannual national emergency exercise, Top Officials 2 (TOPOFF 2), which occurred in Chicago, Illinois, and Seattle, Washington. The scientific examples are taken from the Department of Energy sites at Rocky Flats, Fernald, and Los Alamos where cleanup initiatives based on scientific approaches and community input are underway. Three examples are provided to explain, from a practical standpoint, how decisions during the exercise had to be made quickly, even though the alternatives were not always clear. These examples illustrate how scientific approaches can be integrated into the resolution of these dilemmas. The examples are (1) use of water to wash city roads and freeways contaminated with plutonium, Am, and Cs; (2) decontamination of large public ferries that passed through a radioactive plume; and (3) handling of wastewater following decontamination within a city. Each of these situations posed the need for an immediate decision by authorities in charge, without the benefit of community input or time for an analysis of the important pathways of exposure. It is evident there is a need to merge the practical knowledge gained in emergency response with scientific knowledge learned from cleanup and site restoration. The development of some basic scientific approaches ahead of time in the form of easy-to-use tools will allow practical decisions to be made more quickly and effectively should an actual terrorist event occur. PMID:16217202

  7. A graph-based approach for designing extensible pipelines

    PubMed Central

    2012-01-01

    Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline development such as workflow management systems focus on analysis tasks that are systematically repeated without significant changes in their course of execution, such as genome annotation. However, more dynamism on the pipeline composition is necessary when each execution requires a different combination of steps. Results We propose a graph-based approach to implement extensible and low-maintenance pipelines that is suitable for pipeline applications with multiple functionalities that require different combinations of steps in each execution. Here pipelines are composed automatically by compiling a specialised set of tools on demand, depending on the functionality required, instead of specifying every sequence of tools in advance. We represent the connectivity of pipeline components with a directed graph in which components are the graph edges, their inputs and outputs are the graph nodes, and the paths through the graph are pipelines. To that end, we developed special data structures and a pipeline system algorithm. We demonstrate the applicability of our approach by implementing a format conversion pipeline for the fields of population genetics and genetic epidemiology, but our approach is also helpful in other fields where the use of multiple software is necessary to perform comprehensive analyses, such as gene expression and proteomics analyses. The project code, documentation and the Java executables are available under an open source license at http

  8. Big Data Analytics in Immunology: A Knowledge-Based Approach

    PubMed Central

    Zhang, Guang Lan

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow. PMID:25045677

  9. A practical approach to object based requirements analysis

    NASA Technical Reports Server (NTRS)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  10. A Personalized Collaborative Recommendation Approach Based on Clustering of Customers

    NASA Astrophysics Data System (ADS)

    Wang, Pu

    Collaborative filtering has been known to be the most successful recommender techniques in recommendation systems. Collaborative methods recommend items based on aggregated user ratings of those items and these techniques do not depend on the availability of textual descriptions. They share the common goal of assisting in the users search for items of interest, and thus attempt to address one of the key research problems of the information overload. Collaborative filtering systems can deal with large numbers of customers and with many different products. However there is a problem that the set of ratings is sparse, such that any two customers will most likely have only a few co-rated products. The high dimensional sparsity of the rating matrix and the problem of scalability result in low quality recommendations. In this paper, a personalized collaborative recommendation approach based on clustering of customers is presented. This method uses the clustering technology to form the customers centers. The personalized collaborative filtering approach based on clustering of customers can alleviate the scalability problem in the collaborative recommendations.

  11. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow. PMID:25045677

  12. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  13. Tool use and affordance: Manipulation-based versus reasoning-based approaches.

    PubMed

    Osiurak, François; Badets, Arnaud

    2016-10-01

    Tool use is a defining feature of human species. Therefore, a fundamental issue is to understand the cognitive bases of human tool use. Given that people cannot use tools without manipulating them, proponents of the manipulation-based approach have argued that tool use might be supported by the simulation of past sensorimotor experiences, also sometimes called affordances. However, in the meanwhile, evidence has been accumulated demonstrating the critical role of mechanical knowledge in tool use (i.e., the reasoning-based approach). The major goal of the present article is to examine the validity of the assumptions derived from the manipulation-based versus the reasoning-based approach. To do so, we identified 3 key issues on which the 2 approaches differ, namely, (a) the reference frame issue, (b) the intention issue, and (c) the action domain issue. These different issues will be addressed in light of studies in experimental psychology and neuropsychology that have provided valuable contributions to the topic (i.e., tool-use interaction, orientation effect, object-size effect, utilization behavior and anarchic hand, tool use and perception, apraxia of tool use, transport vs. use actions). To anticipate our conclusions, the reasoning-based approach seems to be promising for understanding the current literature, even if it is not fully satisfactory because of a certain number of findings easier to interpret with regard to the manipulation-based approach. A new avenue for future research might be to develop a framework accommodating both approaches, thereby shedding a new light on the cognitive bases of human tool use and affordances. (PsycINFO Database Record

  14. Tool use and affordance: Manipulation-based versus reasoning-based approaches.

    PubMed

    Osiurak, François; Badets, Arnaud

    2016-10-01

    Tool use is a defining feature of human species. Therefore, a fundamental issue is to understand the cognitive bases of human tool use. Given that people cannot use tools without manipulating them, proponents of the manipulation-based approach have argued that tool use might be supported by the simulation of past sensorimotor experiences, also sometimes called affordances. However, in the meanwhile, evidence has been accumulated demonstrating the critical role of mechanical knowledge in tool use (i.e., the reasoning-based approach). The major goal of the present article is to examine the validity of the assumptions derived from the manipulation-based versus the reasoning-based approach. To do so, we identified 3 key issues on which the 2 approaches differ, namely, (a) the reference frame issue, (b) the intention issue, and (c) the action domain issue. These different issues will be addressed in light of studies in experimental psychology and neuropsychology that have provided valuable contributions to the topic (i.e., tool-use interaction, orientation effect, object-size effect, utilization behavior and anarchic hand, tool use and perception, apraxia of tool use, transport vs. use actions). To anticipate our conclusions, the reasoning-based approach seems to be promising for understanding the current literature, even if it is not fully satisfactory because of a certain number of findings easier to interpret with regard to the manipulation-based approach. A new avenue for future research might be to develop a framework accommodating both approaches, thereby shedding a new light on the cognitive bases of human tool use and affordances. (PsycINFO Database Record PMID:26881695

  15. Contamination of Current Accountability Systems

    ERIC Educational Resources Information Center

    McGill-Franzen, Anne; Allington, Richard

    2006-01-01

    As public employees, educators should expect to be held accountable for their use of public funds. Nonetheless, the various state governments and now the U.S. Department of Education have implemented high-stakes achievement testing as the nearly singular approach to accountability. While these accountability efforts vary in a number of ways,…

  16. Scapular dyskinesia: evolution towards a systems-based approach.

    PubMed

    Willmore, Elaine G; Smith, Michael J

    2016-01-01

    Historically, scapular dyskinesia has been used to describe an isolated clinical entity whereby an abnormality in positioning, movement or function of the scapula is present. Based upon this, treatment approaches have focused on addressing local isolated muscle activity. Recently, however, there has been a progressive move towards viewing the scapula as being part of a wider system of movement that is regulated and controlled by multiple factors, including the wider kinetic chain and individual patient-centred requirements. We therefore propose a paradigm shift whereby scapular dyskinesia is seen not in isolation but is considered within the broader context of patient-centred care and an entire neuromuscular system. PMID:27583003

  17. A Clock Fingerprints-Based Approach for Wireless Transmitter Identification

    NASA Astrophysics Data System (ADS)

    Zhao, Caidan; Xie, Liang; Huang, Lianfen; Yao, Yan

    Cognitive radio (CR) was proposed as one of the promising solutions for low spectrum utilization. However, security problems such as the primary user emulation (PUE) attack severely limit its applications. In this paper, we propose a clock fingerprints-based authentication approach to prevent PUE attacks in CR networks with the help of curve fitting and classifier. An experimental setup was constructed using the WLAN cards and software radio devices, and the corresponding results show that satisfied identification can be achieved for wireless transmitters.

  18. Infections on Temporal Networks—A Matrix-Based Approach

    PubMed Central

    Koher, Andreas; Lentz, Hartmut H. K.; Hövel, Philipp; Sokolov, Igor M.

    2016-01-01

    We extend the concept of accessibility in temporal networks to model infections with a finite infectious period such as the susceptible-infected-recovered (SIR) model. This approach is entirely based on elementary matrix operations and unifies the disease and network dynamics within one algebraic framework. We demonstrate the potential of this formalism for three examples of networks with high temporal resolution: networks of social contacts, sexual contacts, and livestock-trade. Our investigations provide a new methodological framework that can be used, for instance, to estimate the epidemic threshold, a quantity that determines disease parameters, for which a large-scale outbreak can be expected. PMID:27035128

  19. Kinect-based rehabilitation exercises system: therapist involved approach.

    PubMed

    Yao, Li; Xu, Hui; Li, Andong

    2014-01-01

    The Kinect-based physical rehabilitation receives increasing recognition as an approach to provide convenience for the patients who need the therapy usually from the health professions. Most of the previous studies were driven from the patients' point of view. This paper proposes a system aiming to simplify the recovery instruction from therapists, increasing patients' motivation to participate in the rehabilitation exercise. Furthermore, the architecture for developing such rehabilitation system is designed by motion capture, human action recognition and standard exercises prototype with Kinect device. PMID:25226964

  20. An RBF-PSO based approach for modeling prostate cancer

    NASA Astrophysics Data System (ADS)

    Perracchione, Emma; Stura, Ilaria

    2016-06-01

    Prostate cancer is one of the most common cancers in men; it grows slowly and it could be diagnosed in an early stage by dosing the Prostate Specific Antigen (PSA). However, a relapse after the primary therapy could arise in 25 - 30% of cases and different growth characteristics of the new tumor are observed. In order to get a better understanding of the phenomenon, a two parameters growth model is considered. To estimate the parameters values identifying the disease risk level a novel approach, based on combining Particle Swarm Optimization (PSO) with meshfree interpolation methods, is proposed.