Averaging business cycles vs. myopia: Do we need a long term vision when developing IRP?
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonald, C.; Gupta, P.C.
1995-05-01
Utility demand forecasting is inherently imprecise due to the number of uncertainties resulting from business cycles, policy making, technology breakthroughs, national and international political upheavals and the limitations of the forecasting tools. This implies that revisions based primarily on recent experience could lead to unstable forecasts. Moreover, new planning tools are required that provide an explicit consideration of uncertainty and lead to flexible and robust planning tools are required that provide an explicit consideration of uncertainty and lead to flexible and robust planning decisions.
The Analytic Hierarchy Process and Participatory Decisionmaking
Daniel L. Schmoldt; Daniel L. Peterson; Robert L. Smith
1995-01-01
Managing natural resource lands requires social, as well as biophysical, considerations. Unfortunately, it is extremely difficult to accurately assess and quantify changing social preferences, and to aggregate conflicting opinions held by diverse social groups. The Analytic Hierarchy Process (AHP) provides a systematic, explicit, rigorous, and robust mechanism for...
Understanding the spatial distribution of environmental amenities requires consideration of social and biogeophysical factors, and how they interact to produce patterns of environmental justice or injustice. In this study, we explicitly account for terrain, a key local environmen...
Changes in Determining Gifted Eligibility for Underrepresented Students. Research Note. Volume 0802
ERIC Educational Resources Information Center
Froman, Terry
2008-01-01
There are a number of detailed requirements for gifted screening, referral and eligibility specified under State Board Rules and School Board Rules. In all of these areas, explicit mention is made of the Florida Comprehensive Assessment Test (FCAT) Norm Referenced Test (NRT) scores as being integral components. Due to budget considerations, the…
Value-Based Requirements Traceability: Lessons Learned
NASA Astrophysics Data System (ADS)
Egyed, Alexander; Grünbacher, Paul; Heindl, Matthias; Biffl, Stefan
Traceability from requirements to code is mandated by numerous software development standards. These standards, however, are not explicit about the appropriate level of quality of trace links. From a technical perspective, trace quality should meet the needs of the intended trace utilizations. Unfortunately, long-term trace utilizations are typically unknown at the time of trace acquisition which represents a dilemma for many companies. This chapter suggests ways to balance the cost and benefits of requirements traceability. We present data from three case studies demonstrating that trace acquisition requires broad coverage but can tolerate imprecision. With this trade-off our lessons learned suggest a traceability strategy that (1) provides trace links more quickly, (2) refines trace links according to user-defined value considerations, and (3) supports the later refinement of trace links in case the initial value consideration has changed over time. The scope of our work considers the entire life cycle of traceability instead of just the creation of trace links.
Qualitative Secondary Analysis: A Case Exemplar.
Tate, Judith Ann; Happ, Mary Beth
Qualitative secondary analysis (QSA) is the use of qualitative data that was collected by someone else or was collected to answer a different research question. Secondary analysis of qualitative data provides an opportunity to maximize data utility, particularly with difficult-to-reach patient populations. However, qualitative secondary analysis methods require careful consideration and explicit description to best understand, contextualize, and evaluate the research results. In this article, we describe methodologic considerations using a case exemplar to illustrate challenges specific to qualitative secondary analysis and strategies to overcome them. Copyright © 2017 National Association of Pediatric Nurse Practitioners. Published by Elsevier Inc. All rights reserved.
Topology optimization for nonlinear dynamic problems: Considerations for automotive crashworthiness
NASA Astrophysics Data System (ADS)
Kaushik, Anshul; Ramani, Anand
2014-04-01
Crashworthiness of automotive structures is most often engineered after an optimal topology has been arrived at using other design considerations. This study is an attempt to incorporate crashworthiness requirements upfront in the topology synthesis process using a mathematically consistent framework. It proposes the use of equivalent linear systems from the nonlinear dynamic simulation in conjunction with a discrete-material topology optimizer. Velocity and acceleration constraints are consistently incorporated in the optimization set-up. Issues specific to crash problems due to the explicit solution methodology employed, nature of the boundary conditions imposed on the structure, etc. are discussed and possible resolutions are proposed. A demonstration of the methodology on two-dimensional problems that address some of the structural requirements and the types of loading typical of frontal and side impact is provided in order to show that this methodology has the potential for topology synthesis incorporating crashworthiness requirements.
Review and Assessment of JPL's Thermal Margins
NASA Technical Reports Server (NTRS)
Siebes, G.; Kingery, C.; Farguson, C.; White, M.; Blakely, M.; Nunes, J.; Avila, A.; Man, K.; Hoffman, A.; Forgrave, J.
2012-01-01
JPL has captured its experience from over four decades of robotic space exploration into a set of design rules. These rules have gradually changed into explicit requirements and are now formally implemented and verified. Over an extended period of time, the initial understanding of intent and rationale for these rules has faded and rules are now frequently applied without further consideration. In the meantime, mission classes and their associated risk postures have evolved, coupled with resource constraints and growing design diversity, bringing into question the current "one size fits all" thermal margin approach. This paper offers a systematic review of the heat flow path from an electronic junction to the eventual heat rejection to space. This includes the identification of different regimes along this path and the associated requirements. The work resulted in a renewed understanding of the intent behind JPL requirements for hot thermal margins and a framework for relevant considerations, which in turn enables better decision making when a deviation to these requirements is considered.
More than a feeling: Pervasive influences of memory without awareness of retrieval
Voss, Joel L.; Lucas, Heather D.; Paller, Ken A.
2015-01-01
The subjective experiences of recollection and familiarity have featured prominently in the search for neurocognitive mechanisms of memory. However, these two explicit expressions of memory, which involve conscious awareness of memory retrieval, are distinct from an entire category of implicit expressions of memory that do not entail such awareness. This review summarizes recent evidence showing that neurocognitive processing related to implicit memory can powerfully influence the behavioral and neural measures typically associated with explicit memory. Although there are striking distinctions between the neurocognitive processing responsible for implicit versus explicit memory, tests designed to measure only explicit memory nonetheless often capture implicit memory processing as well. In particular, the evidence described here suggests that investigations of familiarity memory are prone to the accidental capture of implicit memory processing. These findings have considerable implications for neurocognitive accounts of memory, as they suggest that many neural and behavioral measures often accepted as signals of explicit memory instead reflect the distinct operation of implicit memory mechanisms that are only sometimes related to explicit memory expressions. Proper identification of the explicit and implicit mechanisms for memory is vital to understanding the normal operation of memory, in addition to the disrupted memory capabilities associated with many neurological disorders and mental illnesses. We suggest that future progress requires utilizing neural, behavioral, and subjective evidence to dissociate implicit and explicit memory processing so as to better understand their distinct mechanisms as well as their potential relationships. When searching for the neurocognitive mechanisms of memory, it is important to keep in mind that memory involves more than a feeling. PMID:24171735
Memory-Efficient Analysis of Dense Functional Connectomes.
Loewe, Kristian; Donohue, Sarah E; Schoenfeld, Mircea A; Kruse, Rudolf; Borgelt, Christian
2016-01-01
The functioning of the human brain relies on the interplay and integration of numerous individual units within a complex network. To identify network configurations characteristic of specific cognitive tasks or mental illnesses, functional connectomes can be constructed based on the assessment of synchronous fMRI activity at separate brain sites, and then analyzed using graph-theoretical concepts. In most previous studies, relatively coarse parcellations of the brain were used to define regions as graphical nodes. Such parcellated connectomes are highly dependent on parcellation quality because regional and functional boundaries need to be relatively consistent for the results to be interpretable. In contrast, dense connectomes are not subject to this limitation, since the parcellation inherent to the data is used to define graphical nodes, also allowing for a more detailed spatial mapping of connectivity patterns. However, dense connectomes are associated with considerable computational demands in terms of both time and memory requirements. The memory required to explicitly store dense connectomes in main memory can render their analysis infeasible, especially when considering high-resolution data or analyses across multiple subjects or conditions. Here, we present an object-based matrix representation that achieves a very low memory footprint by computing matrix elements on demand instead of explicitly storing them. In doing so, memory required for a dense connectome is reduced to the amount needed to store the underlying time series data. Based on theoretical considerations and benchmarks, different matrix object implementations and additional programs (based on available Matlab functions and Matlab-based third-party software) are compared with regard to their computational efficiency. The matrix implementation based on on-demand computations has very low memory requirements, thus enabling analyses that would be otherwise infeasible to conduct due to insufficient memory. An open source software package containing the created programs is available for download.
Memory-Efficient Analysis of Dense Functional Connectomes
Loewe, Kristian; Donohue, Sarah E.; Schoenfeld, Mircea A.; Kruse, Rudolf; Borgelt, Christian
2016-01-01
The functioning of the human brain relies on the interplay and integration of numerous individual units within a complex network. To identify network configurations characteristic of specific cognitive tasks or mental illnesses, functional connectomes can be constructed based on the assessment of synchronous fMRI activity at separate brain sites, and then analyzed using graph-theoretical concepts. In most previous studies, relatively coarse parcellations of the brain were used to define regions as graphical nodes. Such parcellated connectomes are highly dependent on parcellation quality because regional and functional boundaries need to be relatively consistent for the results to be interpretable. In contrast, dense connectomes are not subject to this limitation, since the parcellation inherent to the data is used to define graphical nodes, also allowing for a more detailed spatial mapping of connectivity patterns. However, dense connectomes are associated with considerable computational demands in terms of both time and memory requirements. The memory required to explicitly store dense connectomes in main memory can render their analysis infeasible, especially when considering high-resolution data or analyses across multiple subjects or conditions. Here, we present an object-based matrix representation that achieves a very low memory footprint by computing matrix elements on demand instead of explicitly storing them. In doing so, memory required for a dense connectome is reduced to the amount needed to store the underlying time series data. Based on theoretical considerations and benchmarks, different matrix object implementations and additional programs (based on available Matlab functions and Matlab-based third-party software) are compared with regard to their computational efficiency. The matrix implementation based on on-demand computations has very low memory requirements, thus enabling analyses that would be otherwise infeasible to conduct due to insufficient memory. An open source software package containing the created programs is available for download. PMID:27965565
NASA Astrophysics Data System (ADS)
Gerck, Ed
We present a new, comprehensive framework to qualitatively improve election outcome trustworthiness, where voting is modeled as an information transfer process. Although voting is deterministic (all ballots are counted), information is treated stochastically using Information Theory. Error considerations, including faults, attacks, and threats by adversaries, are explicitly included. The influence of errors may be corrected to achieve an election outcome error as close to zero as desired (error-free), with a provably optimal design that is applicable to any type of voting, with or without ballots. Sixteen voting system requirements, including functional, performance, environmental and non-functional considerations, are derived and rated, meeting or exceeding current public-election requirements. The voter and the vote are unlinkable (secret ballot) although each is identifiable. The Witness-Voting System (Gerck, 2001) is extended as a conforming implementation of the provably optimal design that is error-free, transparent, simple, scalable, robust, receipt-free, universally-verifiable, 100% voter-verified, and end-to-end audited.
Implicit time accurate simulation of unsteady flow
NASA Astrophysics Data System (ADS)
van Buuren, René; Kuerten, Hans; Geurts, Bernard J.
2001-03-01
Implicit time integration was studied in the context of unsteady shock-boundary layer interaction flow. With an explicit second-order Runge-Kutta scheme, a reference solution to compare with the implicit second-order Crank-Nicolson scheme was determined. The time step in the explicit scheme is restricted by both temporal accuracy as well as stability requirements, whereas in the A-stable implicit scheme, the time step has to obey temporal resolution requirements and numerical convergence conditions. The non-linear discrete equations for each time step are solved iteratively by adding a pseudo-time derivative. The quasi-Newton approach is adopted and the linear systems that arise are approximately solved with a symmetric block Gauss-Seidel solver. As a guiding principle for properly setting numerical time integration parameters that yield an efficient time accurate capturing of the solution, the global error caused by the temporal integration is compared with the error resulting from the spatial discretization. Focus is on the sensitivity of properties of the solution in relation to the time step. Numerical simulations show that the time step needed for acceptable accuracy can be considerably larger than the explicit stability time step; typical ratios range from 20 to 80. At large time steps, convergence problems that are closely related to a highly complex structure of the basins of attraction of the iterative method may occur. Copyright
Schwartz, Jennifer A T; Pearson, Steven D
2013-06-24
Despite increasing concerns regarding the cost of health care, the consideration of costs in the development of clinical guidance documents by physician specialty societies has received little analysis. To evaluate the approach to consideration of cost in publicly available clinical guidance documents and methodological statements produced between 2008 and 2012 by the 30 largest US physician specialty societies. Qualitative document review. Whether costs are considered in clinical guidance development, mechanism of cost consideration, and the way that cost issues were used in support of specific clinical practice recommendations. Methodological statements for clinical guidance documents indicated that 17 of 30 physician societies (57%) explicitly integrated costs, 4 (13%) implicitly considered costs, 3 (10%) intentionally excluded costs, and 6 (20%) made no mention. Of the 17 societies that explicitly integrated costs, 9 (53%) consistently used a formal system in which the strength of recommendation was influenced in part by costs, whereas 8 (47%) were inconsistent in their approach or failed to mention the exact mechanism for considering costs. Among the 138 specific recommendations in these guidance documents that included cost as part of the rationale, the most common form of recommendation (50 [36%]) encouraged the use of a specific medical service because of equal effectiveness and lower cost. Slightly more than half of the largest US physician societies explicitly consider costs in developing their clinical guidance documents; among these, approximately half use an explicit mechanism for integrating costs into the strength of recommendations. Many societies remain vague in their approach. Physician specialty societies should demonstrate greater transparency and rigor in their approach to cost consideration in documents meant to influence care decisions.
Hill, Suzanne R; Olson, Leslie G; Falck-Ytter, Yngve; Cruz, Alvaro A; Atkins, David; Baumann, Michael; Jaeschke, Roman; Woitalla, Thomas; Schünemann, Holger J
2012-12-01
Professional societies, like many other organizations around the world, have recognized the need to use rigorous processes to ensure that health care recommendations are based on the best available research evidence. This is the sixth of a series of 14 articles prepared to advise guideline developers for respiratory and other diseases on how to achieve this goal. In this article, we focused on integrating cost and resource information in guideline development and formulating recommendations focusing on four key questions. We addressed the following specific questions. (1) When is it important to incorporate costs, and/or resource implications, and/or cost-effectiveness, and/or affordability considerations in guidelines? (2) Which costs and which resource use should be considered in guidelines? (3)What sources of evidence should be used to estimate costs, resource use, and cost-effectiveness? (4) How can cost-effectiveness, resource implications, and affordability be taken into account explicitly? Our work was based on a prior review on this topic and our conclusions are based on available evidence, consideration of what guideline developers are doing, and workshop discussions. Many authorities suggest that there is a need to include explicit consideration of costs, resource use, and affordability during guideline development. Where drug use is at issue, "explicit consideration" may need to involve only noting whether the price (easily determined and usually the main component of "acquisition cost") of a drug is high or low. Complex interventions such as rehabilitation services are to a greater degree setting- and system-dependent. Resources used, and the costs of those resources, will vary among systems, and formal identification by a guideline group of the resource requirements of a complex intervention is essential. A clinical guideline usually contains multiple recommendations, and in some cases there are hundreds. Defining costs and resource use for all of them-especially for multiple settings-is unlikely to be feasible. At present, disaggregated resource utilization accompanied by some cost information seems to be the most promising approach. The method for assigning values to costs, including external or indirect cost (such as time off work), can have a significant impact on the outcome of any economic evaluation. The perspective that the guideline assumes should be made explicit. Standards for evidence for clinical data are usually good-quality trials reporting a relevant endpoint that should be summarized in a systematic review. Like others, we are therefore proposing that the ideal sources of evidence for cost and resource utilization data for guideline development are systematic reviews of randomized controlled trials that report resource utilization, with direct comparisons between the interventions of interest.
Are History Textbooks More "Considerate" after 20 Years?
ERIC Educational Resources Information Center
Berkeley, Sheri; King-Sears, Margaret E.; Hott, Brittany L.; Bradley-Black, Katherine
2014-01-01
Features of eighth-grade history textbooks were examined through replication of a 20-year-old study that investigated "considerateness" of textbooks. Considerate texts provide clear, coherent information and include features that promote students' comprehension, such as explicit use of organizational structures, a range of question types…
What Does It Take for an Infant to Learn How to Use a Tool by Observation?
Fagard, Jacqueline; Rat-Fischer, Lauriane; Esseily, Rana; Somogyi, Eszter; O’Regan, J. K.
2016-01-01
Observational learning is probably one of the most powerful factors determining progress during child development. When learning a new skill, infants rely on their own exploration; but they also frequently benefit from an adult’s verbal support or from demonstration by an adult modeling the action. At what age and under what conditions does adult demonstration really help the infant to learn a novel behavior? In this review, we summarize recently published work we have conducted on the acquisition of tool use during the second year of life. In particular, we consider under what conditions and to what extent seeing a demonstration from an adult advances an infant’s understanding of how to use a tool to obtain an out-of-reach object. Our results show that classic demonstration starts being helpful at 18 months of age. When adults explicitly show their intention prior to demonstration, even 16-month-old infants learn from the demonstration. On the other hand, providing an explicit demonstration (“look at how I do it”) is not very useful before infants are ready to succeed by themselves anyway. In contrast, repeated observations of the required action in a social context, without explicit reference to this action, considerably advances the age of success and the usefulness of providing a demonstration. We also show that the effect of demonstration can be enhanced if the demonstration makes the baby laugh. Taken together, the results from this series of studies on observational learning of tool use in infants suggest, first, that when observing a demonstration, infants do not know what to pay attention to: demonstration must be accompanied by rich social cues to be effective; second, infants’ attention is inhibited rather than enhanced by an explicit demand of “look at what I do”; and finally a humorous situation considerably helps infants understand the demonstration. PMID:26973565
What Does It Take for an Infant to Learn How to Use a Tool by Observation?
Fagard, Jacqueline; Rat-Fischer, Lauriane; Esseily, Rana; Somogyi, Eszter; O'Regan, J K
2016-01-01
Observational learning is probably one of the most powerful factors determining progress during child development. When learning a new skill, infants rely on their own exploration; but they also frequently benefit from an adult's verbal support or from demonstration by an adult modeling the action. At what age and under what conditions does adult demonstration really help the infant to learn a novel behavior? In this review, we summarize recently published work we have conducted on the acquisition of tool use during the second year of life. In particular, we consider under what conditions and to what extent seeing a demonstration from an adult advances an infant's understanding of how to use a tool to obtain an out-of-reach object. Our results show that classic demonstration starts being helpful at 18 months of age. When adults explicitly show their intention prior to demonstration, even 16-month-old infants learn from the demonstration. On the other hand, providing an explicit demonstration ("look at how I do it") is not very useful before infants are ready to succeed by themselves anyway. In contrast, repeated observations of the required action in a social context, without explicit reference to this action, considerably advances the age of success and the usefulness of providing a demonstration. We also show that the effect of demonstration can be enhanced if the demonstration makes the baby laugh. Taken together, the results from this series of studies on observational learning of tool use in infants suggest, first, that when observing a demonstration, infants do not know what to pay attention to: demonstration must be accompanied by rich social cues to be effective; second, infants' attention is inhibited rather than enhanced by an explicit demand of "look at what I do"; and finally a humorous situation considerably helps infants understand the demonstration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieder, William R.; Allison, Steven D.; Davidson, Eric A.
Microbes influence soil organic matter (SOM) decomposition and the long-term stabilization of carbon (C) in soils. We contend that by revising the representation of microbial processes and their interactions with the physicochemical soil environment, Earth system models (ESMs) may make more realistic global C cycle projections. Explicit representation of microbial processes presents considerable challenges due to the scale at which these processes occur. Thus, applying microbial theory in ESMs requires a framework to link micro-scale process-level understanding and measurements to macro-scale models used to make decadal- to century-long projections. Here, we review the diversity, advantages, and pitfalls of simulating soilmore » biogeochemical cycles using microbial-explicit modeling approaches. We present a roadmap for how to begin building, applying, and evaluating reliable microbial-explicit model formulations that can be applied in ESMs. Drawing from experience with traditional decomposition models we suggest: (1) guidelines for common model parameters and output that can facilitate future model intercomparisons; (2) development of benchmarking and model-data integration frameworks that can be used to effectively guide, inform, and evaluate model parameterizations with data from well-curated repositories; and (3) the application of scaling methods to integrate microbial-explicit soil biogeochemistry modules within ESMs. With contributions across scientific disciplines, we feel this roadmap can advance our fundamental understanding of soil biogeochemical dynamics and more realistically project likely soil C response to environmental change at global scales.« less
NASA Astrophysics Data System (ADS)
Vaidya, Bhargav; Prasad, Deovrat; Mignone, Andrea; Sharma, Prateek; Rickler, Luca
2017-12-01
An important ingredient in numerical modelling of high temperature magnetized astrophysical plasmas is the anisotropic transport of heat along magnetic field lines from higher to lower temperatures. Magnetohydrodynamics typically involves solving the hyperbolic set of conservation equations along with the induction equation. Incorporating anisotropic thermal conduction requires to also treat parabolic terms arising from the diffusion operator. An explicit treatment of parabolic terms will considerably reduce the simulation time step due to its dependence on the square of the grid resolution (Δx) for stability. Although an implicit scheme relaxes the constraint on stability, it is difficult to distribute efficiently on a parallel architecture. Treating parabolic terms with accelerated super-time-stepping (STS) methods has been discussed in literature, but these methods suffer from poor accuracy (first order in time) and also have difficult-to-choose tuneable stability parameters. In this work, we highlight a second-order (in time) Runge-Kutta-Legendre (RKL) scheme (first described by Meyer, Balsara & Aslam 2012) that is robust, fast and accurate in treating parabolic terms alongside the hyperbolic conversation laws. We demonstrate its superiority over the first-order STS schemes with standard tests and astrophysical applications. We also show that explicit conduction is particularly robust in handling saturated thermal conduction. Parallel scaling of explicit conduction using RKL scheme is demonstrated up to more than 104 processors.
Not explicit but implicit memory is influenced by individual perception style
Tsushima, Yoshiaki
2018-01-01
Not only explicit but also implicit memory has considerable influence on our daily life. However, it is still unclear whether explicit and implicit memories are sensitive to individual differences. Here, we investigated how individual perception style (global or local) correlates with implicit and explicit memory. As a result, we found that not explicit but implicit memory was affected by the perception style: local perception style people more greatly used implicit memory than global perception style people. These results help us to make the new effective application adapting to individual perception style and understand some clinical symptoms such as autistic spectrum disorder. Furthermore, this finding might give us new insight of memory involving consciousness and unconsciousness as well as relationship between implicit/explicit memory and individual perception style. PMID:29370212
Not explicit but implicit memory is influenced by individual perception style.
Hine, Kyoko; Tsushima, Yoshiaki
2018-01-01
Not only explicit but also implicit memory has considerable influence on our daily life. However, it is still unclear whether explicit and implicit memories are sensitive to individual differences. Here, we investigated how individual perception style (global or local) correlates with implicit and explicit memory. As a result, we found that not explicit but implicit memory was affected by the perception style: local perception style people more greatly used implicit memory than global perception style people. These results help us to make the new effective application adapting to individual perception style and understand some clinical symptoms such as autistic spectrum disorder. Furthermore, this finding might give us new insight of memory involving consciousness and unconsciousness as well as relationship between implicit/explicit memory and individual perception style.
Economic evaluation of targeted cancer interventions: critical review and recommendations.
Elkin, Elena B; Marshall, Deborah A; Kulin, Nathalie A; Ferrusi, Ilia L; Hassett, Michael J; Ladabaum, Uri; Phillips, Kathryn A
2011-10-01
Scientific advances have improved our ability to target cancer interventions to individuals who will benefit most and spare the risks and costs to those who will derive little benefit or even be harmed. Several approaches are currently used for targeting interventions for cancer risk reduction, screening, and treatment, including risk prediction algorithms for identifying high-risk subgroups and diagnostic tests for tumor markers and germline genetic mutations. Economic evaluation can inform decisions about the use of targeted interventions, which may be more costly than traditional strategies. However, assessing the impact of a targeted intervention on costs and health outcomes requires explicit consideration of the method of targeting. In this study, we describe the importance of this principle by reviewing published cost-effectiveness analyses of targeted interventions in breast cancer. Few studies we identified explicitly evaluated the relationships among the method of targeting, the accuracy of the targeting test, and outcomes of the targeted intervention. Those that did found that characteristics of targeting tests had a substantial impact on outcomes. We posit that the method of targeting and the outcomes of a targeted intervention are inextricably linked and recommend that cost-effectiveness analyses of targeted interventions explicitly consider costs and outcomes of the method of targeting.
ERIC Educational Resources Information Center
Mason, Andrew J.; Bertram, Charles A.
2018-01-01
When considering performing an Introductory Physics for Life Sciences course transformation for one's own institution, life science majors' achievement goals are a necessary consideration to ensure the pedagogical transformation will be effective. However, achievement goals are rarely an explicit consideration in physics education research topics…
Experiments with explicit filtering for LES using a finite-difference method
NASA Technical Reports Server (NTRS)
Lund, T. S.; Kaltenbach, H. J.
1995-01-01
The equations for large-eddy simulation (LES) are derived formally by applying a spatial filter to the Navier-Stokes equations. The filter width as well as the details of the filter shape are free parameters in LES, and these can be used both to control the effective resolution of the simulation and to establish the relative importance of different portions of the resolved spectrum. An analogous, but less well justified, approach to filtering is more or less universally used in conjunction with LES using finite-difference methods. In this approach, the finite support provided by the computational mesh as well as the wavenumber-dependent truncation errors associated with the finite-difference operators are assumed to define the filter operation. This approach has the advantage that it is also 'automatic' in the sense that no explicit filtering: operations need to be performed. While it is certainly convenient to avoid the explicit filtering operation, there are some practical considerations associated with finite-difference methods that favor the use of an explicit filter. Foremost among these considerations is the issue of truncation error. All finite-difference approximations have an associated truncation error that increases with increasing wavenumber. These errors can be quite severe for the smallest resolved scales, and these errors will interfere with the dynamics of the small eddies if no corrective action is taken. Years of experience at CTR with a second-order finite-difference scheme for high Reynolds number LES has repeatedly indicated that truncation errors must be minimized in order to obtain acceptable simulation results. While the potential advantages of explicit filtering are rather clear, there is a significant cost associated with its implementation. In particular, explicit filtering reduces the effective resolution of the simulation compared with that afforded by the mesh. The resolution requirements for LES are usually set by the need to capture most of the energy-containing eddies, and if explicit filtering is used, the mesh must be enlarged so that these motions are passed by the filter. Given the high cost of explicit filtering, the following interesting question arises. Since the mesh must be expanded in order to perform the explicit filter, might it be better to take advantage of the increased resolution and simply perform an unfiltered simulation on the larger mesh? The cost of the two approaches is roughly the same, but the philosophy is rather different. In the filtered simulation, resolution is sacrificed in order to minimize the various forms of numerical error. In the unfiltered simulation, the errors are left intact, but they are concentrated at very small scales that could be dynamically unimportant from a LES perspective. Very little is known about this tradeoff and the objective of this work is to study this relationship in high Reynolds number channel flow simulations using a second-order finite-difference method.
Implicit and explicit processing of emotional facial expressions in Parkinson's disease.
Wagenbreth, Caroline; Wattenberg, Lena; Heinze, Hans-Jochen; Zaehle, Tino
2016-04-15
Besides motor problems, Parkinson's disease (PD) is associated with detrimental emotional and cognitive functioning. Deficient explicit emotional processing has been observed, whilst patients also show impaired Theory of Mind (ToM) abilities. However, it is unclear whether this PD patients' ToM deficit is based on an inability to infer otherś emotional states or whether it is due to explicit emotional processing deficits. We investigated implicit and explicit emotional processing in PD with an affective priming paradigm in which we used pictures of human eyes for emotional primes and a lexical decision task (LDT) with emotional connoted words for target stimuli. Sixteen PD patients and sixteen matched healthy controls performed a LTD combined with an emotional priming paradigm providing emotional information through the facial eye region to assess implicit emotional processing. Second, participants explicitly evaluated the emotional status of eyes and words used in the implicit task. Compared to controls implicit emotional processing abilities were generally preserved in PD with, however, considerable alterations for happiness and disgust processing. Furthermore, we observed a general impairment of patients for explicit evaluation of emotional stimuli, which was augmented for the rating of facial expressions. This is the first study reporting results for affective priming with facial eye expressions in PD patients. Our findings indicate largely preserved implicit emotional processing, with a specific altered processing of disgust and happiness. Explicit emotional processing was considerably impaired for semantic and especially for facial stimulus material. Poor ToM abilities in PD patients might be based on deficient explicit emotional processing, with preserved ability to implicitly infer other people's feelings. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Waring, Hansun Zhang
2013-01-01
Despite the push for fostering reflective practices in teacher education in the last 20 years, true reflection remains rare (Farr, 2011). Based on a detailed analysis of four mentor-teacher meetings in a graduate TESOL program, I show how specific mentor practices generate teacher reflection without explicit solicitations. Findings of this study…
Quantum Monte Carlo studies of solvated systems
NASA Astrophysics Data System (ADS)
Schwarz, Kathleen; Letchworth Weaver, Kendra; Arias, T. A.; Hennig, Richard G.
2011-03-01
Solvation qualitatively alters the energetics of diverse processes from protein folding to reactions on catalytic surfaces. An explicit description of the solvent in quantum-mechanical calculations requires both a large number of electrons and exploration of a large number of configurations in the phase space of the solvent. These problems can be circumvented by including the effects of solvent through a rigorous classical density-functional description of the liquid environment, thereby yielding free energies and thermodynamic averages directly, while eliminating the need for explicit consideration of the solvent electrons. We have implemented and tested this approach within the CASINO Quantum Monte Carlo code. Our method is suitable for calculations in any basis within CASINO, including b-spline and plane wave trial wavefunctions, and is equally applicable to molecules, surfaces, and crystals. For our preliminary test calculations, we use a simplified description of the solvent in terms of an isodensity continuum dielectric solvation approach, though the method is fully compatible with more reliable descriptions of the solvent we shall employ in the future.
Structural kinetic modeling of metabolic networks.
Steuer, Ralf; Gross, Thilo; Selbig, Joachim; Blasius, Bernd
2006-08-08
To develop and investigate detailed mathematical models of metabolic processes is one of the primary challenges in systems biology. However, despite considerable advance in the topological analysis of metabolic networks, kinetic modeling is still often severely hampered by inadequate knowledge of the enzyme-kinetic rate laws and their associated parameter values. Here we propose a method that aims to give a quantitative account of the dynamical capabilities of a metabolic system, without requiring any explicit information about the functional form of the rate equations. Our approach is based on constructing a local linear model at each point in parameter space, such that each element of the model is either directly experimentally accessible or amenable to a straightforward biochemical interpretation. This ensemble of local linear models, encompassing all possible explicit kinetic models, then allows for a statistical exploration of the comprehensive parameter space. The method is exemplified on two paradigmatic metabolic systems: the glycolytic pathway of yeast and a realistic-scale representation of the photosynthetic Calvin cycle.
Explicit accounting of electronic effects on the Hugoniot of porous materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nayak, Bishnupriya; Menon, S. V. G., E-mail: menon.svg98@gmail.com
2016-03-28
A generalized enthalpy based equation of state, which includes thermal electron excitations explicitly, is formulated from simple considerations. Its application to obtain Hugoniot of materials needs simultaneous evaluation of pressure-volume curve and temperature, the latter requiring solution of a differential equation. The errors involved in two recent papers [Huayun et al., J. Appl. Phys. 92, 5917 (2002); 92, 5924 (2002)], which employed this approach, are brought out and discussed. In addition to developing the correct set of equations, the present work also provides a numerical method to implement this approach. Constant pressure specific heat of ions and electrons and ionicmore » enthalpy parameter, needed for applications, are calculated using a three component equation of state. The method is applied to porous Cu with different initial porosities. Comparison of results with experimental data shows good agreement. It is found that temperatures along the Hugoniot of porous materials are significantly modified due to electronic effects.« less
Peristalsis of nonconstant viscosity Jeffrey fluid with nanoparticles
NASA Astrophysics Data System (ADS)
Alvi, N.; Latif, T.; Hussain, Q.; Asghar, S.
Mixed convective peristaltic activity of variable viscosity nanofluids is addressed. Unlike the conventional consideration of constant viscosity; the viscosity is taken as temperature dependent. Constitutive relations for linear viscoelastic Jeffrey fluid are employed and uniform magnetic field is applied in the transverse direction. For nanofluids, the formulation is completed in presence of Brownian motion, thermophoresis, viscous dissipation and Joule heating. Consideration of temperature dependence of viscosity is not a choice but the realistic requirement of the wall temperature and the heat generated due to the viscous dissipation. Well established large wavelength and small Reynolds number approximations are invoked. Non-linear coupled system is analytically solved for the convergent series solutions identifying the interval of convergence explicitly. A comparative study between analytical and numerical solution is made for certainty. Influence of the parameters undertaken for the description of the problem is pointed out and its physics explained.
An ordinal classification approach for CTG categorization.
Georgoulas, George; Karvelis, Petros; Gavrilis, Dimitris; Stylios, Chrysostomos D; Nikolakopoulos, George
2017-07-01
Evaluation of cardiotocogram (CTG) is a standard approach employed during pregnancy and delivery. But, its interpretation requires high level expertise to decide whether the recording is Normal, Suspicious or Pathological. Therefore, a number of attempts have been carried out over the past three decades for development automated sophisticated systems. These systems are usually (multiclass) classification systems that assign a category to the respective CTG. However most of these systems usually do not take into consideration the natural ordering of the categories associated with CTG recordings. In this work, an algorithm that explicitly takes into consideration the ordering of CTG categories, based on binary decomposition method, is investigated. Achieved results, using as a base classifier the C4.5 decision tree classifier, prove that the ordinal classification approach is marginally better than the traditional multiclass classification approach, which utilizes the standard C4.5 algorithm for several performance criteria.
Knowledge translation within a population health study: how do you do it?
2013-01-01
Background Despite the considerable and growing body of knowledge translation (KT) literature, there are few methodologies sufficiently detailed to guide an integrated KT research approach for a population health study. This paper argues for a clearly articulated collaborative KT approach to be embedded within the research design from the outset. Discussion Population health studies are complex in their own right, and strategies to engage the local community in adopting new interventions are often fraught with considerable challenges. In order to maximise the impact of population health research, more explicit KT strategies need to be developed from the outset. We present four propositions, arising from our work in developing a KT framework for a population health study. These cover the need for an explicit theory-informed conceptual framework; formalizing collaborative approaches within the design; making explicit the roles of both the stakeholders and the researchers; and clarifying what counts as evidence. From our deliberations on these propositions, our own co-creating (co-KT) Framework emerged in which KT is defined as both a theoretical and practical framework for actioning the intent of researchers and communities to co-create, refine, implement and evaluate the impact of new knowledge that is sensitive to the context (values, norms and tacit knowledge) where it is generated and used. The co-KT Framework has five steps. These include initial contact and framing the issue; refining and testing knowledge; interpreting, contextualising and adapting knowledge to the local context; implementing and evaluating; and finally, the embedding and translating of new knowledge into practice. Summary Although descriptions of how to incorporate KT into research designs are increasing, current theoretical and operational frameworks do not generally span a holistic process from knowledge co-creation to knowledge application and implementation within one project. Population health studies may have greater health impact when KT is incorporated early and explicitly into the research design. This, we argue, will require that particular attention be paid to collaborative approaches, stakeholder identification and engagement, the nature and sources of evidence used, and the role of the research team working with the local study community. PMID:23694753
Large numbers hypothesis. II - Electromagnetic radiation
NASA Technical Reports Server (NTRS)
Adams, P. J.
1983-01-01
This paper develops the theory of electromagnetic radiation in the units covariant formalism incorporating Dirac's large numbers hypothesis (LNH). A direct field-to-particle technique is used to obtain the photon propagation equation which explicitly involves the photon replication rate. This replication rate is fixed uniquely by requiring that the form of a free-photon distribution function be preserved, as required by the 2.7 K cosmic radiation. One finds that with this particular photon replication rate the units covariant formalism developed in Paper I actually predicts that the ratio of photon number to proton number in the universe varies as t to the 1/4, precisely in accord with LNH. The cosmological red-shift law is also derived and it is shown to differ considerably from the standard form of (nu)(R) - const.
NASA Technical Reports Server (NTRS)
Maddalon, J. M.; Hayhurst, K. J.; Neogi, N. A.; Verstynen, H. A.; Clothier, R. A.
2016-01-01
One of the key challenges to the development of a commercial Unmanned Air-craft System (UAS) market is the lack of explicit consideration of UAS in the current regulatory framework. Despite recent progress, additional steps are needed to enable broad UAS types and operational models. This paper discusses recent research that examines how a risk-based approach for safety might change the process and substance of airworthiness requirements for UAS. The project proposed risk-centric airworthiness requirements for a midsize un-manned rotorcraft used for agricultural spraying and also identified factors that may contribute to distinguishing safety risk among different UAS types and operational concepts. Lessons learned regarding how a risk-based approach can expand the envelope of UAS certification are discussed.
What do we know about implicit false-belief tracking?
Schneider, Dana; Slaughter, Virginia P; Dux, Paul E
2015-02-01
There is now considerable evidence that neurotypical individuals track the internal cognitions of others, even in the absence of instructions to do so. This finding has prompted the suggestion that humans possess an implicit mental state tracking system (implicit Theory of Mind, ToM) that exists alongside a system that allows the deliberate and explicit analysis of the mental states of others (explicit ToM). Here we evaluate the evidence for this hypothesis and assess the extent to which implicit and explicit ToM operations are distinct. We review evidence showing that adults can indeed engage in ToM processing even without being conscious of doing so. However, at the same time, there is evidence that explicit and implicit ToM operations share some functional features, including drawing on executive resources. Based on the available evidence, we propose that implicit and explicit ToM operations overlap and should only be considered partially distinct.
Adarkwah, Charles Christian; Sadoghi, Amirhossein; Gandjour, Afschin
2016-02-01
There has been a debate on whether cost-effectiveness analysis should consider the cost of consumption and leisure time activities when using the quality-adjusted life year as a measure of health outcome under a societal perspective. The purpose of this study was to investigate whether the effects of ill health on consumptive activities are spontaneously considered in a health state valuation exercise and how much this matters. The survey enrolled patients with inflammatory bowel disease in Germany (n = 104). Patients were randomized to explicit and no explicit instruction for the consideration of consumption and leisure effects in a time trade-off (TTO) exercise. Explicit instruction to consider non-health-related utility in TTO exercises did not influence TTO scores. However, spontaneous consideration of non-health-related utility in patients without explicit instruction (60% of respondents) led to significantly lower TTO scores. Results suggest an inclusion of consumption costs in the numerator of the cost-effectiveness ratio, at least for those respondents who spontaneously consider non-health-related utility from treatment. Results also suggest that exercises eliciting health valuations from the general public may include a description of the impact of disease on consumptive activities. Copyright © 2015 John Wiley & Sons, Ltd.
Putting Public Health Ethics into Practice: A Systematic Framework
Marckmann, Georg; Schmidt, Harald; Sofaer, Neema; Strech, Daniel
2015-01-01
It is widely acknowledged that public health practice raises ethical issues that require a different approach than traditional biomedical ethics. Several frameworks for public health ethics (PHE) have been proposed; however, none of them provides a practice-oriented combination of the two necessary components: (1) a set of normative criteria based on an explicit ethical justification and (2) a structured methodological approach for applying the resulting normative criteria to concrete public health (PH) issues. Building on prior work in the field and integrating valuable elements of other approaches to PHE, we present a systematic ethical framework that shall guide professionals in planning, conducting, and evaluating PH interventions. Based on a coherentist model of ethical justification, the proposed framework contains (1) an explicit normative foundation with five substantive criteria and seven procedural conditions to guarantee a fair decision process, and (2) a six-step methodological approach for applying the criteria and conditions to the practice of PH and health policy. The framework explicitly ties together ethical analysis and empirical evidence, thus striving for evidence-based PHE. It can provide normative guidance to those who analyze the ethical implications of PH practice including academic ethicists, health policy makers, health technology assessment bodies, and PH professionals. It will enable those who implement a PH intervention and those affected by it (i.e., the target population) to critically assess whether and how the required ethical considerations have been taken into account. Thereby, the framework can contribute to assuring the quality of ethical analysis in PH. Whether the presented framework will be able to achieve its goals has to be determined by evaluating its practical application. PMID:25705615
Microbial contributions to climate change through carbon cycle feedbacks.
Bardgett, Richard D; Freeman, Chris; Ostle, Nicholas J
2008-08-01
There is considerable interest in understanding the biological mechanisms that regulate carbon exchanges between the land and atmosphere, and how these exchanges respond to climate change. An understanding of soil microbial ecology is central to our ability to assess terrestrial carbon cycle-climate feedbacks, but the complexity of the soil microbial community and the many ways that it can be affected by climate and other global changes hampers our ability to draw firm conclusions on this topic. In this paper, we argue that to understand the potential negative and positive contributions of soil microbes to land-atmosphere carbon exchange and global warming requires explicit consideration of both direct and indirect impacts of climate change on microorganisms. Moreover, we argue that this requires consideration of complex interactions and feedbacks that occur between microbes, plants and their physical environment in the context of climate change, and the influence of other global changes which have the capacity to amplify climate-driven effects on soil microbes. Overall, we emphasize the urgent need for greater understanding of how soil microbial ecology contributes to land-atmosphere carbon exchange in the context of climate change, and identify some challenges for the future. In particular, we highlight the need for a multifactor experimental approach to understand how soil microbes and their activities respond to climate change and consequences for carbon cycle feedbacks.
Sample size requirements for the design of reliability studies: precision consideration.
Shieh, Gwowen
2014-09-01
In multilevel modeling, the intraclass correlation coefficient based on the one-way random-effects model is routinely employed to measure the reliability or degree of resemblance among group members. To facilitate the advocated practice of reporting confidence intervals in future reliability studies, this article presents exact sample size procedures for precise interval estimation of the intraclass correlation coefficient under various allocation and cost structures. Although the suggested approaches do not admit explicit sample size formulas and require special algorithms for carrying out iterative computations, they are more accurate than the closed-form formulas constructed from large-sample approximations with respect to the expected width and assurance probability criteria. This investigation notes the deficiency of existing methods and expands the sample size methodology for the design of reliability studies that have not previously been discussed in the literature.
Principles and Benefits of Explicitly Designed Medical Device Safety Architecture.
Larson, Brian R; Jones, Paul; Zhang, Yi; Hatcliff, John
The complexity of medical devices and the processes by which they are developed pose considerable challenges to producing safe designs and regulatory submissions that are amenable to effective reviews. Designing an appropriate and clearly documented architecture can be an important step in addressing this complexity. Best practices in medical device design embrace the notion of a safety architecture organized around distinct operation and safety requirements. By explicitly separating many safety-related monitoring and mitigation functions from operational functionality, the aspects of a device most critical to safety can be localized into a smaller and simpler safety subsystem, thereby enabling easier verification and more effective reviews of claims that causes of hazardous situations are detected and handled properly. This article defines medical device safety architecture, describes its purpose and philosophy, and provides an example. Although many of the presented concepts may be familiar to those with experience in realization of safety-critical systems, this article aims to distill the essence of the approach and provide practical guidance that can potentially improve the quality of device designs and regulatory submissions.
O'Connor, Nick; Paton, Michael
2008-04-01
A framework developed to promote the understanding and application of clinical governance principles in an area mental health service is described. The framework is operationalized through systems, processes, roles and responsibilities. The development of an explicit and operationalizable framework for clinical governance arose from the authors' experiences in leading and managing mental health services. There is a particular emphasis on improvement of quality of care and patient safety. The framework is informed by recent developments in thinking about clinical governance, including key documents from Australia and the United Kingdom. The operational nature of the framework allows for key components of clinical governance to be described explicitly, communicated effectively, and continually tested and improved. Further consideration and assessment of the value of differing approaches to this task are required. For example, a general, illustrative approach to raise clinician awareness can be contrasted with prescriptive and specified approaches which progressively encompass the many functions and processes of a mental health service. Mental health clinicians and managers can be guided by a framework that will ensure safe, high quality and continually improving processes of care.
Neutron coincidence measurements when nuclear parameters vary during the multiplication process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Ming-Shih; Teichmann, T.
1995-07-01
In a recent paper, a physical/mathematical model was developed for neutron coincidence counting, taking explicit account of neutron absorption and leakage, and using dual probability generating function to derive explicit formulae for the single and multiple count-rates in terms of the physical parameters of the system. The results of this modeling proved very successful in a number of cases in which the system parameters (neutron reaction cross-sections, detection probabilities, etc.) remained the same at the various stages of the process (i.e. from collision to collision). However, there are practical circumstances in which such system parameters change from collision to collision,more » and it is necessary to accommodate these, too, in a general theory, applicable to such situations. For instance, in the case of the neutron coincidence collar (NCC), the parameters for the initial, spontaneous fission neutrons, are not the same as those for the succeeding induced fission neutrons, and similar situations can be envisaged for certain other experimental configurations. This present document shows how the previous considerations can be elaborated to embrace these more general requirements.« less
Making the Implicit Explicit: Towards an Assurance Case for DO-178C
NASA Technical Reports Server (NTRS)
Holloway, C. Michael
2013-01-01
For about two decades, compliance with Software Considerations in Airborne Systems and Equipment Certification (DO-178B) has been the primary means for receiving regulatory approval for using software on commercial airplanes. A new edition of the standard, DO-178C, was published in December 2011, and regulatory bodies have started the process towards recognizing this edition. The stated purpose of DO-178C remains unchanged from its predecessor: providing guidance “for the production of software for airborne systems and equipment that performs its intended function with a level of confidence in safety that complies with airworthiness requirements.” Within the text of the guidance, little or no rationale is given for how a particular objective or collection of objectives contributes to achieving this purpose. Thus the assurance case for the document is implicit. This paper discusses a current effort to make the implicit explicit. In particular, the paper describes the current status of the research seeking to identify the specific arguments contained in, or implied by, the DO-178C guidance that implicitly justify the assumption that the document meets its stated purpose.
Business or pleasure? Utilitarian versus hedonic considerations in emotion regulation.
Tamir, Maya; Chiu, Chi-Yue; Gross, James J
2007-08-01
It is widely accepted that emotions have utilitarian as well as hedonic consequences. Nevertheless, it is typically assumed that individuals regulate emotions to obtain hedonic, rather than utilitarian, benefits. In this study, the authors tested whether individuals represent the utility of pleasant and unpleasant emotions and whether they would be motivated to experience unpleasant emotions if they believed they could be useful. First, findings revealed that participants explicitly viewed approach emotions (e.g., excitement) as useful for obtaining rewards, but viewed avoidance emotions (e.g., worry) as useful for avoiding threats. Second, this pattern was replicated in implicit representations of emotional utility, which were dissociated from explicit ones. Third, implicit, but not explicit, representations of emotional utility predicted motives for emotion regulation. When anticipating a threatening task, participants who viewed emotions such as worry and fear as useful for avoiding threats preferred to engage in activities that were likely to increase worry and fear (vs. excitement) before the task. These findings demonstrate that utilitarian considerations play an important, if underappreciated, role in emotion regulation. ((c) 2007 APA, all rights reserved).
Closing unprofitable services: ethical issues and management responses.
Summers, James W
1985-01-01
Closing unprofitable services often requires as much analysis, public relations, marketing, and planning as any expansion. Further, issues about ethics, indigents, and the hospital mission force the consideration of values explicitly if a marketing fiasco is to be avoided. By integrating values analysis with more traditional management tasks, the challenges of service closure can be converted into opportunities to demonstrate how your institution has met or exceeded its ethical obligations. A case involving OB is developed to show how ethical and management issues blend into one another. Specific strategies for consensus building and marketing of the legitimacy of the hospital's position are given. Institutional ethics committees are one primary mechanism for developing a plan to benefit from unpleasant decisions.
NASA Astrophysics Data System (ADS)
Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.
2005-03-01
Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.
The nature of declarative and nondeclarative knowledge for implicit and explicit learning.
Kirkhart, M W
2001-10-01
Using traditional implicit and explicit artificial-grammar learning tasks, the author investigated the similarities and differences between the acquisition of declarative knowledge under implicit and explicit learning conditions and the functions of the declarative knowledge during testing. Results suggested that declarative knowledge was not predictive of or required for implicit learning but was related to consistency in implicit learning performance. In contrast, declarative knowledge was predictive of and required for explicit learning and was related to consistency in performance. For explicit learning, the declarative knowledge functioned as a guide for other behavior. In contrast, for implicit learning, the declarative knowledge did not serve as a guide for behavior but was instead a post hoc description of the most commonly seen stimuli.
Paulus, Markus; Murillo, Esther; Sodian, Beate
2016-08-01
A considerable amount of research has examined children's ability to rely on explicit social cues such as pointing to understand others' referential intentions. Yet, skillful social interaction also requires reliance on and learning from implicit cues (i.e., cues that are not displayed with the explicit intention to teach or inform someone). From an embodied point of view, orienting movements and body orientation are salient cues that reveal something about a person's intentional relations without being explicit communicative cues. In three experiments, the current study investigated the development of the ability to use body information in a word learning situation. To this end, we presented 2-year-old children, 3.5-year-old children, and adults with movies on an eye-tracking screen in which an actor oriented her upper body to one of two objects while uttering a novel word. The results show that the 3.5-year-old children and adults, but not the 2-year-old children, related the novel word to the referred object (Experiments 1 and 2). Yet, when the actor oriented her body to one object while pointing to the other object, children of both age groups relied on the pointing cue (Experiment 3). This suggests that by 3.5 years children use another's body orientation as an indicator of her intentional relations but that they prioritize explicit social cues over the implicit body posture cues. Overall, the study supports theoretical views that an appreciation of others' intentional relations does not emerge as an all-or-nothing ability but rather emerges gradually during the course of early development. Copyright © 2016 Elsevier Inc. All rights reserved.
Characteristics of implicit chaining in cotton-top tamarins (Saguinus oedipus).
Locurto, Charles; Gagne, Matthew; Nutile, Lauren
2010-07-01
In human cognition there has been considerable interest in observing the conditions under which subjects learn material without explicit instructions to learn. In the present experiments, we adapted this issue to nonhumans by asking what subjects learn in the absence of explicit reinforcement for correct responses. Two experiments examined the acquisition of sequence information by cotton-top tamarins (Saguinus oedipus) when such learning was not demanded by the experimental contingencies. An implicit chaining procedure was used in which visual stimuli were presented serially on a touchscreen. Subjects were required to touch one stimulus to advance to the next stimulus. Stimulus presentations followed a pattern, but learning the pattern was not necessary for reinforcement. In Experiment 1 the chain consisted of five different visual stimuli that were presented in the same order on each trial. Each stimulus could occur at any one of six touchscreen positions. In Experiment 2 the same visual element was presented serially in the same five locations on each trial, thereby allowing a behavioral pattern to be correlated with the visual pattern. In this experiment two new tests, a Wild-Card test and a Running-Start test, were used to assess what was learned in this procedure. Results from both experiments indicated that tamarins acquired more information from an implicit chain than was required by the contingencies of reinforcement. These results contribute to the developing literature on nonhuman analogs of implicit learning.
Technology Considerations for Inclusion of Survivability in MDAO
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia M.
2017-01-01
Rising traffic density, along with autonomy and diversity of vehicles in the air, will fundamentally change the safety environment of the future air transportation system. The change in risk is two-fold: increasing chances of mid-air collisions with non-cooperative objects and increasing chances of crashes over highly populated areas. The changing nature of the vehicles populating the airspace means that civilian aircraft design must now explicitly include considerations of survivability in the event of collision with other vehicles, as well as prevention of damage to people, animals and property on the ground, to a much greater extent than today. This paper offers a preliminary perspective on how MDAO could contribute toward these goals. One of the conclusions is that, in contrast to traditional vehicle design, to accommodate the complexity of the future airspace safely and efficiently, vehicle design requirements, modeling, and design optimization must be closely connected to the properties of the airspace, including those of other vehicles in the air. Thus, the total measure of a vehicle's survivability should include the traditional survivability in malfunction scenarios, combined with new considerations of survivability in collisions and survivability of the public on the ground.
Parallelization of implicit finite difference schemes in computational fluid dynamics
NASA Technical Reports Server (NTRS)
Decker, Naomi H.; Naik, Vijay K.; Nicoules, Michel
1990-01-01
Implicit finite difference schemes are often the preferred numerical schemes in computational fluid dynamics, requiring less stringent stability bounds than the explicit schemes. Each iteration in an implicit scheme involves global data dependencies in the form of second and higher order recurrences. Efficient parallel implementations of such iterative methods are considerably more difficult and non-intuitive. The parallelization of the implicit schemes that are used for solving the Euler and the thin layer Navier-Stokes equations and that require inversions of large linear systems in the form of block tri-diagonal and/or block penta-diagonal matrices is discussed. Three-dimensional cases are emphasized and schemes that minimize the total execution time are presented. Partitioning and scheduling schemes for alleviating the effects of the global data dependencies are described. An analysis of the communication and the computation aspects of these methods is presented. The effect of the boundary conditions on the parallel schemes is also discussed.
NASA Astrophysics Data System (ADS)
Al-Saidi, Ahmed Mohammad
The purpose of this study was to examine the effect of an explicit versus an implicit instructional approach during technology-based curriculum on students' understanding of the nature of science (NOS) within an introductory biology course. The study emphasized the inferential and tentative nature of science. The intervention or explicit group was involved in inquiry activities followed by discussions that were directly geared towards the target aspects of NOS. The implicit group was engaged in the same activities but received instruction devoid of direct reference to the NOS aspects. Students in both groups spent identical amount of time on task. Selected items of the Views of Nature of Science Questionnaire (VNOS) together with semi-structured interviews were used to evaluate students' NOS conceptions before and at the end of the intervention, which lasted two weeks. A quantitative analysis using chi-square of students' pre-intervention NOS views as provided by the VNOS questionnaires revealed that there was not a statistically significant difference between implicit and explicit groups in both targeted NOS aspects, with (p = 0.18) and (p = 0.34) for inferential and tentative NOS, respectively. However the same analysis indicated statistical significance difference for post-intervention between implicit and explicit groups, yielding (p < 0.02) and (p < 0.002) for both inferential and tentative NOS, respectively. A qualitative analysis of students' pre and post-intervention views of the target aspects of NOS as well as semi-structured interviews for both groups was also conducted. Before intervention, the number of informed NOS responses in both groups was not considerably different. However, analysis of post-intervention NOS views indicated that more students in the explicit group demonstrated informed views of the NOS aspects than in the implicit group. Therefore, the analysis of the data indicated that, in this particular study, engaging students in inquiry-based activities followed by explicit discussion that is geared toward NOS aspects was more effective than merely involving them in implicit, inquiry-based instruction. The imperative finding of the present study provides evidence that teaching the NOS could be achieved through short-intensive discussion and does not necessarily require separate and independent courses.
Explicit solutions of a gravity-induced film flow along a convectively heated vertical wall.
Raees, Ammarah; Xu, Hang
2013-01-01
The gravity-driven film flow has been analyzed along a vertical wall subjected to a convective boundary condition. The Boussinesq approximation is applied to simplify the buoyancy term, and similarity transformations are used on the mathematical model of the problem under consideration, to obtain a set of coupled ordinary differential equations. Then the reduced equations are solved explicitly by using homotopy analysis method (HAM). The resulting solutions are investigated for heat transfer effects on velocity and temperature profiles.
Colangelo, Annette; Buchanan, Lori
2006-12-01
The failure of inhibition hypothesis posits a theoretical distinction between implicit and explicit access in deep dyslexia. Specifically, the effects of failure of inhibition are assumed only in conditions that have an explicit selection requirement in the context of production (i.e., aloud reading). In contrast, the failure of inhibition hypothesis proposes that implicit processing and explicit access to semantic information without production demands are intact in deep dyslexia. Evidence for intact implicit and explicit access requires that performance in deep dyslexia parallels that observed in neurologically intact participants on tasks based on implicit and explicit processes. In other words, deep dyslexics should produce normal effects in conditions with implicit task demands (i.e., lexical decision) and on tasks based on explicit access without production (i.e., forced choice semantic decisions) because failure of inhibition does not impact the availability of lexical information, only explicit retrieval in the context of production. This research examined the distinction between implicit and explicit processes in deep dyslexia using semantic blocking in lexical decision and forced choice semantic decisions as a test for the failure of inhibition hypothesis. The results of the semantic blocking paradigm support the distinction between implicit and explicit processing and provide evidence for failure of inhibition as an explanation for semantic errors in deep dyslexia.
Implicit and Explicit Representations of Hand Position in Tool Use
Rand, Miya K.; Heuer, Herbert
2013-01-01
Understanding the interactions of visual and proprioceptive information in tool use is important as it is the basis for learning of the tool's kinematic transformation and thus skilled performance. This study investigated how the CNS combines seen cursor positions and felt hand positions under a visuo-motor rotation paradigm. Young and older adult participants performed aiming movements on a digitizer while looking at rotated visual feedback on a monitor. After each movement, they judged either the proprioceptively sensed hand direction or the visually sensed cursor direction. We identified asymmetric mutual biases with a strong visual dominance. Furthermore, we found a number of differences between explicit and implicit judgments of hand directions. The explicit judgments had considerably larger variability than the implicit judgments. The bias toward the cursor direction for the explicit judgments was about twice as strong as for the implicit judgments. The individual biases of explicit and implicit judgments were uncorrelated. Biases of these judgments exhibited opposite sequential effects. Moreover, age-related changes were also different between these judgments. The judgment variability was decreased and the bias toward the cursor direction was increased with increasing age only for the explicit judgments. These results indicate distinct explicit and implicit neural representations of hand direction, similar to the notion of distinct visual systems. PMID:23894307
What values in design? The challenge of incorporating moral values into design.
Manders-Huits, Noëmi
2011-06-01
Recently, there is increased attention to the integration of moral values into the conception, design, and development of emerging IT. The most reviewed approach for this purpose in ethics and technology so far is Value-Sensitive Design (VSD). This article considers VSD as the prime candidate for implementing normative considerations into design. Its methodology is considered from a conceptual, analytical, normative perspective. The focus here is on the suitability of VSD for integrating moral values into the design of technologies in a way that joins in with an analytical perspective on ethics of technology. Despite its promising character, it turns out that VSD falls short in several respects: (1) VSD does not have a clear methodology for identifying stakeholders, (2) the integration of empirical methods with conceptual research within the methodology of VSD is obscure, (3) VSD runs the risk of committing the naturalistic fallacy when using empirical knowledge for implementing values in design, (4) the concept of values, as well as their realization, is left undetermined and (5) VSD lacks a complimentary or explicit ethical theory for dealing with value trade-offs. For the normative evaluation of a technology, I claim that an explicit and justified ethical starting point or principle is required. Moreover, explicit attention should be given to the value aims and assumptions of a particular design. The criteria of adequacy for such an approach or methodology follow from the evaluation of VSD as the prime candidate for implementing moral values in design.
Structural design considerations for micromachined solid-oxide fuel cells
NASA Astrophysics Data System (ADS)
Srikar, V. T.; Turner, Kevin T.; Andrew Ie, Tze Yung; Spearing, S. Mark
Micromachined solid-oxide fuel cells (μSOFCs) are among a class of devices being investigated for portable power generation. Optimization of the performance and reliability of such devices requires robust, scale-dependent, design methodologies. In this first analysis, we consider the structural design of planar, electrolyte-supported, μSOFCs from the viewpoints of electrochemical performance, mechanical stability and reliability, and thermal behavior. The effect of electrolyte thickness on fuel cell performance is evaluated using a simple analytical model. Design diagrams that account explicitly for thermal and intrinsic residual stresses are presented to identify geometries that are resistant to fracture and buckling. Analysis of energy loss due to in-plane heat conduction highlights the importance of efficient thermal isolation in microscale fuel cell design.
Implicit aversive memory under anaesthesia in animal models: a narrative review.
Samuel, N; Taub, A H; Paz, R; Raz, A
2018-07-01
Explicit memory after anaesthesia has gained considerable attention because of its negative implications, while implicit memory, which is more elusive and lacks patients' explicit recall, has received less attention and dedicated research. This is despite the likely impact of implicit memory on postoperative long-term well-being and behaviour. Given the scarcity of human data, fear conditioning in animals offers a reliable model of implicit learning, and importantly, one where we already have a good understanding of the underlying neural circuitry in awake conditions. Animal studies provide evidence that fear conditioning occurs under anaesthesia. The effects of different anaesthetics on memory are complex, with different drugs interacting at different stages of learning. Modulatory suppressive effects can be because of context, specific drugs, and dose dependency. In some cases, low doses of general anaesthetics can actually lead to a paradoxical opposite effect. The underlying mechanisms involve several neurotransmitter systems, acting mainly in the amygdala, hippocampus, and neocortex. Here, we review animal studies of aversive conditioning under anaesthesia, discuss the complex picture that arises, identify the gaps in knowledge that require further investigation, and highlight the potential translational relevance of the models. Copyright © 2018 British Journal of Anaesthesia. Published by Elsevier Ltd. All rights reserved.
Chen, G.; Chacón, L.
2015-08-11
For decades, the Vlasov–Darwin model has been recognized to be attractive for particle-in-cell (PIC) kinetic plasma simulations in non-radiative electromagnetic regimes, to avoid radiative noise issues and gain computational efficiency. However, the Darwin model results in an elliptic set of field equations that renders conventional explicit time integration unconditionally unstable. We explore a fully implicit PIC algorithm for the Vlasov–Darwin model in multiple dimensions, which overcomes many difficulties of traditional semi-implicit Darwin PIC algorithms. The finite-difference scheme for Darwin field equations and particle equations of motion is space–time-centered, employing particle sub-cycling and orbit-averaging. This algorithm conserves total energy, local charge,more » canonical-momentum in the ignorable direction, and preserves the Coulomb gauge exactly. An asymptotically well-posed fluid preconditioner allows efficient use of large cell sizes, which are determined by accuracy considerations, not stability, and can be orders of magnitude larger than required in a standard explicit electromagnetic PIC simulation. Finally, we demonstrate the accuracy and efficiency properties of the algorithm with various numerical experiments in 2D–3V.« less
Sorrentino, Jasmin; Augoustinos, Martha
2016-09-01
Former Prime Minister Julia Gillard's speech in the Australian parliament on sexism and misogyny received considerable public attention and controversy. However, less attention has been paid to how Gillard attended and oriented to issues related to her status as a woman during the period between her elevation to the position of Prime Minister in June 2010 and the delivery of the misogyny speech in October 2012. Using a discursive psychological approach, this article examines a corpus of interview transcripts in which gender was occasioned both explicitly and implicitly by speakers, thus requiring Gillard to attend to her gender identity. The analysis demonstrates that far from making gender a salient and relevant membership category, Gillard worked strategically to mitigate her gender as merely inconsequential to her role as Prime Minister. These findings are discussed in relation to existing research examining how gender is oriented to, negotiated, and resisted in talk to accomplish social actions, and more specifically what may be at stake for women in leadership positions who explicitly orient to gender as an identity category. © 2016 The British Psychological Society.
Robust fault-tolerant tracking control design for spacecraft under control input saturation.
Bustan, Danyal; Pariz, Naser; Sani, Seyyed Kamal Hosseini
2014-07-01
In this paper, a continuous globally stable tracking control algorithm is proposed for a spacecraft in the presence of unknown actuator failure, control input saturation, uncertainty in inertial matrix and external disturbances. The design method is based on variable structure control and has the following properties: (1) fast and accurate response in the presence of bounded disturbances; (2) robust to the partial loss of actuator effectiveness; (3) explicit consideration of control input saturation; and (4) robust to uncertainty in inertial matrix. In contrast to traditional fault-tolerant control methods, the proposed controller does not require knowledge of the actuator faults and is implemented without explicit fault detection and isolation processes. In the proposed controller a single parameter is adjusted dynamically in such a way that it is possible to prove that both attitude and angular velocity errors will tend to zero asymptotically. The stability proof is based on a Lyapunov analysis and the properties of the singularity free quaternion representation of spacecraft dynamics. Results of numerical simulations state that the proposed controller is successful in achieving high attitude performance in the presence of external disturbances, actuator failures, and control input saturation. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Sotiropoulou, Rafaella-Eleni P.; Nenes, Athanasios; Adams, Peter J.; Seinfeld, John H.
2007-01-01
In situ observations of aerosol and cloud condensation nuclei (CCN) and the GISS GCM Model II' with an online aerosol simulation and explicit aerosol-cloud interactions are used to quantify the uncertainty in radiative forcing and autoconversion rate from application of Kohler theory. Simulations suggest that application of Koehler theory introduces a 10-20% uncertainty in global average indirect forcing and 2-11% uncertainty in autoconversion. Regionally, the uncertainty in indirect forcing ranges between 10-20%, and 5-50% for autoconversion. These results are insensitive to the range of updraft velocity and water vapor uptake coefficient considered. This study suggests that Koehler theory (as implemented in climate models) is not a significant source of uncertainty for aerosol indirect forcing but can be substantial for assessments of aerosol effects on the hydrological cycle in climatically sensitive regions of the globe. This implies that improvements in the representation of GCM subgrid processes and aerosol size distribution will mostly benefit indirect forcing assessments. Predictions of autoconversion, by nature, will be subject to considerable uncertainty; its reduction may require explicit representation of size-resolved aerosol composition and mixing state.
Stellar feedback strongly alters the amplification and morphology of galactic magnetic fields
NASA Astrophysics Data System (ADS)
Su, Kung-Yi; Hayward, Christopher C.; Hopkins, Philip F.; Quataert, Eliot; Faucher-Giguère, Claude-André; Kereš, Dušan
2018-01-01
Using high-resolution magnetohydrodynamic simulations of idealized, non-cosmological galaxies, we investigate how cooling, star formation and stellar feedback affect galactic magnetic fields. We find that the amplification histories, saturation values and morphologies of the magnetic fields vary considerably depending on the baryonic physics employed, primarily because of differences in the gas density distribution. In particular, adiabatic runs and runs with a subgrid (effective equation of state) stellar feedback model yield lower saturation values and morphologies that exhibit greater large-scale order compared with runs that adopt explicit stellar feedback and runs with cooling and star formation but no feedback. The discrepancies mostly lie in gas denser than the galactic average, which requires cooling and explicit fragmentation to capture. Independent of the baryonic physics included, the magnetic field strength scales with gas density as B ∝ n2/3, suggesting isotropic flux freezing or equipartition between the magnetic and gravitational energies during the field amplification. We conclude that accurate treatments of cooling, star formation and stellar feedback are crucial for obtaining the correct magnetic field strength and morphology in dense gas, which, in turn, is essential for properly modelling other physical processes that depend on the magnetic field, such as cosmic ray feedback.
Sexual and reproductive health care for adolescents: legal rights and policy challenges.
English, Abigail
2007-12-01
Laws developed over the past half century have significantly improved adolescents' access to essential sexual and reproductive health care. These laws allow many adolescent minors to give their own consent, protect confidentiality, and provide financial support for the care. The consent requirements for adolescents to receive health care are contained primarily in state court decisions and in statutes known as "state minor consent laws," which are based on either the minor's status or the services sought. Confidentiality protections for adolescents' health information are contained in these minor consent laws, in the federal medical privacy regulations known as the "HIPAA Privacy Rule," and in state medical privacy laws. Other significant laws include statutes providing for the emancipation of minors, court decisions delineating the mature minor doctrine, regulations protecting adolescents' access to confidential family planning services in publicly funded programs, and court decisions interpreting the constitutional right of privacy. Special considerations apply to consent and confidentiality questions pertaining to family planning, contraception, and pregnancy-related care for minors. In addition to the explicit provisions of state minor consent laws, many of the most important considerations are articulated in court decisions based on the constitutional right of privacy and the confidentiality requirements that are part of the federal Title X Family Planning Program and Medicaid.
Felton, Adam; Ranius, Thomas; Roberge, Jean-Michel; Öhman, Karin; Lämås, Tomas; Hynynen, Jari; Juutinen, Artti; Mönkkönen, Mikko; Nilsson, Urban; Lundmark, Tomas; Nordin, Annika
2017-07-15
A variety of modeling approaches can be used to project the future development of forest systems, and help to assess the implications of different management alternatives for biodiversity and ecosystem services. This diversity of approaches does however present both an opportunity and an obstacle for those trying to decide which modeling technique to apply, and interpreting the management implications of model output. Furthermore, the breadth of issues relevant to addressing key questions related to forest ecology, conservation biology, silviculture, economics, requires insights stemming from a number of distinct scientific disciplines. As forest planners, conservation ecologists, ecological economists and silviculturalists, experienced with modeling trade-offs and synergies between biodiversity and wood biomass production, we identified fifteen key considerations relevant to assessing the pros and cons of alternative modeling approaches. Specifically we identified key considerations linked to study question formulation, modeling forest dynamics, forest processes, study landscapes, spatial and temporal aspects, and the key response metrics - biodiversity and wood biomass production, as well as dealing with trade-offs and uncertainties. We also provide illustrative examples from the modeling literature stemming from the key considerations assessed. We use our findings to reiterate the need for explicitly addressing and conveying the limitations and uncertainties of any modeling approach taken, and the need for interdisciplinary research efforts when addressing the conservation of biodiversity and sustainable use of environmental resources. Copyright © 2017 Elsevier Ltd. All rights reserved.
Explicit and implicit assessment of gender roles.
Fernández, Juan; Quiroga, M Ángeles; Escorial, Sergio; Privado, Jesús
2014-05-01
Gender roles have been assessed by explicit measures and, recently, by implicit measures. In the former case, the theoretical assumptions have been questioned by empirical results. To solve this contradiction, we carried out two concatenated studies based on a relatively well-founded theoretical and empirical approach. The first study was designed to obtain a sample of genderized activities of the domestic sphere by means of an explicit assessment. Forty-two raters (22 women and 20 men, balanced on age, sex, and level of education) took part as raters. In the second study, an implicit assessment of gender roles was carried out, focusing on the response time given to the sample activities obtained from the first study. A total of 164 adults (90 women and 74 men, mean age = 43), with experience in living with a partner and balanced on age, sex, and level of education, participated. Taken together, results show that explicit and implicit assessment converge. The current social reality shows that there is still no equity in some gender roles in the domestic sphere. These consistent results show considerable theoretical and empirical robustness, due to the double implicit and explicit assessment.
Impact of mesophyll diffusion on estimated global land CO 2 fertilization
Sun, Ying; Gu, Lianhong; Dickinson, Robert E.; ...
2014-10-13
In C 3 plants, CO 2 concentrations drop considerably along mesophyll diffusion pathways from substomatal cavities to chloroplasts where CO 2 assimilation occurs. Global carbon cycle models have not explicitly represented this internal drawdown and so overestimate CO 2 available for carboxylation and underestimate photosynthetic responsiveness to atmospheric CO 2. An explicit consideration of mesophyll diffusion increases the modeled cumulative CO 2 fertilization effect (CFE) for global gross primary production (GPP) from 915 PgC to 1057 PgC for the period of 1901 to 2010. This increase represents a 16% correction, large enough to explain the persistent overestimation of growth ratesmore » of historical atmospheric CO 2 by Earth System Models. Without this correction, the CFE for global GPP is underestimated by 0.05 PgC yr -1ppm -1. This finding implies that the contemporary terrestrial biosphere is more CO 2-limited than previously thought.« less
Babin, Volodymyr; Roland, Christopher; Darden, Thomas A.; Sagui, Celeste
2007-01-01
There is considerable interest in developing methodologies for the accurate evaluation of free energies, especially in the context of biomolecular simulations. Here, we report on a reexamination of the recently developed metadynamics method, which is explicitly designed to probe “rare events” and areas of phase space that are typically difficult to access with a molecular dynamics simulation. Specifically, we show that the accuracy of the free energy landscape calculated with the metadynamics method may be considerably improved when combined with umbrella sampling techniques. As test cases, we have studied the folding free energy landscape of two prototypical peptides: Ace-(Gly)2-Pro-(Gly)3-Nme in vacuo and trialanine solvated by both implicit and explicit water. The method has been implemented in the classical biomolecular code AMBER and is to be distributed in the next scheduled release of the code. © 2006 American Institute of Physics. PMID:17144742
A comparison of two central difference schemes for solving the Navier-Stokes equations
NASA Technical Reports Server (NTRS)
Maksymiuk, C. M.; Swanson, R. C.; Pulliam, T. H.
1990-01-01
Five viscous transonic airfoil cases were computed by two significantly different computational fluid dynamics codes: An explicit finite-volume algorithm with multigrid, and an implicit finite-difference approximate-factorization method with Eigenvector diagonalization. Both methods are described in detail, and their performance on the test cases is compared. The codes utilized the same grids, turbulence model, and computer to provide the truest test of the algorithms. The two approaches produce very similar results, which, for attached flows, also agree well with experimental results; however, the explicit code is considerably faster.
Valuing flexibilities in the design of urban water management systems.
Deng, Yinghan; Cardin, Michel-Alexandre; Babovic, Vladan; Santhanakrishnan, Deepak; Schmitter, Petra; Meshgi, Ali
2013-12-15
Climate change and rapid urbanization requires decision-makers to develop a long-term forward assessment on sustainable urban water management projects. This is further complicated by the difficulties of assessing sustainable designs and various design scenarios from an economic standpoint. A conventional valuation approach for urban water management projects, like Discounted Cash Flow (DCF) analysis, fails to incorporate uncertainties, such as amount of rainfall, unit cost of water, and other uncertainties associated with future changes in technological domains. Such approach also fails to include the value of flexibility, which enables managers to adapt and reconfigure systems over time as uncertainty unfolds. This work describes an integrated framework to value investments in urban water management systems under uncertainty. It also extends the conventional DCF analysis through explicit considerations of flexibility in systems design and management. The approach incorporates flexibility as intelligent decision-making mechanisms that enable systems to avoid future downside risks and increase opportunities for upside gains over a range of possible futures. A water catchment area in Singapore was chosen to assess the value of a flexible extension of standard drainage canals and a flexible deployment of a novel water catchment technology based on green roofs and porous pavements. Results show that integrating uncertainty and flexibility explicitly into the decision-making process can reduce initial capital expenditure, improve value for investment, and enable decision-makers to learn more about system requirements during the lifetime of the project. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
van Velzen, Joke H.
2016-01-01
Theoretically, it has been argued that a conscious understanding of metacognitive knowledge requires that this knowledge is explicit and systematic. The purpose of this descriptive study was to obtain a better understanding of explicitness and systematicity in knowledge of the mathematical problem-solving process. Eighteen 11th-grade…
Sleep Increases Explicit Solutions and Reduces Intuitive Judgments of Semantic Coherence
ERIC Educational Resources Information Center
Zander, Thea; Volz, Kirsten G.; Born, Jan; Diekelmann, Susanne
2017-01-01
Sleep fosters the generation of explicit knowledge. Whether sleep also benefits implicit intuitive decisions about underlying patterns is unclear. We examined sleep's role in explicit and intuitive semantic coherence judgments. Participants encoded sets of three words and after a sleep or wake period were required to judge the potential…
Age effects on explicit and implicit memory
Ward, Emma V.; Berry, Christopher J.; Shanks, David R.
2013-01-01
It is well-documented that explicit memory (e.g., recognition) declines with age. In contrast, many argue that implicit memory (e.g., priming) is preserved in healthy aging. For example, priming on tasks such as perceptual identification is often not statistically different in groups of young and older adults. Such observations are commonly taken as evidence for distinct explicit and implicit learning/memory systems. In this article we discuss several lines of evidence that challenge this view. We describe how patterns of differential age-related decline may arise from differences in the ways in which the two forms of memory are commonly measured, and review recent research suggesting that under improved measurement methods, implicit memory is not age-invariant. Formal computational models are of considerable utility in revealing the nature of underlying systems. We report the results of applying single and multiple-systems models to data on age effects in implicit and explicit memory. Model comparison clearly favors the single-system view. Implications for the memory systems debate are discussed. PMID:24065942
Bibby, Anna C; Torgerson, David J; Leach, Samantha; Lewis-White, Helen; Maskell, Nick A
2018-01-08
The 'trials within cohorts' (TwiC) design is a pragmatic approach to randomised trials in which trial participants are randomly selected from an existing cohort. The design has multiple potential benefits, including the option of conducting multiple trials within the same cohort. To date, the TwiC design methodology been used in numerous clinical settings but has never been applied to a clinical trial of an investigational medicinal product (CTIMP). We have recently secured the necessary approvals to undertake the first CTIMP using the TwiC design. In this paper, we describe some of the considerations and modifications required to ensure such a trial is compliant with Good Clinical Practice and international clinical trials regulations. We advocate using a two-stage consent process and using the consent stages to explicitly differentiate between trial participants and cohort participants who are providing control data. This distinction ensured compliance but had consequences with respect to costings, recruitment and the trial assessment schedule. We have demonstrated that it is possible to secure ethical and regulatory approval for a CTIMP TwiC. By including certain considerations at the trial design stage, we believe this pragmatic and efficient methodology could be utilised in other CTIMPs in future.
Setting conservation priorities.
Wilson, Kerrie A; Carwardine, Josie; Possingham, Hugh P
2009-04-01
A generic framework for setting conservation priorities based on the principles of classic decision theory is provided. This framework encapsulates the key elements of any problem, including the objective, the constraints, and knowledge of the system. Within the context of this framework the broad array of approaches for setting conservation priorities are reviewed. While some approaches prioritize assets or locations for conservation investment, it is concluded here that prioritization is incomplete without consideration of the conservation actions required to conserve the assets at particular locations. The challenges associated with prioritizing investments through time in the face of threats (and also spatially and temporally heterogeneous costs) can be aided by proper problem definition. Using the authors' general framework for setting conservation priorities, multiple criteria can be rationally integrated and where, how, and when to invest conservation resources can be scheduled. Trade-offs are unavoidable in priority setting when there are multiple considerations, and budgets are almost always finite. The authors discuss how trade-offs, risks, uncertainty, feedbacks, and learning can be explicitly evaluated within their generic framework for setting conservation priorities. Finally, they suggest ways that current priority-setting approaches may be improved.
Bayesian analysis of input uncertainty in hydrological modeling: 2. Application
NASA Astrophysics Data System (ADS)
Kavetski, Dmitri; Kuczera, George; Franks, Stewart W.
2006-03-01
The Bayesian total error analysis (BATEA) methodology directly addresses both input and output errors in hydrological modeling, requiring the modeler to make explicit, rather than implicit, assumptions about the likely extent of data uncertainty. This study considers a BATEA assessment of two North American catchments: (1) French Broad River and (2) Potomac basins. It assesses the performance of the conceptual Variable Infiltration Capacity (VIC) model with and without accounting for input (precipitation) uncertainty. The results show the considerable effects of precipitation errors on the predicted hydrographs (especially the prediction limits) and on the calibrated parameters. In addition, the performance of BATEA in the presence of severe model errors is analyzed. While BATEA allows a very direct treatment of input uncertainty and yields some limited insight into model errors, it requires the specification of valid error models, which are currently poorly understood and require further work. Moreover, it leads to computationally challenging highly dimensional problems. For some types of models, including the VIC implemented using robust numerical methods, the computational cost of BATEA can be reduced using Newton-type methods.
An efficient and flexible Abel-inversion method for noisy data
NASA Astrophysics Data System (ADS)
Antokhin, Igor I.
2016-12-01
We propose an efficient and flexible method for solving the Abel integral equation of the first kind, frequently appearing in many fields of astrophysics, physics, chemistry, and applied sciences. This equation represents an ill-posed problem, thus solving it requires some kind of regularization. Our method is based on solving the equation on a so-called compact set of functions and/or using Tikhonov's regularization. A priori constraints on the unknown function, defining a compact set, are very loose and can be set using simple physical considerations. Tikhonov's regularization in itself does not require any explicit a priori constraints on the unknown function and can be used independently of such constraints or in combination with them. Various target degrees of smoothness of the unknown function may be set, as required by the problem at hand. The advantage of the method, apart from its flexibility, is that it gives uniform convergence of the approximate solution to the exact solution, as the errors of input data tend to zero. The method is illustrated on several simulated models with known solutions. An example of astrophysical application of the method is also given.
ERIC Educational Resources Information Center
Rose, Homer C., Jr.; Hample, Stephen R.
1982-01-01
Considerations that can help colleges and universities develop institutionally specific strategies for planning faculty reductions are addressed. It is suggested that an institution can provide a fair and workable reduction plan if it: thoroughly explores alternatives to faculty layoffs; develops explicit standards and procedures for reduction…
NASA Astrophysics Data System (ADS)
He, Hongxing; Meyer, Astrid; Jansson, Per-Erik; Svensson, Magnus; Rütting, Tobias; Klemedtsson, Leif
2018-02-01
The symbiosis between plants and Ectomycorrhizal fungi (ECM) is shown to considerably influence the carbon (C) and nitrogen (N) fluxes between the soil, rhizosphere, and plants in boreal forest ecosystems. However, ECM are either neglected or presented as an implicit, undynamic term in most ecosystem models, which can potentially reduce the predictive power of models.
In order to investigate the necessity of an explicit consideration of ECM in ecosystem models, we implement the previously developed MYCOFON model into a detailed process-based, soil-plant-atmosphere model, Coup-MYCOFON, which explicitly describes the C and N fluxes between ECM and roots. This new Coup-MYCOFON model approach (ECM explicit) is compared with two simpler model approaches: one containing ECM implicitly as a dynamic uptake of organic N considering the plant roots to represent the ECM (ECM implicit), and the other a static N approach in which plant growth is limited to a fixed N level (nonlim). Parameter uncertainties are quantified using Bayesian calibration in which the model outputs are constrained to current forest growth and soil C / N ratio for four forest sites along a climate and N deposition gradient in Sweden and simulated over a 100-year period.
The nonlim
approach could not describe the soil C / N ratio due to large overestimation of soil N sequestration but simulate the forest growth reasonably well. The ECM implicit
and explicit
approaches both describe the soil C / N ratio well but slightly underestimate the forest growth. The implicit approach simulated lower litter production and soil respiration than the explicit approach. The ECM explicit Coup-MYCOFON model provides a more detailed description of internal ecosystem fluxes and feedbacks of C and N between plants, soil, and ECM. Our modeling highlights the need to incorporate ECM and organic N uptake into ecosystem models, and the nonlim approach is not recommended for future long-term soil C and N predictions. We also provide a key set of posterior fungal parameters that can be further investigated and evaluated in future ECM studies.
A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems
NASA Astrophysics Data System (ADS)
Li, Yu; Oberweis, Andreas
Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.
Developing interventions for chronically ill patients: is coping a helpful concept?
de Ridder, D; Schreurs, K
2001-03-01
In this review, the role of coping in the development of psychosocial interventions for chronically ill patients is discussed. After summarizing the theoretical issues involved in the translation of the coping concept into an intervention, a review is undertaken of 35 studies concerned with the impact of interventions aimed at improving coping on patients' quality of life. These studies concern seven different chronic disease types (AIDS, asthma, cancer, cardiovascular diseases, chronic pain, diabetes, and rheumatoid arthritis) and show explicit consideration of attempts to manage illness in terms of coping to be rare. Many studies nevertheless address the equivalent of coping, namely behaviors and/or cognitions intended to deal with an illness situation appraised as stressful. The results of these studies are encouraging, although largely limited to the improvement of one or two particular coping strategies and problem-focused strategies in particular. It is argued that in order to expand on these initially positive findings, greater and more explicit consideration should be given to the potential of the coping concept for intervention with the chronically ill. The appraisal of stressful situations, the use of coping resources, and the strategic application of particular coping strategies should, for example, be given more careful consideration.
The influence of vertical motor responses on explicit and incidental processing of power words.
Jiang, Tianjiao; Sun, Lining; Zhu, Lei
2015-07-01
There is increasing evidence demonstrating that power judgment is affected by vertical information. Such interaction between vertical space and power (i.e., response facilitation under space-power congruent conditions) is generally elicited in paradigms that require participants to explicitly evaluate the power of the presented words. The current research explored the possibility that explicit evaluative processing is not a prerequisite for the emergence of this effect. Here we compared the influence of vertical information on a standard explicit power evaluation task with influence on a task that linked power with stimuli in a more incidental manner, requiring participants to report whether the words represented people or animals or the font of the words. The results revealed that although the effect is more modest, the interaction between responses and power is also evident in an incidental task. Furthermore, we also found that explicit semantic processing is a prerequisite to ensure such an effect. Copyright © 2015 Elsevier Inc. All rights reserved.
Schultz, Douglas H.; Balderston, Nicholas L.; Geiger, Jennifer A.; Helmstetter, Fred J.
2014-01-01
The nature of the relationship between explicit and implicit learning is a topic of considerable debate. In order to investigate this relationship we conducted two experiments on postconditioning revaluation of the unconditional stimulus (UCS) in human fear conditioning. In Experiment 1, the intensity of the UCS was decreased following acquisition for one group (devaluation) and held constant for another group (control). A subsequent test revealed that even though both groups exhibited similar levels of UCS expectancy, the devaluation group had significantly smaller conditional skin conductance responses. The devaluation effect was not explained by differences in the explicit estimates of UCS probability or explicit knowledge that the UCS intensity had changed. In Experiment 2, the value of the UCS was increased following acquisition for one group (inflation) and held constant for another group (control). Test performance revealed that UCS inflation did not alter expectancy ratings, but the inflation group exhibited larger learned skin conductance responses than the control group. The inflation effect was not explained by differences in the explicit estimates of UCS probability or explicit knowledge that the UCS intensity had changed. The SCR revaluation effect was not dependent on explicit memory processes in either experiment. In both experiments we found differences on an implicit measure of learning in the absence of changes in explicit measures. Together, the differences observed between expectancy measures and skin conductance support the idea that these responses might reflect different types of memory formed during the same training procedure and be supported by separate neural systems. PMID:23731073
The importance of values in evidence-based medicine.
Kelly, Michael P; Heath, Iona; Howick, Jeremy; Greenhalgh, Trisha
2015-10-12
Evidence-based medicine (EBM) has always required integration of patient values with 'best' clinical evidence. It is widely recognized that scientific practices and discoveries, including those of EBM, are value-laden. But to date, the science of EBM has focused primarily on methods for reducing bias in the evidence, while the role of values in the different aspects of the EBM process has been almost completely ignored. In this paper, we address this gap by demonstrating how a consideration of values can enhance every aspect of EBM, including: prioritizing which tests and treatments to investigate, selecting research designs and methods, assessing effectiveness and efficiency, supporting patient choice and taking account of the limited time and resources available to busy clinicians. Since values are integral to the practice of EBM, it follows that the highest standards of EBM require values to be made explicit, systematically explored, and integrated into decision making. Through 'values based' approaches, EBM's connection to the humanitarian principles upon which it was founded will be strengthened.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gledhill, Jonathan D.; Tozer, David J., E-mail: d.j.tozer@durham.ac.uk
Density scaling considerations are used to derive an exchange–correlation explicit density functional that is appropriate for the electron deficient side of the integer and which recovers the exact r → ∞ asymptotic behaviour of the exchange–correlation potential. The functional has an unconventional mathematical form with parameters that are system-dependent; the parameters for an N-electron system are determined in advance from generalised gradient approximation (GGA) calculations on the N- and (N − 1)-electron systems. Compared to GGA results, the functional yields similar exchange–correlation energies, but HOMO energies that are an order of magnitude closer to the negative of the vertical ionisationmore » potential; for anions, the HOMO energies are negative, as required. Rydberg excitation energies are also notably improved and the exchange–correlation potential is visibly lowered towards the near-exact potential. Further development is required to improve valence excitations, static isotropic polarisabilities, and the shape of the potential in non-asymptotic regions. The functional is fundamentally different to conventional approximations.« less
What is in a name? Is food addiction a misnomer?
Vella, Shae-Leigh; Pai, Nagesh
2017-02-01
Recently interest in the phenomenon of food addiction has increased substantially since the inclusion of gambling disorder in the DSM-5. However the phenomenon of food addiction remains controversial and the designation continues to lack clear consideration. Few researchers have offered an explicit theoretical definition of the phenomenon which is fundamental; as it not only pertains to the aetiology it also directs research and management of the phenomenon. Therefore this review explores 'what is in a name'? Specifically possible aetiologies of food addiction, eating addiction and food addiction as an eating disorder are reviewed and the potential DSM-5 classification espoused. It is evident that the phenomenon requires further research and evaluation in order to delineate whether the phenomenon constitutes a disorder and if the phenomenon is found to be a valid entity the most appropriate designation. As it is too early to draw definitive conclusions regarding the concept all plausible designations and the associated aetiologies require further investigation. Copyright © 2016 Elsevier B.V. All rights reserved.
Annual Irrigation Dynamics in the U.S. Northern High Plains Derived from Landsat Satellite Data
NASA Astrophysics Data System (ADS)
Deines, Jillian M.; Kendall, Anthony D.; Hyndman, David W.
2017-09-01
Sustainable management of agricultural water resources requires improved understanding of irrigation patterns in space and time. We produced annual, high-resolution (30 m) irrigation maps for 1999-2016 by combining all available Landsat satellite imagery with climate and soil covariables in Google Earth Engine. Random forest classification had accuracies from 92 to 100% and generally agreed with county statistics (
Steven T. Knick; Steven E. Hanser; Matthias Leu; Cameron L. Aldridge; Scott E. Neilsen; Mary M. Rowland; Sean P. Finn; Michael J. Wisdom
2011-01-01
We conducted an ecoregional assessment of sagebrush (Artemisia spp.) ecosystems in the Wyoming Basins and surrounding regions (WBEA) to determine broad-scale species-environmental relationships. Our goal was to assess the potential influence from threats to the sagebrush ecosystem on associated wildlife through the use of spatially explicit...
Default contagion risks in Russian interbank market
NASA Astrophysics Data System (ADS)
Leonidov, A. V.; Rumyantsev, E. L.
2016-06-01
Systemic risks of default contagion in the Russian interbank market are investigated. The analysis is based on considering the bow-tie structure of the weighted oriented graph describing the structure of the interbank loans. A probabilistic model of interbank contagion explicitly taking into account the empirical bow-tie structure reflecting functionality of the corresponding nodes (borrowers, lenders, borrowers and lenders simultaneously), degree distributions and disassortativity of the interbank network under consideration based on empirical data is developed. The characteristics of contagion-related systemic risk calculated with this model are shown to be in agreement with those of explicit stress tests.
Welch, Vivian A; Akl, Elie A; Guyatt, Gordon; Pottie, Kevin; Eslava-Schmalbach, Javier; Ansari, Mohammed T; de Beer, Hans; Briel, Matthias; Dans, Tony; Dans, Inday; Hultcrantz, Monica; Jull, Janet; Katikireddi, Srinivasa Vittal; Meerpohl, Joerg; Morton, Rachael; Mosdol, Annhild; Petkovic, Jennifer; Schünemann, Holger J; Sharaf, Ravi N; Singh, Jasvinder A; Stanev, Roger; Tonia, Thomy; Tristan, Mario; Vitols, Sigurd; Watine, Joseph; Tugwell, Peter
2017-10-01
This article introduces the rationale and methods for explicitly considering health equity in the Grading of Recommendations Assessment, Development and Evaluation (GRADE) methodology for development of clinical, public health, and health system guidelines. We searched for guideline methodology articles, conceptual articles about health equity, and examples of guidelines that considered health equity explicitly. We held three meetings with GRADE Working Group members and invited comments from the GRADE Working Group listserve. We developed three articles on incorporating equity considerations into the overall approach to guideline development, rating certainty, and assembling the evidence base and evidence to decision and/or recommendation. Clinical and public health guidelines have a role to play in promoting health equity by explicitly considering equity in the process of guideline development. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Content relatedness in the social web based on social explicit semantic analysis
NASA Astrophysics Data System (ADS)
Ntalianis, Klimis; Otterbacher, Jahna; Mastorakis, Nikolaos
2017-06-01
In this paper a novel content relatedness algorithm for social media content is proposed, based on the Explicit Semantic Analysis (ESA) technique. The proposed scheme takes into consideration social interactions. In particular starting from the vector space representation model, similarity is expressed by a summation of term weight products. In this paper, term weights are estimated by a social computing method, where the strength of each term is calculated by the attention the terms receives. For this reason each post is split into two parts, title and comments area, while attention is defined by the number of social interactions such as likes and shares. The overall approach is named Social Explicit Semantic Analysis. Experimental results on real data show the advantages and limitations of the proposed approach, while an initial comparison between ESA and S-ESA is very promising.
Children exhibit different performance patterns in explicit and implicit theory of mind tasks.
Oktay-Gür, Nese; Schulz, Alexandra; Rakoczy, Hannes
2018-04-01
Three studies tested scope and limits of children's implicit and explicit theory of mind. In Studies 1 and 2, three- to six-year-olds (N = 84) were presented with closely matched explicit false belief tasks that differed in whether or not they required an understanding of aspectuality. Results revealed that children performed equally well in the different tasks, and performance was strongly correlated. Study 3 tested two-year-olds (N = 81) in implicit interactive versions of these tasks and found evidence for dis-unity: children performed competently only in those tasks that did not require an understanding of aspectuality. Taken together, the present findings suggest that early implicit and later explicit theory of mind tasks may tap different forms of cognitive capacities. Copyright © 2018 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Colangelo, Annette; Buchanan, Lori
2006-01-01
The failure of inhibition hypothesis posits a theoretical distinction between implicit and explicit access in deep dyslexia. Specifically, the effects of failure of inhibition are assumed only in conditions that have an explicit selection requirement in the context of production (i.e., aloud reading). In contrast, the failure of inhibition…
Modeling Physiological Processes That Relate Toxicant Exposure and Bacterial Population Dynamics
Klanjscek, Tin; Nisbet, Roger M.; Priester, John H.; Holden, Patricia A.
2012-01-01
Quantifying effects of toxicant exposure on metabolic processes is crucial to predicting microbial growth patterns in different environments. Mechanistic models, such as those based on Dynamic Energy Budget (DEB) theory, can link physiological processes to microbial growth. Here we expand the DEB framework to include explicit consideration of the role of reactive oxygen species (ROS). Extensions considered are: (i) additional terms in the equation for the “hazard rate” that quantifies mortality risk; (ii) a variable representing environmental degradation; (iii) a mechanistic description of toxic effects linked to increase in ROS production and aging acceleration, and to non-competitive inhibition of transport channels; (iv) a new representation of the “lag time” based on energy required for acclimation. We estimate model parameters using calibrated Pseudomonas aeruginosa optical density growth data for seven levels of cadmium exposure. The model reproduces growth patterns for all treatments with a single common parameter set, and bacterial growth for treatments of up to 150 mg(Cd)/L can be predicted reasonably well using parameters estimated from cadmium treatments of 20 mg(Cd)/L and lower. Our approach is an important step towards connecting levels of biological organization in ecotoxicology. The presented model reveals possible connections between processes that are not obvious from purely empirical considerations, enables validation and hypothesis testing by creating testable predictions, and identifies research required to further develop the theory. PMID:22328915
Magnetic design for the PediaFlow ventricular assist device.
Noh, Myounggyu D; Antaki, James F; Ricci, Michael; Gardiner, Jeff; Paden, Dave; Wu, Jingchun; Prem, Ed; Borovetz, Harvey; Paden, Bradley E
2008-02-01
This article describes a design process for a new pediatric ventricular assist device, the PediaFlow. The pump is embodied in a magnetically levitated turbodynamic design that was developed explicitly based on the requirements for chronic support of infants and small children. The procedure entailed the consideration of multiple pump topologies, from which an axial mixed-flow configuration was chosen for further development. The magnetic design includes permanent-magnet (PM) passive bearings for radial support of the rotor, an actively controlled thrust actuator for axial support, and a brushless direct current (DC) motor for rotation. These components are closely coupled both geometrically and magnetically, and were therefore optimized in parallel, using electromagnetic, rotordynamic models and fluid models, and in consideration of hydrodynamic requirements. Multiple design objectives were considered, including efficiency, size, and margin between critical speeds to operating speed. The former depends upon the radial and yaw stiffnesses of the PM bearings. Analytical expressions for the stiffnesses were derived and verified through finite element analysis (FEA). A toroidally wound motor was designed for high efficiency and minimal additional negative radial stiffness. The design process relies heavily on optimization at the component level and system level. The results of this preliminary design optimization yielded a pump design with an overall stability margin of 15%, based on a pressure rise of 100 mm Hg at 0.5 lpm running at 16,000 rpm.
Replication and robustness in developmental research.
Duncan, Greg J; Engel, Mimi; Claessens, Amy; Dowsett, Chantelle J
2014-11-01
Replications and robustness checks are key elements of the scientific method and a staple in many disciplines. However, leading journals in developmental psychology rarely include explicit replications of prior research conducted by different investigators, and few require authors to establish in their articles or online appendices that their key results are robust across estimation methods, data sets, and demographic subgroups. This article makes the case for prioritizing both explicit replications and, especially, within-study robustness checks in developmental psychology. It provides evidence on variation in effect sizes in developmental studies and documents strikingly different replication and robustness-checking practices in a sample of journals in developmental psychology and a sister behavioral science-applied economics. Our goal is not to show that any one behavioral science has a monopoly on best practices, but rather to show how journals from a related discipline address vital concerns of replication and generalizability shared by all social and behavioral sciences. We provide recommendations for promoting graduate training in replication and robustness-checking methods and for editorial policies that encourage these practices. Although some of our recommendations may shift the form and substance of developmental research articles, we argue that they would generate considerable scientific benefits for the field. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Functionals of Gegenbauer polynomials and D-dimensional hydrogenic momentum expectation values
NASA Astrophysics Data System (ADS)
Van Assche, W.; Yáñez, R. J.; González-Férez, R.; Dehesa, Jesús S.
2000-09-01
The system of Gegenbauer or ultraspherical polynomials {Cnλ(x);n=0,1,…} is a classical family of polynomials orthogonal with respect to the weight function ωλ(x)=(1-x2)λ-1/2 on the support interval [-1,+1]. Integral functionals of Gegenbauer polynomials with integrand f(x)[Cnλ(x)]2ωλ(x), where f(x) is an arbitrary function which does not depend on n or λ, are considered in this paper. First, a general recursion formula for these functionals is obtained. Then, the explicit expression for some specific functionals of this type is found in a closed and compact form; namely, for the functionals with f(x) equal to (1-x)α(1+x)β, log(1-x2), and (1+x)log(1+x), which appear in numerous physico-mathematical problems. Finally, these functionals are used in the explicit evaluation of the momentum expectation values
and are given by means of a terminating 5F4 hypergeometric function with unit argument, which is a considerable improvement with respect to Hey's expression (the only one existing up to now) which requires a double sum.
Shang, Barry Z; Voulgarakis, Nikolaos K; Chu, Jhih-Wei
2012-07-28
This work illustrates that fluctuating hydrodynamics (FHD) simulations can be used to capture the thermodynamic and hydrodynamic responses of molecular fluids at the nanoscale, including those associated with energy and heat transfer. Using all-atom molecular dynamics (MD) trajectories as the reference data, the atomistic coordinates of each snapshot are mapped onto mass, momentum, and energy density fields on Eulerian grids to generate a corresponding field trajectory. The molecular length-scale associated with finite molecule size is explicitly imposed during this coarse-graining by requiring that the variances of density fields scale inversely with the grid volume. From the fluctuations of field variables, the response functions and transport coefficients encoded in the all-atom MD trajectory are computed. By using the extracted fluid properties in FHD simulations, we show that the fluctuations and relaxation of hydrodynamic fields quantitatively match with those observed in the reference all-atom MD trajectory, hence establishing compatibility between the atomistic and field representations. We also show that inclusion of energy transfer in the FHD equations can more accurately capture the thermodynamic and hydrodynamic responses of molecular fluids. The results indicate that the proposed MD-to-FHD mapping with explicit consideration of finite molecule size provides a robust framework for coarse-graining the solution phase of complex molecular systems.
NASA Technical Reports Server (NTRS)
Gulden, L. E.; Rosero, E.; Yang, Z.-L.; Rodell, Matthew; Jackson, C. S.; Niu, G.-Y.; Yeh, P. J.-F.; Famiglietti, J. S.
2007-01-01
Land surface models (LSMs) are computer programs, similar to weather and climate prediction models, which simulate the storage and movement of water (including soil moisture, snow, evaporation, and runoff) after it falls to the ground as precipitation. It is not currently possible to measure all of the variables of interest everywhere on Earth with sufficient accuracy. Hence LSMs have been developed to integrate the available information, including satellite observations, using powerful computers, in order to track water storage and redistribution. The maps are used to improve weather forecasts, support water resources and agricultural applications, and study the Earth's water cycle and climate variability. Recently, the models have begun to simulate groundwater storage. In this paper, we compare several possible approaches, and examine the pitfalls associated with trying to estimate aquifer parameters (such as porosity) that are required by the models. We find that explicit representation of groundwater, as opposed to the addition of deeper soil layers, considerably decreases the sensitivity of modeled terrestrial water storage to aquifer parameter choices. We also show that approximate knowledge of parameter values is not sufficient to guarantee realistic model performance: because interaction among parameters is significant, they must be prescribed as a harmonious set.
Seidl, Rupert; Lexer, Manfred J
2013-01-15
The unabated continuation of anthropogenic greenhouse gas emissions and the lack of an international consensus on a stringent climate change mitigation policy underscore the importance of adaptation for coping with the all but inevitable changes in the climate system. Adaptation measures in forestry have particularly long lead times. A timely implementation is thus crucial for reducing the considerable climate vulnerability of forest ecosystems. However, since future environmental conditions as well as future societal demands on forests are inherently uncertain, a core requirement for adaptation is robustness to a wide variety of possible futures. Here we explicitly address the roles of climatic and social uncertainty in forest management, and tackle the question of robustness of adaptation measures in the context of multi-objective sustainable forest management (SFM). We used the Austrian Federal Forests (AFF) as a case study, and employed a comprehensive vulnerability assessment framework based on ecosystem modeling, multi-criteria decision analysis, and practitioner participation. We explicitly considered climate uncertainty by means of three climate change scenarios, and accounted for uncertainty in future social demands by means of three societal preference scenarios regarding SFM indicators. We found that the effects of climatic and social uncertainty on the projected performance of management were in the same order of magnitude, underlining the notion that climate change adaptation requires an integrated social-ecological perspective. Furthermore, our analysis of adaptation measures revealed considerable trade-offs between reducing adverse impacts of climate change and facilitating adaptive capacity. This finding implies that prioritization between these two general aims of adaptation is necessary in management planning, which we suggest can draw on uncertainty analysis: Where the variation induced by social-ecological uncertainty renders measures aiming to reduce climate change impacts statistically insignificant (i.e., for approximately one third of the investigated management units of the AFF case study), fostering adaptive capacity is suggested as the preferred pathway for adaptation. We conclude that climate change adaptation needs to balance between anticipating expected future conditions and building the capacity to address unknowns and surprises. Copyright © 2012 Elsevier Ltd. All rights reserved.
The ABCs of Student Engagement
ERIC Educational Resources Information Center
Parsons, Seth A.; Nuland, Leila Richey; Parsons, Allison Ward
2014-01-01
Student engagement is an important consideration for teachers and administrators because it is explicitly associated with achievement. What the authors call the ABC's of engagement they outline as: Affective engagement, Behavioral engagement, and Cognitive engagement. They also present "Three Things Every Teacher Needs to Know about…
Finite Element Analysis of the Maximum Stress at the Joints of the Transmission Tower
NASA Astrophysics Data System (ADS)
Itam, Zarina; Beddu, Salmia; Liyana Mohd Kamal, Nur; Bamashmos, Khaled H.
2016-03-01
Transmission towers are tall structures, usually a steel lattice tower, used to support an overhead power line. Usually, transmission towers are analyzed as frame-truss systems and the members are assumed to be pin-connected without explicitly considering the effects of joints on the tower behavior. In this research, an engineering example of joint will be analyzed with the consideration of the joint detailing to investigate how it will affect the tower analysis. A static analysis using STAAD Pro was conducted to indicate the joint with the maximum stress. This joint will then be explicitly analyzed in ANSYS using the Finite Element Method. Three approaches were used in the software which are the simple plate model, bonded contact with no bolts, and beam element bolts. Results from the joint analysis show that stress values increased with joint details consideration. This proves that joints and connections play an important role in the distribution of stress within the transmission tower.
Mrozek, Piotr
2011-08-01
A numerical model explicitly considering the space-charge density evolved both under the mask and in the region of optical structure formation was used to predict the profiles of Ag concentration during field-assisted Ag(+)-Na(+) ion exchange channel waveguide fabrication. The influence of the unequal values of diffusion constants and mobilities of incoming and outgoing ions, the value of a correlation factor (Haven ratio), and particularly space-charge density induced during the ion exchange, on the resulting profiles of Ag concentration was analyzed and discussed. It was shown that the incorporation into the numerical model of a small quantity of highly mobile ions other than exclusively Ag(+) and Na(+) may considerably affect the range and shape of calculated Ag profiles in the multicomponent glass. The Poisson equation was used to predict the electric field spread evolution in the glass substrate. The results of the numerical analysis were verified by the experimental data of Ag concentration in a channel waveguide fabricated using a field-assisted process.
The pragmatist's guide to comparative effectiveness research.
Chandra, Amitabh; Jena, Anupam B; Skinner, Jonathan S
2011-01-01
Following an acrimonious health care reform debate involving charges of "death panels," in 2010, Congress explicitly forbade the use of cost-effectiveness analysis in government programs of the Patient Protection and Affordable Care Act. In this context, comparative effectiveness research emerged as an alternative strategy to understand better what works in health care. Put simply, comparative effectiveness research compares the efficacy of two or more diagnostic tests, treatments, or health care delivery methods without any explicit consideration of costs. To economists, the omission of costs from an assessment might seem nonsensical, but we argue that comparative effectiveness research still holds promise. First, it sidesteps one problem facing cost-effectiveness analysis--the widespread political resistance to the idea of using prices in health care. Second, there is little or no evidence on comparative effectiveness for a vast array of treatments: for example, we don't know whether proton-beam therapy, a very expensive treatment for prostate cancer (which requires building a cyclotron and a facility the size of a football field) offers any advantage over conventional approaches. Most drug studies compare new drugs to placebos, rather than "head-to-head" with other drugs on the market, leaving a vacuum as to which drug works best. Finally, the comparative effectiveness research can prove a useful first step even in the absence of cost information if it provides key estimates of treatment effects. After all, such effects are typically expensive to determine and require years or even decades of data. Costs are much easier to measure, and can be appended at a later date as financial Armageddon draws closer.
Spatially explicit multi-criteria decision analysis for managing vector-borne diseases
2011-01-01
The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular. PMID:22206355
Bevelhimer, Mark S.; DeRolph, Christopher R.; Schramm, Michael P.
2016-06-06
Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multidisciplinary explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, wemore » were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements have been a result of a range of factors, from biological and hydrological to political and cultural. Furthermore, project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation.« less
DeRolph, Christopher R; Schramm, Michael P; Bevelhimer, Mark S
2016-10-01
Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multi-faceted explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, we were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements are functions of a range of factors, from biophysical to socio-political. Project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation. Copyright © 2016 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bevelhimer, Mark S.; DeRolph, Christopher R.; Schramm, Michael P.
Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multidisciplinary explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, wemore » were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements have been a result of a range of factors, from biological and hydrological to political and cultural. Furthermore, project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation.« less
Extinguishing trace fear engages the retrosplenial cortex rather than the amygdala
Kwapis, Janine L.; Jarome, Timothy J.; Lee, Jonathan L.; Gilmartin, Marieke R.; Helmstetter, Fred J.
2013-01-01
Extinction learning underlies the treatment for a variety of anxiety disorders. Most of what is known about the neurobiology of extinction is based on standard “delay” fear conditioning, in which awareness is not required for learning. Little is known about how complex, explicit associations extinguish, however. “Trace” conditioning is considered to be a rodent model of explicit fear because it relies on both the cortex and hippocampus and requires explicit contingency awareness in humans. Here, we explore the neural circuit supporting trace fear extinction in order to better understand how complex memories extinguish. We first show that the amygdala is selectively involved in delay fear extinction; blocking intra-amygdala glutamate receptors disrupted delay, but not trace extinction. Further, ERK phosphorylation was increased in the amygdala after delay, but not trace extinction. We then identify the retrosplenial cortex (RSC) as a key structure supporting trace extinction. ERK phosphorylation was selectively increased in the RSC following trace extinction and blocking intra-RSC NMDA receptors impaired trace, but not delay extinction. These findings indicate that delay and trace extinction require different neural circuits; delay extinction requires plasticity in the amygdala whereas trace extinction requires the RSC. Anxiety disorders linked to explicit memory may therefore depend on cortical processes that have not been traditionally targeted by extinction studies based on delay fear. PMID:24055593
Donor Behavior and Voluntary Support for Higher Education Institutions.
ERIC Educational Resources Information Center
Leslie, Larry L.; Ramey, Garey
Voluntary support of higher education in America is investigated through regression analysis of institutional characteristics at two points in time. The assumption of donor rationality together with explicit consideration of interorganizational relationships offers a coherent framework for the analysis of voluntary support by the major…
The History Major and Liberal Education
ERIC Educational Resources Information Center
Liberal Education, 2009
2009-01-01
All disciplines and fields have something important to contribute to liberal learning. History, however, provides something distinctive. This contribution can be enhanced by a more explicit understanding of the relationship between the history major and the broader goals and processes of liberal learning, and through consideration of that…
Henriksen, Niel M.; Roe, Daniel R.; Cheatham, Thomas E.
2013-01-01
Molecular dynamics force field development and assessment requires a reliable means for obtaining a well-converged conformational ensemble of a molecule in both a time-efficient and cost-effective manner. This remains a challenge for RNA because its rugged energy landscape results in slow conformational sampling and accurate results typically require explicit solvent which increases computational cost. To address this, we performed both traditional and modified replica exchange molecular dynamics simulations on a test system (alanine dipeptide) and an RNA tetramer known to populate A-form-like conformations in solution (single-stranded rGACC). A key focus is on providing the means to demonstrate that convergence is obtained, for example by investigating replica RMSD profiles and/or detailed ensemble analysis through clustering. We found that traditional replica exchange simulations still require prohibitive time and resource expenditures, even when using GPU accelerated hardware, and our results are not well converged even at 2 microseconds of simulation time per replica. In contrast, a modified version of replica exchange, reservoir replica exchange in explicit solvent, showed much better convergence and proved to be both a cost-effective and reliable alternative to the traditional approach. We expect this method will be attractive for future research that requires quantitative conformational analysis from explicitly solvated simulations. PMID:23477537
Henriksen, Niel M; Roe, Daniel R; Cheatham, Thomas E
2013-04-18
Molecular dynamics force field development and assessment requires a reliable means for obtaining a well-converged conformational ensemble of a molecule in both a time-efficient and cost-effective manner. This remains a challenge for RNA because its rugged energy landscape results in slow conformational sampling and accurate results typically require explicit solvent which increases computational cost. To address this, we performed both traditional and modified replica exchange molecular dynamics simulations on a test system (alanine dipeptide) and an RNA tetramer known to populate A-form-like conformations in solution (single-stranded rGACC). A key focus is on providing the means to demonstrate that convergence is obtained, for example, by investigating replica RMSD profiles and/or detailed ensemble analysis through clustering. We found that traditional replica exchange simulations still require prohibitive time and resource expenditures, even when using GPU accelerated hardware, and our results are not well converged even at 2 μs of simulation time per replica. In contrast, a modified version of replica exchange, reservoir replica exchange in explicit solvent, showed much better convergence and proved to be both a cost-effective and reliable alternative to the traditional approach. We expect this method will be attractive for future research that requires quantitative conformational analysis from explicitly solvated simulations.
Resource allocation in health care and the role of personal autonomy.
Gandjour, A
2015-03-01
Resource allocation decisions in health care require the consideration of ethical values. Major ethical theories include Amartya Sen's capability approach, Norman Daniels's theory of justice for health, and preference utilitarian theory. This paper argues that while only preference utilitarian theory explicitly considers the impact of an individual's actions on others, all 3 theories agree in terms of providing individual autonomy. Furthermore, it shows that all 3 theories emphasise the role of informed preferences in securing individual autonomy. Still, stressing personal autonomy has limited direct implications for priority setting. 2 priority rules for resource allocation could be identified: 1) to give priority to patients with mental disability (over those with pure physical disability); and 2) to give priority to patients with a large expected loss of autonomy without treatment. © Georg Thieme Verlag KG Stuttgart · New York.
Area law from loop quantum gravity
NASA Astrophysics Data System (ADS)
Hamma, Alioscia; Hung, Ling-Yan; Marcianò, Antonino; Zhang, Mingyi
2018-03-01
We explore the constraints following from requiring the area law in the entanglement entropy in the context of loop quantum gravity. We find a unique solution to the single-link wave function in the large j limit, believed to be appropriate in the semiclassical limit. We then generalize our considerations to multilink coherent states, and find that the area law is preserved very generically using our single-link wave function as a building block. Finally, we develop the framework that generates families of multilink states that preserve the area law while avoiding macroscopic entanglement, the space-time analogue of "Schrödinger's cat." We note that these states, defined on a given set of graphs, are the ground states of some local Hamiltonian that can be constructed explicitly. This can potentially shed light on the construction of the appropriate Hamiltonian constraints in the LQG framework.
NASA Technical Reports Server (NTRS)
Shih, T. I.-P.; Roelke, R. J.; Steinthorsson, E.
1991-01-01
In order to study numerically details of the flow and heat transfer within coolant passages of turbine blades, a method must first be developed to generate grid systems within the very complicated geometries involved. In this study, a grid generation package was developed that is capable of generating the required grid systems. The package developed is based on an algebraic grid generation technique that permits the user considerable control over how grid points are to be distributed in a very explicit way. These controls include orthogonality of grid lines next to boundary surfaces and ability to cluster about arbitrary points, lines, and surfaces. This paper describes that grid generation package and shows how it can be used to generate grid systems within complicated-shaped coolant passages via an example.
The need for speed: informed land acquisitions for conservation in a dynamic property market.
McDonald-Madden, Eve; Bode, Michael; Game, Edward T; Grantham, Hedley; Possingham, Hugh P
2008-11-01
Land acquisition is a common approach to biodiversity conservation but is typically subject to property availability on the public market. Consequently, conservation plans are often unable to be implemented as intended. When properties come on the market, conservation agencies must make a choice: purchase immediately, often without a detailed knowledge of its biodiversity value; survey the parcel and accept the risk that it may be removed from the market during this process; or not purchase and hope a better parcel comes on the market at a later date. We describe both an optimal method, using stochastic dynamic programming, and a simple rule of thumb for making such decisions. The solutions to this problem illustrate how optimal conservation is necessarily dynamic and requires explicit consideration of both the time period allowed for implementation and the availability of properties.
The explicit and implicit dance in psychoanalytic change.
Fosshage, James L
2004-02-01
How the implicit/non-declarative and explicit/declarative cognitive domains interact is centrally important in the consideration of effecting change within the psychoanalytic arena. Stern et al. (1998) declare that long-lasting change occurs in the domain of implicit relational knowledge. In the view of this author, the implicit and explicit domains are intricately intertwined in an interactive dance within a psychoanalytic process. The author views that a spirit of inquiry (Lichtenberg, Lachmann & Fosshage 2002) serves as the foundation of the psychoanalytic process. Analyst and patient strive to explore, understand and communicate and, thereby, create a 'spirit' of interaction that contributes, through gradual incremental learning, to new implicit relational knowledge. This spirit, as part of the implicit relational interaction, is a cornerstone of the analytic relationship. The 'inquiry' more directly brings explicit/declarative processing to the foreground in the joint attempt to explore and understand. The spirit of inquiry in the psychoanalytic arena highlights both the autobiographical scenarios of the explicit memory system and the mental models of the implicit memory system as each contributes to a sense of self, other, and self with other. This process facilitates the extrication and suspension of the old models, so that new models based on current relational experience can be gradually integrated into both memory systems for lasting change.
Ballesteros, Soledad; Reales, José M; García, Eulalio; Carrasco, Marisa
2006-02-01
Three experiments investigated the effects of two variables -selective attention during encoding and delay between study and test- on implicit (picture fragment completion and object naming) and explicit (free recall and recognition) memory tests. Experiments 1 and 2 consistently indicated that (a) at all delays (immediate to 1 month), picture-fragment identification threshold was lower for the attended than the unattended pictures; (b) the attended pictures were recalled and recognized better than the unattended; and (c) attention and delay interacted in both memory tests. For implicit memory, performance decreased as delay increased for both attended and unattended pictures, but priming was more pronounced and lasted longer for the attended pictures; it was still present after a 1-month delay. For explicit memory, performance decreased as delay increased for attended pictures, but for unattended pictures performance was consistent throughout delay. By using a perceptual object naming task, Experiment 3 showed reliable implicit and explicit memory for attended but not for unattended pictures. This study indicates that picture repetition priming requires attention at the time of study and that neither delay nor attention dissociate performance in explicit and implicit memory tests; both types of memory require attention, but explicit memory does so to a larger degree.
Putting an Ethical Lens on Learning Analytics
ERIC Educational Resources Information Center
West, Deborah; Huijser, Henk; Heath, David
2016-01-01
As learning analytics activity has increased, a variety of ethical implications and considerations have emerged, though a significant research gap remains in explicitly investigating the views of key stakeholders, such as academic staff. This paper draws on ethics-related findings from an Australian study featuring two surveys, one of…
ERIC Educational Resources Information Center
Binder, P.-M.; Richert, A.
2011-01-01
A series of papers have recently addressed the mechanism by which a siphon works. While all this started as an effort to clarify words--namely, dictionary definitions--the authors feel that words, along with the misguided use of physical concepts, are currently contributing to considerable confusion and casuistry on this subject. They wish to make…
Research on golden-winged warblers: recent progress and current needs
Henry M. Streby; Ronald W. Rohrbaugh; David A. Buehler; David E. Andersen; Rachel Vallender; David I. King; Tom Will
2016-01-01
Considerable advances have been made in knowledge about Golden-winged Warblers (Vermivora chrysoptera) in the past decade. Recent employment of molecular analysis, stable-isotope analysis, telemetry-based monitoring of survival and behavior, and spatially explicit modeling techniques have added to, and revised, an already broad base of published...
Influence of Familiar Features on Diagnosis: Instantiated Features in an Applied Setting
ERIC Educational Resources Information Center
Dore, Kelly L.; Brooks, Lee R.; Weaver, Bruce; Norman, Geoffrey R.
2012-01-01
Medical diagnosis can be viewed as a categorization task. There are two mechanisms whereby humans make categorical judgments: "analytical reasoning," based on explicit consideration of features and "nonanalytical reasoning," an unconscious holistic process of matching against prior exemplars. However, there is evidence that prior experience can…
ERIC Educational Resources Information Center
Huang, Tzu-Hua; Liu, Yuan-Chen
2017-01-01
This paper reflects thorough consideration of cultural perspectives in the establishment of science curriculum development principles in Taiwan. The authority explicitly states that education measures and activities of aboriginal peoples' ethnic group should be implemented consistently to incorporate their history, language, art, living customs,…
President's Education Aims Aired
ERIC Educational Resources Information Center
McNeil, Michele; Klein, Alyson
2009-01-01
By explicitly naming education as one of three top priority areas in his first joint congressional address and in his first federal budget proposal, President Barack Obama is putting considerable political weight--and even more money--behind the agenda he laid out during his campaign. Certain themes he struck in the February 24…
President's Education Aims Aired
ERIC Educational Resources Information Center
McNeil, Michele; Klein, Alyson
2009-01-01
By explicitly naming education as one of three top priority areas in his first joint congressional address and in his first federal budget proposal, President Barack Obama is putting considerable political weight--and even more money--behind the agenda he laid out during his campaign. Certain themes he struck in the Feb. 24…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-15
... designed to prevent conflicts of interest; (iv) Any business or personal relationship of the compensation... business or personal relationship of the compensation consultant, legal counsel, other adviser or the... factors should explicitly include consideration of the personal and business relationships between...
Considering Fees in Psychodynamic Psychotherapy: Opportunities for Residents
ERIC Educational Resources Information Center
Newman, Stewart S.
2005-01-01
OBJECTIVE: The topic of money is ubiquitous to psychodynamic therapy work, yet often neglected in residency training programs. Residency allows a unique opportunity to address issues pertaining to money and their impact on therapy. METHOD: Through the experience of the author, the need for a more explicit and systematic consideration within…
Fast but fleeting: adaptive motor learning processes associated with aging and cognitive decline.
Trewartha, Kevin M; Garcia, Angeles; Wolpert, Daniel M; Flanagan, J Randall
2014-10-01
Motor learning has been shown to depend on multiple interacting learning processes. For example, learning to adapt when moving grasped objects with novel dynamics involves a fast process that adapts and decays quickly-and that has been linked to explicit memory-and a slower process that adapts and decays more gradually. Each process is characterized by a learning rate that controls how strongly motor memory is updated based on experienced errors and a retention factor determining the movement-to-movement decay in motor memory. Here we examined whether fast and slow motor learning processes involved in learning novel dynamics differ between younger and older adults. In addition, we investigated how age-related decline in explicit memory performance influences learning and retention parameters. Although the groups adapted equally well, they did so with markedly different underlying processes. Whereas the groups had similar fast processes, they had different slow processes. Specifically, the older adults exhibited decreased retention in their slow process compared with younger adults. Within the older group, who exhibited considerable variation in explicit memory performance, we found that poor explicit memory was associated with reduced retention in the fast process, as well as the slow process. These findings suggest that explicit memory resources are a determining factor in impairments in the both the fast and slow processes for motor learning but that aging effects on the slow process are independent of explicit memory declines. Copyright © 2014 the authors 0270-6474/14/3413411-11$15.00/0.
NASA Technical Reports Server (NTRS)
Kreider, Kevin L.; Baumeister, Kenneth J.
1996-01-01
An explicit finite difference real time iteration scheme is developed to study harmonic sound propagation in aircraft engine nacelles. To reduce storage requirements for future large 3D problems, the time dependent potential form of the acoustic wave equation is used. To insure that the finite difference scheme is both explicit and stable for a harmonic monochromatic sound field, a parabolic (in time) approximation is introduced to reduce the order of the governing equation. The analysis begins with a harmonic sound source radiating into a quiescent duct. This fully explicit iteration method then calculates stepwise in time to obtain the 'steady state' harmonic solutions of the acoustic field. For stability, applications of conventional impedance boundary conditions requires coupling to explicit hyperbolic difference equations at the boundary. The introduction of the time parameter eliminates the large matrix storage requirements normally associated with frequency domain solutions, and time marching attains the steady-state quickly enough to make the method favorable when compared to frequency domain methods. For validation, this transient-frequency domain method is applied to sound propagation in a 2D hard wall duct with plug flow.
Metcalfe, J. D.; Le Quesne, W. J. F.; Cheung, W. W. L.; Righton, D. A.
2012-01-01
Physiological studies focus on the responses of cells, tissues and individuals to stressors, usually in laboratory situations. Conservation and management, on the other hand, focus on populations. The field of conservation physiology addresses the question of how abiotic drivers of physiological responses at the level of the individual alter requirements for successful conservation and management of populations. To achieve this, impacts of physiological effects at the individual level need to be scaled to impacts on population dynamics, which requires consideration of ecology. Successfully realizing the potential of conservation physiology requires interdisciplinary studies incorporating physiology and ecology, and requires that a constructive dialogue develops between these traditionally disparate fields. To encourage this dialogue, we consider the increasingly explicit incorporation of physiology into ecological models applied to marine fish conservation and management. Conservation physiology is further challenged as the physiology of an individual revealed under laboratory conditions is unlikely to reflect realized responses to the complex variable stressors to which it is exposed in the wild. Telemetry technology offers the capability to record an animal's behaviour while simultaneously recording environmental variables to which it is exposed. We consider how the emerging insights from telemetry can strengthen the incorporation of physiology into ecology. PMID:22566680
Meng, Xianwei; Murakami, Taro; Hashiya, Kazuhide
2017-01-01
Understanding the referent of other's utterance by referring the contextual information helps in smooth communication. Although this pragmatic referential process can be observed even in infants, its underlying mechanism and relative abilities remain unclear. This study aimed to comprehend the background of the referential process by investigating whether the phonological loop affected the referent assignment. A total of 76 children (43 girls) aged 3-5 years participated in a reference assignment task in which an experimenter asked them to answer explicit (e.g., "What color is this?") and ambiguous (e.g., "What about this?") questions about colorful objects. The phonological loop capacity was measured by using the forward digit span task in which children were required to repeat the numbers as an experimenter uttered them. The results showed that the scores of the forward digit span task positively predicted correct response to explicit questions and part of the ambiguous questions. That is, the phonological loop capacity did not have effects on referent assignment in response to ambiguous questions that were asked after a topic shift of the explicit questions and thus required a backward reference to the preceding explicit questions to detect the intent of the current ambiguous questions. These results suggest that although the phonological loop capacity could overtly enhance the storage of verbal information, it does not seem to directly contribute to the pragmatic referential process, which might require further social cognitive processes.
Weimholt, Josef
2015-01-01
One might expect--given the vastly different look, feel, and function of the ubiquitous (and innocuous) Nutrition Facts panel and the "inflammatory" graphic warning labels for cigarettes--that the statutes establishing such disclosure requirements would exhibit similar disparities. In fact, the relevant provisions of the Nutrition Labeling and Education Act of 1990 and the Family Smoking Prevention and Tobacco Control Act of 2009 are. quite analogous. Like other mandated disclosures, the nutrition label and the cigarette. graphic warnings seek to simultaneously inform and influence consumer decisions. Both statutes grant FDA considerable discretion in.the implementation of the labeling requirements, generally allowing the agency to alter the format and content of the labels as necessary to promote the statutory goals. Thus, the differences in the nutrition and cigarette warning labels are not the product of the statutory schemes alone; rather, they reflect important differences in FDA's interpretation and prioritization of the dual regulatory goals, and in the agency's implicit or explicit assumptions about human behavior.
The Purpose of Analytical Models from the Perspective of a Data Provider.
ERIC Educational Resources Information Center
Sheehan, Bernard S.
The purpose of analytical models is to reduce complex institutional management problems and situations to simpler proportions and compressed time frames so that human skills of decision makers can be brought to bear most effectively. Also, modeling cultivates the art of management by forcing explicit and analytical consideration of important…
Perspectives on Complexity, Its Definition and Applications in the Field
ERIC Educational Resources Information Center
Koopmans, Matthijs
2017-01-01
There is considerable variation in the dynamical literature in how the term "complexity" is used. While there have been several attempts to describe from an educational perspective what complexity encompasses, the term is frequently used without an explicit definition. To forge a shared understanding of what complexity means, the purpose…
Research-Based Worksheets on Using Multiple Representations in Science Classrooms
ERIC Educational Resources Information Center
Hill, Matthew; Sharma, Manjula
2015-01-01
The ability to represent the world like a scientist is difficult to teach; it is more than simply knowing the representations (e.g., graphs, words, equations and diagrams). For meaningful science learning to take place, consideration needs to be given to explicitly integrating representations into instructional methods, linked to the content, and…
Child Labour, Education Policy and Governance in Cambodia
ERIC Educational Resources Information Center
Kim, Chae-Young
2011-01-01
This paper considers how the issue of child labour is located in Cambodian education policy debates and how it is affected by the major constraints surrounding the Cambodian education sector. In particular, it asks why Cambodian policy makers have not sought to address the issue explicitly despite its considerable, and adverse, impact on…
ERIC Educational Resources Information Center
Concannon-Gibney, Tara; Murphy, Brian
2012-01-01
Despite a wealth of international research indicating the importance but also the dearth of explicit reading comprehension instruction in classrooms, current classroom reading pedagogy does not appear to have acknowledged and addressed this shortcoming to any significant degree. This is cause for some considerable concern, as today's students…
Toward Modeling the Learner's Personality Using Educational Games
ERIC Educational Resources Information Center
Essalmi, Fathi; Tlili, Ahmed; Ben Ayed, Leila Jemni; Jemmi, Mohamed
2017-01-01
Learner modeling is a crucial step in the learning personalization process. It allows taking into consideration the learner's profile to make the learning process more efficient. Most studies refer to an explicit method, namely questionnaire, to model learners. Questionnaires are time consuming and may not be motivating for learners. Thus, this…
Family Poicy in Canada: Some Theoretical Considerations and a Practical Application.
ERIC Educational Resources Information Center
Hepworth, H. Philip
Frequently implicit in Canadian social policy addressing other issues, family policy is generally assumed to be a good thing, is bound up with social structure, and, when made explicit, is prescriptive and potentially embarrassing to government. Historically important as a forerunner of more recent income assistance programs, the provision of…
Timescales and the management of ecological systems.
Hastings, Alan
2016-12-20
Human management of ecological systems, including issues like fisheries, invasive species, and restoration, as well as others, often must be undertaken with limited information. This means that developing general principles and heuristic approaches is important. Here, I focus on one aspect, the importance of an explicit consideration of time, which arises because of the inherent limitations in the response of ecological systems. I focus mainly on simple systems and models, beginning with systems without density dependence, which are therefore linear. Even for these systems, it is important to recognize the necessary delays in the response of the ecological system to management. Here, I also provide details for optimization that show how general results emerge and emphasize how delays due to demography and life histories can change the optimal management approach. A brief discussion of systems with density dependence and tipping points shows that the same themes emerge, namely, that when considering issues of restoration or management to change the state of an ecological system, that timescales need explicit consideration and may change the optimal approach in important ways.
The effect of articulatory suppression on implicit and explicit false memory in the DRM paradigm.
Van Damme, Ilse; Menten, Jan; d'Ydewalle, Gery
2010-11-01
Several studies have shown that reliable implicit false memory can be obtained in the DRM paradigm. There has been considerable debate, however, about whether or not conscious activation of critical lures during study is a necessary condition for this. Recent findings have revealed that articulatory suppression prevents subsequent false priming in an anagram task (Lovden & Johansson, 2003). The present experiment sought to replicate and extend these findings to an implicit word stem completion task, and to additionally investigate the effect of articulatory suppression on explicit false memory. Results showed an inhibitory effect of articulatory suppression on veridical memory, as well as on implicit false memory, whereas the level of explicit false memory was heightened. This suggests that articulatory suppression did not merely eliminate conscious lure activation, but had a more general capacity-delimiting effect. The drop in veridical memory can be attributed to diminished encoding of item-specific information. Superficial encoding also limited the spreading of semantic activation during study, which inhibited later false priming. In addition, the lack of item-specific and phenomenological details caused impaired source monitoring at test, resulting in heightened explicit false memory.
Optimal routing of hazardous substances in time-varying, stochastic transportation networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woods, A.L.; Miller-Hooks, E.; Mahmassani, H.S.
This report is concerned with the selection of routes in a network along which to transport hazardous substances, taking into consideration several key factors pertaining to the cost of transport and the risk of population exposure in the event of an accident. Furthermore, the fact that travel time and the risk measures are not constant over time is explicitly recognized in the routing decisions. Existing approaches typically assume static conditions, possibly resulting in inefficient route selection and unnecessary risk exposure. The report described the application of recent advances in network analysis methodologies to the problem of routing hazardous substances. Severalmore » specific problem formulations are presented, reflecting different degrees of risk aversion on the part of the decision-maker, as well as different possible operational scenarios. All procedures explicitly consider travel times and travel costs (including risk measures) to be stochastic time-varying quantities. The procedures include both exact algorithms, which may require extensive computational effort in some situations, as well as more efficient heuristics that may not guarantee a Pareto-optimal solution. All procedures are systematically illustrated for an example application using the Texas highway network, for both normal and incident condition scenarios. The application illustrates the trade-offs between the information obtained in the solution and computational efficiency, and highlights the benefits of incorporating these procedures in a decision-support system for hazardous substance shipment routing decisions.« less
NASA Astrophysics Data System (ADS)
Shi, X.; Utada, H.; Jiaying, W.
2009-12-01
The vector finite-element method combined with divergence corrections based on the magnetic field H, referred to as VFEH++ method, is developed to simulate the magnetotelluric (MT) responses of 3-D conductivity models. The advantages of the new VFEH++ method are the use of edge-elements to eliminate the vector parasites and the divergence corrections to explicitly guarantee the divergence-free conditions in the whole modeling domain. 3-D MT topographic responses are modeling using the new VFEH++ method, and are compared with those calculated by other numerical methods. The results show that MT responses can be modeled highly accurate using the VFEH+ +method. The VFEH++ algorithm is also employed for the 3-D MT data inversion incorporating topography. The 3-D MT inverse problem is formulated as a minimization problem of the regularized misfit function. In order to avoid the huge memory requirement and very long time for computing the Jacobian sensitivity matrix for Gauss-Newton method, we employ the conjugate gradient (CG) approach to solve the inversion equation. In each iteration of CG algorithm, the cost computation is the product of the Jacobian sensitivity matrix with a model vector x or its transpose with a data vector y, which can be transformed into two pseudo-forwarding modeling. This avoids the full explicitly Jacobian matrix calculation and storage which leads to considerable savings in the memory required by the inversion program in PC computer. The performance of CG algorithm will be illustrated by several typical 3-D models with horizontal earth surface and topographic surfaces. The results show that the VFEH++ and CG algorithms can be effectively employed to 3-D MT field data inversion.
Environmental decision-making and the influences of various stressors, such as landscape and climate changes on water quantity and quality, requires the application of environmental modeling. Spatially explicit environmental and watershed-scale models using GIS as a base framewor...
Gravity discharge vessel revisited: An explicit Lambert W function solution
NASA Astrophysics Data System (ADS)
Digilov, Rafael M.
2017-07-01
Based on the generalized Poiseuille equation modified by a kinetic energy correction, an explicit solution for the time evolution of a liquid column draining under gravity through an exit capillary tube is derived in terms of the Lambert W function. In contrast to the conventional exponential behavior, as implied by the Poiseuille law, a new analytical solution gives a full account for the volumetric flow rate of a fluid through a capillary of any length and improves the precision of viscosity determination. The theoretical consideration may be of interest to students as an example of how implicit equations in the field of physics can be solved analytically using the Lambert function.
NASA Astrophysics Data System (ADS)
Vorholzer, Andreas; von Aufschnaiter, Claudia; Boone, William J.
2018-02-01
Inquiry-based teaching is considered as contributing to content-related, procedural, and epistemic learning goals of science education. In this study, a quasi-experimental research design was utilized to investigate to what extent embedding inquiry activities in an explicit and an implicit instructional approach fosters students' ability to engage in three practices of scientific investigation (POSI): (1) formulating questions and hypotheses, (2) planning investigations, (3) analyzing and interpreting data. Both approaches were implemented in a classroom-based intervention conducted in a German upper secondary school (N = 222). Students' procedural knowledge of the three POSI was assessed with a paper-pencil test prior and post to the intervention, their content knowledge and dispositional factors (e.g., cognitive abilities) were gathered once. Results show that not only explicit but also implicit instruction fosters students' knowledge of POSI. While overall explicit instruction was found to be more effective, the findings indicate that the effectiveness depends considerably on the practice addressed. Moreover, findings suggest that both approaches were equally beneficial for all students regardless of their prior content knowledge and their prior procedural knowledge of POSI. Potential conditions for the success of explicit and implicit approaches as well as implications for instruction on POSI in science classrooms and for future research are discussed.
Analytical basis for planetary quarantine.
NASA Technical Reports Server (NTRS)
Schalkowsky, S.; Kline, R. C., Jr.
1971-01-01
The attempt is made to investigate quarantine constraints, and alternatives for meeting them, in sufficient detail for identifying those courses of action which compromise neither the quarantine nor the space mission objectives. Mathematical models pertinent to this goal are formulated at three distinct levels. The first level of mission constraint models pertains to the quarantine goals considered necessary by the international scientific community. The principal emphasis of modeling at this level is to quantify international considerations and to produce well-defined mission constraints. Such constraints must be translated into explicit implementation requirements by the operational agency of the launching nation. This produces the second level of implementation system modeling. However, because of the multitude of factors entering into the implementation models, it is convenient to consider these factors at the third level of implementation parameter models. These models are intentionally limited to the inclusion of only those factors which can be quantified realistically, either now or in the near future.
Do People Use the Shortest Path? An Empirical Test of Wardrop’s First Principle
Zhu, Shanjiang; Levinson, David
2015-01-01
Most recent route choice models, following either the random utility maximization or rule-based paradigm, require explicit enumeration of feasible routes. The quality of model estimation and prediction is sensitive to the appropriateness of the consideration set. However, few empirical studies of revealed route characteristics have been reported in the literature. This study evaluates the widely applied shortest path assumption by evaluating routes followed by residents of the Minneapolis—St. Paul metropolitan area. Accurate Global Positioning System (GPS) and Geographic Information System (GIS) data were employed to reveal routes people used over an eight to thirteen week period. Most people did not choose the shortest path. Using three weeks of that data, we find that current route choice set generation algorithms do not reveal the majority of paths that individuals took. Findings from this study may guide future efforts in building better route choice models. PMID:26267756
NASA Astrophysics Data System (ADS)
Lewis, Ray A.; Modanese, Giovanni
Vibrating media offer an important testing ground for reconciling conflicts between General Relativity, Quantum Mechanics and other branches of physics. For sources like a Weber bar, the standard covariant formalism for elastic bodies can be applied. The vibrating string, however, is a source of gravitational waves which requires novel computational techniques, based on the explicit construction of a conserved and renormalized energy-momentum tensor. Renormalization (in a classical sense) is necessary to take into account the effect of external constraints, which affect the emission considerably. Our computation also relaxes usual simplifying assumptions like far-field approximation, spherical or plane wave symmetry, TT gauge and absence of internal interference. In a further step towards unification, the method is then adapted to give the radiation field of a transversal Alfven wave in a rarefied astrophysical plasma, where the tension is produced by an external static magnetic field.
The motivating operation and negatively reinforced problem behavior: a systematic review.
Langthorne, Paul; McGill, Peter; Oliver, Chris
2014-01-01
The concept of motivational operations exerts an increasing influence on the understanding and assessment of problem behavior in people with intellectual and developmental disability. In this systematic review of 59 methodologically robust studies of the influence of motivational operations in negative reinforcement paradigms in this population, we identify themes related to situational and biological variables that have implications for assessment, intervention, and further research. There is now good evidence that motivational operations of differing origins influence negatively reinforced problem behavior, and that these might be subject to manipulation to facilitate favorable outcomes. There is also good evidence that some biological variables warrant consideration in assessment procedures as they predispose the person's behavior to be influenced by specific motivational operations. The implications for assessment and intervention are made explicit with reference to variables that are open to manipulation or that require further research and conceptualization within causal models.
Cascade Distillation System Design for Safety and Mission Assurance
NASA Technical Reports Server (NTRS)
Sargusingh, Miriam J.; Callahan, Michael R.
2015-01-01
Per the NASA Human Health, Life Support and Habitation System Technology Area 06 report "crewed missions venturing beyond Low-Earth Orbit (LEO) will require technologies with improved reliability, reduced mass, self-sufficiency, and minimal logistical needs as an emergency or quick-return option will not be feasible." To meet this need, the development team of the second generation Cascade Distillation System (CDS 2.0) opted a development approach that explicitely incorporate consideration of safety, mission assurance, and autonomy. The CDS 2.0 prelimnary design focused on establishing a functional baseline that meets the CDS core capabilities and performance. The critical design phase is now focused on incorporating features through a deliberative process of establishing the systems failure modes and effects, identifying mitigative strategies, and evaluating the merit of the proposed actions through analysis and test. This paper details results of this effort on the CDS 2.0 design.
Cascade Distillation System Design for Safety and Mission Assurance
NASA Technical Reports Server (NTRS)
Sarguisingh, Miriam; Callahan, Michael R.; Okon, Shira
2015-01-01
Per the NASA Human Health, Life Support and Habitation System Technology Area 06 report "crewed missions venturing beyond Low-Earth Orbit (LEO) will require technologies with improved reliability, reduced mass, self-sufficiency, and minimal logistical needs as an emergency or quick-return option will not be feasible".1 To meet this need, the development team of the second generation Cascade Distillation System (CDS 2.0) chose a development approach that explicitly incorporate consideration of safety, mission assurance, and autonomy. The CDS 2.0 preliminary design focused on establishing a functional baseline that meets the CDS core capabilities and performance. The critical design phase is now focused on incorporating features through a deliberative process of establishing the systems failure modes and effects, identifying mitigation strategies, and evaluating the merit of the proposed actions through analysis and test. This paper details results of this effort on the CDS 2.0 design.
Science and policy: valuing framing, language and listening.
Forbes, Stephen
2011-01-01
This paper considers the context for science contributing to policy development and explores some critical issues that should inform science advocacy and influence with policy makers. The paper argues that the key challenges are at least as much in educating conservation scientists and science communicators about society and policy making as they are in educating society and policy makers about science. The importance of developing processes to ensure that scientists and science communicators invest in the development of relationships based on respect and understanding of their audience in both communities and amongst policy makers provides a critical first step. The objectives of the Global Strategy for Plant Conservation acknowledge the importance of developing the capacities and public engagement necessary to implement the Strategy, including knowledge transfer and community capacity building. However, the development of targets to equip institutions and plant conservation professionals to explicitly address the barriers to influencing policy development through knowledge transfer and integration require further consideration.
Assistive Technologies and Issues Relating to Privacy, Ethics and Security
NASA Astrophysics Data System (ADS)
Martin, Suzanne; Bengtsson, Johan E.; Dröes, Rose-Marie
Emerging technologies provide the opportunity to develop innovative sustainable service models, capable of supporting adults with dementia at home. Devices range from simple stand-alone components that can generate a responsive alarm call to complex interoperable systems that even can be remotely controlled. From these complex systems the paradigm of the ubiquitous or ambient smart home has emerged, integrating technology, environmental design and traditional care provision. The service context is often complex, involving a variety of stakeholders and a range of interested agencies. Against this backdrop, as anecdotal evidence and government policies spawn further innovation it is critical that due consideration is given to the potential ethical ramifications at an individual, organisational and societal level. Well-grounded ethical thinking and proactive ethical responses to this innovation are required. Explicit policy and practice should therefore emerge which engenders confidence in existing supported living option schemes for adults with dementia and informs further innovation.
Optimization-based manufacturing scheduling with multiple resources and setup requirements
NASA Astrophysics Data System (ADS)
Chen, Dong; Luh, Peter B.; Thakur, Lakshman S.; Moreno, Jack, Jr.
1998-10-01
The increasing demand for on-time delivery and low price forces manufacturer to seek effective schedules to improve coordination of multiple resources and to reduce product internal costs associated with labor, setup and inventory. This study describes the design and implementation of a scheduling system for J. M. Product Inc. whose manufacturing is characterized by the need to simultaneously consider machines and operators while an operator may attend several operations at the same time, and the presence of machines requiring significant setup times. The scheduling problem with these characteristics are typical for many manufacturers, very difficult to be handled, and have not been adequately addressed in the literature. In this study, both machine and operators are modeled as resources with finite capacities to obtain efficient coordination between them, and an operator's time can be shared by several operations at the same time to make full use of the operator. Setups are explicitly modeled following our previous work, with additional penalties on excessive setups to reduce setup costs and avoid possible scraps. An integer formulation with a separable structure is developed to maximize on-time delivery of products, low inventory and small number of setups. Within the Lagrangian relaxation framework, the problem is decomposed into individual subproblems that are effectively solved by using dynamic programming with additional penalties embedded in state transitions. Heuristics is then developed to obtain a feasible schedule following on our previous work with new mechanism to satisfy operator capacity constraints. The method has been implemented using the object-oriented programming language C++ with a user-friendly interface, and numerical testing shows that the method generates high quality schedules in a timely fashion. Through simultaneous consideration of machines and operators, machines and operators are well coordinated to facilitate the smooth flow of parts through the system. The explicit modeling of setups and the associated penalties let parts with same setup requirements clustered together to avoid excessive setups.
Conflicting and complementary ethics of animal welfare considerations in reintroductions.
Harrington, Lauren A; Moehrenschlager, Axel; Gelling, Merryl; Atkinson, Rob P D; Hughes, Joelene; Macdonald, David W
2013-06-01
Despite differences in focus, goals, and strategies between conservation biology and animal welfare, both are inextricably linked in many ways, and greater consideration of animal welfare, although important in its own right, also has considerable potential to contribute to conservation success. Nevertheless, animal welfare and animal ethics are not always considered explicitly within conservation practice. We systematically reviewed the recent scientific peer-reviewed and online gray literature on reintroductions of captive-bred and wild-caught animals (mammals, birds, amphibians, and reptiles) to quantify the occurrence of animal welfare issues. We considered monitoring that could be indicative of the animal's welfare status and supportive management actions that could improve animal welfare (regardless of whether the aim was explicitly animal-welfare orientated). Potential welfare issues (of variable nature and extent) were recorded in 67% of 199 projects reviewed; the most common were mortality >50%, dispersal or loss of animals, disease, and human conflict. Most (>70%) projects monitored survival, 18% assessed body condition, and 2% monitored stress levels. Animal welfare, explicitly, was referred to in 6% of projects. Supportive actions, most commonly use of on-site prerelease pens and provision of supplemental food or water, were implemented in 79% of projects, although the extent and duration of support varied. Practitioners can address animal-welfare issues in reintroductions by considering the potential implications for individual animals at all stages of the release process using the decision tree presented. We urge practitioners to report potential animal-welfare issues, describe mitigation actions, and evaluate their efficacy to facilitate transparent evaluation of common moral dilemmas and to advance communal strategies for dealing with them. Currently, comparative mortality rates, health risks, postrelease stress, effectiveness of supportive measures, and behavior of individuals warrant further research to improve animal welfare in reintroductions and to increase success of such projects. © 2013 Society for Conservation Biology.
Explicit frequency equations of free vibration of a nonlocal Timoshenko beam with surface effects
NASA Astrophysics Data System (ADS)
Zhao, Hai-Sheng; Zhang, Yao; Lie, Seng-Tjhen
2018-02-01
Considerations of nonlocal elasticity and surface effects in micro- and nanoscale beams are both important for the accurate prediction of natural frequency. In this study, the governing equation of a nonlocal Timoshenko beam with surface effects is established by taking into account three types of boundary conditions: hinged-hinged, clamped-clamped and clamped-hinged ends. For a hinged-hinged beam, an exact and explicit natural frequency equation is obtained. However, for clamped-clamped and clamped-hinged beams, the solutions of corresponding frequency equations must be determined numerically due to their transcendental nature. Hence, the Fredholm integral equation approach coupled with a curve fitting method is employed to derive the approximate fundamental frequency equations, which can predict the frequency values with high accuracy. In short, explicit frequency equations of the Timoshenko beam for three types of boundary conditions are proposed to exhibit directly the dependence of the natural frequency on the nonlocal elasticity, surface elasticity, residual surface stress, shear deformation and rotatory inertia, avoiding the complicated numerical computation.
Testing the Use of Implicit Solvent in the Molecular Dynamics Modelling of DNA Flexibility
NASA Astrophysics Data System (ADS)
Mitchell, J.; Harris, S.
DNA flexibility controls packaging, looping and in some cases sequence specific protein binding. Molecular dynamics simulations carried out with a computationally efficient implicit solvent model are potentially a powerful tool for studying larger DNA molecules than can be currently simulated when water and counterions are represented explicitly. In this work we compare DNA flexibility at the base pair step level modelled using an implicit solvent model to that previously determined from explicit solvent simulations and database analysis. Although much of the sequence dependent behaviour is preserved in implicit solvent, the DNA is considerably more flexible when the approximate model is used. In addition we test the ability of the implicit solvent to model stress induced DNA disruptions by simulating a series of DNA minicircle topoisomers which vary in size and superhelical density. When compared with previously run explicit solvent simulations, we find that while the levels of DNA denaturation are similar using both computational methodologies, the specific structural form of the disruptions is different.
Comparison of three explicit multigrid methods for the Euler and Navier-Stokes equations
NASA Technical Reports Server (NTRS)
Chima, Rodrick V.; Turkel, Eli; Schaffer, Steve
1987-01-01
Three explicit multigrid methods, Ni's method, Jameson's finite-volume method, and a finite-difference method based on Brandt's work, are described and compared for two model problems. All three methods use an explicit multistage Runge-Kutta scheme on the fine grid, and this scheme is also described. Convergence histories for inviscid flow over a bump in a channel for the fine-grid scheme alone show that convergence rate is proportional to Courant number and that implicit residual smoothing can significantly accelerate the scheme. Ni's method was slightly slower than the implicitly-smoothed scheme alone. Brandt's and Jameson's methods are shown to be equivalent in form but differ in their node versus cell-centered implementations. They are about 8.5 times faster than Ni's method in terms of CPU time. Results for an oblique shock/boundary layer interaction problem verify the accuracy of the finite-difference code. All methods slowed considerably on the stretched viscous grid but Brandt's method was still 2.1 times faster than Ni's method.
[Nutrition or industry. Experiences with nutritional considerations in the agricultural policy].
Botten, G
1991-06-30
The need to export health considerations to various sector policies is regarded as important in health promotion. Health is generally a highly appreciated benefit; thus many sectors seek to use health as an argument for their policy. This article describes the relation between nutrition and agricultural policy in Norway. In areas where nutrition and agriculture had mutual interests, health considerations were easily exported. However, when interests diverged the issue became more complicated. Much effort was focused upon achieving correct use of nutritional arguments. Before negotiating and weighing respective viewpoints it is essential to clarify each sector's standpoint and interest. Conflicts and negotiations are linked to strategies which seek explicitly to integrate health premisses into sectors outside the health services itself.
Watt, S; Shores, E A; Kinoshita, S
1999-07-01
Implicit and explicit memory were examined in individuals with severe traumatic brain injury (TBI) under conditions of full and divided attention. Participants included 12 individuals with severe TBI and 12 matched controls. In Experiment 1, participants carried out an implicit test of word-stem completion and an explicit test of cued recall. Results demonstrated that TBI participants exhibited impaired explicit memory but preserved implicit memory. In Experiment 2, a significant reduction in the explicit memory performance of both TBI and control participants, as well as a significant decrease in the implicit memory performance of TBI participants, was achieved by reducing attentional resources at encoding. These results indicated that performance on an implicit task of word-stem completion may require the availability of additional attentional resources that are not preserved after severe TBI.
Murakami, Taro; Hashiya, Kazuhide
2017-01-01
Understanding the referent of other’s utterance by referring the contextual information helps in smooth communication. Although this pragmatic referential process can be observed even in infants, its underlying mechanism and relative abilities remain unclear. This study aimed to comprehend the background of the referential process by investigating whether the phonological loop affected the referent assignment. A total of 76 children (43 girls) aged 3–5 years participated in a reference assignment task in which an experimenter asked them to answer explicit (e.g., “What color is this?”) and ambiguous (e.g., “What about this?”) questions about colorful objects. The phonological loop capacity was measured by using the forward digit span task in which children were required to repeat the numbers as an experimenter uttered them. The results showed that the scores of the forward digit span task positively predicted correct response to explicit questions and part of the ambiguous questions. That is, the phonological loop capacity did not have effects on referent assignment in response to ambiguous questions that were asked after a topic shift of the explicit questions and thus required a backward reference to the preceding explicit questions to detect the intent of the current ambiguous questions. These results suggest that although the phonological loop capacity could overtly enhance the storage of verbal information, it does not seem to directly contribute to the pragmatic referential process, which might require further social cognitive processes. PMID:29088282
Koch, Marianne; Riss, Paul; Umek, Wolfgang; Hanzal, Engelbert
2016-03-01
Poor reporting of research may limit critical appraisal and reproducibility, whereas adherence to reporting guidelines (RG) can guarantee completeness and transparency. We aimed to determine the explicit citing of RGs (CONSORT, PRISMA, STROBE) in urogynecology articles in 2013, the requirements of relevant journals and a potential difference between urogynecology and general gynecology journals. All urogynecologic articles published between January and December 2013 in the journals NAU, IUJ, FPMRS, GREEN, AJOG, and BJOG were included. Issues were searched for systematic reviews, RCTs, cohort studies, case-control studies and cross-sectional studies. Each electronic article was searched for the term PRISMA, CONSORT, or STROBE according to the study design. Instructions to Authors of the six journals were screened for requirement of using RGs. We included 296 articles (243 observational studies, 40 RCTs, and 13 systematic reviews). The use of PRISMA guidelines was explicitly declared in 54% of systematic reviews, CONSORT guidelines were referenced in 25% of RCTs and STROBE in 1.2% of observational studies. The use of CONSORT is required by all journals except FPMRS. PRISMA and STROBE are only compulsory in the journals GREEN, AJOG, and BJOG. The overall rate of explicit mentioning of RGs comparing urogynecology and general gynecology journals was 6.7% versus 7.1%, respectively. The explicit mentioning of RGs was on a relatively low level. A slightly higher adherence was recognized among general gynecology journals compared to urogynecology journals. Stronger efforts should be taken to further promote the use of RGs in urogynecology. © 2015 Wiley Periodicals, Inc.
Color-Blind Leadership: A Critical Race Theory Analysis of the ISLLC and ELCC Standards
ERIC Educational Resources Information Center
Davis, Bradley W.; Gooden, Mark A.; Micheaux, Donna J.
2015-01-01
Purpose: Working from the driving research question--"is the explicit consideration of race present in the ISLLC and ELCC standards?"--this article explores the implications of a school leadership landscape reliant on a collection of color-blind leadership standards to guide the preparation and practice of school leaders. In doing so, we…
ERIC Educational Resources Information Center
Frawley, Rebecca Glenn
2013-01-01
Since equipping students for service to God and others is either an implicit or explicit element of the mission statement of every Christ-centered college and university, academic officers at such institutions should give serious consideration to making service-learning one of their regular pedagogical strategies. This paper presents the…
Simulating the effects of the southern pine beetle on regional dynamics 60 years into the future
Jennifer K. Costanza; Jiri Hulcr; Frank H. Koch; Todd Earnhardt; Alexa J. McKerrow; Rob R. Dunn; Jaime A. Collazo
2012-01-01
We developed a spatially explicit model that simulated future southern pine beetle (Dendroctonus frontalis, SPB) dynamics and pine forest management for a real landscape over 60 years to inform regional forest management. The SPB has a considerable effect on forest dynamics in the Southeastern United States, especially in loblolly pine (...
University Social Responsibility and Brand Image of Private Universities in Bangkok
ERIC Educational Resources Information Center
Plungpongpan, Jirawan; Tiangsoongnern, Leela; Speece, Mark
2016-01-01
Purpose: The purpose of this paper is to examine the effects of university social responsibility (USR) on the brand image of private universities in Thailand. Brand image is important for entry into the consideration set as prospective students evaluate options for university study. USR activities may be implicit or explicit, i.e., actively…
Scaling laws and complexity in fire regimes [Chapter 2
Donald McKenzie; Maureen Kennedy
2011-01-01
Use of scaling terminology and concepts in ecology evolved rapidly from rare occurrences in the early 1980s to a central idea by the early 1990s (Allen and Hoekstra 1992; Levin 1992; Peterson and Parker 1998). In landscape ecology, use of "scale" frequently connotes explicitly spatial considerations (Dungan et al. 2002), notably grain and extent. More...
NASA Astrophysics Data System (ADS)
Scherer, Artur; Valiron, Benoît; Mau, Siun-Chuon; Alexander, Scott; van den Berg, Eric; Chapuran, Thomas E.
2017-03-01
We provide a detailed estimate for the logical resource requirements of the quantum linear-system algorithm (Harrow et al. in Phys Rev Lett 103:150502, 2009) including the recently described elaborations and application to computing the electromagnetic scattering cross section of a metallic target (Clader et al. in Phys Rev Lett 110:250504, 2013). Our resource estimates are based on the standard quantum-circuit model of quantum computation; they comprise circuit width (related to parallelism), circuit depth (total number of steps), the number of qubits and ancilla qubits employed, and the overall number of elementary quantum gate operations as well as more specific gate counts for each elementary fault-tolerant gate from the standard set { X, Y, Z, H, S, T, { CNOT } }. In order to perform these estimates, we used an approach that combines manual analysis with automated estimates generated via the Quipper quantum programming language and compiler. Our estimates pertain to the explicit example problem size N=332{,}020{,}680 beyond which, according to a crude big-O complexity comparison, the quantum linear-system algorithm is expected to run faster than the best known classical linear-system solving algorithm. For this problem size, a desired calculation accuracy ɛ =0.01 requires an approximate circuit width 340 and circuit depth of order 10^{25} if oracle costs are excluded, and a circuit width and circuit depth of order 10^8 and 10^{29}, respectively, if the resource requirements of oracles are included, indicating that the commonly ignored oracle resources are considerable. In addition to providing detailed logical resource estimates, it is also the purpose of this paper to demonstrate explicitly (using a fine-grained approach rather than relying on coarse big-O asymptotic approximations) how these impressively large numbers arise with an actual circuit implementation of a quantum algorithm. While our estimates may prove to be conservative as more efficient advanced quantum-computation techniques are developed, they nevertheless provide a valid baseline for research targeting a reduction of the algorithmic-level resource requirements, implying that a reduction by many orders of magnitude is necessary for the algorithm to become practical.
Somatic Markers and Explicit Knowledge Are both Involved in Decision-Making
ERIC Educational Resources Information Center
Guillaume, Sebastien; Jollant, Fabrice; Jaussent, Isabelle; Lawrence, Natalia; Malafosse, Alain; Courtet, Philippe
2009-01-01
In 1994, it was proposed that decision-making requires emotion-related signals, known as somatic markers. In contrast, some authors argued that conscious knowledge of contingencies is sufficient for advantageous decision-making. We aimed to investigate the respective roles of somatic markers and explicit knowledge in decision-making. Thirty…
A Conceptual Model for the Design and Delivery of Explicit Thinking Skills Instruction
ERIC Educational Resources Information Center
Kassem, Cherrie L.
2005-01-01
Developing student thinking skills is an important goal for most educators. However, due to time constraints and weighty content standards, thinking skills instruction is often embedded in subject matter, implicit and incidental. For best results, thinking skills instruction requires a systematic design and explicit teaching strategies. The…
Quantum corrections for the cubic Galileon in the covariant language
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saltas, Ippocratis D.; Vitagliano, Vincenzo, E-mail: isaltas@fc.ul.pt, E-mail: vincenzo.vitagliano@ist.utl.pt
We present for the first time an explicit exposition of quantum corrections within the cubic Galileon theory including the effect of quantum gravity, in a background- and gauge-invariant manner, employing the field-reparametrisation approach of the covariant effective action at 1-loop. We show that the consideration of gravitational effects in combination with the non-linear derivative structure of the theory reveals new interactions at the perturbative level, which manifest themselves as higher-operators in the associated effective action, which' relevance is controlled by appropriate ratios of the cosmological vacuum and the Galileon mass scale. The significance and concept of the covariant approach inmore » this context is discussed, while all calculations are explicitly presented.« less
Ethical Design of Intelligent Assistive Technologies for Dementia: A Descriptive Review.
Ienca, Marcello; Wangmo, Tenzin; Jotterand, Fabrice; Kressig, Reto W; Elger, Bernice
2017-09-22
The use of Intelligent Assistive Technology (IAT) in dementia care opens the prospects of reducing the global burden of dementia and enabling novel opportunities to improve the lives of dementia patients. However, with current adoption rates being reportedly low, the potential of IATs might remain under-expressed as long as the reasons for suboptimal adoption remain unaddressed. Among these, ethical and social considerations are critical. This article reviews the spectrum of IATs for dementia and investigates the prevalence of ethical considerations in the design of current IATs. Our screening shows that a significant portion of current IATs is designed in the absence of explicit ethical considerations. These results suggest that the lack of ethical consideration might be a codeterminant of current structural limitations in the translation of IATs from designing labs to bedside. Based on these data, we call for a coordinated effort to proactively incorporate ethical considerations early in the design and development of new products.
Tacit knowledge as the unifying factor in evidence based medicine and clinical judgement.
Thornton, Tim
2006-03-17
The paper outlines the role that tacit knowledge plays in what might seem to be an area of knowledge that can be made fully explicit or codified and which forms a central element of Evidence Based Medicine. Appeal to the role the role of tacit knowledge in science provides a way to unify the tripartite definition of Evidence Based Medicine given by Sackett et al: the integration of best research evidence with clinical expertise and patient values. Each of these three elements, crucially including research evidence, rests on an ineliminable and irreducible notion of uncodified good judgement. The paper focuses on research evidence, drawing first on the work of Kuhn to suggest that tacit knowledge contributes, as a matter of fact, to puzzle solving within what he calls normal science. A stronger argument that it must play a role in research is first motivated by looking to Collins' first hand account of replication in applied physics and then broader considerations of replication in justifying knowledge claims in scientific research. Finally, consideration of an argument from Wittgenstein shows that whatever explicit guidelines can be drawn up to guide judgement the specification of what counts as correctly following them has to remain implicit.Overall, the paper sets out arguments for the claim that even though explicit guidelines and codifications can play a practical role in informing clinical practice, they rest on a body of tacit or implicit skill that is in principle ineliminable. It forms the bedrock of good judgement and unites the integration of research, expertise and values.
Tacit knowledge as the unifying factor in evidence based medicine and clinical judgement
Thornton, Tim
2006-01-01
The paper outlines the role that tacit knowledge plays in what might seem to be an area of knowledge that can be made fully explicit or codified and which forms a central element of Evidence Based Medicine. Appeal to the role the role of tacit knowledge in science provides a way to unify the tripartite definition of Evidence Based Medicine given by Sackett et al: the integration of best research evidence with clinical expertise and patient values. Each of these three elements, crucially including research evidence, rests on an ineliminable and irreducible notion of uncodified good judgement. The paper focuses on research evidence, drawing first on the work of Kuhn to suggest that tacit knowledge contributes, as a matter of fact, to puzzle solving within what he calls normal science. A stronger argument that it must play a role in research is first motivated by looking to Collins' first hand account of replication in applied physics and then broader considerations of replication in justifying knowledge claims in scientific research. Finally, consideration of an argument from Wittgenstein shows that whatever explicit guidelines can be drawn up to guide judgement the specification of what counts as correctly following them has to remain implicit. Overall, the paper sets out arguments for the claim that even though explicit guidelines and codifications can play a practical role in informing clinical practice, they rest on a body of tacit or implicit skill that is in principle ineliminable. It forms the bedrock of good judgement and unites the integration of research, expertise and values. PMID:16759426
NASA Technical Reports Server (NTRS)
Klaus, David M.; Benoit, Michael R.; Nelson, Emily S.; Hammond, Timmothy G.
2004-01-01
Conducting biological research in space requires consideration be given to isolating appropriate control parameters. For in vitro cell cultures, numerous environmental factors can adversely affect data interpretation. A biological response attributed to microgravity can, in theory, be explicitly correlated to a specific lack of weight or gravity-driven motion occurring to, within or around a cell. Weight can be broken down to include the formation of hydrostatic gradients, structural load (stress) or physical deformation (strain). Gravitationally induced motion within or near individual cells in a fluid includes sedimentation (or buoyancy) of the cell and associated shear forces, displacement of cytoskeleton or organelles, and factors associated with intra- or extracellular mass transport. Finally, and of particular importance for cell culture experiments, the collective effects of gravity must be considered for the overall system consisting of the cells, their environment and the device in which they are contained. This does not, however, rule out other confounding variables such as launch acceleration, on orbit vibration, transient acceleration impulses or radiation, which can be isolated using onboard centrifuges or vibration isolation techniques. A framework is offered for characterizing specific cause-and-effect relationships for gravity-dependent responses as a function of the above parameters.
Multistate approaches in computational protein design
Davey, James A; Chica, Roberto A
2012-01-01
Computational protein design (CPD) is a useful tool for protein engineers. It has been successfully applied towards the creation of proteins with increased thermostability, improved binding affinity, novel enzymatic activity, and altered ligand specificity. Traditionally, CPD calculations search and rank sequences using a single fixed protein backbone template in an approach referred to as single-state design (SSD). While SSD has enjoyed considerable success, certain design objectives require the explicit consideration of multiple conformational and/or chemical states. Cases where a “multistate” approach may be advantageous over the SSD approach include designing conformational changes into proteins, using native ensembles to mimic backbone flexibility, and designing ligand or oligomeric association specificities. These design objectives can be efficiently tackled using multistate design (MSD), an emerging methodology in CPD that considers any number of protein conformational or chemical states as inputs instead of a single protein backbone template, as in SSD. In this review article, recent examples of the successful design of a desired property into proteins using MSD are described. These studies employing MSD are divided into two categories—those that utilized multiple conformational states, and those that utilized multiple chemical states. In addition, the scoring of competing states during negative design is discussed as a current challenge for MSD. PMID:22811394
PRA and Risk Informed Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernsen, Sidney A.; Simonen, Fredric A.; Balkey, Kenneth R.
2006-01-01
The Boiler and Pressure Vessel Code (BPVC) of the American Society of Mechanical Engineers (ASME) has introduced a risk based approach into Section XI that covers Rules for Inservice Inspection of Nuclear Power Plant Components. The risk based approach requires application of the probabilistic risk assessments (PRA). Because no industry consensus standard existed for PRAs, ASME has developed a standard to evaluate the quality level of an available PRA needed to support a given risk based application. The paper describes the PRA standard, Section XI application of PRAs, and plans for broader applications of PRAs to other ASME nuclear codesmore » and standards. The paper addresses several specific topics of interest to Section XI. Important consideration are special methods (surrogate components) used to overcome the lack of PRA treatments of passive components in PRAs. The approach allows calculations of conditional core damage probabilities both for component failures that cause initiating events and failures in standby systems that decrease the availability of these systems. The paper relates the explicit risk based methods of the new Section XI code cases to the implicit consideration of risk used in the development of Section XI. Other topics include the needed interactions of ISI engineers, plant operating staff, PRA specialists, and members of expert panels that review the risk based programs.« less
NASA Astrophysics Data System (ADS)
Dykema, J. A.; Anderson, J. G.
2014-12-01
Measuring water vapor at the highest spatial and temporal at all vertical levels and at arbitrary times requires strategic utilization of disparate observations from satellites, ground-based remote sensing, and in situ measurements. These different measurement types have different response times and very different spatial averaging properties, both horizontally and vertically. Accounting for these different measurement properties and explicit propagation of associated uncertainties is necessary to test particular scientific hypotheses, especially in cases of detection of weak signals in the presence of natural fluctuations, and for process studies with small ensembles. This is also true where ancillary data from meteorological analyses are required, which have their own sampling limitations and uncertainties. This study will review two investigations pertaining to measurements of water vapor in the mid-troposphere and lower stratosphere that mix satellite observations with observations from other sources. The focus of the mid-troposphere analysis is to obtain improved estimates of water vapor at the instant of a sounding satellite overpass. The lower stratosphere work examines the uncertainty inherent in a small ensemble of anomalously elevated lower stratospheric water vapor observations when meteorological analysis products and aircraft in situ observations are required for interpretation.
Greenhouse gas implications of a 32 billion gallon bioenergy landscape in the US
NASA Astrophysics Data System (ADS)
DeLucia, E. H.; Hudiburg, T. W.; Wang, W.; Khanna, M.; Long, S.; Dwivedi, P.; Parton, W. J.; Hartman, M. D.
2015-12-01
Sustainable bioenergy for transportation fuel and greenhouse gas (GHGs) reductions may require considerable changes in land use. Perennial grasses have been proposed because of their potential to yield substantial biomass on marginal lands without displacing food and reduce GHG emissions by storing soil carbon. Here, we implemented an integrated approach to planning bioenergy landscapes by combining spatially-explicit ecosystem and economic models to predict a least-cost land allocation for a 32 billion gallon (121 billion liter) renewable fuel mandate in the US. We find that 2022 GHG transportation emissions are decreased by 7% when 3.9 million hectares of eastern US land are converted to perennial grasses supplemented with corn residue to meet cellulosic ethanol requirements, largely because of gasoline displacement and soil carbon storage. If renewable fuel production is accompanied by a cellulosic biofuel tax credit, CO2 equivalent emissions could be reduced by 12%, because it induces more cellulosic biofuel and land under perennial grasses (10 million hectares) than under the mandate alone. While GHG reducing bioenergy landscapes that meet RFS requirements and do not displace food are possible, the reductions in GHG emissions are 50% less compared to previous estimates that did not account for economically feasible land allocation.
Implicit memory. Retention without remembering.
Roediger, H L
1990-09-01
Explicit measures of human memory, such as recall or recognition, reflect conscious recollection of the past. Implicit tests of retention measure transfer (or priming) from past experience on tasks that do not require conscious recollection of recent experiences for their performance. The article reviews research on the relation between explicit and implicit memory. The evidence points to substantial differences between standard explicit and implicit tests, because many variables create dissociations between these tests. For example, although pictures are remembered better than words on explicit tests, words produce more priming than do pictures on several implicit tests. These dissociations may implicate different memory systems that subserve distinct memorial functions, but the present argument is that many dissociations can be understood by appealing to general principles that apply to both explicit and implicit tests. Phenomena studied under the rubric of implicit memory may have important implications in many other fields, including social cognition, problem solving, and cognitive development.
Efficient Use of Distributed Systems for Scientific Applications
NASA Technical Reports Server (NTRS)
Taylor, Valerie; Chen, Jian; Canfield, Thomas; Richard, Jacques
2000-01-01
Distributed computing has been regarded as the future of high performance computing. Nationwide high speed networks such as vBNS are becoming widely available to interconnect high-speed computers, virtual environments, scientific instruments and large data sets. One of the major issues to be addressed with distributed systems is the development of computational tools that facilitate the efficient execution of parallel applications on such systems. These tools must exploit the heterogeneous resources (networks and compute nodes) in distributed systems. This paper presents a tool, called PART, which addresses this issue for mesh partitioning. PART takes advantage of the following heterogeneous system features: (1) processor speed; (2) number of processors; (3) local network performance; and (4) wide area network performance. Further, different finite element applications under consideration may have different computational complexities, different communication patterns, and different element types, which also must be taken into consideration when partitioning. PART uses parallel simulated annealing to partition the domain, taking into consideration network and processor heterogeneity. The results of using PART for an explicit finite element application executing on two IBM SPs (located at Argonne National Laboratory and the San Diego Supercomputer Center) indicate an increase in efficiency by up to 36% as compared to METIS, a widely used mesh partitioning tool. The input to METIS was modified to take into consideration heterogeneous processor performance; METIS does not take into consideration heterogeneous networks. The execution times for these applications were reduced by up to 30% as compared to METIS. These results are given in Figure 1 for four irregular meshes with number of elements ranging from 30,269 elements for the Barth5 mesh to 11,451 elements for the Barth4 mesh. Future work with PART entails using the tool with an integrated application requiring distributed systems. In particular this application, illustrated in the document entails an integration of finite element and fluid dynamic simulations to address the cooling of turbine blades of a gas turbine engine design. It is not uncommon to encounter high-temperature, film-cooled turbine airfoils with 1,000,000s of degrees of freedom. This results because of the complexity of the various components of the airfoils, requiring fine-grain meshing for accuracy. Additional information is contained in the original.
Symmetry considerations in the scattering of identical composite bodies
NASA Technical Reports Server (NTRS)
Norbury, J. W.; Townsend, L. W.; Deutchman, P. A.
1986-01-01
Previous studies of the interactions between composite particles were extended to the case in which the composites are identical. The form of the total interaction potential matrix elements was obtained, and guidelines for their explicit evaluation were given. For the case of elastic scattering of identical composites, the matrix element approach was shown to be equivalent to the scattering amplitude method.
ERIC Educational Resources Information Center
Murphy, John W.
2004-01-01
On June 11, 1962, President John F. Kennedy addressed the economy at Yale University. This essay explains the symbolic charge of his economic rhetoric, a persuasive campaign that enjoyed considerable success and marked the first time that a president took explicit responsibility for the nation's economic performance. I argue that the president…
The Integration of Delta Prime (f)in a Multidimensional Space
NASA Technical Reports Server (NTRS)
Farassat, F.
1999-01-01
Consideration is given to the thickness noise term of the Ffowcs Williams-Hawkings equation when the time derivative is taken explicitly. An interpretation is presented of the integral I = function phi(x)delta-prime(f) dx, where it is initially assumed that the absolute value of Del-f is not equal to 1 on the surface f = 0.
A Primer for DoD Reliability, Maintainability and Safety Standards
1988-03-02
the project engineer and the concurrence of their respective managers. The primary consideration in such cases is the thoroughness of the ...basic approaches to the application of environmental stress screening. In one approach, the government explicitly specifies the screens and screening...TO USE DOD-HDBK-344 (USAF) There are two basic approaches to the application of environmental stress
ERIC Educational Resources Information Center
Boselovic, Joseph L.
2014-01-01
Although considerable work has been done around the supposed successes and failures of education reform in post-Katrina New Orleans, concerns about the public/private qualities of new policies are often not discussed explicitly. In kind, this article serves to investigate theoretical conceptions of the public as they relate to education while…
Hess, Jeremy J.; Ebi, Kristie L.; Markandya, Anil; Balbus, John M.; Wilkinson, Paul; Haines, Andy; Chalabi, Zaid
2014-01-01
Background: Policy decisions regarding climate change mitigation are increasingly incorporating the beneficial and adverse health impacts of greenhouse gas emission reduction strategies. Studies of such co-benefits and co-harms involve modeling approaches requiring a range of analytic decisions that affect the model output. Objective: Our objective was to assess analytic decisions regarding model framework, structure, choice of parameters, and handling of uncertainty when modeling health co-benefits, and to make recommendations for improvements that could increase policy uptake. Methods: We describe the assumptions and analytic decisions underlying models of mitigation co-benefits, examining their effects on modeling outputs, and consider tools for quantifying uncertainty. Discussion: There is considerable variation in approaches to valuation metrics, discounting methods, uncertainty characterization and propagation, and assessment of low-probability/high-impact events. There is also variable inclusion of adverse impacts of mitigation policies, and limited extension of modeling domains to include implementation considerations. Going forward, co-benefits modeling efforts should be carried out in collaboration with policy makers; these efforts should include the full range of positive and negative impacts and critical uncertainties, as well as a range of discount rates, and should explicitly characterize uncertainty. We make recommendations to improve the rigor and consistency of modeling of health co-benefits. Conclusion: Modeling health co-benefits requires systematic consideration of the suitability of model assumptions, of what should be included and excluded from the model framework, and how uncertainty should be treated. Increased attention to these and other analytic decisions has the potential to increase the policy relevance and application of co-benefits modeling studies, potentially helping policy makers to maximize mitigation potential while simultaneously improving health. Citation: Remais JV, Hess JJ, Ebi KL, Markandya A, Balbus JM, Wilkinson P, Haines A, Chalabi Z. 2014. Estimating the health effects of greenhouse gas mitigation strategies: addressing parametric, model, and valuation challenges. Environ Health Perspect 122:447–455; http://dx.doi.org/10.1289/ehp.1306744 PMID:24583270
Code of Federal Regulations, 2011 CFR
2011-04-01
...” include stated eligibility requirements such as income, as well as other explicit or implicit requirements...” means functions such as caring for one's self, performing manual tasks, walking, seeing, hearing...
Code of Federal Regulations, 2010 CFR
2010-04-01
...” include stated eligibility requirements such as income, as well as other explicit or implicit requirements...” means functions such as caring for one's self, performing manual tasks, walking, seeing, hearing...
40 CFR 63.830 - Reporting requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... specified in paragraphs (b)(1) through (b)(6) of this section to the Administrator: (1) An initial... approved unless explicitly disapproved, or unless comments received from the Administrator require... into EPA's WebFIRE database. (2) All reports required by this subpart not subject to the requirements...
40 CFR 63.830 - Reporting requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... specified in paragraphs (b)(1) through (b)(6) of this section to the Administrator: (1) An initial... approved unless explicitly disapproved, or unless comments received from the Administrator require... into EPA's WebFIRE database. (2) All reports required by this subpart not subject to the requirements...
40 CFR 63.830 - Reporting requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... specified in paragraphs (b)(1) through (b)(6) of this section to the Administrator: (1) An initial... approved unless explicitly disapproved, or unless comments received from the Administrator require... into EPA's WebFIRE database. (2) All reports required by this subpart not subject to the requirements...
40 CFR 63.830 - Reporting requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... specified in paragraphs (b)(1) through (b)(6) of this section to the Administrator: (1) An initial... approved unless explicitly disapproved, or unless comments received from the Administrator require... into EPA's WebFIRE database. (2) All reports required by this subpart not subject to the requirements...
On the effects of scale for ecosystem services mapping
Grêt-Regamey, Adrienne; Weibel, Bettina; Bagstad, Kenneth J.; Ferrari, Marika; Geneletti, Davide; Klug, Hermann; Schirpke, Uta; Tappeiner, Ulrike
2014-01-01
Ecosystems provide life-sustaining services upon which human civilization depends, but their degradation largely continues unabated. Spatially explicit information on ecosystem services (ES) provision is required to better guide decision making, particularly for mountain systems, which are characterized by vertical gradients and isolation with high topographic complexity, making them particularly sensitive to global change. But while spatially explicit ES quantification and valuation allows the identification of areas of abundant or limited supply of and demand for ES, the accuracy and usefulness of the information varies considerably depending on the scale and methods used. Using four case studies from mountainous regions in Europe and the U.S., we quantify information gains and losses when mapping five ES - carbon sequestration, flood regulation, agricultural production, timber harvest, and scenic beauty - at coarse and fine resolution (250 m vs. 25 m in Europe and 300 m vs. 30 m in the U.S.). We analyze the effects of scale on ES estimates and their spatial pattern and show how these effects are related to different ES, terrain structure and model properties. ES estimates differ substantially between the fine and coarse resolution analyses in all case studies and across all services. This scale effect is not equally strong for all ES. We show that spatially explicit information about non-clustered, isolated ES tends to be lost at coarse resolution and against expectation, mainly in less rugged terrain, which calls for finer resolution assessments in such contexts. The effect of terrain ruggedness is also related to model properties such as dependency on land use-land cover data. We close with recommendations for mapping ES to make the resulting maps more comparable, and suggest a four-step approach to address the issue of scale when mapping ES that can deliver information to support ES-based decision making with greater accuracy and reliability.
On the Effects of Scale for Ecosystem Services Mapping
Grêt-Regamey, Adrienne; Weibel, Bettina; Bagstad, Kenneth J.; Ferrari, Marika; Geneletti, Davide; Klug, Hermann; Schirpke, Uta; Tappeiner, Ulrike
2014-01-01
Ecosystems provide life-sustaining services upon which human civilization depends, but their degradation largely continues unabated. Spatially explicit information on ecosystem services (ES) provision is required to better guide decision making, particularly for mountain systems, which are characterized by vertical gradients and isolation with high topographic complexity, making them particularly sensitive to global change. But while spatially explicit ES quantification and valuation allows the identification of areas of abundant or limited supply of and demand for ES, the accuracy and usefulness of the information varies considerably depending on the scale and methods used. Using four case studies from mountainous regions in Europe and the U.S., we quantify information gains and losses when mapping five ES - carbon sequestration, flood regulation, agricultural production, timber harvest, and scenic beauty - at coarse and fine resolution (250 m vs. 25 m in Europe and 300 m vs. 30 m in the U.S.). We analyze the effects of scale on ES estimates and their spatial pattern and show how these effects are related to different ES, terrain structure and model properties. ES estimates differ substantially between the fine and coarse resolution analyses in all case studies and across all services. This scale effect is not equally strong for all ES. We show that spatially explicit information about non-clustered, isolated ES tends to be lost at coarse resolution and against expectation, mainly in less rugged terrain, which calls for finer resolution assessments in such contexts. The effect of terrain ruggedness is also related to model properties such as dependency on land use-land cover data. We close with recommendations for mapping ES to make the resulting maps more comparable, and suggest a four-step approach to address the issue of scale when mapping ES that can deliver information to support ES-based decision making with greater accuracy and reliability. PMID:25549256
On the effects of scale for ecosystem services mapping.
Grêt-Regamey, Adrienne; Weibel, Bettina; Bagstad, Kenneth J; Ferrari, Marika; Geneletti, Davide; Klug, Hermann; Schirpke, Uta; Tappeiner, Ulrike
2014-01-01
Ecosystems provide life-sustaining services upon which human civilization depends, but their degradation largely continues unabated. Spatially explicit information on ecosystem services (ES) provision is required to better guide decision making, particularly for mountain systems, which are characterized by vertical gradients and isolation with high topographic complexity, making them particularly sensitive to global change. But while spatially explicit ES quantification and valuation allows the identification of areas of abundant or limited supply of and demand for ES, the accuracy and usefulness of the information varies considerably depending on the scale and methods used. Using four case studies from mountainous regions in Europe and the U.S., we quantify information gains and losses when mapping five ES - carbon sequestration, flood regulation, agricultural production, timber harvest, and scenic beauty - at coarse and fine resolution (250 m vs. 25 m in Europe and 300 m vs. 30 m in the U.S.). We analyze the effects of scale on ES estimates and their spatial pattern and show how these effects are related to different ES, terrain structure and model properties. ES estimates differ substantially between the fine and coarse resolution analyses in all case studies and across all services. This scale effect is not equally strong for all ES. We show that spatially explicit information about non-clustered, isolated ES tends to be lost at coarse resolution and against expectation, mainly in less rugged terrain, which calls for finer resolution assessments in such contexts. The effect of terrain ruggedness is also related to model properties such as dependency on land use-land cover data. We close with recommendations for mapping ES to make the resulting maps more comparable, and suggest a four-step approach to address the issue of scale when mapping ES that can deliver information to support ES-based decision making with greater accuracy and reliability.
Miller, Brian W.; Breckheimer, Ian; McCleary, Amy L.; Guzmán-Ramirez, Liza; Caplow, Susan C.; Jones-Smith, Jessica C.; Walsh, Stephen J.
2010-01-01
Agent Based Models (ABMs) are powerful tools for population-environment research but are subject to trade-offs between model complexity and abstraction. This study strikes a compromise between abstract and highly specified ABMs by designing a spatially explicit, stylized ABM and using it to explore policy scenarios in a setting that is facing substantial conservation and development challenges. Specifically, we present an ABM that reflects key Land Use / Land Cover (LULC) dynamics and livelihood decisions on Isabela Island in the Galápagos Archipelago of Ecuador. We implement the model using the NetLogo software platform, a free program that requires relatively little programming experience. The landscape is composed of a satellite-derived distribution of a problematic invasive species (common guava) and a stylized representation of the Galápagos National Park, the community of Puerto Villamil, the agricultural zone, and the marine area. The agent module is based on publicly available data and household interviews, and represents the primary livelihoods of the population in the Galápagos Islands – tourism, fisheries, and agriculture. We use the model to enact hypothetical agricultural subsidy scenarios aimed at controlling invasive guava and assess the resulting population and land cover dynamics. Findings suggest that spatially explicit, stylized ABMs have considerable utility, particularly during preliminary stages of research, as platforms for (1) sharpening conceptualizations of population-environment systems, (2) testing alternative scenarios, and (3) uncovering critical data gaps. PMID:20539752
Spatial patterns of agricultural expansion determine impacts on biodiversity and carbon storage.
Chaplin-Kramer, Rebecca; Sharp, Richard P; Mandle, Lisa; Sim, Sarah; Johnson, Justin; Butnar, Isabela; Milà I Canals, Llorenç; Eichelberger, Bradley A; Ramler, Ivan; Mueller, Carina; McLachlan, Nikolaus; Yousefi, Anahita; King, Henry; Kareiva, Peter M
2015-06-16
The agricultural expansion and intensification required to meet growing food and agri-based product demand present important challenges to future levels and management of biodiversity and ecosystem services. Influential actors such as corporations, governments, and multilateral organizations have made commitments to meeting future agricultural demand sustainably and preserving critical ecosystems. Current approaches to predicting the impacts of agricultural expansion involve calculation of total land conversion and assessment of the impacts on biodiversity or ecosystem services on a per-area basis, generally assuming a linear relationship between impact and land area. However, the impacts of continuing land development are often not linear and can vary considerably with spatial configuration. We demonstrate what could be gained by spatially explicit analysis of agricultural expansion at a large scale compared with the simple measure of total area converted, with a focus on the impacts on biodiversity and carbon storage. Using simple modeling approaches for two regions of Brazil, we find that for the same amount of land conversion, the declines in biodiversity and carbon storage can vary two- to fourfold depending on the spatial pattern of conversion. Impacts increase most rapidly in the earliest stages of agricultural expansion and are more pronounced in scenarios where conversion occurs in forest interiors compared with expansion into forests from their edges. This study reveals the importance of spatially explicit information in the assessment of land-use change impacts and for future land management and conservation.
Miller, Brian W; Breckheimer, Ian; McCleary, Amy L; Guzmán-Ramirez, Liza; Caplow, Susan C; Jones-Smith, Jessica C; Walsh, Stephen J
2010-05-01
Agent Based Models (ABMs) are powerful tools for population-environment research but are subject to trade-offs between model complexity and abstraction. This study strikes a compromise between abstract and highly specified ABMs by designing a spatially explicit, stylized ABM and using it to explore policy scenarios in a setting that is facing substantial conservation and development challenges. Specifically, we present an ABM that reflects key Land Use / Land Cover (LULC) dynamics and livelihood decisions on Isabela Island in the Galápagos Archipelago of Ecuador. We implement the model using the NetLogo software platform, a free program that requires relatively little programming experience. The landscape is composed of a satellite-derived distribution of a problematic invasive species (common guava) and a stylized representation of the Galápagos National Park, the community of Puerto Villamil, the agricultural zone, and the marine area. The agent module is based on publicly available data and household interviews, and represents the primary livelihoods of the population in the Galápagos Islands - tourism, fisheries, and agriculture. We use the model to enact hypothetical agricultural subsidy scenarios aimed at controlling invasive guava and assess the resulting population and land cover dynamics. Findings suggest that spatially explicit, stylized ABMs have considerable utility, particularly during preliminary stages of research, as platforms for (1) sharpening conceptualizations of population-environment systems, (2) testing alternative scenarios, and (3) uncovering critical data gaps.
Spatial patterns of agricultural expansion determine impacts on biodiversity and carbon storage
Chaplin-Kramer, Rebecca; Sharp, Richard P.; Mandle, Lisa; Sim, Sarah; Johnson, Justin; Butnar, Isabela; Milà i Canals, Llorenç; Eichelberger, Bradley A.; Ramler, Ivan; Mueller, Carina; McLachlan, Nikolaus; Yousefi, Anahita; King, Henry; Kareiva, Peter M.
2015-01-01
The agricultural expansion and intensification required to meet growing food and agri-based product demand present important challenges to future levels and management of biodiversity and ecosystem services. Influential actors such as corporations, governments, and multilateral organizations have made commitments to meeting future agricultural demand sustainably and preserving critical ecosystems. Current approaches to predicting the impacts of agricultural expansion involve calculation of total land conversion and assessment of the impacts on biodiversity or ecosystem services on a per-area basis, generally assuming a linear relationship between impact and land area. However, the impacts of continuing land development are often not linear and can vary considerably with spatial configuration. We demonstrate what could be gained by spatially explicit analysis of agricultural expansion at a large scale compared with the simple measure of total area converted, with a focus on the impacts on biodiversity and carbon storage. Using simple modeling approaches for two regions of Brazil, we find that for the same amount of land conversion, the declines in biodiversity and carbon storage can vary two- to fourfold depending on the spatial pattern of conversion. Impacts increase most rapidly in the earliest stages of agricultural expansion and are more pronounced in scenarios where conversion occurs in forest interiors compared with expansion into forests from their edges. This study reveals the importance of spatially explicit information in the assessment of land-use change impacts and for future land management and conservation. PMID:26082547
Analytical steady-state solutions for water-limited cropping systems using saline irrigation water
NASA Astrophysics Data System (ADS)
Skaggs, T. H.; Anderson, R. G.; Corwin, D. L.; Suarez, D. L.
2014-12-01
Due to the diminishing availability of good quality water for irrigation, it is increasingly important that irrigation and salinity management tools be able to target submaximal crop yields and support the use of marginal quality waters. In this work, we present a steady-state irrigated systems modeling framework that accounts for reduced plant water uptake due to root zone salinity. Two explicit, closed-form analytical solutions for the root zone solute concentration profile are obtained, corresponding to two alternative functional forms of the uptake reduction function. The solutions express a general relationship between irrigation water salinity, irrigation rate, crop salt tolerance, crop transpiration, and (using standard approximations) crop yield. Example applications are illustrated, including the calculation of irrigation requirements for obtaining targeted submaximal yields, and the generation of crop-water production functions for varying irrigation waters, irrigation rates, and crops. Model predictions are shown to be mostly consistent with existing models and available experimental data. Yet the new solutions possess advantages over available alternatives, including: (i) the solutions were derived from a complete physical-mathematical description of the system, rather than based on an ad hoc formulation; (ii) the analytical solutions are explicit and can be evaluated without iterative techniques; (iii) the solutions permit consideration of two common functional forms of salinity induced reductions in crop water uptake, rather than being tied to one particular representation; and (iv) the utilized modeling framework is compatible with leading transient-state numerical models.
Three Dimensional Aerodynamic Analysis of a High-Lift Transport Configuration
NASA Technical Reports Server (NTRS)
Dodbele, Simha S.
1993-01-01
Two computational methods, a surface panel method and an Euler method employing unstructured grid methodology, were used to analyze a subsonic transport aircraft in cruise and high-lift conditions. The computational results were compared with two separate sets of flight data obtained for the cruise and high-lift configurations. For the cruise configuration, the surface pressures obtained by the panel method and the Euler method agreed fairly well with results from flight test. However, for the high-lift configuration considerable differences were observed when the computational surface pressures were compared with the results from high-lift flight test. On the lower surface of all the elements with the exception of the slat, both the panel and Euler methods predicted pressures which were in good agreement with flight data. On the upper surface of all the elements the panel method predicted slightly higher suction compared to the Euler method. On the upper surface of the slat, pressure coefficients obtained by both the Euler and panel methods did not agree with the results of the flight tests. A sensitivity study of the upward deflection of the slat from the 40 deg. flap setting suggested that the differences in the slat deflection between the computational model and the flight configuration could be one of the sources of this discrepancy. The computation time for the implicit version of the Euler code was about 1/3 the time taken by the explicit version though the implicit code required 3 times the memory taken by the explicit version.
Are adverse effects incorporated in economic models? An initial review of current practice.
Craig, D; McDaid, C; Fonseca, T; Stock, C; Duffy, S; Woolacott, N
2009-12-01
To identify methodological research on the incorporation of adverse effects in economic models and to review current practice. Major electronic databases (Cochrane Methodology Register, Health Economic Evaluations Database, NHS Economic Evaluation Database, EconLit, EMBASE, Health Management Information Consortium, IDEAS, MEDLINE and Science Citation Index) were searched from inception to September 2007. Health technology assessment (HTA) reports commissioned by the National Institute for Health Research (NIHR) HTA programme and published between 2004 and 2007 were also reviewed. The reviews of methodological research on the inclusion of adverse effects in decision models and of current practice were carried out according to standard methods. Data were summarised in a narrative synthesis. Of the 719 potentially relevant references in the methodological research review, five met the inclusion criteria; however, they contained little information of direct relevance to the incorporation of adverse effects in models. Of the 194 HTA monographs published from 2004 to 2007, 80 were reviewed, covering a range of research and therapeutic areas. In total, 85% of the reports included adverse effects in the clinical effectiveness review and 54% of the decision models included adverse effects in the model; 49% included adverse effects in the clinical review and model. The link between adverse effects in the clinical review and model was generally weak; only 3/80 (< 4%) used the results of a meta-analysis from the systematic review of clinical effectiveness and none used only data from the review without further manipulation. Of the models including adverse effects, 67% used a clinical adverse effects parameter, 79% used a cost of adverse effects parameter, 86% used one of these and 60% used both. Most models (83%) used utilities, but only two (2.5%) used solely utilities to incorporate adverse effects and were explicit that the utility captured relevant adverse effects; 53% of those models that included utilities derived them from patients on treatment and could therefore be interpreted as capturing adverse effects. In total, 30% of the models that included adverse effects used withdrawals related to drug toxicity and therefore might be interpreted as using withdrawals to capture adverse effects, but this was explicitly stated in only three reports. Of the 37 models that did not include adverse effects, 18 provided justification for this omission, most commonly lack of data; 19 appeared to make no explicit consideration of adverse effects in the model. There is an implicit assumption within modelling guidance that adverse effects are very important but there is a lack of clarity regarding how they should be dealt with and considered in modelling. In many cases a lack of clear reporting in the HTAs made it extremely difficult to ascertain what had actually been carried out in consideration of adverse effects. The main recommendation is for much clearer and explicit reporting of adverse effects, or their exclusion, in decision models and for explicit recognition in future guidelines that 'all relevant outcomes' should include some consideration of adverse events.
48 CFR 46.301 - Contractor inspection requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... simplified acquisition threshold and (a) inclusion of the clause is necessary to ensure an explicit understanding of the contractor's inspection responsibilities, or (b) inclusion of the clause is required under...
48 CFR 46.301 - Contractor inspection requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... simplified acquisition threshold and (a) inclusion of the clause is necessary to ensure an explicit understanding of the contractor's inspection responsibilities, or (b) inclusion of the clause is required under...
48 CFR 46.301 - Contractor inspection requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... simplified acquisition threshold and (a) inclusion of the clause is necessary to ensure an explicit understanding of the contractor's inspection responsibilities, or (b) inclusion of the clause is required under...
48 CFR 46.301 - Contractor inspection requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... simplified acquisition threshold and (a) inclusion of the clause is necessary to ensure an explicit understanding of the contractor's inspection responsibilities, or (b) inclusion of the clause is required under...
NASA Technical Reports Server (NTRS)
Arnold, S. M.; Saleeb, A. F.; Tan, H. Q.; Zhang, Y.
1993-01-01
The issue of developing effective and robust schemes to implement a class of the Ogden-type hyperelastic constitutive models is addressed. To this end, special purpose functions (running under MACSYMA) are developed for the symbolic derivation, evaluation, and automatic FORTRAN code generation of explicit expressions for the corresponding stress function and material tangent stiffness tensors. These explicit forms are valid over the entire deformation range, since the singularities resulting from repeated principal-stretch values have been theoretically removed. The required computational algorithms are outlined, and the resulting FORTRAN computer code is presented.
RigFit: a new approach to superimposing ligand molecules.
Lemmen, C; Hiller, C; Lengauer, T
1998-09-01
If structural knowledge of a receptor under consideration is lacking, drug design approaches focus on similarity or dissimilarity analysis of putative ligands. In this context the mutual ligand superposition is of utmost importance. Methods that are rapid enough to facilitate interactive usage, that allow to process sets of conformers and that enable database screening are of special interest here. The ability to superpose molecular fragments instead of entire molecules has proven to be helpful too. The RIGFIT approach meets these requirements and has several additional advantages. In three distinct test applications, we evaluated how closely we can approximate the observed relative orientation for a set of known crystal structures, we employed RIGFIT as a fragment placement procedure, and we performed a fragment-based database screening. The run time of RIGFIT can be traded off against its accuracy. To be competitive in accuracy with another state-of-the-art alignment tool, with which we compare our method explicitly, computing times of about 6 s per superposition on a common day workstation are required. If longer run times can be afforded the accuracy increases significantly. RIGFIT is part of the flexible superposition software FLEXS which can be accessed on the WWW [http:/(/)cartan.gmd.de/FlexS].
Hildebrand, J; Maycock, B; Comfort, J; Burns, S; Adams, E; Howat, P
2015-12-01
Mature minor consent only became available in Australia in 2007. There is neither an explicitly defined protocol, nor a clear definition evident in the literature relating to use of the mature minor concept in health research. Due to difficulties in defining fixed age ranges to varying levels of maturity and vulnerability, there is a lack of clarity surrounding when it might be reasonable and ethical to apply for or grant a waiver for parental consent. This paper describes the challenges faced and solutions created when gaining approval for use of mature minor consent in a respondent-driven sampling (RDS) study to explore the social norms and alcohol consumption among 14-17-year-old adolescents (n = 1012) in the community. The University's Human Research Ethics Committee granted mature minor consent for this study, and the techniques applied enabled recruitment of adolescents from community-based settings through use of RDS to achieve the required sample. This paper has relevance for research that requires a waiver for parental consent; it presents a case study for assessing mature minors and makes recommendations on how ethical guidelines can be improved to assist human research ethics application processes.
Low-Outgassing Photogrammetry Targets for Use in Outer Space
NASA Technical Reports Server (NTRS)
Gross, Jason N.; Sampler, Henry; Reed, Benjamin B.
2011-01-01
A short document discusses an investigation of materials for photogrammetry targets for highly sensitive optical scientific instruments to be operated in outer space and in an outer-space-environment- simulating thermal vacuum chamber on Earth. A key consideration in the selection of photogrammetry-target materials for vacuum environments is the need to prevent contamination that could degrade the optical responses of the instruments. Therefore, in addition to the high levels and uniformity of reflectivity required of photogrammetry-target materials suitable for use in air, the materials sought must exhibit minimal outgassing. Commercially available photogrammetry targets were found to outgas excessively under the thermal and vacuum conditions of interest; this finding prompted the investigators to consider optically equivalent or superior, lower-outgassing alternative target materials. The document lists several materials found to satisfy the requirements, but does not state explicitly whether the materials can be used individually or must be combined in the proper sequence into layered target structures. The materials in question are an aluminized polyimide tape, an acrylic pressure- sensitive adhesive, a 500-A-thick layer of vapor-deposited aluminum, and spherical barium titanate glass beads having various diameters from 20 to 63 microns..
Stochastic soil water balance under seasonal climates
Feng, Xue; Porporato, Amilcare; Rodriguez-Iturbe, Ignacio
2015-01-01
The analysis of soil water partitioning in seasonally dry climates necessarily requires careful consideration of the periodic climatic forcing at the intra-annual timescale in addition to daily scale variabilities. Here, we introduce three new extensions to a stochastic soil moisture model which yields seasonal evolution of soil moisture and relevant hydrological fluxes. These approximations allow seasonal climatic forcings (e.g. rainfall and potential evapotranspiration) to be fully resolved, extending the analysis of soil water partitioning to account explicitly for the seasonal amplitude and the phase difference between the climatic forcings. The results provide accurate descriptions of probabilistic soil moisture dynamics under seasonal climates without requiring extensive numerical simulations. We also find that the transfer of soil moisture between the wet to the dry season is responsible for hysteresis in the hydrological response, showing asymmetrical trajectories in the mean soil moisture and in the transient Budyko's curves during the ‘dry-down‘ versus the ‘rewetting‘ phases of the year. Furthermore, in some dry climates where rainfall and potential evapotranspiration are in-phase, annual evapotranspiration can be shown to increase because of inter-seasonal soil moisture transfer, highlighting the importance of soil water storage in the seasonal context. PMID:25663808
Characterizing model uncertainties in the life cycle of lignocellulose-based ethanol fuels.
Spatari, Sabrina; MacLean, Heather L
2010-11-15
Renewable and low carbon fuel standards being developed at federal and state levels require an estimation of the life cycle carbon intensity (LCCI) of candidate fuels that can substitute for gasoline, such as second generation bioethanol. Estimating the LCCI of such fuels with a high degree of confidence requires the use of probabilistic methods to account for known sources of uncertainty. We construct life cycle models for the bioconversion of agricultural residue (corn stover) and energy crops (switchgrass) and explicitly examine uncertainty using Monte Carlo simulation. Using statistical methods to identify significant model variables from public data sets and Aspen Plus chemical process models,we estimate stochastic life cycle greenhouse gas (GHG) emissions for the two feedstocks combined with two promising fuel conversion technologies. The approach can be generalized to other biofuel systems. Our results show potentially high and uncertain GHG emissions for switchgrass-ethanol due to uncertain CO₂ flux from land use change and N₂O flux from N fertilizer. However, corn stover-ethanol,with its low-in-magnitude, tight-in-spread LCCI distribution, shows considerable promise for reducing life cycle GHG emissions relative to gasoline and corn-ethanol. Coproducts are important for reducing the LCCI of all ethanol fuels we examine.
Towards a minimal stochastic model for a large class of diffusion-reactions on biological membranes.
Chevalier, Michael W; El-Samad, Hana
2012-08-28
Diffusion of biological molecules on 2D biological membranes can play an important role in the behavior of stochastic biochemical reaction systems. Yet, we still lack a fundamental understanding of circumstances where explicit accounting of the diffusion and spatial coordinates of molecules is necessary. In this work, we illustrate how time-dependent, non-exponential reaction probabilities naturally arise when explicitly accounting for the diffusion of molecules. We use the analytical expression of these probabilities to derive a novel algorithm which, while ignoring the exact position of the molecules, can still accurately capture diffusion effects. We investigate the regions of validity of the algorithm and show that for most parameter regimes, it constitutes an accurate framework for studying these systems. We also document scenarios where large spatial fluctuation effects mandate explicit consideration of all the molecules and their positions. Taken together, our results derive a fundamental understanding of the role of diffusion and spatial fluctuations in these systems. Simultaneously, they provide a general computational methodology for analyzing a broad class of biological networks whose behavior is influenced by diffusion on membranes.
Benjamin A. Crabb; James A. Powell; Barbara J. Bentz
2012-01-01
Forecasting spatial patterns of mountain pine beetle (MPB) population success requires spatially explicit information on host pine distribution. We developed a means of producing spatially explicit datasets of pine density at 30-m resolution using existing geospatial datasets of vegetation composition and structure. Because our ultimate goal is to model MPB population...
Indirect Goal Priming Is More Powerful than Explicit Instruction in Children
ERIC Educational Resources Information Center
Kesek, Amanda; Cunningham, William A.; Packer, Dominic J.; Zelazo, Philip David
2011-01-01
This study examined the relative efficacy of explicit instruction and indirect priming on young children's behavior in a task that required a series of choices between a small immediate reward and a larger delayed reward. One hundred and six 4-year-old children were randomly assigned to one of four conditions involving one of two goals (maximize…
ERIC Educational Resources Information Center
Granena, Gisela
2012-01-01
Very high-level, functional ability in foreign languages is increasingly important in many walks of life. It is also very rare, and likely requires an early start and/or a special aptitude. This study investigated the extent to which aptitude for explicit learning, defined as "analytic ability" and aptitude for implicit learning, defined…
Maudsley, Gillian
2011-01-01
Some important research questions in medical education and health services research need 'mixed methods research' (particularly synthesizing quantitative and qualitative findings). The approach is not new, but should be more explicitly reported. The broad search question here, of a disjointed literature, was thus: What is mixed methods research - how should it relate to medical education research?, focused on explicit acknowledgement of 'mixing'. Literature searching focused on Web of Knowledge supplemented by other databases across disciplines. Five main messages emerged: - Thinking quantitative and qualitative, not quantitative versus qualitative - Appreciating that mixed methods research blends different knowledge claims, enquiry strategies, and methods - Using a 'horses for courses' [whatever works] approach to the question, and clarifying the mix - Appreciating how medical education research competes with the 'evidence-based' movement, health services research, and the 'RCT' - Being more explicit about the role of mixed methods in medical education research, and the required expertise Mixed methods research is valuable, yet the literature relevant to medical education is fragmented and poorly indexed. The required time, effort, expertise, and techniques deserve better recognition. More write-ups should explicitly discuss the 'mixing' (particularly of findings), rather than report separate components.
NASA Technical Reports Server (NTRS)
Jaggers, R. F.
1977-01-01
A derivation of an explicit solution to the two point boundary-value problem of exoatmospheric guidance and trajectory optimization is presented. Fixed initial conditions and continuous burn, multistage thrusting are assumed. Any number of end conditions from one to six (throttling is required in the case of six) can be satisfied in an explicit and practically optimal manner. The explicit equations converge for off nominal conditions such as engine failure, abort, target switch, etc. The self starting, predictor/corrector solution involves no Newton-Rhapson iterations, numerical integration, or first guess values, and converges rapidly if physically possible. A form of this algorithm has been chosen for onboard guidance, as well as real time and preflight ground targeting and trajectory shaping for the NASA Space Shuttle Program.
Stakeholders apply the GRADE evidence-to-decision framework to facilitate coverage decisions.
Dahm, Philipp; Oxman, Andrew D; Djulbegovic, Benjamin; Guyatt, Gordon H; Murad, M Hassan; Amato, Laura; Parmelli, Elena; Davoli, Marina; Morgan, Rebecca L; Mustafa, Reem A; Sultan, Shahnaz; Falck-Ytter, Yngve; Akl, Elie A; Schünemann, Holger J
2017-06-01
Coverage decisions are complex and require the consideration of many factors. A well-defined, transparent process could improve decision-making and facilitate decision-maker accountability. We surveyed key US-based stakeholders regarding their current approaches for coverage decisions. Then, we held a workshop to test an evidence-to-decision (EtD) framework for coverage based on the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) criteria. A total of 42 individuals (including 19 US stakeholders as well as international health policymakers and GRADE working group members) attended the workshop. Of the 19 stakeholders, 14 (74%) completed the survey before the workshop. Almost all of their organizations (13 of 14; 93%) used systematic reviews for coverage decision-making; few (2 of 14; 14%) developed their own evidence synthesis; a majority (9 of 14; 64%) rated the certainty of evidence (using various systems); almost all (13 of 14; 93%) denied formal consideration of resource use; and half (7 of 14; 50%) reported explicit criteria for decision-making. At the workshop, stakeholders successfully applied the EtD framework to four case studies and provided narrative feedback, which centered on contextual factors affecting coverage decisions in the United States, the need for reliable data on subgroups of patients, and the challenge of decision-making without formal consideration of resource use. Stakeholders successfully applied the EtD framework to four case studies and highlighted contextual factors affecting coverage decisions and affirmed its value. Their input informed the further development of a revised EtD framework, now publicly available (http://gradepro.org/). Published by Elsevier Inc.
Singing over the Wall: Legal and Ethical Considerations for Sacred Music in the Public Schools
ERIC Educational Resources Information Center
Drummond, Tim
2014-01-01
Music with sacred texts is integral to the historical and modern development of the music education field, yet many who teach in public schools find themselves limited or banned from using sacred music. School divisions do not have a consensus opinion on how to handle this sensitive topic, and the law is not explicit. In this article, I provide an…
ERIC Educational Resources Information Center
Segall, Avner; Burke, Kevin
2013-01-01
While it is true that following various Supreme Court decisions in the last century, religion is, in most cases, no longer explicitly taught in public school classrooms, we use this article to explore the ways in which implicit religious understandings regarding curriculum and pedagogy still remain prevalent in current public education. Building…
ERIC Educational Resources Information Center
Mellor, Liz
2011-01-01
This article aims to make explicit the evolving ecology of ideas in the field of community music and higher education that are particular to a context yet transferable across respective fields of enquiry--music education, community music, music therapy and community music therapy. This is contextualized in two ways: (1) through a consideration of…
78 FR 69606 - Record Requirements in the Mechanical Power Presses Standard
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... press is implicit in the requirement in existing paragraph (e)(1)(i), which specifies that the employer... believes that adding an explicit requirement to perform necessary maintenance and repair will ensure that... weekly inspections and tests required by existing paragraph (e)(1)(ii) serve the following functions: (i...
Requirements for the formal representation of pathophysiology mechanisms by clinicians
Helvensteijn, M.; Kokash, N.; Martorelli, I.; Sarwar, D.; Islam, S.; Grenon, P.; Hunter, P.
2016-01-01
Knowledge of multiscale mechanisms in pathophysiology is the bedrock of clinical practice. If quantitative methods, predicting patient-specific behaviour of these pathophysiology mechanisms, are to be brought to bear on clinical decision-making, the Human Physiome community and Clinical community must share a common computational blueprint for pathophysiology mechanisms. A number of obstacles stand in the way of this sharing—not least the technical and operational challenges that must be overcome to ensure that (i) the explicit biological meanings of the Physiome's quantitative methods to represent mechanisms are open to articulation, verification and study by clinicians, and that (ii) clinicians are given the tools and training to explicitly express disease manifestations in direct contribution to modelling. To this end, the Physiome and Clinical communities must co-develop a common computational toolkit, based on this blueprint, to bridge the representation of knowledge of pathophysiology mechanisms (a) that is implicitly depicted in electronic health records and the literature, with (b) that found in mathematical models explicitly describing mechanisms. In particular, this paper makes use of a step-wise description of a specific disease mechanism as a means to elicit the requirements of representing pathophysiological meaning explicitly. The computational blueprint developed from these requirements addresses the Clinical community goals to (i) organize and manage healthcare resources in terms of relevant disease-related knowledge of mechanisms and (ii) train the next generation of physicians in the application of quantitative methods relevant to their research and practice. PMID:27051514
[Hierarchy structuring for mammography technique by interpretive structural modeling method].
Kudo, Nozomi; Kurowarabi, Kunio; Terashita, Takayoshi; Nishimoto, Naoki; Ogasawara, Katsuhiko
2009-10-20
Participation in screening mammography is currently desired in Japan because of the increase in breast cancer morbidity. However, the pain and discomfort of mammography is recognized as a significant deterrent for women considering this examination. Thus quick procedures, sufficient experience, and advanced skills are required for radiologic technologists. The aim of this study was to make the point of imaging techniques explicit and to help understand the complicated procedure. We interviewed 3 technologists who were highly skilled in mammography, and 14 factors were retrieved by using brainstorming and the KJ method. We then applied Interpretive Structural Modeling (ISM) to the factors and developed a hierarchical concept structure. The result showed a six-layer hierarchy whose top node was explanation of the entire procedure on mammography. Male technologists were related to as a negative factor. Factors concerned with explanation were at the upper node. We gave attention to X-ray techniques and considerations. The findings will help beginners improve their skills.
Timpka, Toomas; Eriksson, Henrik; Strömgren, Magnus; Eriksson, Olle; Ekberg, Joakim; Grimvall, Anders; Nyce, James; Gursky, Elin; Holm, Einar
2010-01-01
The global spread of a novel A (H1N1) influenza virus in 2009 has highlighted the possibility of a devastating pandemic similar to the ‘Spanish flu’ of 1917–1918. Responding to such pandemics requires careful planning for the early phases where there is no availability of pandemic vaccine. We set out to compute a Neighborhood Influenza Susceptibility Index (NISI) describing the vulnerability of local communities of different geo-socio-physical structure to a pandemic influenza outbreak. We used a spatially explicit geo-physical model of Linköping municipality (pop. 136,240) in Sweden, and employed an ontology-modeling tool to define simulation models and transmission settings. We found considerable differences in NISI between neighborhoods corresponding to primary care areas with regard to early progress of the outbreak, as well as in terms of the total accumulated share of infected residents counted after the outbreak. The NISI can be used in local preparations of physical response measures during pandemics. PMID:21347087
NASA Astrophysics Data System (ADS)
Sehati, N.; Tavassoly, M. K.
2017-08-01
Inspiring from the scheme proposed in (Zheng in Phys Rev A 69:064,302 2004), our aim is to teleport an unknown qubit atomic state using the cavity QED method without using the explicit Bell-state measurement, and so the additional atom is not required. Two identical Λ-type three-level atoms are interacted separately and subsequently with a two-mode quantized cavity field where each mode is expressed with a single-photon field state. The interaction between atoms and field is well described via the Jaynes-Cummings model. It is then shown that how if the atomic detection results a particular state of atom 1, an unknown state can be appropriately teleported from atom 1 to atom 2. This teleportation procedure successfully leads to the high fidelity F (success probability P_g) in between 69%≲ F≲ 100% (0.14≲ P_g≲ 0.56). At last, we illustrated that our scheme considerably improves similar previous proposals.
A curvilinear, fully implicit, conservative electromagnetic PIC algorithm in multiple dimensions
Chacon, L.; Chen, G.
2016-04-19
Here, we extend a recently proposed fully implicit PIC algorithm for the Vlasov–Darwin model in multiple dimensions (Chen and Chacón (2015) [1]) to curvilinear geometry. As in the Cartesian case, the approach is based on a potential formulation (Φ, A), and overcomes many difficulties of traditional semi-implicit Darwin PIC algorithms. Conservation theorems for local charge and global energy are derived in curvilinear representation, and then enforced discretely by a careful choice of the discretization of field and particle equations. Additionally, the algorithm conserves canonical-momentum in any ignorable direction, and preserves the Coulomb gauge ∇ • A = 0 exactly. Anmore » asymptotically well-posed fluid preconditioner allows efficient use of large cell sizes, which are determined by accuracy considerations, not stability, and can be orders of magnitude larger than required in a standard explicit electromagnetic PIC simulation. We demonstrate the accuracy and efficiency properties of the algorithm with numerical experiments in mapped meshes in 1D-3V and 2D-3V.« less
A curvilinear, fully implicit, conservative electromagnetic PIC algorithm in multiple dimensions
NASA Astrophysics Data System (ADS)
Chacón, L.; Chen, G.
2016-07-01
We extend a recently proposed fully implicit PIC algorithm for the Vlasov-Darwin model in multiple dimensions (Chen and Chacón (2015) [1]) to curvilinear geometry. As in the Cartesian case, the approach is based on a potential formulation (ϕ, A), and overcomes many difficulties of traditional semi-implicit Darwin PIC algorithms. Conservation theorems for local charge and global energy are derived in curvilinear representation, and then enforced discretely by a careful choice of the discretization of field and particle equations. Additionally, the algorithm conserves canonical-momentum in any ignorable direction, and preserves the Coulomb gauge ∇ ṡ A = 0 exactly. An asymptotically well-posed fluid preconditioner allows efficient use of large cell sizes, which are determined by accuracy considerations, not stability, and can be orders of magnitude larger than required in a standard explicit electromagnetic PIC simulation. We demonstrate the accuracy and efficiency properties of the algorithm with numerical experiments in mapped meshes in 1D-3V and 2D-3V.
Kinetic description of large-scale low pressure glow discharges
NASA Astrophysics Data System (ADS)
Kortshagen, Uwe; Heil, Brian
1997-10-01
In recent years the so called ``nonlocal approximation'' to the solution of the electron Boltzmann equation has attracted considerable attention as an extremely efficient method for the kinetic modeling of low pressure discharges. However, it appears that modern discharges, which are optimized to provide large-scale plasma uniformity, are explicitly designed to work in a regime, in which the nonlocal approximation is no longer strictly valid. In the presentation we discuss results of a hybrid model, which is based on the natural division of the electron distribution function into a nonlocal body, which is determined by elastic collisions only, and a high energy part which requires a more complete treatment due to the action of inelastic collisions and wall losses of electrons. The method is applied to an inductively coupled low pressure discharge. We discuss the transition from plasma density profiles maximal on the discharge axis to plasma density profiles with off-center maxima, which has been observed in experiments. A positive feedback mechanism involved in this transition is pointed out.
A curvilinear, fully implicit, conservative electromagnetic PIC algorithm in multiple dimensions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chacon, L.; Chen, G.
Here, we extend a recently proposed fully implicit PIC algorithm for the Vlasov–Darwin model in multiple dimensions (Chen and Chacón (2015) [1]) to curvilinear geometry. As in the Cartesian case, the approach is based on a potential formulation (Φ, A), and overcomes many difficulties of traditional semi-implicit Darwin PIC algorithms. Conservation theorems for local charge and global energy are derived in curvilinear representation, and then enforced discretely by a careful choice of the discretization of field and particle equations. Additionally, the algorithm conserves canonical-momentum in any ignorable direction, and preserves the Coulomb gauge ∇ • A = 0 exactly. Anmore » asymptotically well-posed fluid preconditioner allows efficient use of large cell sizes, which are determined by accuracy considerations, not stability, and can be orders of magnitude larger than required in a standard explicit electromagnetic PIC simulation. We demonstrate the accuracy and efficiency properties of the algorithm with numerical experiments in mapped meshes in 1D-3V and 2D-3V.« less
Emotional engineers: toward morally responsible design.
Roeser, Sabine
2012-03-01
Engineers are normally seen as the archetype of people who make decisions in a rational and quantitative way. However, technological design is not value neutral. The way a technology is designed determines its possibilities, which can, for better or for worse, have consequences for human wellbeing. This leads various scholars to the claim that engineers should explicitly take into account ethical considerations. They are at the cradle of new technological developments and can thereby influence the possible risks and benefits more directly than anybody else. I have argued elsewhere that emotions are an indispensable source of ethical insight into ethical aspects of risk. In this paper I will argue that this means that engineers should also include emotional reflection into their work. This requires a new understanding of the competencies of engineers: they should not be unemotional calculators; quite the opposite, they should work to cultivate their moral emotions and sensitivity, in order to be engaged in morally responsible engineering. © The Author(s) 2010. This article is published with open access at Springerlink.com
ERIC Educational Resources Information Center
Smith, Herbert A.
This study involved examining an instructional unit with regard to its concept content and appropriateness for its target audience. The study attempted to determine (1) what concepts are treated explicitly or implicitly, (2) whether there is a hierarchical conceptual structure within the unit, (3) what level of sophistication is required to…
DoD Product Line Practice Workshop Report
1998-05-01
capability. The essential enterprise management practices include ensuring sound business goals providing an appropriate funding model performing...business. This way requires vision and explicit support at the organizational level. There must be an explicit funding model to support the development...the same group seems to work best in smaller organizations. A funding model for core asset development also needs to be developed because the core
ERIC Educational Resources Information Center
Colangelo, A.; Buchanan, L.
2005-01-01
We report evidence for dissociation between explicit and implicit access to word representations in a deep dyslexic patient (JO). JO read aloud a series of ambiguous (e.g., bank) and unambiguous (e.g., food) words and performed a lexical decision task using these same items. When required to explicitly access the items (i.e., naming), JO showed…
Siegel, Jason T; Navarro, Mario A; Thomson, Andrew L
2015-10-01
Investigations conducted through Amazon's Mechanical Turk (MTurk) sometimes explicitly note eligibility requirements when recruiting participants; however, the impact of this practice on data integrity is relatively unexplored within the MTurk context. Contextualized in the organ donor registration domain, the current study assessed whether overtly listing eligibility requirements impairs the accuracy of data collected on MTurk. On day 1, the first and third round of data collection did not list eligibility requirements; the second and fourth round overtly listed a qualification requirement: status as a non-registered organ donor. On day 2, the approach was identical, except the order was reversed-the first and third round overtly listed the study qualifications, while the second and fourth did not. These procedures provided eight different waves of data. In addition, all participants were randomly assigned to read an elevating (i.e., morally inspiring) story or a story not intended to elicit any emotion. Regardless of recruitment approach, only participants who were not registered as donors were included in the analysis. Results indicated that the recruitment script that explicitly requested non-registered donors resulted in the collection of participants with higher mean intentions scores than the script that did not overtly list the eligibility requirements. Further, even though the elevation induction increased intentions to register as a donor, there was not a significant interaction between recruitment approach and the influence of the elevation manipulation on registration intentions. Explicitly listing eligibility requirements can influence the accuracy of estimates derived from data collected through MTurk. Copyright © 2015 Elsevier Ltd. All rights reserved.
Impact of negation salience and cognitive resources on negation during attitude formation.
Boucher, Kathryn L; Rydell, Robert J
2012-10-01
Because of the increased cognitive resources required to process negations, past research has shown that explicit attitude measures are more sensitive to negations than implicit attitude measures. The current work demonstrated that the differential impact of negations on implicit and explicit attitude measures was moderated by (a) the extent to which the negation was made salient and (b) the amount of cognitive resources available during attitude formation. When negations were less visually salient, explicit but not implicit attitude measures reflected the intended valence of the negations. When negations were more visually salient, both explicit and implicit attitude measures reflected the intended valence of the negations, but only when perceivers had ample cognitive resources during encoding. Competing models of negation processing, schema-plus-tag and fusion, were examined to determine how negation salience impacts the processing of negations.
Development of iterative techniques for the solution of unsteady compressible viscous flows
NASA Technical Reports Server (NTRS)
Sankar, Lakshmi N.; Hixon, Duane
1991-01-01
Efficient iterative solution methods are being developed for the numerical solution of two- and three-dimensional compressible Navier-Stokes equations. Iterative time marching methods have several advantages over classical multi-step explicit time marching schemes, and non-iterative implicit time marching schemes. Iterative schemes have better stability characteristics than non-iterative explicit and implicit schemes. Thus, the extra work required by iterative schemes can also be designed to perform efficiently on current and future generation scalable, missively parallel machines. An obvious candidate for iteratively solving the system of coupled nonlinear algebraic equations arising in CFD applications is the Newton method. Newton's method was implemented in existing finite difference and finite volume methods. Depending on the complexity of the problem, the number of Newton iterations needed per step to solve the discretized system of equations can, however, vary dramatically from a few to several hundred. Another popular approach based on the classical conjugate gradient method, known as the GMRES (Generalized Minimum Residual) algorithm is investigated. The GMRES algorithm was used in the past by a number of researchers for solving steady viscous and inviscid flow problems with considerable success. Here, the suitability of this algorithm is investigated for solving the system of nonlinear equations that arise in unsteady Navier-Stokes solvers at each time step. Unlike the Newton method which attempts to drive the error in the solution at each and every node down to zero, the GMRES algorithm only seeks to minimize the L2 norm of the error. In the GMRES algorithm the changes in the flow properties from one time step to the next are assumed to be the sum of a set of orthogonal vectors. By choosing the number of vectors to a reasonably small value N (between 5 and 20) the work required for advancing the solution from one time step to the next may be kept to (N+1) times that of a noniterative scheme. Many of the operations required by the GMRES algorithm such as matrix-vector multiplies, matrix additions and subtractions can all be vectorized and parallelized efficiently.
Bashir, Ali; Bansal, Vikas; Bafna, Vineet
2010-06-18
Massively parallel DNA sequencing technologies have enabled the sequencing of several individual human genomes. These technologies are also being used in novel ways for mRNA expression profiling, genome-wide discovery of transcription-factor binding sites, small RNA discovery, etc. The multitude of sequencing platforms, each with their unique characteristics, pose a number of design challenges, regarding the technology to be used and the depth of sequencing required for a particular sequencing application. Here we describe a number of analytical and empirical results to address design questions for two applications: detection of structural variations from paired-end sequencing and estimating mRNA transcript abundance. For structural variation, our results provide explicit trade-offs between the detection and resolution of rearrangement breakpoints, and the optimal mix of paired-read insert lengths. Specifically, we prove that optimal detection and resolution of breakpoints is achieved using a mix of exactly two insert library lengths. Furthermore, we derive explicit formulae to determine these insert length combinations, enabling a 15% improvement in breakpoint detection at the same experimental cost. On empirical short read data, these predictions show good concordance with Illumina 200 bp and 2 Kbp insert length libraries. For transcriptome sequencing, we determine the sequencing depth needed to detect rare transcripts from a small pilot study. With only 1 Million reads, we derive corrections that enable almost perfect prediction of the underlying expression probability distribution, and use this to predict the sequencing depth required to detect low expressed genes with greater than 95% probability. Together, our results form a generic framework for many design considerations related to high-throughput sequencing. We provide software tools http://bix.ucsd.edu/projects/NGS-DesignTools to derive platform independent guidelines for designing sequencing experiments (amount of sequencing, choice of insert length, mix of libraries) for novel applications of next generation sequencing.
What is presumed when we presume consent?
Pierscionek, Barbara K
2008-01-01
Background The organ donor shortfall in the UK has prompted calls to introduce legislation to allow for presumed consent: if there is no explicit objection to donation of an organ, consent should be presumed. The current debate has not taken in account accepted meanings of presumption in law and science and the consequences for rights of ownership that would arise should presumed consent become law. In addition, arguments revolve around the rights of the competent autonomous adult but do not always consider the more serious implications for children or the disabled. Discussion Any action or decision made on a presumption is accepted in law and science as one based on judgement of a provisional situation. It should therefore allow the possibility of reversing the action or decision. Presumed consent to organ donation will not permit such reversal. Placing prime importance on the functionality of body organs and their capacity to sustain life rather than on explicit consent of the individual will lead to further debate about rights of ownership and potentially to questions about financial incentives and to whom benefits should accrue. Factors that influence donor rates are not fully understood and attitudes of the public to presumed consent require further investigation. Presuming consent will also necessitate considering how such a measure would be applied in situations involving children and mentally incompetent adults. Summary The presumption of consent to organ donation cannot be understood in the same way as is presumption when applied to science or law. Consideration should be given to the consequences of presuming consent and to the questions of ownership and organ monetary value as these questions are likely to arise should presumed consent be permitted. In addition, the implications of presumed consent on children and adults who are unable to object to organ donation, requires serious contemplation if these most vulnerable members of society are to be protected. PMID:18439242
Modelling virus- and host-limitation in vectored plant disease epidemics.
Jeger, M J; van den Bosch, F; Madden, L V
2011-08-01
Models of plant virus epidemics have received less attention than those caused by fungal pathogens. Intuitively, the fact that virus diseases are systemic means that the individual diseased plant can be considered as the population unit which simplifies modelling. However, the fact that a vector is required in the vast majority of cases for virus transmission, means that explicit consideration must be taken of the vector, or, the involvement of the vector in the transmission process must be considered implicitly. In the latter case it is also important that within-plant processes, such as virus multiplication and systemic movement, are taken into account. In this paper we propose an approach based on the linking of transmission at the population level with virus multiplication within plants. The resulting models are parameter-sparse and hence simplistic. However, the range of model outcomes is representative of field observations relating to the apparent limitation of epidemic development in populations of healthy susceptible plants. We propose that epidemic development can be constrained by virus limitation in the early stages of an epidemic when the availability of healthy susceptible hosts is not limiting. There is an inverse relationship between levels of transmission in the population and the mean virus titre/infected plant. In the case of competition between viruses, both virus and host limitation are likely to be important in determining whether one virus can displace another or whether both viruses can co-exist in a plant population. Lotka-Volterra type equations are derived to describe density-dependent competition between two viruses multiplying within plants, embedded within a population level epidemiological model. Explicit expressions determining displacement or co-existence of the viruses are obtained. Unlike the classical Lotka-Volterra competition equations, the co-existence requirement for the competition coefficients to be both less than 1 can be relaxed. Copyright © 2011 Elsevier B.V. All rights reserved.
On lattice chiral gauge theories
NASA Technical Reports Server (NTRS)
Maiani, L.; Rossi, G. C.; Testa, M.
1991-01-01
The Smit-Swift-Aoki formulation of a lattice chiral gauge theory is presented. In this formulation the Wilson and other non invariant terms in the action are made gauge invariant by the coupling with a nonlinear auxilary scalar field, omega. It is shown that omega decouples from the physical states only if appropriate parameters are tuned so as to satisfy a set of BRST identities. In addition, explicit ghost fields are necessary to ensure decoupling. These theories can give rise to the correct continuum limit. Similar considerations apply to schemes with mirror fermions. Simpler cases with a global chiral symmetry are discussed and it is shown that the theory becomes free at decoupling. Recent numerical simulations agree with those considerations.
Commentary: Writing and Evaluating Qualitative Research Reports
Thompson, Deborah; Aroian, Karen J.; McQuaid, Elizabeth L.; Deatrick, Janet A.
2016-01-01
Objective To provide an overview of qualitative methods, particularly for reviewers and authors who may be less familiar with qualitative research. Methods A question and answer format is used to address considerations for writing and evaluating qualitative research. Results and Conclusions When producing qualitative research, individuals are encouraged to address the qualitative research considerations raised and to explicitly identify the systematic strategies used to ensure rigor in study design and methods, analysis, and presentation of findings. Increasing capacity for review and publication of qualitative research within pediatric psychology will advance the field’s ability to gain a better understanding of the specific needs of pediatric populations, tailor interventions more effectively, and promote optimal health. PMID:27118271
76 FR 8923 - Explosive Siting Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-16
....regulations.gov , including any personal information you provide. Using the search function of the docket Web... requirements, but do not impose explicit separation requirements. The employer must guarantee the mechanical... ensure these systems are adequate for their functions and must maintain the components. 29 CFR 1910.119(j...
77 FR 29247 - Federal Motor Vehicle Safety Standards; Occupant Crash Protection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-17
...). ACTION: Final rule; technical amendments. SUMMARY: This final rule makes technical amendments to Federal... advanced air bag requirements. As written now, the general warning label requirements contain an explicit... equipment requirements for restraint systems. This document makes technical amendments to several of the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ji Zhengfeng; Feng Yuan; Ying Mingsheng
Local quantum operations and classical communication (LOCC) put considerable constraints on many quantum information processing tasks such as cloning and discrimination. Surprisingly, however, discrimination of any two pure states survives such constraints in some sense. We show that cloning is not that lucky; namely, probabilistic LOCC cloning of two product states is strictly less efficient than global cloning. We prove our result by giving explicitly the efficiency formula of local cloning of any two product states.
A discussion supporting presumed consent for posthumous sperm procurement and conception.
Tremellen, Kelton; Savulescu, Julian
2015-01-01
Conception of a child using cryopreserved sperm from a deceased man is generally considered ethically sound provided explicit consent for its use has been made, thereby protecting the man's autonomy. When death is sudden (trauma, unexpected illness), explicit consent is not possible, thereby preventing posthumous sperm procurement (PSP) and conception according to current European Society of Human Reproduction and Embryology and the American Society for Reproductive Medicine guidelines. Here, we argue that autonomy of a deceased person should not be considered the paramount ethical concern, but rather consideration of the welfare of the living (widow and prospective child) should be the primary focus. Posthumous conception can bring significant advantages to the widow and her resulting child, with most men supporting such practice. We suggest that a deceased man can benefit from posthumous conception (continuation of his 'bloodline', allowing his widow's wishes for a child to be satisfied), and has a moral duty to allow his widow access to his sperm, if she so wishes, unless he clearly indicated that he did not want children when alive. We outline the arguments favouring presumed consent over implied or proxy consent, plus practical considerations for recording men's wishes to opt-out of posthumous conception. Copyright © 2014 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Garcia, Elena
The demand for air travel is expanding beyond the capacity of the existing National Airspace System. Excess traffic results in delays and compromised safety. Thus, a number of initiatives to improve airspace capacity have been proposed. To assess the impact of these technologies on air traffic one must move beyond the vehicle to a system-of-systems point of view. This top-level perspective must include consideration of the aircraft, airports, air traffic control and airlines that make up the airspace system. In addition to these components and their interactions economics, safety and government regulations must also be considered. Furthermore, the air transportation system is inherently variable with changes in everything from fuel prices to the weather. The development of a modeling environment that enables a comprehensive probabilistic evaluation of technological impacts was the subject of this thesis. The final modeling environment developed used economics as the thread to tie the airspace components together. Airport capacities and delays were calculated explicitly with due consideration to the impacts of air traffic control. The delay costs were then calculated for an entire fleet, and an airline economic analysis, considering the impact of these costs, was carried out. Airline return on investment was considered the metric of choice since it brings together all costs and revenues, including the cost of delays, landing fees for airport use and aircraft financing costs. Safety was found to require a level of detail unsuitable for a system-of-systems approach and was relegated to future airspace studies. Environmental concerns were considered to be incorporated into airport regulations and procedures and were not explicitly modeled. A deterministic case study was developed to test this modeling environment. The Atlanta airport operations for the year 2000 were used for validation purposes. A 2005 baseline was used as a basis for comparing the four technologies considered: a very large aircraft, Terminal Area Productivity air traffic control technologies, smoothing of an airline schedule, and the addition of a runway. A case including all four technologies simultaneously was also considered. Unfortunately, the complexity of the system prevented full exploration of the probabilistic aspects of the National Airspace System.
Pre-gilbertian conceptions of terrestrial magnetism
Smith, P.J.
1968-01-01
It is now well known that William Gilbert, in his De Magnete of 1600, first suggested that the earth behaves as a great magnet. By their very nature, however, such explicit statements tend, in retrospect, to be emphasised at the expense of less explicit antecedent ideas and experiments, with the result that, in the example under consideration here, the impression has sometimes been given that before Gilbert there was not the slightest suspicion that the earth exerts influence on the magnetic needle. In fact, Gilbert's conclusion represented the culmination of many centuries of thought and experimentation on the subject. This essay traces the main steps in the evolutionary process from the idea that magnetic 'virtue' derived from the heave, through the gradual realisation that magnetism is closely associated with the earth, up to the time of Gilbert's definite statement. ?? 1968.
Using effort information with change-in-ratio data for population estimation
Udevitz, Mark S.; Pollock, Kenneth H.
1995-01-01
Most change-in-ratio (CIR) methods for estimating fish and wildlife population sizes have been based only on assumptions about how encounter probabilities vary among population subclasses. When information on sampling effort is available, it is also possible to derive CIR estimators based on assumptions about how encounter probabilities vary over time. This paper presents a generalization of previous CIR models that allows explicit consideration of a range of assumptions about the variation of encounter probabilities among subclasses and over time. Explicit estimators are derived under this model for specific sets of assumptions about the encounter probabilities. Numerical methods are presented for obtaining estimators under the full range of possible assumptions. Likelihood ratio tests for these assumptions are described. Emphasis is on obtaining estimators based on assumptions about variation of encounter probabilities over time.
A Robust Semi-Parametric Test for Detecting Trait-Dependent Diversification.
Rabosky, Daniel L; Huang, Huateng
2016-03-01
Rates of species diversification vary widely across the tree of life and there is considerable interest in identifying organismal traits that correlate with rates of speciation and extinction. However, it has been challenging to develop methodological frameworks for testing hypotheses about trait-dependent diversification that are robust to phylogenetic pseudoreplication and to directionally biased rates of character change. We describe a semi-parametric test for trait-dependent diversification that explicitly requires replicated associations between character states and diversification rates to detect effects. To use the method, diversification rates are reconstructed across a phylogenetic tree with no consideration of character states. A test statistic is then computed to measure the association between species-level traits and the corresponding diversification rate estimates at the tips of the tree. The empirical value of the test statistic is compared to a null distribution that is generated by structured permutations of evolutionary rates across the phylogeny. The test is applicable to binary discrete characters as well as continuous-valued traits and can accommodate extremely sparse sampling of character states at the tips of the tree. We apply the test to several empirical data sets and demonstrate that the method has acceptable Type I error rates. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Silke, Charlotte; Swords, Lorraine; Heary, Caroline
2017-11-01
Research indicates that adolescents who experience mental health difficulties are frequently stigmatised by their peers. Stigmatisation is associated with a host of negative social and psychological effects, which impacts a young person's well-being. As a result, the development of effective anti-stigma strategies is considered a major research priority. However, in order to design effective stigma reduction strategies, researchers must be informed by an understanding of the factors that influence the expression of stigma. Although evidence suggests that empathy and social norms have a considerable effect on adolescents' social attitudes and behaviours, research has yet to examine whether these factors significantly influence adolescents' responses toward their peers with mental health difficulties. Thus, this study aims to examine whether empathy (cognitive and affective) and peer norms (descriptive and injunctive) influence adolescents' implicit and explicit stigmatising responses toward peers with mental health problems. A total of 570 (221 male and 348 female; 1 non-specified) adolescents, aged between 13 and 18 years (M = 15.51, SD = 1.13), participated in this research. Adolescents read vignettes describing male/female depressed and 'typically developing' peers. Adolescents answered questions assessing their stigmatising responses toward each target, as well as their empathic responding and normative perceptions. A sub-sample of participants (n=173) also completed an IAT assessing their implicit stigmatising responses. Results showed that descriptive norms exerted a substantial effect on adolescents' explicit responses. Cognitive empathy, affective empathy and injunctive norms exerted more limited effects on explicit responses. No significant effects were observed for implicit stigma. Overall, empathy was found to have limited effects on adolescents' explicit and implicit stigmatising responses, which may suggest that other contextual variables moderate the effects of dispositional empathy on responding. In conclusion, these findings suggest that tackling the perception of negative descriptive norms may be an effective strategy for reducing explicit stigmatising responses among adolescents. Copyright © 2017 Elsevier B.V. All rights reserved.
Making the Grade: Describing Inherent Requirements for the Initial Teacher Education Practicum
ERIC Educational Resources Information Center
Sharplin, Elaine; Peden, Sanna; Marais, Ida
2016-01-01
This study explores the development, description, and illustration of inherent requirement (IR) statements to make explicit the requirements for performance on an initial teacher education (ITE) practicum. Through consultative group processes with stakeholders involved in ITE, seven IR domains were identified. From interviews with academics,…
78 FR 69543 - Record Requirements in the Mechanical Power Presses Standard
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... perform maintenance and repair necessary for the safe operation of the entire press is implicit in the... of general industry, but OSHA believes that adding an explicit requirement to perform necessary... required by existing paragraph (e)(1)(ii) serve the following functions: (i) Remind employers to inspect...
Thinking Together: Modeling Clinical Decision-Support as a Sociotechnical System
Hussain, Mustafa I.; Reynolds, Tera L.; Mousavi, Fatemeh E.; Chen, Yunan; Zheng, Kai
2017-01-01
Computerized clinical decision-support systems are members of larger sociotechnical systems, composed of human and automated actors, who send, receive, and manipulate artifacts. Sociotechnical consideration is rare in the literature. This makes it difficult to comparatively evaluate the success of CDS implementations, and it may also indicate that sociotechnical context receives inadequate consideration in practice. To facilitate sociotechnical consideration, we developed the Thinking Together model, a flexible diagrammatical means of representing CDS systems as sociotechnical systems. To develop this model, we examined the literature with the lens of Distributed Cognition (DCog) theory. We then present two case studies of vastly different CDSSs, one almost fully automated and the other with minimal automation, to illustrate the flexibility of the Thinking Together model. We show that this model, informed by DCog and the CDS literature, are capable of supporting both research, by enabling comparative evaluation, and practice, by facilitating explicit sociotechnical planning and communication. PMID:29854164
Modeling trends from North American Breeding Bird Survey data: a spatially explicit approach
Bled, Florent; Sauer, John R.; Pardieck, Keith L.; Doherty, Paul; Royle, J. Andy
2013-01-01
Population trends, defined as interval-specific proportional changes in population size, are often used to help identify species of conservation interest. Efficient modeling of such trends depends on the consideration of the correlation of population changes with key spatial and environmental covariates. This can provide insights into causal mechanisms and allow spatially explicit summaries at scales that are of interest to management agencies. We expand the hierarchical modeling framework used in the North American Breeding Bird Survey (BBS) by developing a spatially explicit model of temporal trend using a conditional autoregressive (CAR) model. By adopting a formal spatial model for abundance, we produce spatially explicit abundance and trend estimates. Analyses based on large-scale geographic strata such as Bird Conservation Regions (BCR) can suffer from basic imbalances in spatial sampling. Our approach addresses this issue by providing an explicit weighting based on the fundamental sample allocation unit of the BBS. We applied the spatial model to three species from the BBS. Species have been chosen based upon their well-known population change patterns, which allows us to evaluate the quality of our model and the biological meaning of our estimates. We also compare our results with the ones obtained for BCRs using a nonspatial hierarchical model (Sauer and Link 2011). Globally, estimates for mean trends are consistent between the two approaches but spatial estimates provide much more precise trend estimates in regions on the edges of species ranges that were poorly estimated in non-spatial analyses. Incorporating a spatial component in the analysis not only allows us to obtain relevant and biologically meaningful estimates for population trends, but also enables us to provide a flexible framework in order to obtain trend estimates for any area.
Georges, Carrie; Hoffmann, Danielle; Schiltz, Christine
2018-01-01
Behavioral evidence for the link between numerical and spatial representations comes from the spatial-numerical association of response codes (SNARC) effect, consisting in faster reaction times to small/large numbers with the left/right hand respectively. The SNARC effect is, however, characterized by considerable intra- and inter-individual variability. It depends not only on the explicit or implicit nature of the numerical task, but also relates to interference control. To determine whether the prevalence of the latter relation in the elderly could be ascribed to younger individuals’ ceiling performances on executive control tasks, we determined whether the SNARC effect related to Stroop and/or Flanker effects in 26 young adults with ADHD. We observed a divergent pattern of correlation depending on the type of numerical task used to assess the SNARC effect and the type of interference control measure involved in number-space associations. Namely, stronger number-space associations during parity judgments involving implicit magnitude processing related to weaker interference control in the Stroop but not Flanker task. Conversely, stronger number-space associations during explicit magnitude classifications tended to be associated with better interference control in the Flanker but not Stroop paradigm. The association of stronger parity and magnitude SNARC effects with weaker and better interference control respectively indicates that different mechanisms underlie these relations. Activation of the magnitude-associated spatial code is irrelevant and potentially interferes with parity judgments, but in contrast assists explicit magnitude classifications. Altogether, the present study confirms the contribution of interference control to number-space associations also in young adults. It suggests that magnitude-associated spatial codes in implicit and explicit tasks are monitored by different interference control mechanisms, thereby explaining task-related intra-individual differences in number-space associations. PMID:29881363
Smith, Peter C; Chalkidou, Kalipso
2017-01-01
A fundamental debate in the transition towards universal health coverage concerns whether to establish an explicit health benefits package to which all citizens are entitled, and the level of detail in which to specify that package. At one extreme, the treatments to be funded, and the circumstances in which patients qualify for the treatment, might be specified in great detail, and be entirely mandatory. This would make clinicians little more than automata, carrying out prescribed practice. At the other extreme, priorities may be expressed in very broad terms, with no compulsion or other incentives to encourage adherence. The paper examines the arguments for and against setting an explicit benefits package, and discusses the circumstances in which increased detail in specification are most appropriate. The English National Health Service is used as a case study, based on institutional history, official documents and research literature. Although the English NHS does not explicitly specify a health benefits package, it is in some respects establishing an 'intelligent' package, based on instruments such as an essential medicines list, clinical guidelines, provider payment and performance reporting, which acknowledges gaps in evidence and variations in local resource constraints. Further moves towards a more explicit specification are likely to yield substantial benefits in most health systems. Considerations in determining the 'hardness' of benefits package specification might include the quality of information about the costs and benefits of treatments, the heterogeneity of patient needs and preferences, the financing regime in place, and the nature of supply side constraints. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Validity for an integrated laboratory analogue of sexual aggression and bystander intervention.
Parrott, Dominic J; Tharp, Andra Teten; Swartout, Kevin M; Miller, Cameron A; Hall, Gordon C Nagayama; George, William H
2012-01-01
This study sought to develop and validate an integrated laboratory paradigm of sexual aggression and bystander intervention. Participants were a diverse community sample (54% African American) of heterosexual males (N = 156) between 21 and 35 years of age who were recruited to complete the study with a male friend and an ostensibly single, heterosexual female who reported a strong dislike of sexual content in the media. Participants viewed a sexually explicit or nonsexually explicit film clip as part of contrived media rating task and made individual choices of which film clip to show the female confederate. Immediately thereafter, participants were required to reach consensus on a group decision of which film clip to show the female confederate. Subjecting a target to an unwanted experience with a sexual connotation was operationalized as selection of the sexually explicit video, whereas successful bystander intervention was operationalized as the event of one partner individually selecting the sexually explicit video but then selecting the nonsexually explicit video for the group choice. Results demonstrated that a 1-year history of sexual aggression and endorsement of pertinent misogynistic attitudes significantly predicted selection of the sexually-explicit video. In addition, bystander efficacy significantly predicted men's successful prevention of their male peer's intent to show the female confederate a sexually explicit video. Discussion focused on how these data inform future research and bystander intervention programming for sexual aggression. © 2012 Wiley Periodicals, Inc.
Some findings on zero-inflated and hurdle poisson models for disease mapping.
Corpas-Burgos, Francisca; García-Donato, Gonzalo; Martinez-Beneito, Miguel A
2018-05-27
Zero excess in the study of geographically referenced mortality data sets has been the focus of considerable attention in the literature, with zero-inflation being the most common procedure to handle this lack of fit. Although hurdle models have also been used in disease mapping studies, their use is more rare. We show in this paper that models using particular treatments of zero excesses are often required for achieving appropriate fits in regular mortality studies since, otherwise, geographical units with low expected counts are oversmoothed. However, as also shown, an indiscriminate treatment of zero excess may be unnecessary and has a problematic implementation. In this regard, we find that naive zero-inflation and hurdle models, without an explicit modeling of the probabilities of zeroes, do not fix zero excesses problems well enough and are clearly unsatisfactory. Results sharply suggest the need for an explicit modeling of the probabilities that should vary across areal units. Unfortunately, these more flexible modeling strategies can easily lead to improper posterior distributions as we prove in several theoretical results. Those procedures have been repeatedly used in the disease mapping literature, and one should bear these issues in mind in order to propose valid models. We finally propose several valid modeling alternatives according to the results mentioned that are suitable for fitting zero excesses. We show that those proposals fix zero excesses problems and correct the mentioned oversmoothing of risks in low populated units depicting geographic patterns more suited to the data. Copyright © 2018 John Wiley & Sons, Ltd.
Lander, Jonas; Hainz, Tobias; Hirschberg, Irene; Bossert, Sabine; Strech, Daniel
2016-01-01
Public involvement activities (PIAs) may contribute to the governance of ethically challenging biomedical research and innovation by informing, consulting with and engaging the public in developments and decision-making processes. For PIAs to capture a population's preferences (e.g. on issues in whole genome sequencing, biobanks or genome editing), a central methodological requirement is to involve a sufficiently representative subgroup of the general public. While the existing literature focusses on theoretical and normative aspects of 'representation', this study assesses empirically how such considerations are implemented in practice. It evaluates how PIA reports describe representation objectives, the recruitment process and levels of representation achieved. PIA reports were included from a systematic literature search if they directly reported a PIA conducted in a relevant discipline such as genomics, biobanks, biotechnology or others. PIA reports were analyzed with thematic text analysis. The text analysis was guided by an assessment matrix based on PIA-specific guidelines and frameworks. We included 46 relevant reports, most focusing on issues in genomics. 27 reports (59%) explicitly described representation objectives, though mostly without adjusting eligibility criteria and recruiting methods to the specific objective. 11 reports (24%) explicitly reported to have achieved the intended representation; the rest either reported failure or were silent on this issue. Representation of study samples in PIAs in biomedical research and innovation is currently not reported systematically. Improved reporting on representation would not only improve the validity and value of PIAs, but could also contribute to PIA results being used more often in relevant policy and decision-making processes. © 2016 S. Karger AG, Basel.
Is lorazepam-induced amnesia specific to the type of memory or to the task used to assess it?
File, S E; Sharma, R; Shaffer, J
1992-01-01
Retrieval tasks can be classified along a continuum from conceptually driven (relying on the encoded meaning of the material) to data driven (relying on the perceptual record and surface features of the material). Since most explicit memory tests are conceptually driven and most implicit memory tests are data driven there has been considerable confounding of the memory system being assessed and the processing required by the retrieval task. The purpose of the present experiment was to investigate the effects of lorazepam on explicit memory, using both types of retrieval task. Lorazepam (2.5 mg) or matched placebo was administered to healthy volunteers and changes in subjective mood ratings and in performance in tests of memory were measured. Lorazepam made subjects significantly more drowsy, feeble, clumsy, muzzy, lethargic and mentally slow. Lorazepam significantly impaired recognition memory for slides, impaired the number of words remembered when the retrieval was cued by the first two letters and reduced the number of pictures remembered when retention was cued with picture fragments. Thus episodic memory was impaired whether the task used was conceptually driven (as in slide recognition) or data driven, as in the other two tasks. Analyses of covariance indicated that the memory impairments were independent of increased sedation, as assessed by self-ratings. In contrast to the deficits in episodic memory, there were no lorazepam-induced impairments in tests of semantic memory, whether this was measured in the conceptually driven task of category generation or in the data-driven task of wordstem completion.
Policies for patient access to clinical data via PHRs: current state and recommendations.
Collins, Sarah A; Vawdrey, David K; Kukafka, Rita; Kuperman, Gilad J
2011-12-01
Healthcare delivery organizations are increasingly using online personal health records (PHRs) to provide patients with direct access to their clinical information; however, there may be a lack of consistency in the data made available. We aimed to understand the general use and functionality of PHRs and the organizational policies and decision-making structures for making data available to patients. A cross-sectional survey was administered by telephone structured interview to 21 organizations to determine the types of data made available to patients through PHRs and the presence of explicit governance for PHR data release. Organizations were identified based on a review of the literature, PHR experts, and snowball sampling. Organizations that did not provide patients with electronic access to their data via a PHR were excluded. Interviews were conducted with 17 organizations for a response rate of 81%. Half of the organizations had explicit governance in the form of a written policy that outlined the data types made available to patients. Overall, 88% of the organizations used a committee structure for the decision-making process and included senior management and information services. All organizations sought input from clinicians. Discussion There was considerable variability in the types of clinical data and the time frame for releasing these data to patients. Variability in data release policies may have implications for PHR use and adoption. Future policy activities, such as requirement specification for the latter stages of Meaningful Use, should be leveraged as an opportunity to encourage standardization of functionality and broad deployment of PHRs.
NASA Technical Reports Server (NTRS)
Crouch, P. E.; Grossman, Robert
1992-01-01
This note is concerned with the explicit symbolic computation of expressions involving differential operators and their actions on functions. The derivation of specialized numerical algorithms, the explicit symbolic computation of integrals of motion, and the explicit computation of normal forms for nonlinear systems all require such computations. More precisely, if R = k(x(sub 1),...,x(sub N)), where k = R or C, F denotes a differential operator with coefficients from R, and g member of R, we describe data structures and algorithms for efficiently computing g. The basic idea is to impose a multiplicative structure on the vector space with basis the set of finite rooted trees and whose nodes are labeled with the coefficients of the differential operators. Cancellations of two trees with r + 1 nodes translates into cancellation of O(N(exp r)) expressions involving the coefficient functions and their derivatives.
Carpenter, Kathryn L; Wills, Andy J; Benattayallah, Abdelmalek; Milton, Fraser
2016-10-01
The influential competition between verbal and implicit systems (COVIS) model proposes that category learning is driven by two competing neural systems-an explicit, verbal, system, and a procedural-based, implicit, system. In the current fMRI study, participants learned either a conjunctive, rule-based (RB), category structure that is believed to engage the explicit system, or an information-integration category structure that is thought to preferentially recruit the implicit system. The RB and information-integration category structures were matched for participant error rate, the number of relevant stimulus dimensions, and category separation. Under these conditions, considerable overlap in brain activation, including the prefrontal cortex, basal ganglia, and the hippocampus, was found between the RB and information-integration category structures. Contrary to the predictions of COVIS, the medial temporal lobes and in particular the hippocampus, key regions for explicit memory, were found to be more active in the information-integration condition than in the RB condition. No regions were more activated in RB than information-integration category learning. The implications of these results for theories of category learning are discussed. Hum Brain Mapp 37:3557-3574, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
A Geographically Explicit Genetic Model of Worldwide Human-Settlement History
Liu, Hua; Prugnolle, Franck; Manica, Andrea; Balloux, François
2006-01-01
Currently available genetic and archaeological evidence is generally interpreted as supportive of a recent single origin of modern humans in East Africa. However, this is where the near consensus on human settlement history ends, and considerable uncertainty clouds any more detailed aspect of human colonization history. Here, we present a dynamic genetic model of human settlement history coupled with explicit geographical distances from East Africa, the likely origin of modern humans. We search for the best-supported parameter space by fitting our analytical prediction to genetic data that are based on 52 human populations analyzed at 783 autosomal microsatellite markers. This framework allows us to jointly estimate the key parameters of the expansion of modern humans. Our best estimates suggest an initial expansion of modern humans ∼56,000 years ago from a small founding population of ∼1,000 effective individuals. Our model further points to high growth rates in newly colonized habitats. The general fit of the model with the data is excellent. This suggests that coupling analytical genetic models with explicit demography and geography provides a powerful tool for making inferences on human-settlement history. PMID:16826514
Reder, Lynne M.; Park, Heekyeong; Kieffaber, Paul D.
2009-01-01
There is a popular hypothesis that performance on implicit and explicit memory tasks reflects 2 distinct memory systems. Explicit memory is said to store those experiences that can be consciously recollected, and implicit memory is said to store experiences and affect subsequent behavior but to be unavailable to conscious awareness. Although this division based on awareness is a useful taxonomy for memory tasks, the authors review the evidence that the unconscious character of implicit memory does not necessitate that it be treated as a separate system of human memory. They also argue that some implicit and explicit memory tasks share the same memory representations and that the important distinction is whether the task (implicit or explicit) requires the formation of a new association. The authors review and critique dissociations from the behavioral, amnesia, and neuroimaging literatures that have been advanced in support of separate explicit and implicit memory systems by highlighting contradictory evidence and by illustrating how the data can be accounted for using a simple computational memory model that assumes the same memory representation for those disparate tasks. PMID:19210052
NASA Astrophysics Data System (ADS)
Cavaglieri, Daniele; Bewley, Thomas
2015-04-01
Implicit/explicit (IMEX) Runge-Kutta (RK) schemes are effective for time-marching ODE systems with both stiff and nonstiff terms on the RHS; such schemes implement an (often A-stable or better) implicit RK scheme for the stiff part of the ODE, which is often linear, and, simultaneously, a (more convenient) explicit RK scheme for the nonstiff part of the ODE, which is often nonlinear. Low-storage RK schemes are especially effective for time-marching high-dimensional ODE discretizations of PDE systems on modern (cache-based) computational hardware, in which memory management is often the most significant computational bottleneck. In this paper, we develop and characterize eight new low-storage implicit/explicit RK schemes which have higher accuracy and better stability properties than the only low-storage implicit/explicit RK scheme available previously, the venerable second-order Crank-Nicolson/Runge-Kutta-Wray (CN/RKW3) algorithm that has dominated the DNS/LES literature for the last 25 years, while requiring similar storage (two, three, or four registers of length N) and comparable floating-point operations per timestep.
Wieseler, Beate; Kaiser, Thomas; Thomas, Stefanie; Bender, Ralf; Windeler, Jürgen; Lange, Stefan
2016-01-01
At the beginning of 2011, the early benefit assessment of new drugs was introduced in Germany with the Act on the Reform of the Market for Medicinal Products (AMNOG). The Federal Joint Committee (G‐BA) generally commissions the Institute for Quality and Efficiency in Health Care (IQWiG) with this type of assessment, which examines whether a new drug shows an added benefit (a positive patient‐relevant treatment effect) over the current standard therapy. IQWiG is required to assess the extent of added benefit on the basis of a dossier submitted by the pharmaceutical company responsible. In this context, IQWiG was faced with the task of developing a transparent and plausible approach for operationalizing how to determine the extent of added benefit. In the case of an added benefit, the law specifies three main extent categories (minor, considerable, major). To restrict value judgements to a minimum in the first stage of the assessment process, an explicit and abstract operationalization was needed. The present paper is limited to the situation of binary data (analysis of 2 × 2 tables), using the relative risk as an effect measure. For the treatment effect to be classified as a minor, considerable, or major added benefit, the methodological approach stipulates that the (two‐sided) 95% confidence interval of the effect must exceed a specified distance to the zero effect. In summary, we assume that our approach provides a robust, transparent, and thus predictable foundation to determine minor, considerable, and major treatment effects on binary outcomes in the early benefit assessment of new drugs in Germany. After a decision on the added benefit of a new drug by G‐BA, the classification of added benefit is used to inform pricing negotiations between the umbrella organization of statutory health insurance and the pharmaceutical companies. PMID:26134089
Skipka, Guido; Wieseler, Beate; Kaiser, Thomas; Thomas, Stefanie; Bender, Ralf; Windeler, Jürgen; Lange, Stefan
2016-01-01
At the beginning of 2011, the early benefit assessment of new drugs was introduced in Germany with the Act on the Reform of the Market for Medicinal Products (AMNOG). The Federal Joint Committee (G-BA) generally commissions the Institute for Quality and Efficiency in Health Care (IQWiG) with this type of assessment, which examines whether a new drug shows an added benefit (a positive patient-relevant treatment effect) over the current standard therapy. IQWiG is required to assess the extent of added benefit on the basis of a dossier submitted by the pharmaceutical company responsible. In this context, IQWiG was faced with the task of developing a transparent and plausible approach for operationalizing how to determine the extent of added benefit. In the case of an added benefit, the law specifies three main extent categories (minor, considerable, major). To restrict value judgements to a minimum in the first stage of the assessment process, an explicit and abstract operationalization was needed. The present paper is limited to the situation of binary data (analysis of 2 × 2 tables), using the relative risk as an effect measure. For the treatment effect to be classified as a minor, considerable, or major added benefit, the methodological approach stipulates that the (two-sided) 95% confidence interval of the effect must exceed a specified distance to the zero effect. In summary, we assume that our approach provides a robust, transparent, and thus predictable foundation to determine minor, considerable, and major treatment effects on binary outcomes in the early benefit assessment of new drugs in Germany. After a decision on the added benefit of a new drug by G-BA, the classification of added benefit is used to inform pricing negotiations between the umbrella organization of statutory health insurance and the pharmaceutical companies. © 2015 The Authors. Biometrical Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.
Process Modeling Applied to Metal Forming and Thermomechanical Processing
1984-09-01
the flow stress of structural alloys de- creases with temperature. It is well accepted that the homologous temperature, the ratio of the absolute...hardening coefficient y reducing to the value y = 1. This is simply the well - known Considere condition. The influence of strain rate sensitivity on...obtained by sent, well understood [6]. It is also important to note that no way rate effects explicitly in the Hill theory. Thus, comparisons of the
Ballistic Missile Defense in the European Theater: Political, Military and Technical Considerations
2007-04-15
currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 15 04 2007 2. REPORT TYPE...the European Security Strategy. These strategic documents implicitly and explicitly build a strong case for bolstering missile defense capabilities...states. The NMS advocates the building of a defense in depth by extending defensive capabilities well beyond United States borders, and uses the
Yu, Fengqiong; Zhou, Xiaoqing; Qing, Wu; Li, Dan; Li, Jing; Chen, Xingui; Ji, Gongjun; Dong, Yi; Luo, Yuejia; Zhu, Chunyan; Wang, Kai
2017-01-30
The present study aimed to investigate neural substrates of response inhibition to sad faces across explicit and implicit tasks in depressed female patients. Event-related potentials were obtained while participants performed modified explicit and implicit emotional go/no-go tasks. Compared to controls, depressed patients showed decreased discrimination accuracy and amplitudes of original and nogo-go difference waves at the P3 interval in response inhibition to sad faces during explicit and implicit tasks. P3 difference wave were positively correlated with discrimination accuracy and were independent of clinical assessment. The activation of right dorsal prefrontal cortex was larger for the implicit than for the explicit task in sad condition in health controls, but was similar for the two tasks in depressed patients. The present study indicated that selectively impairment in response inhibition to sad faces in depressed female patients occurred at the behavior inhibition stage across implicit and explicit tasks and may be a trait-like marker of depression. Longitudinal studies are required to determine whether decreased response inhibition to sad faces increases the risk for future depressive episodes so that appropriate treatment can be administered to patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
State analysis requirements database for engineering complex embedded systems
NASA Technical Reports Server (NTRS)
Bennett, Matthew B.; Rasmussen, Robert D.; Ingham, Michel D.
2004-01-01
It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer's intent, potentially leading to software errors. This problem is addressed by a systems engineering tool called the State Analysis Database, which provides a tool for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using the State Analysis Database.
Marckmann, G; In der Schmitten, J
2014-05-01
Under the current conditions in the health care system, physicians inevitably have to take responsibility for the cost dimension of their decisions on the level of single cases. This article, therefore, discusses the question how physicians can integrate cost considerations into their clinical decisions at the microlevel in a medically rational and ethically justified way. We propose a four-step model for "ethical cost-consciousness": (1) forego ineffective interventions as required by good evidence-based medicine, (2) respect individual patient preferences, (3) minimize the diagnostic and therapeutic effort to achieve a certain treatment goal, and (4) forego expensive interventions that have only a small or unlikely (net) benefit for the patient. Steps 1-3 are ethically justified by the principles of beneficence, nonmaleficence, and respect for autonomy, step 4 by the principles of justice. For decisions on step 4, explicit cost-conscious guidelines should be developed locally or regionally. Following the four-step model can contribute to ethically defensible, cost-conscious decision-making at the microlevel. In addition, physicians' rationing decisions should meet basic standards of procedural fairness. Regular cost-case discussions and clinical ethics consultation should be available as decision support. Implementing step 4, however, requires first of all a clear political legitimation with the corresponding legal framework.
Spatial modeling of cell signaling networks.
Cowan, Ann E; Moraru, Ion I; Schaff, James C; Slepchenko, Boris M; Loew, Leslie M
2012-01-01
The shape of a cell, the sizes of subcellular compartments, and the spatial distribution of molecules within the cytoplasm can all control how molecules interact to produce a cellular behavior. This chapter describes how these spatial features can be included in mechanistic mathematical models of cell signaling. The Virtual Cell computational modeling and simulation software is used to illustrate the considerations required to build a spatial model. An explanation of how to appropriately choose between physical formulations that implicitly or explicitly account for cell geometry and between deterministic versus stochastic formulations for molecular dynamics is provided, along with a discussion of their respective strengths and weaknesses. As a first step toward constructing a spatial model, the geometry needs to be specified and associated with the molecules, reactions, and membrane flux processes of the network. Initial conditions, diffusion coefficients, velocities, and boundary conditions complete the specifications required to define the mathematics of the model. The numerical methods used to solve reaction-diffusion problems both deterministically and stochastically are then described and some guidance is provided in how to set up and run simulations. A study of cAMP signaling in neurons ends the chapter, providing an example of the insights that can be gained in interpreting experimental results through the application of spatial modeling. Copyright © 2012 Elsevier Inc. All rights reserved.
Beyond the Shannon–Khinchin formulation: The composability axiom and the universal-group entropy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tempesta, Piergiulio, E-mail: p.tempesta@fis.ucm.es
2016-02-15
The notion of entropy is ubiquitous both in natural and social sciences. In the last two decades, a considerable effort has been devoted to the study of new entropic forms, which generalize the standard Boltzmann–Gibbs (BG) entropy and could be applicable in thermodynamics, quantum mechanics and information theory. In Khinchin (1957), by extending previous ideas of Shannon (1948) and Shannon and Weaver (1949), Khinchin proposed a characterization of the BG entropy, based on four requirements, nowadays known as the Shannon–Khinchin (SK) axioms. The purpose of this paper is twofold. First, we show that there exists an intrinsic group-theoretical structure behindmore » the notion of entropy. It comes from the requirement of composability of an entropy with respect to the union of two statistically independent systems, that we propose in an axiomatic formulation. Second, we show that there exists a simple universal family of trace-form entropies. This class contains many well known examples of entropies and infinitely many new ones, a priori multi-parametric. Due to its specific relation with Lazard’s universal formal group of algebraic topology, the new general entropy introduced in this work will be called the universal-group entropy. A new example of multi-parametric entropy is explicitly constructed.« less
Supercomputer implementation of finite element algorithms for high speed compressible flows
NASA Technical Reports Server (NTRS)
Thornton, E. A.; Ramakrishnan, R.
1986-01-01
Prediction of compressible flow phenomena using the finite element method is of recent origin and considerable interest. Two shock capturing finite element formulations for high speed compressible flows are described. A Taylor-Galerkin formulation uses a Taylor series expansion in time coupled with a Galerkin weighted residual statement. The Taylor-Galerkin algorithms use explicit artificial dissipation, and the performance of three dissipation models are compared. A Petrov-Galerkin algorithm has as its basis the concepts of streamline upwinding. Vectorization strategies are developed to implement the finite element formulations on the NASA Langley VPS-32. The vectorization scheme results in finite element programs that use vectors of length of the order of the number of nodes or elements. The use of the vectorization procedure speeds up processing rates by over two orders of magnitude. The Taylor-Galerkin and Petrov-Galerkin algorithms are evaluated for 2D inviscid flows on criteria such as solution accuracy, shock resolution, computational speed and storage requirements. The convergence rates for both algorithms are enhanced by local time-stepping schemes. Extension of the vectorization procedure for predicting 2D viscous and 3D inviscid flows are demonstrated. Conclusions are drawn regarding the applicability of the finite element procedures for realistic problems that require hundreds of thousands of nodes.
Asif, Irfan M; Wiederman, Michael; Kapur, Rahul
2017-11-01
Journal club is a pervasive component of graduate medical education, yet there is no gold standard as to format and logistics. Survey of primary care sports medicine fellowship directors in the United States. Sixty-nine program directors completed the online questionnaire (40% response rate). There were some common aspects to journal club exhibited by a majority of programs, including the general format, required attendance by fellows and expected or required attendance by faculty, the expectation that participants had at least read the article before the meeting, and that meetings occurred during the workday in the work setting without provision of food. There was considerable variation on other aspects, including the objectives of journal club, who had primary responsibility for organizing the session, the criteria for selection of articles, who was invited to attend, and the perceived problems with journal club. This is the first survey investigating the current state of journal club in primary care sports medicine fellowship programs. Several opportunities for educational enhancements exist within journal clubs in primary care sports medicine, including the use of structured tools to guide discussion, providing mechanisms to evaluate the journal club experience as a whole, inviting multidisciplinary team members (eg, statisticians) to discussions, and ensuring that objectives are explicitly stated to participants.
Multirobot Lunar Excavation and ISRU Using Artificial-Neural-Tissue Controllers
NASA Astrophysics Data System (ADS)
Thangavelautham, Jekanthan; Smith, Alexander; Abu El Samid, Nader; Ho, Alexander; Boucher, Dale; Richard, Jim; D'Eleuterio, Gabriele M. T.
2008-01-01
Automation of site preparation and resource utilization on the Moon with teams of autonomous robots holds considerable promise for establishing a lunar base. Such multirobot autonomous systems would require limited human support infrastructure, complement necessary manned operations and reduce overall mission risk. We present an Artificial Neural Tissue (ANT) architecture as a control system for autonomous multirobot excavation tasks. An ANT approach requires much less human supervision and pre-programmed human expertise than previous techniques. Only a single global fitness function and a set of allowable basis behaviors need be specified. An evolutionary (Darwinian) selection process is used to `breed' controllers for the task at hand in simulation and the fittest controllers are transferred onto hardware for further validation and testing. ANT facilitates `machine creativity', with the emergence of novel functionality through a process of self-organized task decomposition of mission goals. ANT based controllers are shown to exhibit self-organization, employ stigmergy (communication mediated through the environment) and make use of templates (unlabeled environmental cues). With lunar in-situ resource utilization (ISRU) efforts in mind, ANT controllers have been tested on a multirobot excavation task in which teams of robots with no explicit supervision can successfully avoid obstacles, interpret excavation blueprints, perform layered digging, avoid burying or trapping other robots and clear/maintain digging routes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thangavelautham, Jekanthan; Smith, Alexander; Abu El Samid, Nader
Automation of site preparation and resource utilization on the Moon with teams of autonomous robots holds considerable promise for establishing a lunar base. Such multirobot autonomous systems would require limited human support infrastructure, complement necessary manned operations and reduce overall mission risk. We present an Artificial Neural Tissue (ANT) architecture as a control system for autonomous multirobot excavation tasks. An ANT approach requires much less human supervision and pre-programmed human expertise than previous techniques. Only a single global fitness function and a set of allowable basis behaviors need be specified. An evolutionary (Darwinian) selection process is used to 'breed' controllersmore » for the task at hand in simulation and the fittest controllers are transferred onto hardware for further validation and testing. ANT facilitates 'machine creativity', with the emergence of novel functionality through a process of self-organized task decomposition of mission goals. ANT based controllers are shown to exhibit self-organization, employ stigmergy (communication mediated through the environment) and make use of templates (unlabeled environmental cues). With lunar in-situ resource utilization (ISRU) efforts in mind, ANT controllers have been tested on a multirobot excavation task in which teams of robots with no explicit supervision can successfully avoid obstacles, interpret excavation blueprints, perform layered digging, avoid burying or trapping other robots and clear/maintain digging routes.« less
Generalized Born Models of Macromolecular Solvation Effects
NASA Astrophysics Data System (ADS)
Bashford, Donald; Case, David A.
2000-10-01
It would often be useful in computer simulations to use a simple description of solvation effects, instead of explicitly representing the individual solvent molecules. Continuum dielectric models often work well in describing the thermodynamic aspects of aqueous solvation, and approximations to such models that avoid the need to solve the Poisson equation are attractive because of their computational efficiency. Here we give an overview of one such approximation, the generalized Born model, which is simple and fast enough to be used for molecular dynamics simulations of proteins and nucleic acids. We discuss its strengths and weaknesses, both for its fidelity to the underlying continuum model and for its ability to replace explicit consideration of solvent molecules in macromolecular simulations. We focus particularly on versions of the generalized Born model that have a pair-wise analytical form, and therefore fit most naturally into conventional molecular mechanics calculations.
GPU-accelerated simulations of isolated black holes
NASA Astrophysics Data System (ADS)
Lewis, Adam G. M.; Pfeiffer, Harald P.
2018-05-01
We present a port of the numerical relativity code SpEC which is capable of running on NVIDIA GPUs. Since this code must be maintained in parallel with SpEC itself, a primary design consideration is to perform as few explicit code changes as possible. We therefore rely on a hierarchy of automated porting strategies. At the highest level we use TLoops, a C++ library of our design, to automatically emit CUDA code equivalent to tensorial expressions written into C++ source using a syntax similar to analytic calculation. Next, we trace out and cache explicit matrix representations of the numerous linear transformations in the SpEC code, which allows these to be performed on the GPU using pre-existing matrix-multiplication libraries. We port the few remaining important modules by hand. In this paper we detail the specifics of our port, and present benchmarks of it simulating isolated black hole spacetimes on several generations of NVIDIA GPU.
Special-case closed form of the Baker-Campbell-Hausdorff formula
NASA Astrophysics Data System (ADS)
Van-Brunt, Alexander; Visser, Matt
2015-06-01
The Baker-Campbell-Hausdorff formula is a general result for the quantity Z(X,Y)=ln ({{e}X}{{e}Y}), where X and Y are not necessarily commuting. For completely general commutation relations between X and Y, (the free Lie algebra), the general result is somewhat unwieldy. However in specific physics applications the commutator [X,Y], while non-zero, might often be relatively simple, which sometimes leads to explicit closed form results. We consider the special case [X,Y]=uX+vY+cI, and show that in this case the general result reduces to Furthermore we explicitly evaluate the symmetric function f(u,v)=f(v,u), demonstrating that and relate this to previously known results. For instance this result includes, but is considerably more general than, results obtained from either the Heisenberg commutator [P,Q]=-i\\hbar I or the creation-destruction commutator [a,{{a}\\dagger }]=I.
NASA Astrophysics Data System (ADS)
Yamataka, Hiroshi; Aida, Misako
1998-06-01
Ab initio MO calculations (HF/3-21G, HF/6-31G, HF/6-31+G* and MP2/6-31+G*) were carried out on the hydrolysis of CH 3Cl in which up to 13 water solvent molecules were explicitly considered. For n⩾3, three important stationary points ( cmp1, TS, and cmp2) were detected in the course of the reaction. The calculations for the n=13 system at the HF/6-31+G* level reproduced the experimental activation enthalpy and the secondary deuterium kinetic isotope effect. The two reacting bond lengths in the transition state are 1.975 Å (O-C) and 2.500 Å (C-Cl), and CH 3Cl is surrounded by 13 water molecules without any apparent vacant space. The proton transfer from the attacking water to the water cluster occurs after TS is reached.
Guidelines for appropriate care: the importance of empirical normative analysis.
Berg, M; Meulen, R T; van den Burg, M
2001-01-01
The Royal Dutch Medical Association recently completed a research project aimed at investigating how guidelines for 'appropriate medical care' should be construed. The project took as a starting point that explicit attention should be given to ethical and political considerations in addition to data about costs and effectiveness. In the project, two research groups set out to design guidelines and cost-effectiveness analyses (CEAs) for two circumscribed medical areas (angina pectoris and major depression). Our third group was responsible for the normative analysis. We undertook an explorative, qualitative pilot study of the normative considerations that played a role in constructing the guidelines and CEAs, and simultaneously interviewed specialists about the normative considerations that guided their diagnostic and treatment decisions. Explicating normative considerations, we argue, is important democratically: the issues at stake should not be left to decision analysts and guideline developers to decide. Moreover, it is a necessary condition for a successful implementation of such tools: those who draw upon these tools will only accept them when they can recognize themselves in the considerations implied. Empirical normative analysis, we argue, is a crucial tool in developing guidelines for appropriate medical care.
2017-01-01
Objective The objectives of the study were to determine (1) parental and professional views regarding the type of consent required for common neonatal interventions and (2) whether there has been a change in professional understanding regarding the requirements of consent since the last UK survey in 2003. Design Cohort study of (1) parents of babies admitted to a single-centre tertiary neonatal unit and (2) healthcare professionals. Methods The views of 8 parents of former neonatal patients and 69 neonatal professionals were sought using online and telephone survey methodology regarding 20 neonatal interventions and whether implied consent, explicit verbal consent or explicit written consent should be obtained. Results Agreement, defined as both parental and professional consensus on the type of consent required, was present in 12/20 of the interventions. Comparison between professional views in 2003 demonstrated a change regarding type of consent for 50% of interventions with a shift towards obtaining explicit written consent certain treatments. Conclusions The study indicates areas of consensus that exist between parents and professionals regarding consent for common neonatal interventions and a change in professional views regarding consent since the last UK survey in 2003. These data might help inform the development of national guidance for how professionals should obtain consent in neonatology. PMID:29637148
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patton, A.D.; Ayoub, A.K.; Singh, C.
1982-07-01
Existing methods for generating capacity reliability evaluation do not explicitly recognize a number of operating considerations which may have important effects in system reliability performance. Thus, current methods may yield estimates of system reliability which differ appreciably from actual observed reliability. Further, current methods offer no means of accurately studying or evaluating alternatives which may differ in one or more operating considerations. Operating considerations which are considered to be important in generating capacity reliability evaluation include: unit duty cycles as influenced by load cycle shape, reliability performance of other units, unit commitment policy, and operating reserve policy; unit start-up failuresmore » distinct from unit running failures; unit start-up times; and unit outage postponability and the management of postponable outages. A detailed Monte Carlo simulation computer model called GENESIS and two analytical models called OPCON and OPPLAN have been developed which are capable of incorporating the effects of many operating considerations including those noted above. These computer models have been used to study a variety of actual and synthetic systems and are available from EPRI. The new models are shown to produce system reliability indices which differ appreciably from index values computed using traditional models which do not recognize operating considerations.« less
A conceptual framework for evaluating data suitability for observational studies.
Shang, Ning; Weng, Chunhua; Hripcsak, George
2017-09-08
To contribute a conceptual framework for evaluating data suitability to satisfy the research needs of observational studies. Suitability considerations were derived from a systematic literature review on researchers' common data needs in observational studies and a scoping review on frequent clinical database design considerations, and were harmonized to construct a suitability conceptual framework using a bottom-up approach. The relationships among the suitability categories are explored from the perspective of 4 facets of data: intrinsic, contextual, representational, and accessible. A web-based national survey of domain experts was conducted to validate the framework. Data suitability for observational studies hinges on the following key categories: Explicitness of Policy and Data Governance, Relevance, Availability of Descriptive Metadata and Provenance Documentation, Usability, and Quality. We describe 16 measures and 33 sub-measures. The survey uncovered the relevance of all categories, with a 5-point Likert importance score of 3.9 ± 1.0 for Explicitness of Policy and Data Governance, 4.1 ± 1.0 for Relevance, 3.9 ± 0.9 for Availability of Descriptive Metadata and Provenance Documentation, 4.2 ± 1.0 for Usability, and 4.0 ± 0.9 for Quality. The suitability framework evaluates a clinical data source's fitness for research use. Its construction reflects both researchers' points of view and data custodians' design features. The feedback from domain experts rated Usability, Relevance, and Quality categories as the most important considerations. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Research in digital adaptive flight controllers
NASA Technical Reports Server (NTRS)
Kaufman, H.
1976-01-01
A design study of adaptive control logic suitable for implementation in modern airborne digital flight computers was conducted. Both explicit controllers which directly utilize parameter identification and implicit controllers which do not require identification were considered. Extensive analytical and simulation efforts resulted in the recommendation of two explicit digital adaptive flight controllers. Interface weighted least squares estimation procedures with control logic were developed using either optimal regulator theory or with control logic based upon single stage performance indices.
2014-01-01
Background The affective personality trait ‘harm avoidance’ (HA) from Cloninger’s psychobiological personality model determines how an individual deals with emotional stimuli. Emotional stimuli are processed by a neural network that include the left and right amygdalae as important key nodes. Explicit, implicit and passive processing of affective stimuli are known to activate the amygdalae differently reflecting differences in attention, level of detailed analysis of the stimuli and the cognitive control needed to perform the required task. Previous studies revealed that implicit processing or passive viewing of affective stimuli, induce a left amygdala response that correlates with HA. In this new study we have tried to extend these findings to the situation in which the subjects were required to explicitly process emotional stimuli. Methods A group of healthy female participants was asked to rate the valence of positive and negative stimuli while undergoing fMRI. Afterwards the neural responses of the participants to the positive and to the negative stimuli were separately correlated to their HA scores and compared between the low and high HA participants. Results Both analyses revealed increased neural activity in the left laterobasal (LB) amygdala of the high HA participants while they were rating the positive and the negative stimuli. Conclusions Our results indicate that the left amygdala response to explicit processing of affective stimuli does correlate with HA. PMID:24884791
Cumulative emission budgets and their implications: the case for SAFE carbon
NASA Astrophysics Data System (ADS)
Allen, Myles; Bowerman, Niel; Frame, David; Mason, Charles
2010-05-01
The risk of dangerous long-term climate change due to anthropogenic carbon dioxide emissions is predominantly determined by cumulative emissions over all time, not the rate of emission in any given year or commitment period. This has profound implications for climate mitigation policy: emission targets for specific years such as 2020 or 2050 provide no guarantee of meeting any overall cumulative emission budget. By focusing attention on short-term measures to reduce the flow of emissions, they may even exacerbate the overall long-term stock. Here we consider how climate policies might be designed explicitly to limit cumulative emissions to, for example, one trillion tonnes of carbon, a figure that has been estimated to give a most likely warming of two degrees above pre-industrial, with a likely range of 1.6-2.6 degrees. Three approaches are considered: tradable emission permits with the possibility of indefinite emission banking, carbon taxes explicitly linked to cumulative emissions and mandatory carbon sequestration. Framing mitigation policy around cumulative targets alleviates the apparent tension between climate protection and short-term consumption that bedevils any attempt to forge global agreement. We argue that the simplest and hence potentially the most effective approach might be a mandatory requirement on the fossil fuel industry to ensure that a steadily increasing fraction of fossil carbon extracted from the ground is artificially removed from the active carbon cycle through some form of sequestration. We define Sequestered Adequate Fraction of Extracted (SAFE) carbon as a source in which this sequestered fraction is anchored to cumulative emissions, increasing smoothly to reach 100% before we release the trillionth tonne. While adopting the use of SAFE carbon would increase the cost of fossil energy much as a system of emission permits or carbon taxes would, it could do so with much less explicit government intervention. We contrast this proposal with, for example, the WBGU budget approach which also recognises the importance of cumulative emissions, noting their different implications for global equity and development considerations. The implications of cumulative emissions for the issue of historical responsibility for adaptation costs will also be discussed.
Solution Methods for Certain Evolution Equations
NASA Astrophysics Data System (ADS)
Vega-Guzman, Jose Manuel
Solution methods for certain linear and nonlinear evolution equations are presented in this dissertation. Emphasis is placed mainly on the analytical treatment of nonautonomous differential equations, which are challenging to solve despite the existent numerical and symbolic computational software programs available. Ideas from the transformation theory are adopted allowing one to solve the problems under consideration from a non-traditional perspective. First, the Cauchy initial value problem is considered for a class of nonautonomous and inhomogeneous linear diffusion-type equation on the entire real line. Explicit transformations are used to reduce the equations under study to their corresponding standard forms emphasizing on natural relations with certain Riccati(and/or Ermakov)-type systems. These relations give solvability results for the Cauchy problem of the parabolic equation considered. The superposition principle allows to solve formally this problem from an unconventional point of view. An eigenfunction expansion approach is also considered for this general evolution equation. Examples considered to corroborate the efficacy of the proposed solution methods include the Fokker-Planck equation, the Black-Scholes model and the one-factor Gaussian Hull-White model. The results obtained in the first part are used to solve the Cauchy initial value problem for certain inhomogeneous Burgers-type equation. The connection between linear (the Diffusion-type) and nonlinear (Burgers-type) parabolic equations is stress in order to establish a strong commutative relation. Traveling wave solutions of a nonautonomous Burgers equation are also investigated. Finally, it is constructed explicitly the minimum-uncertainty squeezed states for quantum harmonic oscillators. They are derived by the action of corresponding maximal kinematical invariance group on the standard ground state solution. It is shown that the product of the variances attains the required minimum value only at the instances that one variance is a minimum and the other is a maximum, when the squeezing of one of the variances occurs. Such explicit construction is possible due to the relation between the diffusion-type equation studied in the first part and the time-dependent Schrodinger equation. A modication of the radiation field operators for squeezed photons in a perfect cavity is also suggested with the help of a nonstandard solution of Heisenberg's equation of motion.
Heinrichs, Julie; Aldridge, Cameron L.; O'Donnell, Michael; Schumaker, Nathan
2017-01-01
Prioritizing habitats for conservation is a challenging task, particularly for species with fluctuating populations and seasonally dynamic habitat needs. Although the use of resource selection models to identify and prioritize habitat for conservation is increasingly common, their ability to characterize important long-term habitats for dynamic populations are variable. To examine how habitats might be prioritized differently if resource selection was directly and dynamically linked with population fluctuations and movement limitations among seasonal habitats, we constructed a spatially explicit individual-based model for a dramatically fluctuating population requiring temporally varying resources. Using greater sage-grouse (Centrocercus urophasianus) in Wyoming as a case study, we used resource selection function maps to guide seasonal movement and habitat selection, but emergent population dynamics and simulated movement limitations modified long-term habitat occupancy. We compared priority habitats in RSF maps to long-term simulated habitat use. We examined the circumstances under which the explicit consideration of movement limitations, in combination with population fluctuations and trends, are likely to alter predictions of important habitats. In doing so, we assessed the future occupancy of protected areas under alternative population and habitat conditions. Habitat prioritizations based on resource selection models alone predicted high use in isolated parcels of habitat and in areas with low connectivity among seasonal habitats. In contrast, results based on more biologically-informed simulations emphasized central and connected areas near high-density populations, sometimes predicted to be low selection value. Dynamic models of habitat use can provide additional biological realism that can extend, and in some cases, contradict habitat use predictions generated from short-term or static resource selection analyses. The explicit inclusion of population dynamics and movement propensities via spatial simulation modeling frameworks may provide an informative means of predicting long-term habitat use, particularly for fluctuating populations with complex seasonal habitat needs. Importantly, our results indicate the possible need to consider habitat selection models as a starting point rather than the common end point for refining and prioritizing habitats for protection for cyclic and highly variable populations.
Spatially Explicit Simulation of Mesotopographic Controls on Peatland Hydrology and Carbon Fluxes
NASA Astrophysics Data System (ADS)
Sonnentag, O.; Chen, J. M.; Roulet, N. T.
2006-12-01
A number of field carbon flux measurements, paleoecological records, and model simulations have acknowledged the importance of northern peatlands in terrestrial carbon cycling and methane emissions. An important parameter in peatlands that influences both net primary productivity, the net gain of carbon through photosynthesis, and decomposition under aerobic and anaerobic conditions, is the position of the water table. Biological and physical processes involved in peatland carbon dynamics and their hydrological controls operate at different spatial scales. The highly variable hydraulic characteristics of the peat profile and the overall shape of the peat body as defined by its surface topography at the mesoscale (104 m2) are of major importance for peatland water table dynamics. Common types of peatlands include bogs with a slightly domed centre. As a result of the convex profile, their water supply is restricted to atmospheric inputs, and water is mainly shed by shallow subsurface flow. From a modelling perspective the influence of mesotopographic controls on peatland hydrology and thus carbon balance requires that process-oriented models that examine the links between peatland hydrology, ecosystem functioning, and climate must incorporate some form of lateral subsurface flow consideration. Most hydrological and ecological modelling studies in complex terrain explicitly account for the topographic controls on lateral subsurface flow through digital elevation models. However, modelling studies in peatlands often employ simple empirical parameterizations of lateral subsurface flow, neglecting the influence of peatlands low relief mesoscale topography. Our objective is to explicitly simulate the mesotopographic controls on peatland hydrology and carbon fluxes using the Boreal Ecosystem Productivity Simulator (BEPS) adapted to northern peatlands. BEPS is a process-oriented ecosystem model in a remote sensing framework that takes into account peatlands multi-layer canopy through vertically stratified mapped leaf area index. Model outputs are validated against multi-year measurements taken at an eddy-covariance flux tower located within Mer Bleue bog, a typical raised bog near Ottawa, Ontario, Canada. Model results for seasonal water table dynamics and evapotranspiration at daily time steps in 2003 are in good agreement with measurements with R2=0.74 and R2=0.79, respectively, and indicate the suitability of our pursued approach.
Explicit formulation of second and third order optical nonlinearity in the FDTD framework
NASA Astrophysics Data System (ADS)
Varin, Charles; Emms, Rhys; Bart, Graeme; Fennel, Thomas; Brabec, Thomas
2018-01-01
The finite-difference time-domain (FDTD) method is a flexible and powerful technique for rigorously solving Maxwell's equations. However, three-dimensional optical nonlinearity in current commercial and research FDTD softwares requires solving iteratively an implicit form of Maxwell's equations over the entire numerical space and at each time step. Reaching numerical convergence demands significant computational resources and practical implementation often requires major modifications to the core FDTD engine. In this paper, we present an explicit method to include second and third order optical nonlinearity in the FDTD framework based on a nonlinear generalization of the Lorentz dispersion model. A formal derivation of the nonlinear Lorentz dispersion equation is equally provided, starting from the quantum mechanical equations describing nonlinear optics in the two-level approximation. With the proposed approach, numerical integration of optical nonlinearity and dispersion in FDTD is intuitive, transparent, and fully explicit. A strong-field formulation is also proposed, which opens an interesting avenue for FDTD-based modelling of the extreme nonlinear optics phenomena involved in laser filamentation and femtosecond micromachining of dielectrics.
ERIC Educational Resources Information Center
Baum, David A.; Offner, Susan
2008-01-01
Phylogenetic trees, which are depictions of the inferred evolutionary relationships among a set of species, now permeate almost all branches of biology and are appearing in increasing numbers in biology textbooks. While few state standards explicitly require knowledge of phylogenetics, most require some knowledge of evolutionary biology, and many…
Coincident Extraction of Line Objects from Stereo Image Pairs.
1983-09-01
4.4.3 Reconstruction of intersections 4.5 Final result processing 5. Presentation of the results 5.1 FIM image processing system 5.2 Extraction results in...image. To achieve this goal, the existing software system had to be modified and extended considerably. The following sections of this report will give...8000 pixels of each image without explicit loading of subimages could not yet be performed due to computer system software problems. m m n m -4- The
On 'the fear of death' as the primary anxiety: how and why Klein differs from Freud.
Blass, Rachel B
2014-08-01
It is well known that Melanie Klein held the view that 'fear of death' is the primary source of anxiety and that her position is explicitly opposed to that of Sigmund Freud, who maintained that that fear cannot in any way or form be a source of anxiety. In a previous article on Freud's Inhibitions, Symptoms and Anxiety (Blass, 2013), the author argued that, counter to what is commonly portrayed in the literature, Freud's considerations for rejecting the fear of death as a source of anxiety were based on relational and experiential factors that are usually associated with Kleinian psychoanalysis. In light of this affinity of Freud with Klein a question arises as to the actual source of their differences in this context. The present paper offers an answer to this question. The author first presents some of her earlier findings on what led Freud to reject the fear of death as a source of anxiety and then turns to investigate Klein's considerations for accepting it. This takes us beyond her explicit statements on this matter and sheds new light on the relationship of her views regarding death and anxiety and those of Freud. In turn this deepens the understanding of the relationship of Freud and Klein's conceptualizations of the psyche and its internal object relations, pointing to both surprising common ground and foundational differences. Copyright © 2014 Institute of Psychoanalysis.
Activity-Centered Domain Characterization for Problem-Driven Scientific Visualization
Marai, G. Elisabeta
2018-01-01
Although visualization design models exist in the literature in the form of higher-level methodological frameworks, these models do not present a clear methodological prescription for the domain characterization step. This work presents a framework and end-to-end model for requirements engineering in problem-driven visualization application design. The framework and model are based on the activity-centered design paradigm, which is an enhancement of human-centered design. The proposed activity-centered approach focuses on user tasks and activities, and allows an explicit link between the requirements engineering process with the abstraction stage—and its evaluation—of existing, higher-level visualization design models. In a departure from existing visualization design models, the resulting model: assigns value to a visualization based on user activities; ranks user tasks before the user data; partitions requirements in activity-related capabilities and nonfunctional characteristics and constraints; and explicitly incorporates the user workflows into the requirements process. A further merit of this model is its explicit integration of functional specifications, a concept this work adapts from the software engineering literature, into the visualization design nested model. A quantitative evaluation using two sets of interdisciplinary projects supports the merits of the activity-centered model. The result is a practical roadmap to the domain characterization step of visualization design for problem-driven data visualization. Following this domain characterization model can help remove a number of pitfalls that have been identified multiple times in the visualization design literature. PMID:28866550
An image-based skeletal dosimetry model for the ICRP reference adult male—internal electron sources
NASA Astrophysics Data System (ADS)
Hough, Matthew; Johnson, Perry; Rajon, Didier; Jokisch, Derek; Lee, Choonsik; Bolch, Wesley
2011-04-01
In this study, a comprehensive electron dosimetry model of the adult male skeletal tissues is presented. The model is constructed using the University of Florida adult male hybrid phantom of Lee et al (2010 Phys. Med. Biol. 55 339-63) and the EGSnrc-based Paired Image Radiation Transport code of Shah et al (2005 J. Nucl. Med. 46 344-53). Target tissues include the active bone marrow, associated with radiogenic leukemia, and total shallow marrow, associated with radiogenic bone cancer. Monoenergetic electron emissions are considered over the energy range 1 keV to 10 MeV for the following sources: bone marrow (active and inactive), trabecular bone (surfaces and volumes), and cortical bone (surfaces and volumes). Specific absorbed fractions are computed according to the MIRD schema, and are given as skeletal-averaged values in the paper with site-specific values reported in both tabular and graphical format in an electronic annex available from http://stacks.iop.org/0031-9155/56/2309/mmedia. The distribution of cortical bone and spongiosa at the macroscopic dimensions of the phantom, as well as the distribution of trabecular bone and marrow tissues at the microscopic dimensions of the phantom, is imposed through detailed analyses of whole-body ex vivo CT images (1 mm resolution) and spongiosa-specific ex vivo microCT images (30 µm resolution), respectively, taken from a 40 year male cadaver. The method utilized in this work includes: (1) explicit accounting for changes in marrow self-dose with variations in marrow cellularity, (2) explicit accounting for electron escape from spongiosa, (3) explicit consideration of spongiosa cross-fire from cortical bone, and (4) explicit consideration of the ICRP's change in the surrogate tissue region defining the location of the osteoprogenitor cells (from a 10 µm endosteal layer covering the trabecular and cortical surfaces to a 50 µm shallow marrow layer covering trabecular and medullary cavity surfaces). Skeletal-averaged values of absorbed fraction in the present model are noted to be very compatible with those weighted by the skeletal tissue distributions found in the ICRP Publication 110 adult male and female voxel phantoms, but are in many cases incompatible with values used in current and widely implemented internal dosimetry software.
2012-06-07
scheme for the VOF requires the use of the explicit solver to advance the solution in time. The drawback of using the explicit solver is that such ap...proach required much smaller time steps to guarantee that a converged and stable solution is obtained during each fractional time step (Global...Comparable results were obtained for the solutions with the RSM model. 50x 25x 100x25x 25x200x 0.000 0.002 0.004 0.006 0.008 0.010 0 100 200 300
Towards automated assistance for operating home medical devices.
Gao, Zan; Detyniecki, Marcin; Chen, Ming-Yu; Wu, Wen; Hauptmann, Alexander G; Wactlar, Howard D
2010-01-01
To detect errors when subjects operate a home medical device, we observe them with multiple cameras. We then perform action recognition with a robust approach to recognize action information based on explicitly encoding motion information. This algorithm detects interest points and encodes not only their local appearance but also explicitly models local motion. Our goal is to recognize individual human actions in the operations of a home medical device to see if the patient has correctly performed the required actions in the prescribed sequence. Using a specific infusion pump as a test case, requiring 22 operation steps from 6 action classes, our best classifier selects high likelihood action estimates from 4 available cameras, to obtain an average class recognition rate of 69%.
Delivering Faster Congestion Feedback with the Mark-Front Strategy
NASA Technical Reports Server (NTRS)
Liu, Chunlei; Jain, Raj
2001-01-01
Computer networks use congestion feedback from the routers and destinations to control the transmission load. Delivering timely congestion feedback is essential to the performance of networks. Reaction to the congestion can be more effective if faster feedback is provided. Current TCP/IP networks use timeout, duplicate Acknowledgement Packets (ACKs) and explicit congestion notification (ECN) to deliver the congestion feedback, each provides a faster feedback than the previous method. In this paper, we propose a markfront strategy that delivers an even faster congestion feedback. With analytical and simulation results, we show that mark-front strategy reduces buffer size requirement, improves link efficiency and provides better fairness among users. Keywords: Explicit Congestion Notification, mark-front, congestion control, buffer size requirement, fairness.
NASA Astrophysics Data System (ADS)
Firdausi, N.; Prabawa, H. W.; Sutarno, H.
2017-02-01
In an effort to maximize a student’s academic growth, one of the tools available to educators is the explicit instruction. Explicit instruction is marked by a series of support or scaffold, where the students will be guided through the learning process with a clear statement of purpose and a reason for learning new skills, a clear explanation and demonstration of learning targets, supported and practiced with independent feedback until mastery has been achieved. The technology development trend of todays, requires an adjustment in the development of learning object that supports the achievement of explicit instruction targets. This is where the gamification position is. In the role as a pedagogical strategy, the use of gamification preformance study class is still relatively new. Gamification not only use the game elements and game design techniques in non-game contexts, but also to empower and engage learners with the ability of motivation on learning approach and maintains a relaxed atmosphere. With using Reseach and Development methods, this paper presents the integration of technology (which in this case using the concept of gamification) in explicit instruction settings and the impact on the improvement of students’ understanding.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Yuqi; Wang, Jinan; Shao, Qiang, E-mail: qshao@mail.shcnc.ac.cn, E-mail: Jiye.Shi@ucb.com, E-mail: wlzhu@mail.shcnc.ac.cn
2015-03-28
The application of temperature replica exchange molecular dynamics (REMD) simulation on protein motion is limited by its huge requirement of computational resource, particularly when explicit solvent model is implemented. In the previous study, we developed a velocity-scaling optimized hybrid explicit/implicit solvent REMD method with the hope to reduce the temperature (replica) number on the premise of maintaining high sampling efficiency. In this study, we utilized this method to characterize and energetically identify the conformational transition pathway of a protein model, the N-terminal domain of calmodulin. In comparison to the standard explicit solvent REMD simulation, the hybrid REMD is much lessmore » computationally expensive but, meanwhile, gives accurate evaluation of the structural and thermodynamic properties of the conformational transition which are in well agreement with the standard REMD simulation. Therefore, the hybrid REMD could highly increase the computational efficiency and thus expand the application of REMD simulation to larger-size protein systems.« less
Isingrini, M; Vazou, F; Leroy, P
1995-07-01
In this article, we report an experiment that provides further evidence concerning the differences between explicit and implicit measures of memory. The effects of age and divided attention on the implicit conceptual test of category exemplar generation (CEG) were compared with their effects on the explicit test of cued, recall, where the category names served as cues in both tasks. Four age groups (20-35, 40-55, 60-75, and 76-90) were compared. Half of the subjects were also required to carry out a secondary letter-detection task during the learning phase. Cued recall performance was significantly impaired by increased age and imposition of the secondary task. In contrast, the CEG task was unaffected by these two factors. These results suggest that implicit conceptual tasks and explicit memory tasks are mediated by different processes. This conclusion opposes those of previous studies that showed that experimental manipulations (level of processing, generation, organization) influenced these two kinds of memory tests in a similar way.
New Approaches to Final Cooling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neuffer, David
2014-11-10
A high-energy muon collider scenario require a “final cooling” system that reduces transverse emittances by a factor of ~10 while allowing longitudinal emittance increase. The baseline approach has low-energy transverse cooling within high-field solenoids, with strong longitudinal heating. This approach and its recent simulation are discussed. Alternative approaches which more explicitly include emittance exchange are also presented. Round-to-flat beam transform, transverse slicing, and longitudinal bunch coalescence are possible components of the alternative approach. A more explicit understanding of solenoidal cooling beam dynamics is introduced.
NASA Technical Reports Server (NTRS)
Melis, Matthew E.
2003-01-01
NASA Glenn Research Center s Structural Mechanics Branch has years of expertise in using explicit finite element methods to predict the outcome of ballistic impact events. Shuttle engineers from the NASA Marshall Space Flight Center and NASA Kennedy Space Flight Center required assistance in assessing the structural loads that a newly proposed thrust vector control system for the space shuttle solid rocket booster (SRB) aft skirt would expect to see during its recovery splashdown.
Voss, Joel L; Paller, Ken A
2008-11-01
A comprehensive understanding of human memory requires cognitive and neural descriptions of memory processes along with a conception of how memory processing drives behavioral responses and subjective experiences. One serious challenge to this endeavor is that an individual memory process is typically operative within a mix of other contemporaneous memory processes. This challenge is particularly disquieting in the context of implicit memory, which, unlike explicit memory, transpires without the subject necessarily being aware of memory retrieval. Neural correlates of implicit memory and neural correlates of explicit memory are often investigated in different experiments using very different memory tests and procedures. This strategy poses difficulties for elucidating the interactions between the two types of memory process that may result in explicit remembering, and for determining the extent to which certain neural processing events uniquely contribute to only one type of memory. We review recent studies that have succeeded in separately assessing neural correlates of both implicit memory and explicit memory within the same paradigm using event-related brain potentials (ERPs) and functional magnetic resonance imaging (fMRI), with an emphasis on studies from our laboratory. The strategies we describe provide a methodological framework for achieving valid assessments of memory processing, and the findings support an emerging conceptualization of the distinct neurocognitive events responsible for implicit and explicit memory.
A neurocomputational theory of how explicit learning bootstraps early procedural learning.
Paul, Erick J; Ashby, F Gregory
2013-01-01
It is widely accepted that human learning and memory is mediated by multiple memory systems that are each best suited to different requirements and demands. Within the domain of categorization, at least two systems are thought to facilitate learning: an explicit (declarative) system depending largely on the prefrontal cortex, and a procedural (non-declarative) system depending on the basal ganglia. Substantial evidence suggests that each system is optimally suited to learn particular categorization tasks. However, it remains unknown precisely how these systems interact to produce optimal learning and behavior. In order to investigate this issue, the present research evaluated the progression of learning through simulation of categorization tasks using COVIS, a well-known model of human category learning that includes both explicit and procedural learning systems. Specifically, the model's parameter space was thoroughly explored in procedurally learned categorization tasks across a variety of conditions and architectures to identify plausible interaction architectures. The simulation results support the hypothesis that one-way interaction between the systems occurs such that the explicit system "bootstraps" learning early on in the procedural system. Thus, the procedural system initially learns a suboptimal strategy employed by the explicit system and later refines its strategy. This bootstrapping could be from cortical-striatal projections that originate in premotor or motor regions of cortex, or possibly by the explicit system's control of motor responses through basal ganglia-mediated loops.
Verburgh, L; Scherder, E J A; van Lange, P A M; Oosterlaan, J
2016-09-01
In sports, fast and accurate execution of movements is required. It has been shown that implicitly learned movements might be less vulnerable than explicitly learned movements to stressful and fast changing circumstances that exist at the elite sports level. The present study provides insight in explicit and implicit motor learning in youth soccer players with different expertise levels. Twenty-seven youth elite soccer players and 25 non-elite soccer players (aged 10-12) performed a serial reaction time task (SRTT). In the SRTT, one of the sequences must be learned explicitly, the other was implicitly learned. No main effect of group was found for implicit and explicit learning on mean reaction time (MRT) and accuracy. However, for MRT, an interaction was found between learning condition, learning phase and group. Analyses showed no group effects for the explicit learning condition, but youth elite soccer players showed better learning in the implicit learning condition. In particular, during implicit motor learning youth elite soccer showed faster MRTs in the early learning phase and earlier reached asymptote performance in terms of MRT. Present findings may be important for sports because children with superior implicit learning abilities in early learning phases may be able to learn more (durable) motor skills in a shorter time period as compared to other children.
Implicit learning in cotton-top tamarins (Saguinus oedipus) and pigeons (Columba livia).
Locurto, Charles; Fox, Maura; Mazzella, Andrea
2015-06-01
There is considerable interest in the conditions under which human subjects learn patterned information without explicit instructions to learn that information. This form of learning, termed implicit or incidental learning, can be approximated in nonhumans by exposing subjects to patterned information but delivering reinforcement randomly, thereby not requiring the subjects to learn the information in order to be reinforced. Following acquisition, nonhuman subjects are queried as to what they have learned about the patterned information. In the present experiment, we extended the study of implicit learning in nonhumans by comparing two species, cotton-top tamarins (Saguinus oedipus) and pigeons (Columba livia), on an implicit learning task that used an artificial grammar to generate the patterned elements for training. We equated the conditions of training and testing as much as possible between the two species. The results indicated that both species demonstrated approximately the same magnitude of implicit learning, judged both by a random test and by choice tests between pairs of training elements. This finding suggests that the ability to extract patterned information from situations in which such learning is not demanded is of longstanding origin.
Frische, Tobias; Bachmann, Jean; Frein, Daniel; Juffernholz, Tanja; Kehrer, Anja; Klein, Anita; Maack, Gerd; Stock, Frauke; Stolzenberg, Hans-Christian; Thierbach, Claudia; Walter-Rohde, Susanne
2013-12-16
A discussion paper was developed by a panel of experts of the German Federal Environment Agency (UBA) contributing to the on-going debate on the identification, assessment and management of endocrine disruptors with a view to protect wildlife according to the EU substance legislation (plant protection products, biocides, industrial chemicals). Based on a critical synthesis of the state-of-the-art regarding regulatory requirements, testing methods, assessment schemes, decision-making criteria and risk management options, we advise an appropriate and consistent implementation of this important subject into existing chemicals legislation in Europe. Our proposal for a balanced risk management of endocrine disruptors essentially advocates transparent regulatory decision making based on a scientifically robust weight of evidence approach and an adequate risk management consistent across different legislations. With respect to the latter, a more explicit consideration of the principle of proportionality of regulatory decision making and socio-economic benefits in the on-going debate is further encouraged. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Alvertos, Nicolas; Dcunha, Ivan
1992-01-01
Pose and orientation of an object is one of the central issues in 3-D recognition problems. Most of today's available techniques require considerable pre-processing such as detecting edges or joints, fitting curves or surfaces to segment images, and trying to extract higher order features from the input images. We present a method based on analytical geometry, whereby all the rotation parameters of any quadric surface are determined and subsequently eliminated. This procedure is iterative in nature and was found to converge to the desired results in as few as three iterations. The approach enables us to position the quadric surface in a desired coordinate system, and then to utilize the presented shape information to explicitly represent and recognize the 3-D surface. Experiments were conducted with simulated data for objects such as hyperboloid of one and two sheets, elliptic and hyperbolic paraboloid, elliptic and hyperbolic cylinders, ellipsoids, and quadric cones. Real data of quadric cones and cylinders were also utilized. Both of these sets yielded excellent results.
Teaching scientific thinking skills: Students and computers coaching each other
NASA Astrophysics Data System (ADS)
Reif, Frederick; Scott, Lisa A.
1999-09-01
Our attempts to improve physics instruction have led us to analyze thought processes needed to apply scientific principles to problems—and to recognize that reliable performance requires the basic cognitive functions of deciding, implementing, and assessing. Using a reciprocal-teaching strategy to teach such thought processes explicitly, we have developed computer programs called PALs (P_ersonal A_ssistants for L_earning) in which computers and students alternately coach each other. These computer-implemented tutorials make it practically feasible to provide students with individual guidance and feedback ordinarily unavailable in most courses. We constructed PALs specifically designed to teach the application of Newton's laws. In a comparative experimental study these computer tutorials were found to be nearly as effective as individual tutoring by expert teachers—and considerably more effective than the instruction provided in a well-taught physics class. Furthermore, almost all of the students using the PALs perceived them as very helpful to their learning. These results suggest that the proposed instructional approach could fruitfully be extended to improve instruction in various practically realistic contexts.
Drought and resprouting plants
Zeppel, Melanie J. B.; Harrison, Sandy P.; Adams, Henry D.; ...
2014-12-17
Many species have the ability to resprout vegetatively after a substantial loss of biomass induced by environmental stress, including drought. Many of the regions characterised by ecosystems where resprouting is common are projected to experience more frequent and intense drought during the 21 st century. However, in assessments of ecosystem response to drought disturbance there has been scant consideration of the resilience and post-drought recovery of resprouting species. Systematic differences in hydraulic and allocation traits suggest that resprouting species are more resilient to drought-stress than nonresprouting species. Evidence suggests that ecosystems dominated by resprouters recover from disturbance more quickly thanmore » ecosystems dominated by nonresprouters. The ability of resprouters to avoid mortality and withstand drought, coupled with their ability to recover rapidly, suggests that the impact of increased drought stress in ecosystems dominated by these species may be small. Furthermore, the strategy of resprouting needs to be modelled explicitly to improve estimates of future climate-change impacts on the carbon cycle, but this will require several important knowledge gaps to be filled before resprouting can be properly implemented.« less
Maximizing propulsive thrust of a driven filament at low Reynolds number via variable flexibility.
Peng, Zhiwei; Elfring, Gwynn J; Pak, On Shun
2017-03-22
At low Reynolds numbers the locomotive capability of a body can be dramatically hindered by the absence of inertia. In this work, we show how propulsive performance in this regime can be significantly enhanced by employing spatially varying flexibility. As a prototypical example, we consider the propulsive thrust generated by a filament periodically driven at one end. The rigid case leads to zero propulsion, as so constrained by Purcell's scallop theorem, while for uniform filaments there exists a bending stiffness maximizing the propulsive force at a given frequency; here we demonstrate explicitly how considerable further improvement can be achieved by simply varying the stiffness along the filament. The optimal flexibility distribution is strongly configuration-dependent: while increasing the flexibility towards the tail-end enhances the propulsion of a clamped filament, for a hinged filament decreasing the flexibility towards the tail-end is instead favorable. The results reveal new design principles for maximizing propulsion at low Reynolds numbers, potentially useful for developing synthetic micro-swimmers requiring large propulsive force for various biomedical applications.
Cowell, Rosemary A; Bussey, Timothy J; Saksida, Lisa M
2012-11-01
We describe how computational models can be useful to cognitive and behavioral neuroscience, and discuss some guidelines for deciding whether a model is useful. We emphasize that because instantiating a cognitive theory as a computational model requires specification of an explicit mechanism for the function in question, it often produces clear and novel behavioral predictions to guide empirical research. However, computational modeling in cognitive and behavioral neuroscience remains somewhat rare, perhaps because of misconceptions concerning the use of computational models (in particular, connectionist models) in these fields. We highlight some common misconceptions, each of which relates to an aspect of computational models: the problem space of the model, the level of biological organization at which the model is formulated, and the importance (or not) of biological plausibility, parsimony, and model parameters. Careful consideration of these aspects of a model by empiricists, along with careful delineation of them by modelers, may facilitate communication between the two disciplines and promote the use of computational models for guiding cognitive and behavioral experiments. Copyright © 2012 Elsevier Ltd. All rights reserved.
Population density estimated from locations of individuals on a passive detector array
Efford, Murray G.; Dawson, Deanna K.; Borchers, David L.
2009-01-01
The density of a closed population of animals occupying stable home ranges may be estimated from detections of individuals on an array of detectors, using newly developed methods for spatially explicit capture–recapture. Likelihood-based methods provide estimates for data from multi-catch traps or from devices that record presence without restricting animal movement ("proximity" detectors such as camera traps and hair snags). As originally proposed, these methods require multiple sampling intervals. We show that equally precise and unbiased estimates may be obtained from a single sampling interval, using only the spatial pattern of detections. This considerably extends the range of possible applications, and we illustrate the potential by estimating density from simulated detections of bird vocalizations on a microphone array. Acoustic detection can be defined as occurring when received signal strength exceeds a threshold. We suggest detection models for binary acoustic data, and for continuous data comprising measurements of all signals above the threshold. While binary data are often sufficient for density estimation, modeling signal strength improves precision when the microphone array is small.
Assessing health impact assessment: multidisciplinary and international perspectives
Krieger, N; Northridge, M; Gruskin, S; Quinn, M; Kriebel, D; Davey, S; Bassett, M; Rehkopf, D; Miller, C
2003-01-01
Health impact assessment (HIA) seeks to expand evaluation of policy and programmes in all sectors, both private and public, to include their impact on population health. While the idea that the public's health is affected by a broad array of social and economic policies is not new and dates back well over two centuries, what is new is the notion—increasingly adopted by major health institutions, such as the World Health Organisation (WHO) and the United Kingdom National Health Services (NHS)—that health should be an explicit consideration when evaluating all public policies. In this article, it is argued that while HIA has the potential to enhance recognition of societal determinants of health and of intersectoral responsibility for health, its pitfalls warrant critical attention. Greater clarity is required regarding criteria for initiating, conducting, and completing HIA, including rules pertaining to decision making, enforcement, compliance, plus paying for their conduct. Critical debate over the promise, process, and pitfalls of HIA needs to be informed by multiple disciplines and perspectives from diverse people and regions of the world. PMID:12933768
Drought and resprouting plants.
Zeppel, Melanie J B; Harrison, Sandy P; Adams, Henry D; Kelley, Douglas I; Li, Guangqi; Tissue, David T; Dawson, Todd E; Fensham, Rod; Medlyn, Belinda E; Palmer, Anthony; West, Adam G; McDowell, Nate G
2015-04-01
Many species have the ability to resprout vegetatively after a substantial loss of biomass induced by environmental stress, including drought. Many of the regions characterised by ecosystems where resprouting is common are projected to experience more frequent and intense drought during the 21st Century. However, in assessments of ecosystem response to drought disturbance there has been scant consideration of the resilience and post-drought recovery of resprouting species. Systematic differences in hydraulic and allocation traits suggest that resprouting species are more resilient to drought-stress than nonresprouting species. Evidence suggests that ecosystems dominated by resprouters recover from disturbance more quickly than ecosystems dominated by nonresprouters. The ability of resprouters to avoid mortality and withstand drought, coupled with their ability to recover rapidly, suggests that the impact of increased drought stress in ecosystems dominated by these species may be small. The strategy of resprouting needs to be modelled explicitly to improve estimates of future climate-change impacts on the carbon cycle, but this will require several important knowledge gaps to be filled before resprouting can be properly implemented. © 2014 The Authors. New Phytologist © 2014 New Phytologist Trust.
Azria, E; Tsatsaris, V; Moriette, G; Hirsch, E; Schmitz, T; Cabrol, D; Goffinet, F
2007-05-01
Extreme premature child's long-term prognostic is getting better and better known, and if a resuscitation procedure is possible at birth, it won't guarantee survival or a survival free of disability. Incertitude toward individual prognosis and outcome for those children remains considerable. In this field, we are at the frontier of medical knowledge and the answer to the question, "how to decide the ante and postnatal care" is crucial. This work is focused on this problematic of decision-making in the context of extreme prematurity. It attempts to deconstruct this concept and to explicit its stakes. Thus, with the support of the medical sources and of philosophical debates, we tried to build a decision-making procedure that complies with the ethical requirements of medical care, accuracy, justice and equity. This decision-making procedure is primarily concerned with the singularity of each decision situation and it intends to link it closely to the notions of rationality and responsibility.
Azria, E; Tsatsaris, V; Moriette, G; Hirsch, E; Schmitz, T; Cabrol, D; Goffinet, F
2007-05-01
Extreme premature child's long-term prognostic is getting better and better known, and if a resuscitation procedure is possible at birth, it won't guarantee survival or a survival free of disability. Incertitude toward individual prognosis and outcome for those childs remains considerable. In this field, we are at the frontier of medical knowledge and the answer to the question, "how to decide the ante and postnatal care?" is crucial. This work is focused on this problematic of decision making in the context of extreme prematurity. It attempts to deconstruct this concept and to explicit its stakes. Thus, with the support of the medical sources and of philosophical debates, we tried to build a decision-making procedure that complies with the ethical requirements of medical care, accuracy, justice and equity. This decision-making procedure is primarily concerned with the singularity of each decision situation and it intends to link it closely to the notions of rationality and responsibility.
Kevany, Sebastian; Benatar, Solomon R; Fleischer, Theodore
2013-01-01
The escalating expenditure on patients with HIV/AIDS within an inadequately funded public health system is tending towards crowding out care for patients with non-HIV illnesses. Priority-setting decisions are thus required and should increasingly be based on an explicit, transparent and accountable process to facilitate sustainability. South Africa's public health system is eroding, even though the government has received extensive donor financing for specific conditions, such as HIV/AIDS. The South African government's 2007 HIV plan anticipated costs exceeding 20% of the annual health budget with a strong focus on treatment interventions, while the recently announced 2012-2016 National Strategic HIV plan could cost up to US$16 billion. Conversely, the total non-HIV health budget has remained static in recent years, effectively reducing the supply of health care for other diseases. While the South African government cannot meet all demands for health care simultaneously, health funders should attempt to allocate health resources in a fair, efficient, transparent and accountable manner, in order to ensure that publicly funded health care is delivered in a reasonable and non-discriminatory fashion. We recommend a process for resource allocation that includes ethical, economic, legal and policy considerations. This process, adapted for use by South Africa's policy-makers, could bring health, political, economic and ethical gains, whilst allaying a social crisis as mounting treatment commitments generated by HIV have the potential to overwhelm the health system.
Welch, Vivian A; Akl, Elie A; Pottie, Kevin; Ansari, Mohammed T; Briel, Matthias; Christensen, Robin; Dans, Antonio; Dans, Leonila; Eslava-Schmalbach, Javier; Guyatt, Gordon; Hultcrantz, Monica; Jull, Janet; Katikireddi, Srinivasa Vittal; Lang, Eddy; Matovinovic, Elizabeth; Meerpohl, Joerg J; Morton, Rachael L; Mosdol, Annhild; Murad, M Hassan; Petkovic, Jennifer; Schünemann, Holger; Sharaf, Ravi; Shea, Bev; Singh, Jasvinder A; Solà, Ivan; Stanev, Roger; Stein, Airton; Thabaneii, Lehana; Tonia, Thomy; Tristan, Mario; Vitols, Sigurd; Watine, Joseph; Tugwell, Peter
2017-10-01
The aim of this paper is to describe a conceptual framework for how to consider health equity in the Grading Recommendations Assessment and Development Evidence (GRADE) guideline development process. Consensus-based guidance developed by the GRADE working group members and other methodologists. We developed consensus-based guidance to help address health equity when rating the certainty of synthesized evidence (i.e., quality of evidence). When health inequity is determined to be a concern by stakeholders, we propose five methods for explicitly assessing health equity: (1) include health equity as an outcome; (2) consider patient-important outcomes relevant to health equity; (3) assess differences in the relative effect size of the treatment; (4) assess differences in baseline risk and the differing impacts on absolute effects; and (5) assess indirectness of evidence to disadvantaged populations and/or settings. The most important priority for research on health inequity and guidelines is to identify and document examples where health equity has been considered explicitly in guidelines. Although there is a weak scientific evidence base for assessing health equity, this should not discourage the explicit consideration of how guidelines and recommendations affect the most vulnerable members of society. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wiedemair, W.; Tuković, Ž.; Jasak, H.; Poulikakos, D.; Kurtcuoglu, V.
2012-02-01
The complex interaction between an ultrasound-driven microbubble and an enclosing capillary microvessel is investigated by means of a coupled, multi-domain numerical model using the finite volume formulation. This system is of interest in the study of transient blood-brain barrier disruption (BBBD) for drug delivery applications. The compliant vessel structure is incorporated explicitly as a distinct domain described by a dedicated physical model. Red blood cells (RBCs) are taken into account as elastic solids in the blood plasma. We report the temporal and spatial development of transmural pressure (Ptm) and wall shear stress (WSS) at the luminal endothelial interface, both of which are candidates for the yet unknown mediator of BBBD. The explicit introduction of RBCs shapes the Ptm and WSS distributions and their derivatives markedly. While the peak values of these mechanical wall parameters are not affected considerably by the presence of RBCs, a pronounced increase in their spatial gradients is observed compared to a configuration with blood plasma alone. The novelty of our work lies in the explicit treatment of the vessel wall, and in the modelling of blood as a composite fluid, which we show to be relevant for the mechanical processes at the endothelium.
[Application of spatially explicit landscape model in soil loss study in Huzhong area].
Xu, Chonggang; Hu, Yuanman; Chang, Yu; Li, Xiuzhen; Bu, Renchang; He, Hongshi; Leng, Wenfang
2004-10-01
Universal Soil Loss Equation (USLE) has been widely used to estimate the average annual soil loss. In most of the previous work on soil loss evaluation on forestland, cover management factor was calculated from the static forest landscape. The advent of spatially explicit forest landscape model in the last decade, which explicitly simulates the forest succession dynamics under natural and anthropogenic disturbances (fire, wind, harvest and so on) on heterogeneous landscape, makes it possible to take into consideration the change of forest cover, and to dynamically simulate the soil loss in different year (e.g. 10 years and 20 years after current year). In this study, we linked a spatially explicit landscape model (LANDIS) with USLE to simulate the soil loss dynamics under two scenarios: fire and no harvest, fire and harvest. We also simulated the soil loss with no fire and no harvest as a control. The results showed that soil loss varied periodically with simulation year, and the amplitude of change was the lowest under the control scenario and the highest under the fire and no harvest scenario. The effect of harvest on soil loss could not be easily identified on the map; however, the cumulative effect of harvest on soil loss was larger than that of fire. Decreasing the harvest area and the percent of bare soil increased by harvest could significantly reduce soil loss, but had no significant effects on the dynamic of soil loss. Although harvest increased the annual soil loss, it tended to decrease the variability of soil loss between different simulation years.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, J.; Yu, T.; Papajak, E.
2011-01-01
Many methods for correcting harmonic partition functions for the presence of torsional motions employ some form of one-dimensional torsional treatment to replace the harmonic contribution of a specific normal mode. However, torsions are often strongly coupled to other degrees of freedom, especially other torsions and low-frequency bending motions, and this coupling can make assigning torsions to specific normal modes problematic. Here, we present a new class of methods, called multi-structural (MS) methods, that circumvents the need for such assignments by instead adjusting the harmonic results by torsional correction factors that are determined using internal coordinates. We present three versions ofmore » the MS method: (i) MS-AS based on including all structures (AS), i.e., all conformers generated by internal rotations; (ii) MS-ASCB based on all structures augmented with explicit conformational barrier (CB) information, i.e., including explicit calculations of all barrier heights for internal-rotation barriers between the conformers; and (iii) MS-RS based on including all conformers generated from a reference structure (RS) by independent torsions. In the MS-AS scheme, one has two options for obtaining the local periodicity parameters, one based on consideration of the nearly separable limit and one based on strongly coupled torsions. The latter involves assigning the local periodicities on the basis of Voronoi volumes. The methods are illustrated with calculations for ethanol, 1-butanol, and 1-pentyl radical as well as two one-dimensional torsional potentials. The MS-AS method is particularly interesting because it does not require any information about conformational barriers or about the paths that connect the various structures.« less
Zheng, Jingjing; Yu, Tao; Papajak, Ewa; Alecu, I M; Mielke, Steven L; Truhlar, Donald G
2011-06-21
Many methods for correcting harmonic partition functions for the presence of torsional motions employ some form of one-dimensional torsional treatment to replace the harmonic contribution of a specific normal mode. However, torsions are often strongly coupled to other degrees of freedom, especially other torsions and low-frequency bending motions, and this coupling can make assigning torsions to specific normal modes problematic. Here, we present a new class of methods, called multi-structural (MS) methods, that circumvents the need for such assignments by instead adjusting the harmonic results by torsional correction factors that are determined using internal coordinates. We present three versions of the MS method: (i) MS-AS based on including all structures (AS), i.e., all conformers generated by internal rotations; (ii) MS-ASCB based on all structures augmented with explicit conformational barrier (CB) information, i.e., including explicit calculations of all barrier heights for internal-rotation barriers between the conformers; and (iii) MS-RS based on including all conformers generated from a reference structure (RS) by independent torsions. In the MS-AS scheme, one has two options for obtaining the local periodicity parameters, one based on consideration of the nearly separable limit and one based on strongly coupled torsions. The latter involves assigning the local periodicities on the basis of Voronoi volumes. The methods are illustrated with calculations for ethanol, 1-butanol, and 1-pentyl radical as well as two one-dimensional torsional potentials. The MS-AS method is particularly interesting because it does not require any information about conformational barriers or about the paths that connect the various structures.
Yassi, Annalee; O’Hara, Lyndsay Michelle; Engelbrecht, Michelle C.; Uebel, Kerry; Nophale, Letshego Elizabeth; Bryce, Elizabeth Ann; Buxton, Jane A; Siegel, Jacob; Spiegel, Jerry Malcolm
2014-01-01
Background Community-based cluster-randomized controlled trials (RCTs) are increasingly being conducted to address pressing global health concerns. Preparations for clinical trials are well-described, as are the steps for multi-component health service trials. However, guidance is lacking for addressing the ethical and logistic challenges in (cluster) RCTs of population health interventions in low- and middle-income countries. Objective We aimed to identify the factors that population health researchers must explicitly consider when planning RCTs within North–South partnerships. Design We reviewed our experiences and identified key ethical and logistic issues encountered during the pre-trial phase of a recently implemented RCT. This trial aimed to improve tuberculosis (TB) and Human Immunodeficiency Virus (HIV) prevention and care for health workers by enhancing workplace assessment capability, addressing concerns about confidentiality and stigma, and providing onsite counseling, testing, and treatment. An iterative framework was used to synthesize this analysis with lessons taken from other studies. Results The checklist of critical factors was grouped into eight categories: 1) Building trust and shared ownership; 2) Conducting feasibility studies throughout the process; 3) Building capacity; 4) Creating an appropriate information system; 5) Conducting pilot studies; 6) Securing stakeholder support, with a view to scale-up; 7) Continuously refining methodological rigor; and 8) Explicitly addressing all ethical issues both at the start and continuously as they arise. Conclusion Researchers should allow for the significant investment of time and resources required for successful implementation of population health RCTs within North–South collaborations, recognize the iterative nature of the process, and be prepared to revise protocols as challenges emerge. PMID:24802561
Conson, Massimiliano; Volpicella, Francesco; De Bellis, Francesco; Orefice, Agnese; Trojano, Luigi
2017-10-01
A key point in motor imagery literature is that judging hands in palm view recruits sensory-motor information to a higher extent than judging hands in back view, due to the greater biomechanical complexity implied in rotating hands depicted from palm than from back. We took advantage from this solid evidence to test the nature of a phenomenon known as self-advantage, i.e. the advantage in implicitly recognizing self vs. others' hand images. The self-advantage has been actually found when implicitly but not explicitly judging self-hands, likely due to dissociation between implicit and explicit body representations. However, such a finding might be related to the extent to which motor imagery is recruited during implicit and explicit processing of hand images. We tested this hypothesis in two behavioural experiments. In Experiment 1, right-handed participants judged laterality of either self or others' hands, whereas in Experiment 2, an explicit recognition of one's own hands was required. Crucially, in both experiments participants were randomly presented with hand images viewed from back or from palm. The main result of both experiments was the self-advantage when participants judged hands from palm view. This novel finding demonstrate that increasing the "motor imagery load" during processing of self vs. others' hands can elicit a self-advantage in explicit recognition tasks as well. Future studies testing the possible dissociation between implicit and explicit visual body representations should take into account the modulatory effect of motor imagery load on self-hand processing. Copyright © 2017. Published by Elsevier B.V.
Engineering Complex Embedded Systems with State Analysis and the Mission Data System
NASA Technical Reports Server (NTRS)
Ingham, Michel D.; Rasmussen, Robert D.; Bennett, Matthew B.; Moncada, Alex C.
2004-01-01
It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer s intent, potentially leading to software errors. This problem is addressed by a systems engineering methodology called State Analysis, which provides a process for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using State Analysis and how these requirements inform the design of the system software, using representative spacecraft examples.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-05
... referenced: The existing CPSC Standard Operating Procedure for Determining Lead (Pb) in Paint and Other... explicitly require the use of a particular standard operating procedure. Additionally, the following....'' It is still based on standard test procedures, such as ASTM International (formerly the American...
The mere exposure effect and recognition depend on the way you look!
Willems, Sylvie; Dedonder, Jonathan; Van der Linden, Martial
2010-01-01
In line with Whittlesea and Price (2001), we investigated whether the memory effect measured with an implicit memory paradigm (mere exposure effect) and an explicit recognition task depended on perceptual processing strategies, regardless of whether the task required intentional retrieval. We found that manipulation intended to prompt functional implicit-explicit dissociation no longer had a differential effect when we induced similar perceptual strategies in both tasks. Indeed, the results showed that prompting a nonanalytic strategy ensured performance above chance on both tasks. Conversely, inducing an analytic strategy drastically decreased both explicit and implicit performance. Furthermore, we noted that the nonanalytic strategy involved less extensive gaze scanning than the analytic strategy and that memory effects under this processing strategy were largely independent of gaze movement.
Exponential localization of Wannier functions in insulators.
Brouder, Christian; Panati, Gianluca; Calandra, Matteo; Mourougane, Christophe; Marzari, Nicola
2007-01-26
The exponential localization of Wannier functions in two or three dimensions is proven for all insulators that display time-reversal symmetry, settling a long-standing conjecture. Our proof relies on the equivalence between the existence of analytic quasi-Bloch functions and the nullity of the Chern numbers (or of the Hall current) for the system under consideration. The same equivalence implies that Chern insulators cannot display exponentially localized Wannier functions. An explicit condition for the reality of the Wannier functions is identified.
Implications of a quadratic stream definition in radiative transfer theory.
NASA Technical Reports Server (NTRS)
Whitney, C.
1972-01-01
An explicit definition of the radiation-stream concept is stated and applied to approximate the integro-differential equation of radiative transfer with a set of twelve coupled differential equations. Computational efficiency is enhanced by distributing the corresponding streams in three-dimensional space in a totally symmetric way. Polarization is then incorporated in this model. A computer program based on the model is briefly compared with a Monte Carlo program for simulation of horizon scans of the earth's atmosphere. It is found to be considerably faster.
Constructing increment-decrement life tables.
Schoen, R
1975-05-01
A life table model which can recognize increments (or entrants) as well as decrements has proven to be of considerable value in the analysis of marital status patterns, labor force participation patterns, and other areas of substantive interest. Nonetheless, relatively little work has been done on the methodology of increment-decrement (or combined) life tables. The present paper reviews the general, recursive solution of Schoen and Nelson (1974), develops explicit solutions for three cases of particular interest, and compares alternative approaches to the construction of increment-decrement tables.
Tailoring the Psychotherapy to the Borderline Patient
HORWITZ, LEONARD; GABBARD, GLEN O.; ALLEN, JON G.; COLSON, DONALD B.; FRIESWYK, SIEBOLT; NEWSOM, GAVIN E.; COYNE, LOLAFAYE
1996-01-01
Views still differ as to the optimal psychodynamic treatment of borderline patients. Recommendations range from psychoanalysis and exploratory psychotherapy to an explicitly supportive treatment aimed at strengthening adaptive defenses. The authors contend that no single approach is appropriate for all patients in this wide-ranging diagnostic category, which spans a continuum from close-to-neurotic to close-to-psychotic levels of functioning. Careful differentiations based on developmental considerations, ego structures, and relationship patterns provide the basis for the optimal treatment approach. PMID:22700301
Uncertainty-accounting environmental policy and management of water systems.
Baresel, Christian; Destouni, Georgia
2007-05-15
Environmental policies for water quality and ecosystem management do not commonly require explicit stochastic accounts of uncertainty and risk associated with the quantification and prediction of waterborne pollutant loads and abatement effects. In this study, we formulate and investigate a possible environmental policy that does require an explicit stochastic uncertainty account. We compare both the environmental and economic resource allocation performance of such an uncertainty-accounting environmental policy with that of deterministic, risk-prone and risk-averse environmental policies under a range of different hypothetical, yet still possible, scenarios. The comparison indicates that a stochastic uncertainty-accounting policy may perform better than deterministic policies over a range of different scenarios. Even in the absence of reliable site-specific data, reported literature values appear to be useful for such a stochastic account of uncertainty.
A short note on the use of the red-black tree in Cartesian adaptive mesh refinement algorithms
NASA Astrophysics Data System (ADS)
Hasbestan, Jaber J.; Senocak, Inanc
2017-12-01
Mesh adaptivity is an indispensable capability to tackle multiphysics problems with large disparity in time and length scales. With the availability of powerful supercomputers, there is a pressing need to extend time-proven computational techniques to extreme-scale problems. Cartesian adaptive mesh refinement (AMR) is one such method that enables simulation of multiscale, multiphysics problems. AMR is based on construction of octrees. Originally, an explicit tree data structure was used to generate and manipulate an adaptive Cartesian mesh. At least eight pointers are required in an explicit approach to construct an octree. Parent-child relationships are then used to traverse the tree. An explicit octree, however, is expensive in terms of memory usage and the time it takes to traverse the tree to access a specific node. For these reasons, implicit pointerless methods have been pioneered within the computer graphics community, motivated by applications requiring interactivity and realistic three dimensional visualization. Lewiner et al. [1] provides a concise review of pointerless approaches to generate an octree. Use of a hash table and Z-order curve are two key concepts in pointerless methods that we briefly discuss next.
Synchronization of spontaneous eyeblinks while viewing video stories
Nakano, Tamami; Yamamoto, Yoshiharu; Kitajo, Keiichi; Takahashi, Toshimitsu; Kitazawa, Shigeru
2009-01-01
Blinks are generally suppressed during a task that requires visual attention and tend to occur immediately before or after the task when the timing of its onset and offset are explicitly given. During the viewing of video stories, blinks are expected to occur at explicit breaks such as scene changes. However, given that the scene length is unpredictable, there should also be appropriate timing for blinking within a scene to prevent temporal loss of critical visual information. Here, we show that spontaneous blinks were highly synchronized between and within subjects when they viewed the same short video stories, but were not explicitly tied to the scene breaks. Synchronized blinks occurred during scenes that required less attention such as at the conclusion of an action, during the absence of the main character, during a long shot and during repeated presentations of a similar scene. In contrast, blink synchronization was not observed when subjects viewed a background video or when they listened to a story read aloud. The results suggest that humans share a mechanism for controlling the timing of blinks that searches for an implicit timing that is appropriate to minimize the chance of losing critical information while viewing a stream of visual events. PMID:19640888
Cognitive conflict without explicit conflict monitoring in a dynamical agent.
Ward, Robert; Ward, Ronnie
2006-11-01
We examine mechanisms for resolving cognitive conflict in an embodied, situated, and dynamic agent, developed through an evolutionary learning process. The agent was required to solve problems of response conflict in a dual-target "catching" task, focusing response on one of the targets while ignoring the other. Conflict in the agent was revealed at the behavioral level in terms of increased latencies to the second target. This behavioral interference was correlated to peak violations of the network's stable state equation. At the level of the agent's neural network, peak violations were also correlated to periods of disagreement in source inputs to the agent's motor effectors. Despite observing conflict at these numerous levels, we did not find any explicit conflict monitoring mechanisms within the agent. We instead found evidence of a distributed conflict management system, characterized by competitive sources within the network. In contrast to the conflict monitoring hypothesis [Botvinick, M. M., Braver, T. S., Barch, D. M., Carter, C. S., & Cohen, J. D. (2001). Conflict monitoring and cognitive control. Psychological Review, 108(3), 624-652], this agent demonstrates that resolution of cognitive conflict does not require explicit conflict monitoring. We consider the implications of our results for the conflict monitoring hypothesis.
Explicit and spontaneous retrieval of emotional scenes: electrophysiological correlates.
Weymar, Mathias; Bradley, Margaret M; El-Hinnawi, Nasryn; Lang, Peter J
2013-10-01
When event-related potentials (ERP) are measured during a recognition task, items that have previously been presented typically elicit a larger late (400-800 ms) positive potential than new items. Recent data, however, suggest that emotional, but not neutral, pictures show ERP evidence of spontaneous retrieval when presented in a free-viewing task (Ferrari, Bradley, Codispoti, Karlsson, & Lang, 2012). In two experiments, we further investigated the brain dynamics of implicit and explicit retrieval. In Experiment 1, brain potentials were measured during a semantic categorization task, which did not explicitly probe episodic memory, but which, like a recognition task, required an active decision and a button press, and were compared to those elicited during recognition and free viewing. Explicit recognition prompted a late enhanced positivity for previously presented, compared with new, pictures regardless of hedonic content. In contrast, only emotional pictures showed an old-new difference when the task did not explicitly probe episodic memory, either when making an active categorization decision regarding picture content, or when simply viewing pictures. In Experiment 2, however, neutral pictures did prompt a significant old-new ERP difference during subsequent free viewing when emotionally arousing pictures were not included in the encoding set. These data suggest that spontaneous retrieval is heightened for salient cues, perhaps reflecting heightened attention and elaborative processing at encoding.
Explicit and spontaneous retrieval of emotional scenes: Electrophysiological correlates
Weymar, Mathias; Bradley, Margaret M.; El-Hinnawi, Nasryn; Lang, Peter J.
2014-01-01
When event-related potentials are measured during a recognition task, items that have previously been presented typically elicit a larger late (400–800 ms) positive potential than new items. Recent data, however, suggest that emotional, but not neutral, pictures show ERP evidence of spontaneous retrieval when presented in a free-viewing task (Ferrari, Bradley, Codispoti & Lang, 2012). In two experiments, we further investigated the brain dynamics of implicit and explicit retrieval. In Experiment 1, brain potentials were measured during a semantic categorization task, which did not explicitly probe episodic memory, but which, like a recognition task, required an active decision and a button press, and were compared to those elicited during recognition and free viewing. Explicit recognition prompted a late enhanced positivity for previously presented, compared to new, pictures regardless of hedonic content. In contrast, only emotional pictures showed an old-new difference when the task did not explicitly probe episodic memory, either when either making an active categorization decision regarding picture content, or when simply viewing pictures. In Experiment 2, however, neutral pictures did prompt a significant old-new ERP difference during subsequent free viewing when emotionally arousing pictures were not included in the encoding set. These data suggest that spontaneous retrieval is heightened for salient cues, perhaps reflecting heightened attention and elaborative processing at encoding. PMID:23795588
Effect of explicit dimension instruction on speech category learning
Chandrasekaran, Bharath; Yi, Han-Gyol; Smayda, Kirsten E.; Maddox, W. Todd
2015-01-01
Learning non-native speech categories is often considered a challenging task in adulthood. This difficulty is driven by cross-language differences in weighting critical auditory dimensions that differentiate speech categories. For example, previous studies have shown that differentiating Mandarin tonal categories requires attending to dimensions related to pitch height and direction. Relative to native speakers of Mandarin, the pitch direction dimension is under-weighted by native English speakers. In the current study, we examined the effect of explicit instructions (dimension instruction) on native English speakers' Mandarin tone category learning within the framework of a dual-learning systems (DLS) model. This model predicts that successful speech category learning is initially mediated by an explicit, reflective learning system that frequently utilizes unidimensional rules, with an eventual switch to a more implicit, reflexive learning system that utilizes multidimensional rules. Participants were explicitly instructed to focus and/or ignore the pitch height dimension, the pitch direction dimension, or were given no explicit prime. Our results show that instruction instructing participants to focus on pitch direction, and instruction diverting attention away from pitch height resulted in enhanced tone categorization. Computational modeling of participant responses suggested that instruction related to pitch direction led to faster and more frequent use of multidimensional reflexive strategies, and enhanced perceptual selectivity along the previously underweighted pitch direction dimension. PMID:26542400
Telerobot operator control station requirements
NASA Technical Reports Server (NTRS)
Kan, Edwin P.
1988-01-01
The operator control station of a telerobot system has unique functional and human factors requirements. It has to satisfy the needs of a truly interactive and user-friendly complex system, a telerobot system being a hybrid between a teleoperated and an autonomous system. These functional, hardware and software requirements are discussed, with explicit reference to the design objectives and constraints of the JPL/NASA Telerobot Demonstrator System.
Multiple systems of category learning.
Smith, Edward E; Grossman, Murray
2008-01-01
We review neuropsychological and neuroimaging evidence for the existence of three qualitatively different categorization systems. These categorization systems are themselves based on three distinct memory systems: working memory (WM), explicit long-term memory (explicit LTM), and implicit long-term memory (implicit LTM). We first contrast categorization based on WM with that based on explicit LTM, where the former typically involves applying rules to a test item and the latter involves determining the similarity between stored exemplars or prototypes and a test item. Neuroimaging studies show differences between brain activity in normal participants as a function of whether they are instructed to categorize novel test items by rule or by similarity to known category members. Rule instructions typically lead to more activation in frontal or parietal areas, associated with WM and selective attention, whereas similarity instructions may activate parietal areas associated with the integration of perceptual features. Studies with neurological patients in the same paradigms provide converging evidence, e.g., patients with Alzheimer's disease, who have damage in prefrontal regions, are more impaired with rule than similarity instructions. Our second contrast is between categorization based on explicit LTM with that based on implicit LTM. Neuropsychological studies with patients with medial-temporal lobe damage show that patients are impaired on tasks requiring explicit LTM, but perform relatively normally on an implicit categorization task. Neuroimaging studies provide converging evidence: whereas explicit categorization is mediated by activation in numerous frontal and parietal areas, implicit categorization is mediated by a deactivation in posterior cortex.
NASA Astrophysics Data System (ADS)
Aubree, Nathan
Since 1990, constitutive concrete model EPM3D (Multiaxial Progressive Damage in 3 Dimensions) has been developed at Polytechnique Montreal. Bouzaiene and Massicotte (1995) choose the hypoelastic approach with the concept of equivalent deformation and the implementation of a scalar damage parameter to represent the microcracking of concrete in pre-peak compression. The post-peak softening behaviour, in tension and in compression, is based on the concept of conservation of the fracture energy. In the finite elements context, it requires defining a localisation limiter acting on the softening modulus depending on the element size. The formulation of EPM3D model in the case of the post-peak compression required revisions. Mesh-dependence problems and the absence of the consideration of the confinement effect were the most important points to improve, with as main goal the modelling of the fracture of the reinforced concrete columns. With a complete literature review, we try to establish an exhaustive list of the numerous parameters having an influence on the softening behavior under uniaxial and multiaxial loads. In the second part of this review, we exhibit the difficulties of modelling a softening material with finite elements theory and the principle of the set up localization limiter. Inspired by models we met in literature, modifications of the previously established relation are proposed by focusing on a more adequate representation of the behavior under confinement loads. Then we proceed to the validation of the model by means of simple analyses with the software ABAQUS and the module of explicit dynamic resolution, called Explicit. Also we present its specificities compared with a classic implicit static resolution. We supply some advice to the reader and future students who are susceptible to model real reinforced concrete columns with EPM3D. Finally we made an experimental program to characterize the post-peak behavior in uniaxial compression of a fiber reinforced concrete mixture (FRC) with the aim of considering the possibility or not of an extrapolation of our model for FRC.
The Impact of Debt Limitations and Referenda Requirements on the Cost of School District Bond Issues
ERIC Educational Resources Information Center
Harris, Mary H.; Munley, Vincent G.
2011-01-01
One distinction between the markets for corporate and municipal bonds involves institutional constraints that apply to some municipal bond issues. This research focuses on how public finance institutions, in particular explicit debt limits and referenda requirements, affect the borrowing cost of individual school district bond issues. The…
It Takes Time and Experience to Learn How to Interpret Gaze in Mentalistic Terms
ERIC Educational Resources Information Center
Leavens, David A.
2006-01-01
What capabilities are required for an organism to evince an "explicit" understanding of gaze as a mentalistic phenomenon? One possibility is that mentalistic interpretations of gaze, like concepts of unseen, supernatural beings, are culturally-specific concepts, acquired through cultural learning. These abstract concepts may either require a…
Integrating Spatial Components into FIA Models of Forest Resources: Some Technical Aspects
Pat Terletzky; Tracey Frescino
2005-01-01
We examined two software packages to determine their feasibility of implementing spatially explicit, forest resource models that integrate Forest Inventory and Analysis data (FIA). ARCINFO and Interactive Data Language (IDL) were examined for their input requirements, speed of processing, storage requirements, and flexibility of implementing. Implementations of two...
Kent, C D; Mashour, G A; Metzger, N A; Posner, K L; Domino, K B
2013-03-01
Anaesthetic awareness is a recognized complication of general anaesthesia (GA) and is associated with post-traumatic stress disorder (PTSD). Although complete amnesia for intraprocedural events during sedation and regional anaesthesia (RA) may occur, explicit recall is expected by anaesthesia providers. Consequently, the possibility that there could be psychological consequences associated with unexpected explicit recall of events during sedation and RA has not been investigated. This study investigated the psychological sequelae of unexpected explicit recall of events during sedation/RA that was reported to the Anesthesia Awareness Registry. The Registry recruited subjects who self-identified as having had anaesthetic awareness. Inclusion criteria were a patient-reported awareness experience in 1990 or later and availability of medical records. The sensations experienced by the subjects during their procedure and the acute and persistent psychological sequelae attributed to this explicit recall were assessed for patients receiving sedation/RA and those receiving GA. Among the patients fulfilling the inclusion criteria, medical record review identified 27 sedation/RA and 50 GA cases. Most patients experienced distress (78% of sedation/RA vs 94% of GA). Approximately 40% of patients with sedation/RA had persistent psychological sequelae, similar to GA patients. Some sedation/RA patients reported an adverse impact on their job performance (15%), family relationships (11%), and friendships (11%), and 15% reported being diagnosed with PTSD. Patients who self-reported to the Registry unexpected explicit recall of events during sedation/RA experienced distress and persistent psychological sequelae comparable with those who had reported anaesthetic awareness during GA. Further study is warranted to determine if patients reporting distress with explicit recall after sedation/RA require psychiatric follow-up.
Nakamura, Shinichiro; Kondo, Yasushi; Matsubae, Kazuyo; Nakajima, Kenichi; Tasaki, Tomohiro; Nagasaka, Tetsuya
2012-09-04
Metals can in theory be infinitely recycled in a closed-loop without any degradation in quality. In reality, however, open-loop recycling is more typical for metal scrap recovered from end-of-life (EoL) products because mixing of different metal species results in scrap quality that no longer matches the originals. Further losses occur when meeting the quality requirement of the target product requires dilution of the secondary material by adding high purity materials. Standard LCA usually does not address these losses. This paper presents a novel approach to quantifying quality- and dilution losses, by means of hybrid input-output analysis. We focus on the losses associated with the recycling of ferrous materials from end-of-life vehicle (ELV) due to the mixing of copper, a typical contaminant in steel recycling. Given the quality of scrap in terms of copper density, the model determines the ratio by which scrap needs to be diluted in an electric arc furnace (EAF), and the amount of demand for EAF steel including those quantities needed for dilution. Application to a high-resolution Japanese IO table supplemented with data on ferrous materials including different grades of scrap indicates that a nationwide avoidance of these losses could result in a significant reduction of CO(2) emissions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Xiaodong; Xia, Yidong; Luo, Hong
A comparative study of two classes of third-order implicit time integration schemes is presented for a third-order hierarchical WENO reconstructed discontinuous Galerkin (rDG) method to solve the 3D unsteady compressible Navier-Stokes equations: — 1) the explicit first stage, single diagonally implicit Runge-Kutta (ESDIRK3) scheme, and 2) the Rosenbrock-Wanner (ROW) schemes based on the differential algebraic equations (DAEs) of Index-2. Compared with the ESDIRK3 scheme, a remarkable feature of the ROW schemes is that, they only require one approximate Jacobian matrix calculation every time step, thus considerably reducing the overall computational cost. A variety of test cases, ranging from inviscid flowsmore » to DNS of turbulent flows, are presented to assess the performance of these schemes. Here, numerical experiments demonstrate that the third-order ROW scheme for the DAEs of index-2 can not only achieve the designed formal order of temporal convergence accuracy in a benchmark test, but also require significantly less computing time than its ESDIRK3 counterpart to converge to the same level of discretization errors in all of the flow simulations in this study, indicating that the ROW methods provide an attractive alternative for the higher-order time-accurate integration of the unsteady compressible Navier-Stokes equations.« less
Liu, Xiaodong; Xia, Yidong; Luo, Hong; ...
2016-10-05
A comparative study of two classes of third-order implicit time integration schemes is presented for a third-order hierarchical WENO reconstructed discontinuous Galerkin (rDG) method to solve the 3D unsteady compressible Navier-Stokes equations: — 1) the explicit first stage, single diagonally implicit Runge-Kutta (ESDIRK3) scheme, and 2) the Rosenbrock-Wanner (ROW) schemes based on the differential algebraic equations (DAEs) of Index-2. Compared with the ESDIRK3 scheme, a remarkable feature of the ROW schemes is that, they only require one approximate Jacobian matrix calculation every time step, thus considerably reducing the overall computational cost. A variety of test cases, ranging from inviscid flowsmore » to DNS of turbulent flows, are presented to assess the performance of these schemes. Here, numerical experiments demonstrate that the third-order ROW scheme for the DAEs of index-2 can not only achieve the designed formal order of temporal convergence accuracy in a benchmark test, but also require significantly less computing time than its ESDIRK3 counterpart to converge to the same level of discretization errors in all of the flow simulations in this study, indicating that the ROW methods provide an attractive alternative for the higher-order time-accurate integration of the unsteady compressible Navier-Stokes equations.« less
Kersten, Paula; Dudley, Margaret; Nayar, Shoba; Elder, Hinemoa; Robertson, Heather; Tauroa, Robyn; McPherson, Kathryn M
2016-10-12
Screening children for behavioural difficulties requires the use of a tool that is culturally valid. We explored the cross-cultural acceptability and utility of the Strengths and Difficulties Questionnaire for pre-school children (aged 3-5) as perceived by families in New Zealand. A qualitative interpretive descriptive study (focus groups and interviews) in which 65 participants from five key ethnic groups (New Zealand European, Māori, Pacific, Asian and other immigrant parents) took part. Thematic analysis using an inductive approach, in which the themes identified are strongly linked to the data, was employed. Many parents reported they were unclear about the purpose of the tool, affecting its perceived value. Participants reported not understanding the context in which they should consider the questions and had difficulty understanding some questions and response options. Māori parents generally did not support the questionnaire based approach, preferring face to face interaction. Parents from Māori, Pacific Island, Asian, and new immigrant groups reported the tool lacked explicit consideration of children in their cultural context. Parents discussed the importance of timing and multiple perspectives when interpreting scores from the tool. In summary, this study posed a number of challenges to the use of the Strengths and Difficulties Questionnaire in New Zealand. Further work is required to develop a tool that is culturally appropriate with good content validity.
The Urban Food-Water Nexus: Modeling Water Footprints of Urban Agriculture using CityCrop
NASA Astrophysics Data System (ADS)
Tooke, T. R.; Lathuilliere, M. J.; Coops, N. C.; Johnson, M. S.
2014-12-01
Urban agriculture provides a potential contribution towards more sustainable food production and mitigating some of the human impacts that accompany volatility in regional and global food supply. When considering the capacity of urban landscapes to produce food products, the impact of urban water demand required for food production in cities is often neglected. Urban agricultural studies also tend to be undertaken at broad spatial scales, overlooking the heterogeneity of urban form that exerts an extreme influence on the urban energy balance. As a result, urban planning and management practitioners require, but often do not have, spatially explicit and detailed information to support informed urban agricultural policy, especially as it relates to potential conflicts with sustainability goals targeting water-use. In this research we introduce a new model, CityCrop, a hybrid evapotranspiration-plant growth model that incorporates detailed digital representations of the urban surface and biophysical impacts of the built environment and urban trees to account for the daily variations in net surface radiation. The model enables very fine-scale (sub-meter) estimates of water footprints of potential urban agricultural production. Results of the model are demonstrated for an area in the City of Vancouver, Canada and compared to aspatial model estimates, demonstrating the unique considerations and sensitivities for current and future water footprints of urban agriculture and the implications for urban water planning and policy.
NASA Astrophysics Data System (ADS)
Dalzell, B. J.; Pennington, D.; Nelson, E.; Mulla, D.; Polasky, S.; Taff, S.
2012-12-01
This study links a spatially-explicit biophysical model (SWAT) with an economic model (InVEST) to identify the economically optimum allocation of conservation practices on the landscape. Combining biophysical and economic analysis allows assessment of the benefits and costs of alternative policy choices through consideration of direct costs and benefits as measured by market transactions as well as non-market benefits and costs from changes in environmental conditions that lead to changes in the provision of ecosystem services. When applied to an agricultural watershed located in South-Central Minnesota, this approach showed that: (1) some modest gains (20% improvement, relative to baseline conditions) in water quality can be achieved without diminishing current economic returns, but that (2) more dramatic reductions in sediment and phosphorus required to meet water quality goals (50% reductions in loadings) will require transitioning land from row crops into perennial vegetation. This shift in land cover will result in a reduction in economic returns unless non-market ecosystem services are also valued. Further results showed that traditional best management practices such as conservation tillage and reduced fertilizer application rates are not sufficient to achieve water quality goals by themselves. Finally, if crop prices drop to pre-2007 levels or valuation of ecosystem services increases, then achieving water quality goals can occur with less of an economic impact to the watershed.
Taylor, J Eric T; Lam, Timothy K; Chasteen, Alison L; Pratt, Jay
2015-01-01
Embodied cognition holds that abstract concepts are grounded in perceptual-motor simulations. If a given embodied metaphor maps onto a spatial representation, then thinking of that concept should bias the allocation of attention. In this study, we used positive and negative self-esteem words to examine two properties of conceptual cueing. First, we tested the orientation-specificity hypothesis, which predicts that conceptual cues should selectively activate certain spatial axes (in this case, valenced self-esteem concepts should activate vertical space), instead of any spatial continuum. Second, we tested whether conceptual cueing requires semantic processing, or if it can be achieved with shallow visual processing of the cue words. Participants viewed centrally presented words consisting of high or low self-esteem traits (e.g., brave, timid) before detecting a target above or below the cue in the vertical condition, or on the left or right of the word in the horizontal condition. Participants were faster to detect targets when their location was compatible with the valence of the word cues, but only in the vertical condition. Moreover, this effect was observed when participants processed the semantics of the word, but not when processing its orthography. The results show that conceptual cueing by spatial metaphors is orientation-specific, and that an explicit consideration of the word cues' semantics is required for conceptual cueing to occur.
Hanefeld, Johanna; Bond, Virginia; Seeley, Janet; Lees, Shelley; Desmond, Nicola
2015-12-01
Increasing attention is being paid to the potential of anti-retroviral treatment (ART) for HIV prevention. The possibility of eliminating HIV from a population through a universal test and treat intervention, where all people within a population are tested for HIV and all positive people immediately initiated on ART, as part of a wider prevention intervention, was first proposed in 2009. Several clinical trials testing this idea are now in inception phase. An intervention which relies on universally testing the entire population for HIV will pose challenges to human rights, including obtaining genuine consent to testing and treatment. It also requires a context in which people can live free from fear of stigma, discrimination and violence, and can access services they require. These challenges are distinct from the field of medical ethics which has traditionally governed clinical trials and focuses primarily on patient researcher relationship. This paper sets out the potential impact of a population wide treatment as prevention intervention on human rights. It identifies five human right principles of particular relevance: participation, accountability, the right to health, non-discrimination and equality, and consent and confidentiality. The paper proposes that explicit attention to human rights can strengthen a treatment as prevention intervention, contribute to mediating likely health systems challenges and offer insights on how to reach all sections of the population. © 2013 John Wiley & Sons Ltd.
Harris, Claire; Allen, Kelly; King, Richard; Ramsey, Wayne; Kelly, Cate; Thiagarajan, Malar
2017-05-05
This is the second in a series of papers reporting a program of Sustainability in Health care by Allocating Resources Effectively (SHARE) in a local healthcare setting. Rising healthcare costs, continuing advances in health technologies and recognition of ineffective practices and systematic waste are driving disinvestment of health technologies and clinical practices that offer little or no benefit in order to maximise outcomes from existing resources. However there is little information to guide regional health services or individual facilities in how they might approach disinvestment locally. This paper outlines the investigation of potential settings and methods for decision-making about disinvestment in the context of an Australian health service. Methods include a literature review on the concepts and terminology relating to disinvestment, a survey of national and international researchers, and interviews and workshops with local informants. A conceptual framework was drafted and refined with stakeholder feedback. There is a lack of common terminology regarding definitions and concepts related to disinvestment and no guidance for an organisation-wide systematic approach to disinvestment in a local healthcare service. A summary of issues from the literature and respondents highlight the lack of theoretical knowledge and practical experience and provide a guide to the information required to develop future models or methods for disinvestment in the local context. A conceptual framework was developed. Three mechanisms that provide opportunities to introduce disinvestment decisions into health service systems and processes were identified. Presented in order of complexity, time to achieve outcomes and resources required they include 1) Explicit consideration of potential disinvestment in routine decision-making, 2) Proactive decision-making about disinvestment driven by available evidence from published research and local data, and 3) Specific exercises in priority setting and system redesign. This framework identifies potential opportunities to initiate disinvestment activities in a systematic integrated approach that can be applied across a whole organisation using transparent, evidence-based methods. Incorporating considerations for disinvestment into existing decision-making systems and processes might be achieved quickly with minimal cost; however establishment of new systems requires research into appropriate methods and provision of appropriate skills and resources to deliver them.
Predicate calculus, artificial intelligence, and workers' compensation.
Harber, P; McCoy, J M
1989-05-01
Application of principles of predicate calculus (PC) and artificial intelligence (AI) search methods to occupational medicine can meet several goals. First, they can improve understanding of the diagnostic process and recognition of the sources of uncertainty in knowledge and in case specific information. Second, PC provides a rational means of resolving differences in conclusion based upon the same premises. Third, understanding of these principles allows separation of knowledge (facts) from the process by which they are used and therefore facilitates development of AI-based expert systems. Application of PC to recognizing causation of pulmonary fibrosis is demonstrated in this paper, providing a method that can be generalized to other problems in occupational medicine. Application of PC and understanding of AI search routines may be particularly applicable to workers' compensation where explicit statement of rational and inferential process is necessary. This approach is useful in the diagnosis of occupational lung disease and may be particularly valuable in workers' compensation considerations, wherein explicit statement of rationale is needed.
Hawkins, Carlee Beth; Nosek, Brian A
2012-11-01
Reporting an Independent political identity does not guarantee the absence of partisanship. Independents demonstrated considerable variability in relative identification with Republicans versus Democrats as measured by an Implicit Association Test (IAT; M = 0.10, SD = 0.47). To test whether this variation predicted political judgment, participants read a newspaper article describing two competing welfare (Study 1) or special education (Study 2) policies. The authors manipulated which policy was proposed by which party. Among self-proclaimed Independents, those who were implicitly Democratic preferred the liberal welfare plan, and those who were implicitly Republican preferred the conservative welfare plan. Regardless of the policy details, these implicit partisans preferred the policy proposed by "their" party, and this effect occurred more strongly for implicit than explicit plan preference. The authors suggest that implicitly partisan Independents may consciously override some partisan influence when making explicit political judgments, and Independents may identify as such to appear objective even when they are not.
Transient analysis of a thermal storage unit involving a phase change material
NASA Technical Reports Server (NTRS)
Griggs, E. I.; Pitts, D. R.; Humphries, W. R.
1974-01-01
The transient response of a single cell of a typical phase change material type thermal capacitor has been modeled using numerical conductive heat transfer techniques. The cell consists of a base plate, an insulated top, and two vertical walls (fins) forming a two-dimensional cavity filled with a phase change material. Both explicit and implicit numerical formulations are outlined. A mixed explicit-implicit scheme which treats the fin implicity while treating the phase change material explicitly is discussed. A band algorithmic scheme is used to reduce computer storage requirements for the implicit approach while retaining a relatively fine grid. All formulations are presented in dimensionless form thereby enabling application to geometrically similar problems. Typical parametric results are graphically presented for the case of melting with constant heat input to the base of the cell.
NASA Astrophysics Data System (ADS)
Schindelegger, Michael; Quinn, Katherine J.; Ponte, Rui M.
2017-04-01
Numerical modeling of non-tidal variations in ocean currents and bottom pressure has played a key role in closing the excitation budget of Earth's polar motion for a wide range of periodicities. Non-negligible discrepancies between observations and model accounts of pole position changes prevail, however, on sub-monthly time scales and call for examination of hydrodynamic effects usually omitted in general circulation models. Specifically, complete hydrodynamic cores must incorporate self-attraction and loading (SAL) feedbacks on redistributed water masses, effects that produces ocean bottom pressure perturbations of typically about 10% relative to the computed mass variations. Here, we report on a benchmark simulation with a near-global, barotropic forward model forced by wind stress, atmospheric pressure, and a properly calculated SAL term. The latter is obtained by decomposing ocean mass anomalies on a 30-minute grid into spherical harmonics at each time step and applying Love numbers to account for seafloor deformation and changed gravitational attraction. The increase in computational time at each time step is on the order of 50%. Preliminary results indicate that the explicit consideration of SAL in the forward runs increases the fidelity of modeled polar motion excitations, in particular on time scales shorter than 5 days as evident from cross spectral comparisons with geodetic excitation. Definite conclusions regarding the relevance of SAL in simulating rapid polar motion are, however, still hampered by the model's incomplete domain representation that excludes parts of the highly energetic Arctic Ocean.
Fully implicit Particle-in-cell algorithms for multiscale plasma simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chacon, Luis
The outline of the paper is as follows: Particle-in-cell (PIC) methods for fully ionized collisionless plasmas, explicit vs. implicit PIC, 1D ES implicit PIC (charge and energy conservation, moment-based acceleration), and generalization to Multi-D EM PIC: Vlasov-Darwin model (review and motivation for Darwin model, conservation properties (energy, charge, and canonical momenta), and numerical benchmarks). The author demonstrates a fully implicit, fully nonlinear, multidimensional PIC formulation that features exact local charge conservation (via a novel particle mover strategy), exact global energy conservation (no particle self-heating or self-cooling), adaptive particle orbit integrator to control errors in momentum conservation, and canonical momenta (EM-PICmore » only, reduced dimensionality). The approach is free of numerical instabilities: ω peΔt >> 1, and Δx >> λ D. It requires many fewer dofs (vs. explicit PIC) for comparable accuracy in challenging problems. Significant CPU gains (vs explicit PIC) have been demonstrated. The method has much potential for efficiency gains vs. explicit in long-time-scale applications. Moment-based acceleration is effective in minimizing N FE, leading to an optimal algorithm.« less
NASA Technical Reports Server (NTRS)
Kumar, A.; Rudy, D. H.; Drummond, J. P.; Harris, J. E.
1982-01-01
Several two- and three-dimensional external and internal flow problems solved on the STAR-100 and CYBER-203 vector processing computers are described. The flow field was described by the full Navier-Stokes equations which were then solved by explicit finite-difference algorithms. Problem results and computer system requirements are presented. Program organization and data base structure for three-dimensional computer codes which will eliminate or improve on page faulting, are discussed. Storage requirements for three-dimensional codes are reduced by calculating transformation metric data in each step. As a result, in-core grid points were increased in number by 50% to 150,000, with a 10% execution time increase. An assessment of current and future machine requirements shows that even on the CYBER-205 computer only a few problems can be solved realistically. Estimates reveal that the present situation is more storage limited than compute rate limited, but advancements in both storage and speed are essential to realistically calculate three-dimensional flow.
Mazor, Tessa; Possingham, Hugh P.; Edelist, Dori; Brokovich, Eran; Kark, Salit
2014-01-01
Successful implementation of marine conservation plans is largely inhibited by inadequate consideration of the broader social and economic context within which conservation operates. Marine waters and their biodiversity are shared by a host of stakeholders, such as commercial fishers, recreational users and offshore developers. Hence, to improve implementation success of conservation plans, we must incorporate other marine activities while explicitly examining trade-offs that may be required. In this study, we test how the inclusion of multiple marine activities can shape conservation plans. We used the entire Mediterranean territorial waters of Israel as a case study to compare four planning scenarios with increasing levels of complexity, where additional zones, threats and activities were added (e.g., commercial fisheries, hydrocarbon exploration interests, aquaculture, and shipping lanes). We applied the marine zoning decision support tool Marxan to each planning scenario and tested a) the ability of each scenario to reach biodiversity targets, b) the change in opportunity cost and c) the alteration of spatial conservation priorities. We found that by including increasing numbers of marine activities and zones in the planning process, greater compromises are required to reach conservation objectives. Complex plans with more activities incurred greater opportunity cost and did not reach biodiversity targets as easily as simplified plans with less marine activities. We discovered that including hydrocarbon data in the planning process significantly alters spatial priorities. For the territorial waters of Israel we found that in order to protect at least 10% of the range of 166 marine biodiversity features there would be a loss of ∼15% of annual commercial fishery revenue and ∼5% of prospective hydrocarbon revenue. This case study follows an illustrated framework for adopting a transparent systematic process to balance biodiversity goals and economic considerations within a country's territorial waters. PMID:25102177
NASA Astrophysics Data System (ADS)
Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.
2015-12-01
Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we contend that creating believable soil carbon predictions requires a robust, transparent, and community-available benchmarking framework. I will present an ILAMB evaluation of several of the above-mentioned approaches in ACME, and attempt to motivate community adoption of this evaluation approach.
Lorenz, Marco; Fürst, Christine; Thiel, Enrico
2013-09-01
Regarding increasing pressures by global societal and climate change, the assessment of the impact of land use and land management practices on land degradation and the related decrease in sustainable provision of ecosystem services gains increasing interest. Existing approaches to assess agricultural practices focus on the assessment of single crops or statistical data because spatially explicit information on practically applied crop rotations is mostly not available. This provokes considerable uncertainties in crop production models as regional specifics have to be neglected or cannot be considered in an appropriate way. In a case study in Saxony, we developed an approach to (i) derive representative regional crop rotations by combining different data sources and expert knowledge. This includes the integration of innovative crop sequences related to bio-energy production or organic farming and different soil tillage, soil management and soil protection techniques. Furthermore, (ii) we developed a regionalization approach for transferring crop rotations and related soil management strategies on the basis of statistical data and spatially explicit data taken from so called field blocks. These field blocks are the smallest spatial entity for which agricultural practices must be reported to apply for agricultural funding within the frame of the European Agricultural Fund for Rural Development (EAFRD) program. The information was finally integrated into the spatial decision support tool GISCAME to assess and visualize in spatially explicit manner the impact of alternative agricultural land use strategies on soil erosion risk and ecosystem services provision. Objective of this paper is to present the approach how to create spatially explicit information on agricultural management practices for a study area around Dresden, the capital of the German Federal State Saxony. Copyright © 2013 Elsevier Ltd. All rights reserved.
Dupas, Laura; Massire, Aurélien; Amadon, Alexis; Vignaud, Alexandre; Boulant, Nicolas
2015-06-01
The spokes method combined with parallel transmission is a promising technique to mitigate the B1(+) inhomogeneity at ultra-high field in 2D imaging. To date however, the spokes placement optimization combined with the magnitude least squares pulse design has never been done in direct conjunction with the explicit Specific Absorption Rate (SAR) and hardware constraints. In this work, the joint optimization of 2-spoke trajectories and RF subpulse weights is performed under these constraints explicitly and in the small tip angle regime. The problem is first considerably simplified by making the observation that only the vector between the 2 spokes is relevant in the magnitude least squares cost-function, thereby reducing the size of the parameter space and allowing a more exhaustive search. The algorithm starts from a set of initial k-space candidates and performs in parallel for all of them optimizations of the RF subpulse weights and the k-space locations simultaneously, under explicit SAR and power constraints, using an active-set algorithm. The dimensionality of the spoke placement parameter space being low, the RF pulse performance is computed for every location in k-space to study the robustness of the proposed approach with respect to initialization, by looking at the probability to converge towards a possible global minimum. Moreover, the optimization of the spoke placement is repeated with an increased pulse bandwidth in order to investigate the impact of the constraints on the result. Bloch simulations and in vivo T2(∗)-weighted images acquired at 7 T validate the approach. The algorithm returns simulated normalized root mean square errors systematically smaller than 5% in 10 s. Copyright © 2015 Elsevier Inc. All rights reserved.
Commentary: Writing and Evaluating Qualitative Research Reports.
Wu, Yelena P; Thompson, Deborah; Aroian, Karen J; McQuaid, Elizabeth L; Deatrick, Janet A
2016-06-01
To provide an overview of qualitative methods, particularly for reviewers and authors who may be less familiar with qualitative research. A question and answer format is used to address considerations for writing and evaluating qualitative research. When producing qualitative research, individuals are encouraged to address the qualitative research considerations raised and to explicitly identify the systematic strategies used to ensure rigor in study design and methods, analysis, and presentation of findings. Increasing capacity for review and publication of qualitative research within pediatric psychology will advance the field's ability to gain a better understanding of the specific needs of pediatric populations, tailor interventions more effectively, and promote optimal health. © The Author 2016. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
A Global Mitigation Hierarchy for Nature Conservation
Bull, Joseph W; Addison, Prue F E; Burgass, Michael J; Gianuca, Dimas; Gorham, Taylor M; Jacob, Céline; Watson, James E M; Wilcox, Chris; Milner-Gulland, E J
2018-01-01
Abstract Efforts to conserve biodiversity comprise a patchwork of international goals, national-level plans, and local interventions that, overall, are failing. We discuss the potential utility of applying the mitigation hierarchy, widely used during economic development activities, to all negative human impacts on biodiversity. Evaluating all biodiversity losses and gains through the mitigation hierarchy could help prioritize consideration of conservation goals and drive the empirical evaluation of conservation investments through the explicit consideration of counterfactual trends and ecosystem dynamics across scales. We explore the challenges in using this framework to achieve global conservation goals, including operationalization and monitoring and compliance, and we discuss solutions and research priorities. The mitigation hierarchy's conceptual power and ability to clarify thinking could provide the step change needed to integrate the multiple elements of conservation goals and interventions in order to achieve successful biodiversity outcomes. PMID:29731513
NASA Technical Reports Server (NTRS)
Poniatowski, Karen
2005-01-01
Contents include the following: Overview/Introduction. Roadmap Approach/Considerations. Roadmap Timeline/Spirals. Requirements Development. Spaceport/Range Capabilities. Mixed Range Architecture. User Requirements/Customer Considerations. Manifest Considerations. Emerging Launch User Requirements. Capability Breakdown Structure/Assessment. Roadmap Team Observations. Transformational Range Test Concept. Roadmap Team Conclusions. Next Steps.
NASA Astrophysics Data System (ADS)
Valorso, Richard; Raventos-Duran, Teresa; Aumont, Bernard; Camredon, Marie; Ng, Nga L.; Seinfeld, John H.
2010-05-01
The evaluation of the impacts of secondary organics on pollution episodes, climate and the tropospheric oxidizing capacity requires modelling tools that track the identity and reactivity of organic carbon in the various stages down to the ultimate oxidation products. The fully explicit representation of hydrocarbon oxidation, from the initial compounds to the final product CO2, requires a very large number of chemical reactions and intermediate species, far in excess of the number that can be reasonably written manually. We developed a "self generating approach" to explicitly describe (i) the gas phase oxidation schemes of organic compounds under general tropospheric conditions and (ii) the partitioning of secondary organics between gas and condensed phases. This approach was codified in a computer program, GECKO-A (Generator for Explicit Chemistry and Kinetics of Organics in the Atmosphere). This method allows prediction of multiphase mass budget using first principles. However, due to computational limitations, fully explicit chemical schemes can only be generated for species up to C8. We recently implemented a reduction protocol in GECKO-A to allow the generation of oxidation schemes for long chain organics. This protocol was applied to develop highly detailed oxidation schemes for biogenic compounds. The relevance of the generated schemes was assessed using experiments performed in the Caltech smog chamber for various NOx conditions. The first results show a systematic overestimation of the simulated SOA concentrations by GECKO-A. Several hypotheses were tested to find the origin of the discrepancies beetwen model and measurements.
Global agriculture and carbon trade-offs
Johnson, Justin Andrew; Runge, Carlisle Ford; Senauer, Benjamin; Foley, Jonathan; Polasky, Stephen
2014-01-01
Feeding a growing and increasingly affluent world will require expanded agricultural production, which may require converting grasslands and forests into cropland. Such conversions can reduce carbon storage, habitat provision, and other ecosystem services, presenting difficult societal trade-offs. In this paper, we use spatially explicit data on agricultural productivity and carbon storage in a global analysis to find where agricultural extensification should occur to meet growing demand while minimizing carbon emissions from land use change. Selective extensification saves ∼6 billion metric tons of carbon compared with a business-as-usual approach, with a value of approximately $1 trillion (2012 US dollars) using recent estimates of the social cost of carbon. This type of spatially explicit geospatial analysis can be expanded to include other ecosystem services and other industries to analyze how to minimize conflicts between economic development and environmental sustainability. PMID:25114254
Global agriculture and carbon trade-offs.
Johnson, Justin Andrew; Runge, Carlisle Ford; Senauer, Benjamin; Foley, Jonathan; Polasky, Stephen
2014-08-26
Feeding a growing and increasingly affluent world will require expanded agricultural production, which may require converting grasslands and forests into cropland. Such conversions can reduce carbon storage, habitat provision, and other ecosystem services, presenting difficult societal trade-offs. In this paper, we use spatially explicit data on agricultural productivity and carbon storage in a global analysis to find where agricultural extensification should occur to meet growing demand while minimizing carbon emissions from land use change. Selective extensification saves ∼ 6 billion metric tons of carbon compared with a business-as-usual approach, with a value of approximately $1 trillion (2012 US dollars) using recent estimates of the social cost of carbon. This type of spatially explicit geospatial analysis can be expanded to include other ecosystem services and other industries to analyze how to minimize conflicts between economic development and environmental sustainability.
Awareness-based game-theoretic space resource management
NASA Astrophysics Data System (ADS)
Chen, Genshe; Chen, Huimin; Pham, Khanh; Blasch, Erik; Cruz, Jose B., Jr.
2009-05-01
Over recent decades, the space environment becomes more complex with a significant increase in space debris and a greater density of spacecraft, which poses great difficulties to efficient and reliable space operations. In this paper we present a Hierarchical Sensor Management (HSM) method to space operations by (a) accommodating awareness modeling and updating and (b) collaborative search and tracking space objects. The basic approach is described as follows. Firstly, partition the relevant region of interest into district cells. Second, initialize and model the dynamics of each cell with awareness and object covariance according to prior information. Secondly, explicitly assign sensing resources to objects with user specified requirements. Note that when an object has intelligent response to the sensing event, the sensor assigned to observe an intelligent object may switch from time-to-time between a strong, active signal mode and a passive mode to maximize the total amount of information to be obtained over a multi-step time horizon and avoid risks. Thirdly, if all explicitly specified requirements are satisfied and there are still more sensing resources available, we assign the additional sensing resources to objects without explicitly specified requirements via an information based approach. Finally, sensor scheduling is applied to each sensor-object or sensor-cell pair according to the object type. We demonstrate our method with realistic space resources management scenario using NASA's General Mission Analysis Tool (GMAT) for space object search and track with multiple space borne observers.
Arnold, E N
1990-05-22
Phylogenies based on morphology vary considerably in their quality: some are robust and explicit with little conflict in the data set, whereas others are far more tenuous, with much conflict and many possible alternatives. The main primary reasons for untrue or inexplicit morphological phylogenies are: not enough characters developed between branching points, uncertain character polarity, poorly differentiated character states, homoplasy caused by parallelism or reversal, and extinction, which may remove species entirely from consideration and can make originally conflicting data sets misleadingly compatible, increasing congruence at the expense of truth. Extinction differs from other confounding factors in not being apparent either in the data set or in subsequent analysis. One possibility is that variation in the quality of morphological phylogenies has resulted from exposure to different ecological situations. To investigate this, it is necessary to compare the histories of the clades concerned. In the case of explicit morphological phylogenies, ecological and behavioural data can be integrated with them and it may then be possible to decide whether morphological characters are likely to have been elicited by the environments through which the clade has passed. The credibility of such results depends not only on the phylogeny being robust but also on its detailed topology: a pectinate phylogeny will often allow more certain and more explicit statements to be made about historical events. In the case of poor phylogenies, it is not possible to produce detailed histories, but they can be compared with robust phylogenies in the range of ecological situations occupied, and whether they occupy novel situations in comparison with their outgroups. LeQuesne testing can give information about niche homoplasy, and it may also be possible to see if morphological features are functionally associated with ecological parameters, even if the direction of change is unknown. Examination of the robust and explicit phylogeny of the semaphore geckoes (Pristurus) suggests that its quality does stem from a variety of environmental factors. The group has progressed along an ecological continuum, passing through a series of increasingly severe niches that appear to have elicited many morphological changes. The fact that niches are progressively filled reduces the likelihood of species reinvading a previous one with related character reversal. Because the niches of advanced Pristurus are virtually unique within the Gekkonidae the morphological changes produced are also very rare and therefore easy to polarize. Ecological changes on the main stem of the phylogeny are abrupt and associated character states consequently well differentiated.(ABSTRACT TRUNCATED AT 400 WORDS)
Practical considerations in the development of hemoglobin-based oxygen therapeutics.
Kim, Hae Won; Estep, Timothy N
2012-09-01
The development of hemoglobin based oxygen therapeutics (HBOCs) requires consideration of a number of factors. While the enabling technology derives from fundamental research on protein biochemistry and biological interactions, translation of these research insights into usable medical therapeutics demands the application of considerable technical expertise and consideration and reconciliation of a myriad of manufacturing, medical, and regulatory requirements. The HBOC development challenge is further exacerbated by the extremely high intravenous doses required for many of the indications contemplated for these products, which in turn implies an extremely high level of purity is required. This communication discusses several of the important product configuration and developmental considerations that impact the translation of fundamental research discoveries on HBOCs into usable medical therapeutics.
ERIC Educational Resources Information Center
Siegfried, William D.
1982-01-01
To determine effectiveness of instructions designed to reduce sex discrimination in employment interviews, students were asked to rate resumes for a male and a female applicant under different instructional conditions. Results suggested that: legal warnings may bias ratings in favor of male applicants; and specifying job requirements reduces…
Mahar, Alyson L.; Compton, Carolyn; McShane, Lisa M.; Halabi, Susan; Asamura, Hisao; Rami-Porta, Ramon; Groome, Patti A.
2015-01-01
Introduction Accurate, individualized prognostication for lung cancer patients requires the integration of standard patient and pathologic factors, biologic, genetic, and other molecular characteristics of the tumor. Clinical prognostic tools aim to aggregate information on an individual patient to predict disease outcomes such as overall survival, but little is known about their clinical utility and accuracy in lung cancer. Methods A systematic search of the scientific literature for clinical prognostic tools in lung cancer published Jan 1, 1996-Jan 27, 2015 was performed. In addition, web-based resources were searched. A priori criteria determined by the Molecular Modellers Working Group of the American Joint Committee on Cancer were used to investigate the quality and usefulness of tools. Criteria included clinical presentation, model development approaches, validation strategies, and performance metrics. Results Thirty-two prognostic tools were identified. Patients with metastases were the most frequently considered population in non-small cell lung cancer. All tools for small cell lung cancer covered that entire patient population. Included prognostic factors varied considerably across tools. Internal validity was not formally evaluated for most tools and only eleven were evaluated for external validity. Two key considerations were highlighted for tool development: identification of an explicit purpose related to a relevant clinical population and clear decision-points, and prioritized inclusion of established prognostic factors over emerging factors. Conclusions Prognostic tools will contribute more meaningfully to the practice of personalized medicine if better study design and analysis approaches are used in their development and validation. PMID:26313682
Iskrov, Georgi; Dermendzhiev, Svetlan; Miteva-Katrandzhieva, Tsonka; Stefanov, Rumen
2016-01-01
Assessment and appraisal of new medical technologies require a balance between the interests of different stakeholders. Final decision should take into account the societal value of new therapies. This perspective paper discusses the socio-economic burden of disease as a specific reimbursement decision-making criterion and calls for the inclusion of it as a counterbalance to the cost-effectiveness and budget impact criteria. Socio-economic burden is a decision-making criterion, accounting for diseases, for which the assessed medical technology is indicated. This indicator is usually researched through cost-of-illness studies that systematically quantify the socio-economic burden of diseases on the individual and on the society. This is a very important consideration as it illustrates direct budgetary consequences of diseases in the health system and indirect costs associated with patient or carer productivity losses. By measuring and comparing the socio-economic burden of different diseases to society, health authorities and payers could benefit in optimizing priority setting and resource allocation. New medical technologies, especially innovative therapies, present an excellent case study for the inclusion of socio-economic burden in reimbursement decision-making. Assessment and appraisal have been greatly concentrated so far on cost-effectiveness and budget impact, marginalizing all other considerations. In this context, data on disease burden and inclusion of explicit criterion of socio-economic burden in reimbursement decision-making may be highly beneficial. Realizing the magnitude of the lost socio-economic contribution resulting from diseases in question could be a reasonable way for policy makers to accept a higher valuation of innovative therapies.
Development and necessary norms of reasoning
Markovits, Henry
2014-01-01
The question of whether reasoning can, or should, be described by a single normative model is an important one. In the following, I combine epistemological considerations taken from Piaget’s notion of genetic epistemology, a hypothesis about the role of reasoning in communication and developmental data to argue that some basic logical principles are in fact highly normative. I argue here that explicit, analytic human reasoning, in contrast to intuitive reasoning, uniformly relies on a form of validity that allows distinguishing between valid and invalid arguments based on the existence of counterexamples to conclusions. PMID:24904501
Anthropology and affect: a consideration of the idiosyncratic dimension of human behaviour.
Izmirlian, H
1977-01-01
In this paper a theoretical perspective is presented in which affect occupies a central position and behaviour is viewed in terms of different degrees of affective expression. Such behaviour is conceptualized in terms of three models: a structural model, a rational model and a psychological model. While the first two models are frequently encountered in the literature, the psychological model has not received explicit formulation, although, as shown here, it is crucial in understanding certain forms of idiosyncratic behaviour that have political and social relevance.
Robust root clustering for linear uncertain systems using generalized Lyapunov theory
NASA Technical Reports Server (NTRS)
Yedavalli, R. K.
1993-01-01
Consideration is given to the problem of matrix root clustering in subregions of a complex plane for linear state space models with real parameter uncertainty. The nominal matrix root clustering theory of Gutman & Jury (1981) using the generalized Liapunov equation is extended to the perturbed matrix case, and bounds are derived on the perturbation to maintain root clustering inside a given region. The theory makes it possible to obtain an explicit relationship between the parameters of the root clustering region and the uncertainty range of the parameter space.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakano, Hiroshi; Elements Strategy Initiative for Catalysts and Batteries, Kyoto University, Kyoto 615-8245
2015-12-31
Electronic polarization effects of a medium can have a significant impact on a chemical reaction in condensed phases. We discuss the effects on the charge transfer excitation of a chromophore, N,N-dimethyl-4-nitroaniline, in various solvents using the mean-field QM/MM method with a polarizable force field. The results show that the explicit consideration of the solvent electronic polarization effects is important especially for a solvent with a low dielectric constant when we study the solvatochromism of the chromophore.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furuuchi, Kazuyuki; Sperling, Marcus, E-mail: kazuyuki.furuuchi@manipal.edu, E-mail: marcus.sperling@univie.ac.at
2017-05-01
We study quantum tunnelling in Dante's Inferno model of large field inflation. Such a tunnelling process, which will terminate inflation, becomes problematic if the tunnelling rate is rapid compared to the Hubble time scale at the time of inflation. Consequently, we constrain the parameter space of Dante's Inferno model by demanding a suppressed tunnelling rate during inflation. The constraints are derived and explicit numerical bounds are provided for representative examples. Our considerations are at the level of an effective field theory; hence, the presented constraints have to hold regardless of any UV completion.
Moore, Quianta L.; Paul, Mary E.; McGuire, Amy L.
2016-01-01
Whether adolescents can participate in clinical trials of pharmacologic therapies for HIV prevention, such as preexposure prophylaxis, without parental permission hinges on state minor consent laws. Very few of these laws explicitly authorize adolescents to consent to preventive services for HIV and other sexually transmitted infections. Unclear state laws may lead to research cessation. We have summarized legal, ethical, and policy considerations related to adolescents’ participation in HIV and sexually transmitted infection prevention research in the United States, and we have explored strategies for facilitating adolescents’ access. PMID:26562103
1979-01-01
tractability for scientific analysis. Although much remains to be learned about mutation as a testing tool, there is a considerable body of written material...explicitly address classifications (2) may not have been affected at all! In general, the and (3) in this article , except to point out that even relative...34 ARTICLE B=B’B C IN CACM 1971). C=C*02 INTEGER AiN),N.F D=B+C INTEGER M.NS.R.I.J.W IF (A.NE.D) GOTO 200 MI PRINT 150 NS=N 150 FORMATIIH .RIGHT ANGLED
Chen, Jianyong; Wang, Xuan; Zhang, Meng; Zhang, Feng; Shen, Mowei
2015-06-01
While drug-related contexts have been shown to influence drug users' implicit and explicit drug-related cognitions, this has been minimally explored in heroin abusers. This study examined the effect of heroin-related cue exposure on implicit and explicit valence and arousal-sedation associations with heroin use for abstinent heroin abusers. In Experiment 1, 39 male abstinent heroin abusers were exposed to heroin-related words and reported cravings before and after cue exposure. They subsequently performed two Extrinsic Affective Simon Tasks (EASTs), which were used to assess implicit valence and arousal-sedation associations with heroin use. Thirty-six male abstinent heroin abusers (controls) only performed the two EASTs. All participants completed measures of explicit expectancy regarding heroin use. In Experiment 2, twenty-eight newly recruited abstinent heroin abusers were exposed to heroin-related pictures, and completed the same implicit and explicit measures used in Experiment 1. A non-significant increase in craving after cue exposure was observed. While participants exposed to heroin-related words or pictures exhibited more positive implicit heroin use associations (relative to negative associations), and such trend was not observed in controls, this difference was not significant across groups. Participants still indicated negative explicit associations with heroin use after cue exposure. Exposure to cues significantly accelerated arousal and sedation responses. Whether cue exposure could change self-reported craving requires further study in abstinent heroin abusers. The exclusively male sample limits generalization of the results. The present findings extend the evidence on whether implicit and explicit heroin-related cognitions are susceptible to context. Copyright © 2014 Elsevier Ltd. All rights reserved.
Jump conditions in transonic equilibria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guazzotto, L.; Betti, R.; Jardin, S. C.
2013-04-15
In the present paper, the numerical calculation of transonic equilibria, first introduced with the FLOW code in Guazzotto et al.[Phys. Plasmas 11, 604 (2004)], is critically reviewed. In particular, the necessity and effect of imposing explicit jump conditions at the transonic discontinuity are investigated. It is found that 'standard' (low-{beta}, large aspect ratio) transonic equilibria satisfy the correct jump condition with very good approximation even if the jump condition is not explicitly imposed. On the other hand, it is also found that high-{beta}, low aspect ratio equilibria require the correct jump condition to be explicitly imposed. Various numerical approaches aremore » described to modify FLOW to include the jump condition. It is proved that the new methods converge to the correct solution even in extreme cases of very large {beta}, while they agree with the results obtained with the old implementation of FLOW in lower-{beta} equilibria.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Guangye; Chacon, Luis; Knoll, Dana Alan
2015-07-31
A multi-rate PIC formulation was developed that employs large timesteps for slow field evolution, and small (adaptive) timesteps for particle orbit integrations. Implementation is based on a JFNK solver with nonlinear elimination and moment preconditioning. The approach is free of numerical instabilities (ω peΔt >>1, and Δx >> λ D), and requires many fewer dofs (vs. explicit PIC) for comparable accuracy in challenging problems. Significant gains (vs. conventional explicit PIC) may be possible for large scale simulations. The paper is organized as follows: Vlasov-Maxwell Particle-in-cell (PIC) methods for plasmas; Explicit, semi-implicit, and implicit time integrations; Implicit PIC formulation (Jacobian-Free Newton-Krylovmore » (JFNK) with nonlinear elimination allows different treatments of disparate scales, discrete conservation properties (energy, charge, canonical momentum, etc.)); Some numerical examples; and Summary.« less
NASA Astrophysics Data System (ADS)
Zhang, Ruili; Wang, Yulei; He, Yang; Xiao, Jianyuan; Liu, Jian; Qin, Hong; Tang, Yifa
2018-02-01
Relativistic dynamics of a charged particle in time-dependent electromagnetic fields has theoretical significance and a wide range of applications. The numerical simulation of relativistic dynamics is often multi-scale and requires accurate long-term numerical simulations. Therefore, explicit symplectic algorithms are much more preferable than non-symplectic methods and implicit symplectic algorithms. In this paper, we employ the proper time and express the Hamiltonian as the sum of exactly solvable terms and product-separable terms in space-time coordinates. Then, we give the explicit symplectic algorithms based on the generating functions of orders 2 and 3 for relativistic dynamics of a charged particle. The methodology is not new, which has been applied to non-relativistic dynamics of charged particles, but the algorithm for relativistic dynamics has much significance in practical simulations, such as the secular simulation of runaway electrons in tokamaks.
Meromorphic solutions of recurrence relations and DRA method for multicomponent master integrals
NASA Astrophysics Data System (ADS)
Lee, Roman N.; Mingulov, Kirill T.
2018-04-01
We formulate a method to find the meromorphic solutions of higher-order recurrence relations in the form of the sum over poles with coefficients defined recursively. Several explicit examples of the application of this technique are given. The main advantage of the described approach is that the analytical properties of the solutions are very clear (the position of poles is explicit, the behavior at infinity can be easily determined). These are exactly the properties that are required for the application of the multiloop calculation method based on dimensional recurrence relations and analyticity (the DRA method).
NASA Technical Reports Server (NTRS)
Kim, Jonnathan H.
1995-01-01
Humans can perform many complicated tasks without explicit rules. This inherent and advantageous capability becomes a hurdle when a task is to be automated. Modern computers and numerical calculations require explicit rules and discrete numerical values. In order to bridge the gap between human knowledge and automating tools, a knowledge model is proposed. Knowledge modeling techniques are discussed and utilized to automate a labor and time intensive task of detecting anomalous bearing wear patterns in the Space Shuttle Main Engine (SSME) High Pressure Oxygen Turbopump (HPOTP).
Discrimination of tonal and atonal music in congenital amusia: The advantage of implicit tasks.
Tillmann, Barbara; Lalitte, Philippe; Albouy, Philippe; Caclin, Anne; Bigand, Emmanuel
2016-05-01
Congenital amusia is a neurodevelopmental disorder of music perception and production, which has been attributed to a major deficit in pitch processing. While most studies and diagnosis tests have used explicit investigation methods, recent studies using implicit investigation approaches have revealed some unimpaired pitch structure processing in congenital amusia. The present study investigated amusic individuals' processing of tonal structures (e.g., musical structures respecting the Western tonal system) via three different questions. Amusic participants and their matched controls judged tonal versions (original musical excerpts) and atonal versions (with manipulated pitch content to remove tonal structures) of 12 musical pieces. For each piece, participants answered three questions that required judgments from different perspectives: an explicit structural one, a personal, emotional one and a more social one (judging the perception of others). Results revealed that amusic individuals' judgments differed between tonal and atonal versions. However, the question type influenced the extent of the revealed structure processing: while amusic individuals were impaired for the question requiring explicit structural judgments, they performed as well as their matched controls for the two other questions. Together with other recent studies, these findings suggest that congenital amusia might be related to a disorder of the conscious access to music processing rather than music processing per se. Copyright © 2016 Elsevier Ltd. All rights reserved.
Emotional valence and physical space: limits of interaction.
de la Vega, Irmgard; de Filippis, Mónica; Lachmair, Martin; Dudschig, Carolin; Kaup, Barbara
2012-04-01
According to the body-specificity hypothesis, people associate positive things with the side of space that corresponds to their dominant hand and negative things with the side corresponding to their nondominant hand. Our aim was to find out whether this association holds also true for a response time study using linguistic stimuli, and whether such an association is activated automatically. Four experiments explored this association using positive and negative words. In Exp. 1, right-handers made a lexical judgment by pressing a left or right key. Attention was not explicitly drawn to the valence of the stimuli. No valence-by-side interaction emerged. In Exp. 2 and 3, right-handers and left-handers made a valence judgment by pressing a left or a right key. A valence-by-side interaction emerged: For positive words, responses were faster when participants responded with their dominant hand, whereas for negative words, responses were faster for the nondominant hand. Exp. 4 required a valence judgment without stating an explicit mapping of valence and side. No valence-by-side interaction emerged. The experiments provide evidence for an association between response side and valence, which, however, does not seem to be activated automatically but rather requires a task with an explicit response mapping to occur.
Replica exchange with solute tempering: A method for sampling biological systems in explicit water
NASA Astrophysics Data System (ADS)
Liu, Pu; Kim, Byungchan; Friesner, Richard A.; Berne, B. J.
2005-09-01
An innovative replica exchange (parallel tempering) method called replica exchange with solute tempering (REST) for the efficient sampling of aqueous protein solutions is presented here. The method bypasses the poor scaling with system size of standard replica exchange and thus reduces the number of replicas (parallel processes) that must be used. This reduction is accomplished by deforming the Hamiltonian function for each replica in such a way that the acceptance probability for the exchange of replica configurations does not depend on the number of explicit water molecules in the system. For proof of concept, REST is compared with standard replica exchange for an alanine dipeptide molecule in water. The comparisons confirm that REST greatly reduces the number of CPUs required by regular replica exchange and increases the sampling efficiency. This method reduces the CPU time required for calculating thermodynamic averages and for the ab initio folding of proteins in explicit water. Author contributions: B.J.B. designed research; P.L. and B.K. performed research; P.L. and B.K. analyzed data; and P.L., B.K., R.A.F., and B.J.B. wrote the paper.Abbreviations: REST, replica exchange with solute tempering; REM, replica exchange method; MD, molecular dynamics.*P.L. and B.K. contributed equally to this work.
Human Expertise Helps Computer Classify Images
NASA Technical Reports Server (NTRS)
Rorvig, Mark E.
1991-01-01
Two-domain method of computational classification of images requires less computation than other methods for computational recognition, matching, or classification of images or patterns. Does not require explicit computational matching of features, and incorporates human expertise without requiring translation of mental processes of classification into language comprehensible to computer. Conceived to "train" computer to analyze photomicrographs of microscope-slide specimens of leucocytes from human peripheral blood to distinguish between specimens from healthy and specimens from traumatized patients.
Reduced attentional blink for gambling-related stimuli in problem gamblers.
Brevers, Damien; Cleeremans, Axel; Tibboel, Helen; Bechara, Antoine; Kornreich, Charles; Verbanck, Paul; Noël, Xavier
2011-09-01
Although there is considerable information concerning the attentional biases in psychoactive substance use and misuse, much less is known about the contribution of attentional processing in problem gambling. The aim of this study was to examine whether problem gamblers (PrG) exhibit attentional bias at the level of the encoding processing stage. Forty PrG and 35 controls participated in an attentional blink (AB) paradigm in which they were required to identify both gambling and neutral words that appeared in a rapid serial visual presentation. Explicit motivation (e.g., intrinsic/arousal, extrinsic, amotivation) toward the gambling cues was recorded. A diminished AB effect for gambling-related words compared to neutral targets was identified in PrG. In contrast, AB was similar when either gambling-related or neutral words were presented to controls. Furthermore, there was a significant positive correlation between the reduced AB for gambling-related words and the sub-score of intrinsic/arousal motivation to gamble in PrG. Such findings suggest that the PrG group exhibits an enhanced ability to process gambling-related information, which is associated with their desire to gamble for arousal reasons. Theoretical and clinical implications of these results are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.
Wing, Steve; Richardson, David B; Hoffmann, Wolfgang
2011-04-01
In April 2010, the U.S. Nuclear Regulatory Commission asked the National Academy of Sciences to update a 1990 study of cancer risks near nuclear facilities. Prior research on this topic has suffered from problems in hypothesis formulation and research design. We review epidemiologic principles used in studies of generic exposure-response associations and in studies of specific sources of exposure. We then describe logical problems with assumptions, formation of testable hypotheses, and interpretation of evidence in previous research on cancer risks near nuclear facilities. Advancement of knowledge about cancer risks near nuclear facilities depends on testing specific hypotheses grounded in physical and biological mechanisms of exposure and susceptibility while considering sample size and ability to adequately quantify exposure, ascertain cancer cases, and evaluate plausible confounders. Next steps in advancing knowledge about cancer risks near nuclear facilities require studies of childhood cancer incidence, focus on in utero and early childhood exposures, use of specific geographic information, and consideration of pathways for transport and uptake of radionuclides. Studies of cancer mortality among adults, cancers with long latencies, large geographic zones, and populations that reside at large distances from nuclear facilities are better suited for public relations than for scientific purposes.
Computational screening of biomolecular adsorption and self-assembly on nanoscale surfaces.
Heinz, Hendrik
2010-05-01
The quantification of binding properties of ions, surfactants, biopolymers, and other macromolecules to nanometer-scale surfaces is often difficult experimentally and a recurring challenge in molecular simulation. A simple and computationally efficient method is introduced to compute quantitatively the energy of adsorption of solute molecules on a given surface. Highly accurate summation of Coulomb energies as well as precise control of temperature and pressure is required to extract the small energy differences in complex environments characterized by a large total energy. The method involves the simulation of four systems, the surface-solute-solvent system, the solute-solvent system, the solvent system, and the surface-solvent system under consideration of equal molecular volumes of each component under NVT conditions using standard molecular dynamics or Monte Carlo algorithms. Particularly in chemically detailed systems including thousands of explicit solvent molecules and specific concentrations of ions and organic solutes, the method takes into account the effect of complex nonbond interactions and rotational isomeric states on the adsorption behavior on surfaces. As a numerical example, the adsorption of a dodecapeptide on the Au {111} and mica {001} surfaces is described in aqueous solution. Copyright 2009 Wiley Periodicals, Inc.
Paradigm shifts and the interplay between state, business and civil sectors.
Encarnação, Sara; Santos, Fernando P; Santos, Francisco C; Blass, Vered; Pacheco, Jorge M; Portugali, Juval
2016-12-01
The recent rise of the civil sector as a main player of socio-political actions, next to public and private sectors, has largely increased the complexity underlying the interplay between different sectors of our society. From urban planning to global governance, analysis of these complex interactions requires new mathematical and computational approaches. Here, we develop a novel framework, grounded on evolutionary game theory, to envisage situations in which each of these sectors is confronted with the dilemma of deciding between maintaining a status quo scenario or shifting towards a new paradigm. We consider multisector conflicts regarding environmentally friendly policies as an example of application, but the framework developed here has a considerably broader scope. We show that the public sector is crucial in initiating the shift, and determine explicitly under which conditions the civil sector-reflecting the emergent reality of civil society organizations playing an active role in modern societies-may influence the decision-making processes accruing to other sectors, while fostering new routes towards a paradigm shift of the society as a whole. Our results are shown to be robust to a wide variety of assumptions and model parametrizations.
Potential impacts of global warming on water resources in southern California.
Beuhler, M
2003-01-01
Global warming will have a significant impact on water resources within the 20 to 90-year planning period of many water projects. Arid and semi-arid regions such as Southern California are especially vulnerable to anticipated negative impacts of global warming on water resources. Long-range water facility planning must consider global climate change in the recommended mix of new facilities needed to meet future water requirements. The generally accepted impacts of global warming include temperature, rising sea levels, more frequent and severe floods and droughts, and a shift from snowfall to rain. Precipitation changes are more difficult to predict. For Southern California, these impacts will be especially severe on surface water supplies. Additionally, rising sea levels will exacerbate salt-water intrusion into freshwater and impact the quality of surface water supplies. Integrated water resources planning is emerging as a tool to develop water supplies and demand management strategies that are less vulnerable to the impacts of global warming. These tools include water conservation, conjunctive use of surface and groundwater and desalination of brackish water and possibly seawater. Additionally, planning for future water needs should include explicit consideration of the potential range of global warming impacts through techniques such as scenario planning.
Paradigm shifts and the interplay between state, business and civil sectors
NASA Astrophysics Data System (ADS)
Encarnação, Sara; Santos, Fernando P.; Santos, Francisco C.; Blass, Vered; Pacheco, Jorge M.; Portugali, Juval
2016-12-01
The recent rise of the civil sector as a main player of socio-political actions, next to public and private sectors, has largely increased the complexity underlying the interplay between different sectors of our society. From urban planning to global governance, analysis of these complex interactions requires new mathematical and computational approaches. Here, we develop a novel framework, grounded on evolutionary game theory, to envisage situations in which each of these sectors is confronted with the dilemma of deciding between maintaining a status quo scenario or shifting towards a new paradigm. We consider multisector conflicts regarding environmentally friendly policies as an example of application, but the framework developed here has a considerably broader scope. We show that the public sector is crucial in initiating the shift, and determine explicitly under which conditions the civil sector-reflecting the emergent reality of civil society organizations playing an active role in modern societies-may influence the decision-making processes accruing to other sectors, while fostering new routes towards a paradigm shift of the society as a whole. Our results are shown to be robust to a wide variety of assumptions and model parametrizations.
Task-set inertia and memory-consolidation bottleneck in dual tasks.
Koch, Iring; Rumiati, Raffaella I
2006-11-01
Three dual-task experiments examined the influence of processing a briefly presented visual object for deferred verbal report on performance in an unrelated auditory-manual reaction time (RT) task. RT was increased at short stimulus-onset asynchronies (SOAs) relative to long SOAs, showing that memory consolidation processes can produce a functional processing bottleneck in dual-task performance. In addition, the experiments manipulated the spatial compatibility of the orientation of the visual object and the side of the speeded manual response. This cross-task compatibility produced relative RT benefits only when the instruction for the visual task emphasized overlap at the level of response codes across the task sets (Experiment 1). However, once the effective task set was in place, it continued to produce cross-task compatibility effects even in single-task situations ("ignore" trials in Experiment 2) and when instructions for the visual task did not explicitly require spatial coding of object orientation (Experiment 3). Taken together, the data suggest a considerable degree of task-set inertia in dual-task performance, which is also reinforced by finding costs of switching task sequences (e.g., AC --> BC vs. BC --> BC) in Experiment 3.
Cumulative environmental impacts and integrated coastal management: the case of Xiamen, China.
Xue, Xiongzhi; Hong, Huasheng; Charles, Anthony T
2004-07-01
This paper examines the assessment of cumulative environmental impacts and the implementation of integrated coastal management within the harbour of Xiamen, China, an urban region in which the coastal zone is under increasing pressure as a result of very rapid economic growth. The first stage of analysis incorporates components of a cumulative effects assessment, including (a) identification of sources of environmental impacts, notably industrial expansion, port development, shipping, waste disposal, aquaculture and coastal construction, (b) selection of a set of valued ecosystem components, focusing on circulation and siltation, water quality, sediment, the benthic community, and mangrove forests, and (c) use of a set of key indicators to examine cumulative impacts arising from the aggregate of human activities. In the second stage of analysis, the paper describes and assesses the development of an institutional framework for integrated coastal management in Xiamen, one that combines policy and planning (including legislative and enforcement mechanisms) with scientific and monitoring mechanisms (including an innovative 'marine functional zoning' system). The paper concludes that the integrated coastal management framework in Xiamen has met all relevant requirements for 'integration' as laid out in the literature, and has explicitly incorporated consideration of cumulative impacts within its management and monitoring processes.
MaTrace: tracing the fate of materials over time and across products in open-loop recycling.
Nakamura, Shinichiro; Kondo, Yasushi; Kagawa, Shigemi; Matsubae, Kazuyo; Nakajima, Kenichi; Nagasaka, Tetsuya
2014-07-01
Even for metals, open-loop recycling is more common than closed-loop recycling due, among other factors, to the degradation of quality in the end-of-life (EoL) phase. Open-loop recycling is subject to loss of functionality of original materials, dissipation in forms that are difficult to recover, and recovered metals might need dilution with primary metals to meet quality requirements. Sustainable management of metal resources calls for the minimization of these losses. Imperative to this is quantitative tracking of the fate of materials across different stages, products, and losses. A new input-output analysis (IO) based model of dynamic material flow analysis (MFA) is presented that can trace the fate of materials over time and across products in open-loop recycling taking explicit consideration of losses and the quality of scrap into account. Application to car steel recovered from EoL vehicles (ELV) showed that after 50 years around 80% of the steel is used in products, mostly buildings and civil engineering (infrastructure), with the rest mostly resided in unrecovered obsolete infrastructure and refinery losses. Sensitivity analysis was conducted to evaluate the effects of changes in product lifespan, and the quality of scrap.
Where do Students Go Wrong in Applying the Scientific Method?
NASA Astrophysics Data System (ADS)
Rubbo, Louis; Moore, Christopher
2015-04-01
Non-science majors completing a liberal arts degree are frequently required to take a science course. Ideally with the completion of a required science course, liberal arts students should demonstrate an improved capability in the application of the scientific method. In previous work we have demonstrated that this is possible if explicit instruction is spent on the development of scientific reasoning skills. However, even with explicit instruction, students still struggle to apply the scientific process. Counter to our expectations, the difficulty is not isolated to a single issue such as stating a testable hypothesis, designing an experiment, or arriving at a supported conclusion. Instead students appear to struggle with every step in the process. This talk summarizes our work looking at and identifying where students struggle in the application of the scientific method. This material is based upon work supported by the National Science Foundation under Grant No. 1244801.
Emotion regulation and mania risk: Differential responses to implicit and explicit cues to regulate.
Ajaya, Yatrika; Peckham, Andrew D; Johnson, Sheri L
2016-03-01
People prone to mania use emotion regulation (ER) strategies well when explicitly coached to do so in laboratory settings, but they find these strategies ineffective in daily life. We hypothesized that, compared with control participants, mania-prone people would show ER deficits when they received implicit, but not explicit, cues to use ER. Undergraduates (N = 66) completed the Hypomanic Personality Scale (HPS) and were randomly assigned to one of three experimental conditions: automatic ER (scrambled sentence primes), deliberate ER (verbal instructions), or control (no priming or instructions to use ER). Then, participants played a videogame designed to evoke anger. Emotion responses were measured with a multi-modal assessment of self-reported affect, psychophysiology, and facial expressions. Respiratory sinus arrhythmia (RSA) was used to index ER. The videogame effectively elicited subjective anger, angry facial expressions, and heart rate increases when keys malfunctioned. As hypothesized, persons who were more mania prone showed greater RSA increases in the deliberate ER condition than in the automatic or control conditions. One potential limitation is the use of an analog sample. Findings suggest that those at risk for mania require more explicit instruction to engage ER effectively. Copyright © 2015 Elsevier Ltd. All rights reserved.
Hughey, Laura; Wheaton, Lewis A
2016-01-01
Loss of an upper extremity and the resulting rehabilitation often requires individuals to learn how to use a prosthetic device for activities of daily living. It remains unclear how prostheses affect motor learning outcomes. The authors' aim was to evaluate whether incidental motor learning and explicit recall is affected in intact persons either using prostheses (n = 10) or the sound limb (n = 10), and a chronic amputee on a modified serial reaction time task. Latency and accuracy of task completion were recorded over six blocks, with a distractor task between blocks 5 and 6. Participants were also asked to recall the sequence immediately following the study and at a 24-hr follow-up. Prosthesis users demonstrate patterns consistent with implicit learning, with sustained error patterns with the distal terminal device. More intact individuals were able to explicitly recall the sequence initially, however there was no significant difference 24 hr following the study. Acute incidental motor learning does not appear to diminish task related error patterns or accompany with explicit recall in prosthesis users, which could present limitations for acute training of prosthesis use in amputees. This suggests differing mechanisms of visuospatial sequential learning and motor control with prostheses.
Brown, Jessica A; Hux, Karen; Knollman-Porter, Kelly; Wallace, Sarah E
2016-01-01
Concomitant visual and cognitive impairments following traumatic brain injuries (TBIs) may be problematic when the visual modality serves as a primary source for receiving information. Further difficulties comprehending visual information may occur when interpretation requires processing inferential rather than explicit content. The purpose of this study was to compare the accuracy with which people with and without severe TBI interpreted information in contextually rich drawings. Fifteen adults with and 15 adults without severe TBI. Repeated-measures between-groups design. Participants were asked to match images to sentences that either conveyed explicit (ie, main action or background) or inferential (ie, physical or mental inference) information. The researchers compared accuracy between participant groups and among stimulus conditions. Participants with TBI demonstrated significantly poorer accuracy than participants without TBI extracting information from images. In addition, participants with TBI demonstrated significantly higher response accuracy when interpreting explicit rather than inferential information; however, no significant difference emerged between sentences referencing main action versus background information or sentences providing physical versus mental inference information for this participant group. Difficulties gaining information from visual environmental cues may arise for people with TBI given their difficulties interpreting inferential content presented through the visual modality.
48 CFR 29.304 - Matters requiring special consideration.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Matters requiring special consideration. 29.304 Section 29.304 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS TAXES State and Local Taxes 29.304 Matters requiring special...
48 CFR 29.304 - Matters requiring special consideration.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Matters requiring special consideration. 29.304 Section 29.304 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS TAXES State and Local Taxes 29.304 Matters requiring special...
48 CFR 29.304 - Matters requiring special consideration.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Matters requiring special consideration. 29.304 Section 29.304 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS TAXES State and Local Taxes 29.304 Matters requiring special...
48 CFR 29.304 - Matters requiring special consideration.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Matters requiring special consideration. 29.304 Section 29.304 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS TAXES State and Local Taxes 29.304 Matters requiring special...
48 CFR 29.304 - Matters requiring special consideration.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Matters requiring special consideration. 29.304 Section 29.304 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS TAXES State and Local Taxes 29.304 Matters requiring special...
SPIN CORRELATIONS OF THE FINAL LEPTONS IN THE TWO-PHOTON PROCESSES γγ → e+e-, μ+μ-, τ+τ-
NASA Astrophysics Data System (ADS)
Lyuboshitz, Valery V.; Lyuboshitz, Vladimir L.
2014-12-01
The spin structure of the process γγ → e+e- is theoretically investigated. It is shown that, if the primary photons are unpolarized, the final electron and positron are unpolarized as well but their spins are strongly correlated. For the final (e+e-) system, explicit expressions for the components of the correlation tensor are derived, and the relative fractions of singlet and triplet states are found. It is demonstrated that in the process γγ → e+e- one of the Bell-type incoherence inequalities for the correlation tensor components is always violated and, thus, spin correlations of the electron and positron in this process have the strongly pronounced quantum character. Analogous consideration can be wholly applied as well to the two-photon processes γγ → μ+μ- and γγ → τ+τ-, which become possible at considerably higher energies.
Consideration and Checkboxes: Incorporating Ethics and Science into the 3Rs
Landi, Margaret S; Shriver, Adam J; Mueller, Anne
2015-01-01
Members of the research community aim to both produce high-quality research and ensure that harm is minimized in animals. The primary means of ensuring these goals are both met is the 3Rs framework of replacement, reduction, and refinement. However, some approaches to the 3Rs may result in a ‘check box mentality’ in which IACUC members, researchers, administrators, and caretakers check off a list of tasks to evaluate a protocol. We provide reasons for thinking that the 3Rs approach could be enhanced with more explicit discussion of the ethical assumptions used to arrive at an approved research protocol during IACUC review. Here we suggest that the notion of moral considerability, and all of the related issues it gives rise to, should be incorporated into IACUC discussions of 3Rs deliberations during protocol review to ensure that animal wellbeing is enhanced within the constraints of scientific investigation. PMID:25836970
When should psychiatrists seek criminal prosecution of assaultive psychiatric inpatients?
Ho, Justin; Ralston, D Christopher; McCullough, Laurence B; Coverdale, John H
2009-08-01
This Open Forum commentary reviews the ethical considerations relevant to the question of prosecuting assaultive psychiatric patients, with particular attention to the significance that should be attached to the arguments generated by those considerations. A comprehensive literature search was conducted incorporating the terms "assaultive patients," "ethics," "psychiatric inpatients," and "law." The literature of professional medical ethics was applied to identify relevant domains of ethical argument. Five domains were identified: fiduciary obligations of physicians to assaultive and other patients; obligations to staff members; professional virtues of compassion, self-sacrifice, and self-effacement; retributive justice; and the patient's right to confidentiality. The content of each domain is explained, and guidance is provided on how to assess the relative strengths of ethical argument within each domain. All five domains must be explicitly addressed in order to make ethically disciplined judgments about whether to seek prosecution. A distinctive feature of this ethical analysis is the central importance of the professional virtues.
NASA Technical Reports Server (NTRS)
Wei, Jiangfeng; Dirmeyer, Paul A.; Wisser, Dominik; Bosilovich, Michael G.; Mocko, David M.
2013-01-01
Irrigation is an important human activity that may impact local and regional climate, but current climate model simulations and data assimilation systems generally do not explicitly include it. The European Centre for Medium-Range Weather Forecasts (ECMWF) Interim Re-Analysis (ERA-Interim) shows more irrigation signal in surface evapotranspiration (ET) than the Modern-Era Retrospective Analysis for Research and Applications (MERRA) because ERA-Interim adjusts soil moisture according to the observed surface temperature and humidity while MERRA has no explicit consideration of irrigation at the surface. But, when compared with the results from a hydrological model with detailed considerations of agriculture, the ET from both reanalyses show large deficiencies in capturing the impact of irrigation. Here, a back-trajectory method is used to estimate the contribution of irrigation to precipitation over local and surrounding regions, using MERRA with observation-based corrections and added irrigation-caused ET increase from the hydrological model. Results show substantial contributions of irrigation to precipitation over heavily irrigated regions in Asia, but the precipitation increase is much less than the ET increase over most areas, indicating that irrigation could lead to water deficits over these regions. For the same increase in ET, precipitation increases are larger over wetter areas where convection is more easily triggered, but the percentage increase in precipitation is similar for different areas. There are substantial regional differences in the patterns of irrigation impact, but, for all the studied regions, the highest percentage contribution to precipitation is over local land.
32 CFR Enclosure 1 - Requirements for Environmental Considerations-Global Commons
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 1 2011-07-01 2011-07-01 false Requirements for Environmental Considerations... ENVIRONMENT ENVIRONMENTAL EFFECTS ABROAD OF MAJOR DEPARTMENT OF DEFENSE ACTIONS Information requirements. Pt. 187, Encl. 1 Enclosure 1—Requirements for Environmental Considerations—Global Commons A. General. This...
32 CFR Enclosure 1 - Requirements for Environmental Considerations-Global Commons
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 1 2012-07-01 2012-07-01 false Requirements for Environmental Considerations... ENVIRONMENT ENVIRONMENTAL EFFECTS ABROAD OF MAJOR DEPARTMENT OF DEFENSE ACTIONS Information requirements. Pt. 187, Encl. 1 Enclosure 1—Requirements for Environmental Considerations—Global Commons A. General. This...
Watch what you type: the role of visual feedback from the screen and hands in skilled typewriting.
Snyder, Kristy M; Logan, Gordon D; Yamaguchi, Motonori
2015-01-01
Skilled typing is controlled by two hierarchically structured processing loops (Logan & Crump, 2011): The outer loop, which produces words, commands the inner loop, which produces keystrokes. Here, we assessed the interplay between the two loops by investigating how visual feedback from the screen (responses either were or were not echoed on the screen) and the hands (the hands either were or were not covered with a box) influences the control of skilled typing. Our results indicated, first, that the reaction time of the first keystroke was longer when responses were not echoed than when they were. Also, the interkeystroke interval (IKSI) was longer when the hands were covered than when they were visible, and the IKSI for responses that were not echoed was longer when explicit error monitoring was required (Exp. 2) than when it was not required (Exp. 1). Finally, explicit error monitoring was more accurate when response echoes were present than when they were absent, and implicit error monitoring (i.e., posterror slowing) was not influenced by visual feedback from the screen or the hands. These findings suggest that the outer loop adjusts the inner-loop timing parameters to compensate for reductions in visual feedback. We suggest that these adjustments are preemptive control strategies designed to execute keystrokes more cautiously when visual feedback from the hands is absent, to generate more cautious motor programs when visual feedback from the screen is absent, and to enable enough time for the outer loop to monitor keystrokes when visual feedback from the screen is absent and explicit error reports are required.
Geometric multigrid for an implicit-time immersed boundary method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guy, Robert D.; Philip, Bobby; Griffith, Boyce E.
2014-10-12
The immersed boundary (IB) method is an approach to fluid-structure interaction that uses Lagrangian variables to describe the deformations and resulting forces of the structure and Eulerian variables to describe the motion and forces of the fluid. Explicit time stepping schemes for the IB method require solvers only for Eulerian equations, for which fast Cartesian grid solution methods are available. Such methods are relatively straightforward to develop and are widely used in practice but often require very small time steps to maintain stability. Implicit-time IB methods permit the stable use of large time steps, but efficient implementations of such methodsmore » require significantly more complex solvers that effectively treat both Lagrangian and Eulerian variables simultaneously. Moreover, several different approaches to solving the coupled Lagrangian-Eulerian equations have been proposed, but a complete understanding of this problem is still emerging. This paper presents a geometric multigrid method for an implicit-time discretization of the IB equations. This multigrid scheme uses a generalization of box relaxation that is shown to handle problems in which the physical stiffness of the structure is very large. Numerical examples are provided to illustrate the effectiveness and efficiency of the algorithms described herein. Finally, these tests show that using multigrid as a preconditioner for a Krylov method yields improvements in both robustness and efficiency as compared to using multigrid as a solver. They also demonstrate that with a time step 100–1000 times larger than that permitted by an explicit IB method, the multigrid-preconditioned implicit IB method is approximately 50–200 times more efficient than the explicit method.« less
Amoretti, M Cristina; Lalumera, Elisabetta
2018-05-30
The general concept of mental disorder specified in the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders is definitional in character: a mental disorder might be identified with a harmful dysfunction. The manual also contains the explicit claim that each individual mental disorder should meet the requirements posed by the definition. The aim of this article is two-fold. First, we shall analyze the definition of the superordinate concept of mental disorder to better understand what necessary (and sufficient) criteria actually characterize such a concept. Second, we shall consider the concepts of some individual mental disorders and show that they are in tension with the definition of the superordinate concept, taking pyromania and narcissistic personality disorder as case studies. Our main point is that an unexplained and not-operationalized dysfunction requirement that is included in the general definition, while being systematically violated by the diagnostic criteria of specific mental disorders, is a logical error. Then, either we unpack and operationalize the dysfunction requirement, and include explicit diagnostic criteria that can actually meet it, or we simply drop it.
The impacts of welfare reform on rural public transportation patronage
DOT National Transportation Integrated Search
1999-12-01
This study examines alternative means of forecasting rural public transportation patronage with explicit attention to persons likely to be affected by welfare to work requirements. Original data gathered on rural transit and auto commuters provides a...
77 FR 73611 - Notice of Public Information Collection Requirements Submitted to OMB for Review
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-11
... the operation of a program compared to a set of explicit or implicit standards, as a means of..., filing of petitions and applications and agency #0;statements of organization and functions are examples...
EXTINCTION DEBT OF PROTECTED AREAS IN DEVELOPING LANDSCAPES
To conserve biological diversity, protected-area networks must be based not only upon current species distributions but also the landscape's long-term capacity to support populations. We used spatially-explicit population models requiring detailed habitat and demographic data to ...
Explicit Pharmacokinetic Modeling: Tools for Documentation, Verification, and Portability
Quantitative estimates of tissue dosimetry of environmental chemicals due to multiple exposure pathways require the use of complex mathematical models, such as physiologically-based pharmacokinetic (PBPK) models. The process of translating the abstract mathematics of a PBPK mode...
NASA Astrophysics Data System (ADS)
Perdigão, R. A. P.
2017-12-01
Predictability assessments are traditionally made on a case-by-case basis, often by running the particular model of interest with randomly perturbed initial/boundary conditions and parameters, producing computationally expensive ensembles. These approaches provide a lumped statistical view of uncertainty evolution, without eliciting the fundamental processes and interactions at play in the uncertainty dynamics. In order to address these limitations, we introduce a systematic dynamical framework for predictability assessment and forecast, by analytically deriving governing equations of predictability in terms of the fundamental architecture of dynamical systems, independent of any particular problem under consideration. The framework further relates multiple uncertainty sources along with their coevolutionary interplay, enabling a comprehensive and explicit treatment of uncertainty dynamics along time, without requiring the actual model to be run. In doing so, computational resources are freed and a quick and effective a-priori systematic dynamic evaluation is made of predictability evolution and its challenges, including aspects in the model architecture and intervening variables that may require optimization ahead of initiating any model runs. It further brings out universal dynamic features in the error dynamics elusive to any case specific treatment, ultimately shedding fundamental light on the challenging issue of predictability. The formulated approach, framed with broad mathematical physics generality in mind, is then implemented in dynamic models of nonlinear geophysical systems with various degrees of complexity, in order to evaluate their limitations and provide informed assistance on how to optimize their design and improve their predictability in fundamental dynamical terms.
Principle considerations for the risk assessment of sprayed consumer products.
Steiling, W; Bascompta, M; Carthew, P; Catalano, G; Corea, N; D'Haese, A; Jackson, P; Kromidas, L; Meurice, P; Rothe, H; Singal, M
2014-05-16
In recent years, the official regulation of chemicals and chemical products has been intensified. Explicitly for spray products enhanced requirements to assess the consumers'/professionals' exposure to such product type have been introduced. In this regard the Aerosol-Dispensers-Directive (75/324/EEC) with obligation for marketing aerosol dispensers, and the Cosmetic-Products-Regulation (1223/2009/EC) which obliges the insurance of a safety assessment, have to be mentioned. Both enactments, similar to the REACH regulation (1907/2006/EC), require a robust chemical safety assessment. From such assessment, appropriate risk management measures may be identified to adequately control the risk of these chemicals/products to human health and the environment when used. Currently, the above-mentioned regulations lack the guidance on which data are needed for preparing a proper hazard analysis and safety assessment of spray products. Mandatory in the process of inhalation risk and safety assessment is the determination and quantification of the actual exposure to the spray product and more specifically, its ingredients. In this respect the current article, prepared by the European Aerosol Federation (FEA, Brussels) task force "Inhalation Toxicology", intends to introduce toxicological principles and the state of the art in currently available exposure models adapted for typical application scenarios. This review on current methodologies is intended to guide safety assessors to better estimate inhalation exposure by using the most relevant data. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Sawbridge, Jenny L; Qureshi, Haseeb K; Boyd, Matthew J; Brown, Angus M
2014-09-01
The ability to understand and implement calculations required for molarity and dilution computations that are routinely undertaken in the laboratory are essential skills that should be possessed by all students entering an undergraduate Life Sciences degree. However, it is increasingly recognized that the majority of these students are ill equipped to reliably carry out such calculations. There are several factors that conspire against students' understanding of this topic, with the alien concept of the mole in relation to the mass of compounds and the engineering notation required when expressing the relatively small quantities typically involved being two key examples. In this report, we highlight teaching methods delivered via revision workshops to undergraduate Life Sciences students at the University of Nottingham. Workshops were designed to 1) expose student deficiencies in basic numeracy skills and remedy these deficiencies, 2) introduce molarity and dilution calculations and illustrate their workings in a step-by-step manner, and 3) allow students to appreciate the magnitude of numbers. Preworkshop to postworkshop comparisons demonstrated a considerable improvement in students' performance, which attenuated with time. The findings of our study suggest that an ability to carry out laboratory calculations cannot be assumed in students entering Life Sciences degrees in the United Kingdom but that explicit instruction in the form of workshops improves proficiency to a level of competence that allows students to prosper in the laboratory environment. Copyright © 2014 The American Physiological Society.
Goyal, Puja; Qian, Hu-Jun; Irle, Stephan; Lu, Xiya; Roston, Daniel; Mori, Toshifumi; Elstner, Marcus; Cui, Qiang
2014-09-25
We discuss the description of water and hydration effects that employs an approximate density functional theory, DFTB3, in either a full QM or QM/MM framework. The goal is to explore, with the current formulation of DFTB3, the performance of this method for treating water in different chemical environments, the magnitude and nature of changes required to improve its performance, and factors that dictate its applicability to reactions in the condensed phase in a QM/MM framework. A relatively minor change (on the scale of kBT) in the O-H repulsive potential is observed to substantially improve the structural properties of bulk water under ambient conditions; modest improvements are also seen in dynamic properties of bulk water. This simple change also improves the description of protonated water clusters, a solvated proton, and to a more limited degree, a solvated hydroxide. By comparing results from DFTB3 models that differ in the description of water, we confirm that proton transfer energetics are adequately described by the standard DFTB3/3OB model for meaningful mechanistic analyses. For QM/MM applications, a robust parametrization of QM-MM interactions requires an explicit consideration of condensed phase properties, for which an efficient sampling technique was developed recently and is reviewed here. The discussions help make clear the value and limitations of DFTB3 based simulations, as well as the developments needed to further improve the accuracy and transferability of the methodology.
Adaptive Flood Risk Management Under Climate Change Uncertainty Using Real Options and Optimization.
Woodward, Michelle; Kapelan, Zoran; Gouldby, Ben
2014-01-01
It is well recognized that adaptive and flexible flood risk strategies are required to account for future uncertainties. Development of such strategies is, however, a challenge. Climate change alone is a significant complication, but, in addition, complexities exist trying to identify the most appropriate set of mitigation measures, or interventions. There are a range of economic and environmental performance measures that require consideration, and the spatial and temporal aspects of evaluating the performance of these is complex. All these elements pose severe difficulties to decisionmakers. This article describes a decision support methodology that has the capability to assess the most appropriate set of interventions to make in a flood system and the opportune time to make these interventions, given the future uncertainties. The flood risk strategies have been explicitly designed to allow for flexible adaptive measures by capturing the concepts of real options and multiobjective optimization to evaluate potential flood risk management opportunities. A state-of-the-art flood risk analysis tool is employed to evaluate the risk associated to each strategy over future points in time and a multiobjective genetic algorithm is utilized to search for the optimal adaptive strategies. The modeling system has been applied to a reach on the Thames Estuary (London, England), and initial results show the inclusion of flexibility is advantageous, while the outputs provide decisionmakers with supplementary knowledge that previously has not been considered. © 2013 HR Wallingford Ltd.
A Physical Based Formula for Calculating the Critical Stress of Snow Movement
NASA Astrophysics Data System (ADS)
He, S.; Ohara, N.
2016-12-01
In snow redistribution modeling, one of the most important parameters is the critical stress of snow movement, which is difficult to estimate from field data because it is influenced by various factors. In this study, a new formula for calculating critical stress of snow movement was derived based on the ice particle sintering process modeling and the moment balance of a snow particle. Through this formula, the influences of snow particle size, air temperature, and deposited time on the critical stress were explicitly taken into consideration. It was found that some of the model parameters were sensitive to the critical stress estimation through the sensitivity analysis using Sobol's method. The two sensitive parameters of the sintering process modeling were determined by a calibration-validation procedure using the observed snow flux data via FlowCapt. Based on the snow flux and metrological data observed at the ISAW stations (http://www.iav.ch), it was shown that the results of this formula were able to describe very well the evolution of the minimum friction wind speed required for the snow motion. This new formula suggested that when the snow just reaches the surface, the smaller snowflake can move easier than the larger particles. However, smaller snow particles require more force to move as the sintering between the snowflakes progresses. This implied that compact snow with small snow particles may be harder to erode by wind although smaller particles may have a higher chance to be suspended once they take off.
Consideration of social values in the establishment of accountable care organizations in the USA.
Keren, Ron; Littlejohns, Peter
2012-01-01
The purpose of this paper is to introduce the new US health organizations called accountable care organizations (ACOs) which are expected to improve the quality and reduce the cost of healthcare for Medicare enrolees. It assesses the importance of ACOs, defining and articulating the values that will underpin their strategic and clinical decision making. This paper uses a social values framework developed by Clark and Weale to consider the values relevant to ACOs. It is likely that social values could be made more explicit in a US setting than they have ever been before, via the new ACOs. Social values could start to form part of a local health economy's marketing strategy. ACOs are very new. This paper identifies that they will need to be very explicit about the values relevant to them. The development of ACOs and the articulation of social values therein may even form the basis of a meaningful dialogue on the importance of assessing value for money or cost-effectiveness in the wider US health policy environment.
Explicit formulae for Yang-Mills-Einstein amplitudes from the double copy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chiodaroli, Marco; Günaydin, Murat; Johansson, Henrik
Using the double-copy construction of Yang-Mills-Einstein theories formulated in our earlier work, we obtain compact presentations for single-trace Yang-Mills-Einstein tree amplitudes with up to five external gravitons and an arbitrary number of gluons. These are written as linear combinations of color-ordered Yang-Mills trees, where the coefficients are given by color/kinematics-satisfying numerators in a Yang-Mills + φ 3 theory. The construction outlined in this paper holds in general dimension and extends straightforwardly to supergravity theories. For one, two, and three external gravitons, our expressions give identical or simpler presentations of amplitudes already constructed through string-theory considerations or the scattering equations formalism.more » Our results are based on color/kinematics duality and gauge invariance, and strongly hint at a recursive structure underlying the single-trace amplitudes with an arbitrary number of gravitons. We also present explicit expressions for all-loop single-graviton Einstein-Yang-Mills amplitudes in terms of Yang-Mills amplitudes and, through gauge invariance, derive new all-loop amplitude relations for Yang-Mills theory.« less
Mothers, Intrinsic Math Motivation, Arithmetic Skills, and Math Anxiety in Elementary School
Daches Cohen, Lital; Rubinsten, Orly
2017-01-01
Math anxiety is influenced by environmental, cognitive, and personal factors. Yet, the concurrent relationships between these factors have not been examined. To this end, the current study investigated how the math anxiety of 30 sixth graders is affected by: (a) mother’s math anxiety and maternal behaviors (environmental factors); (b) children’s arithmetic skills (cognitive factors); and (c) intrinsic math motivation (personal factor). A rigorous assessment of children’s math anxiety was made by using both explicit and implicit measures. The results indicated that accessible self-representations of math anxiety, as reflected by the explicit self-report questionnaire, were strongly affected by arithmetic skills. However, unconscious cognitive constructs of math anxiety, as reflected by the numerical dot-probe task, were strongly affected by environmental factors, such as maternal behaviors and mothers’ attitudes toward math. Furthermore, the present study provided preliminary evidence of intergenerational transmission of math anxiety. The conclusions are that in order to better understand the etiology of math anxiety, multiple facets of parenting and children’s skills should be taken into consideration. Implications for researchers, parents, and educators are discussed. PMID:29180973
Explicit formulae for Yang-Mills-Einstein amplitudes from the double copy
Chiodaroli, Marco; Günaydin, Murat; Johansson, Henrik; ...
2017-07-03
Using the double-copy construction of Yang-Mills-Einstein theories formulated in our earlier work, we obtain compact presentations for single-trace Yang-Mills-Einstein tree amplitudes with up to five external gravitons and an arbitrary number of gluons. These are written as linear combinations of color-ordered Yang-Mills trees, where the coefficients are given by color/kinematics-satisfying numerators in a Yang-Mills + φ 3 theory. The construction outlined in this paper holds in general dimension and extends straightforwardly to supergravity theories. For one, two, and three external gravitons, our expressions give identical or simpler presentations of amplitudes already constructed through string-theory considerations or the scattering equations formalism.more » Our results are based on color/kinematics duality and gauge invariance, and strongly hint at a recursive structure underlying the single-trace amplitudes with an arbitrary number of gravitons. We also present explicit expressions for all-loop single-graviton Einstein-Yang-Mills amplitudes in terms of Yang-Mills amplitudes and, through gauge invariance, derive new all-loop amplitude relations for Yang-Mills theory.« less
Mothers, Intrinsic Math Motivation, Arithmetic Skills, and Math Anxiety in Elementary School.
Daches Cohen, Lital; Rubinsten, Orly
2017-01-01
Math anxiety is influenced by environmental, cognitive, and personal factors. Yet, the concurrent relationships between these factors have not been examined. To this end, the current study investigated how the math anxiety of 30 sixth graders is affected by: (a) mother's math anxiety and maternal behaviors (environmental factors); (b) children's arithmetic skills (cognitive factors); and (c) intrinsic math motivation (personal factor). A rigorous assessment of children's math anxiety was made by using both explicit and implicit measures. The results indicated that accessible self-representations of math anxiety, as reflected by the explicit self-report questionnaire, were strongly affected by arithmetic skills. However, unconscious cognitive constructs of math anxiety, as reflected by the numerical dot-probe task, were strongly affected by environmental factors, such as maternal behaviors and mothers' attitudes toward math. Furthermore, the present study provided preliminary evidence of intergenerational transmission of math anxiety. The conclusions are that in order to better understand the etiology of math anxiety, multiple facets of parenting and children's skills should be taken into consideration. Implications for researchers, parents, and educators are discussed.
The Impact of Aerosol Microphysical Representation in Models on the Direct Radiative Effect
NASA Astrophysics Data System (ADS)
Ridley, D. A.; Heald, C. L.
2017-12-01
Aerosol impacts the radiative balance of the atmosphere both directly and indirectly. There is considerable uncertainty remaining in the aerosol direct radiative effect (DRE), hampering understanding of the present magnitude of anthropogenic aerosol forcing and how future changes in aerosol loading will influence climate. Computationally expensive explicit aerosol microphysics are usually reserved for modelling of the aerosol indirect radiative effects that depend upon aerosol particle number. However, the direct radiative effects of aerosol are also strongly dependent upon the aerosol size distribution, especially particles between 0.2µm - 2µm diameter. In this work, we use a consistent model framework and consistent emissions to explore the impact of prescribed size distributions (bulk scheme) relative to explicit microphysics (sectional scheme) on the aerosol radiative properties. We consider the difference in aerosol burden, water uptake, and extinction efficiency resulting from the two representations, highlighting when and where the bulk and sectional schemes diverge significantly in their estimates of the DRE. Finally, we evaluate the modelled size distributions using in-situ measurements over a range of regimes to provide constraints on both the accumulation and coarse aerosol sizes.
Implicit–explicit (IMEX) Runge–Kutta methods for non-hydrostatic atmospheric models
Gardner, David J.; Guerra, Jorge E.; Hamon, François P.; ...
2018-04-17
The efficient simulation of non-hydrostatic atmospheric dynamics requires time integration methods capable of overcoming the explicit stability constraints on time step size arising from acoustic waves. In this work, we investigate various implicit–explicit (IMEX) additive Runge–Kutta (ARK) methods for evolving acoustic waves implicitly to enable larger time step sizes in a global non-hydrostatic atmospheric model. The IMEX formulations considered include horizontally explicit – vertically implicit (HEVI) approaches as well as splittings that treat some horizontal dynamics implicitly. In each case, the impact of solving nonlinear systems in each implicit ARK stage in a linearly implicit fashion is also explored.The accuracymore » and efficiency of the IMEX splittings, ARK methods, and solver options are evaluated on a gravity wave and baroclinic wave test case. HEVI splittings that treat some vertical dynamics explicitly do not show a benefit in solution quality or run time over the most implicit HEVI formulation. While splittings that implicitly evolve some horizontal dynamics increase the maximum stable step size of a method, the gains are insufficient to overcome the additional cost of solving a globally coupled system. Solving implicit stage systems in a linearly implicit manner limits the solver cost but this is offset by a reduction in step size to achieve the desired accuracy for some methods. Overall, the third-order ARS343 and ARK324 methods performed the best, followed by the second-order ARS232 and ARK232 methods.« less
Implicit-explicit (IMEX) Runge-Kutta methods for non-hydrostatic atmospheric models
NASA Astrophysics Data System (ADS)
Gardner, David J.; Guerra, Jorge E.; Hamon, François P.; Reynolds, Daniel R.; Ullrich, Paul A.; Woodward, Carol S.
2018-04-01
The efficient simulation of non-hydrostatic atmospheric dynamics requires time integration methods capable of overcoming the explicit stability constraints on time step size arising from acoustic waves. In this work, we investigate various implicit-explicit (IMEX) additive Runge-Kutta (ARK) methods for evolving acoustic waves implicitly to enable larger time step sizes in a global non-hydrostatic atmospheric model. The IMEX formulations considered include horizontally explicit - vertically implicit (HEVI) approaches as well as splittings that treat some horizontal dynamics implicitly. In each case, the impact of solving nonlinear systems in each implicit ARK stage in a linearly implicit fashion is also explored. The accuracy and efficiency of the IMEX splittings, ARK methods, and solver options are evaluated on a gravity wave and baroclinic wave test case. HEVI splittings that treat some vertical dynamics explicitly do not show a benefit in solution quality or run time over the most implicit HEVI formulation. While splittings that implicitly evolve some horizontal dynamics increase the maximum stable step size of a method, the gains are insufficient to overcome the additional cost of solving a globally coupled system. Solving implicit stage systems in a linearly implicit manner limits the solver cost but this is offset by a reduction in step size to achieve the desired accuracy for some methods. Overall, the third-order ARS343 and ARK324 methods performed the best, followed by the second-order ARS232 and ARK232 methods.
Processing of false belief passages during natural story comprehension: An fMRI study.
Kandylaki, Katerina D; Nagels, Arne; Tune, Sarah; Wiese, Richard; Bornkessel-Schlesewsky, Ina; Kircher, Tilo
2015-11-01
The neural correlates of theory of mind (ToM) are typically studied using paradigms which require participants to draw explicit, task-related inferences (e.g., in the false belief task). In a natural setup, such as listening to stories, false belief mentalizing occurs incidentally as part of narrative processing. In our experiment, participants listened to auditorily presented stories with false belief passages (implicit false belief processing) and immediately after each story answered comprehension questions (explicit false belief processing), while neural responses were measured with functional magnetic resonance imaging (fMRI). All stories included (among other situations) one false belief condition and one closely matched control condition. For the implicit ToM processing, we modeled the hemodynamic response during the false belief passages in the story and compared it to the hemodynamic response during the closely matched control passages. For implicit mentalizing, we found activation in typical ToM processing regions, that is the angular gyrus (AG), superior medial frontal gyrus (SmFG), precuneus (PCUN), middle temporal gyrus (MTG) as well as in the inferior frontal gyrus (IFG) billaterally. For explicit ToM, we only found AG activation. The conjunction analysis highlighted the left AG and MTG as well as the bilateral IFG as overlapping ToM processing regions for both implicit and explicit modes. Implicit ToM processing during listening to false belief passages, recruits the left SmFG and billateral PCUN in addition to the "mentalizing network" known form explicit processing tasks. © 2015 Wiley Periodicals, Inc.
Teaching Scientists to Communicate: Evidence-based assessment for undergraduate science education
NASA Astrophysics Data System (ADS)
Mercer-Mapstone, Lucy; Kuchel, Louise
2015-07-01
Communication skills are one of five nationally recognised learning outcomes for an Australian Bachelor of Science (BSc) degree. Previous evidence indicates that communication skills taught in Australian undergraduate science degrees are not developed sufficiently to meet the requirements of the modern-day workplace-a problem faced in the UK and USA also. Curriculum development in this area, however, hinges on first evaluating how communication skills are taught currently as a base from which to make effective changes. This study aimed to quantify the current standard of communication education within BSc degrees at Australian research-intensive universities. A detailed evidential baseline for not only what but also how communication skills are being taught was established. We quantified which communication skills were taught and assessed explicitly, implicitly, or were absent in a range of undergraduate science assessment tasks (n = 35) from four research-intensive Australian universities. Results indicate that 10 of the 12 core science communication skills used for evaluation were absent from more than 50% of assessment tasks and 77.14% of all assessment tasks taught less than 5 core communication skills explicitly. The design of assessment tasks significantly affected whether communication skills were taught explicitly. Prominent trends were that communication skills in tasks aimed at non-scientific audiences were taught more explicitly than in tasks aimed at scientific audiences, and the majority of group and multimedia tasks taught communication elements more explicitly than individual, or written and oral tasks. Implications for science communication in the BSc and further research are discussed.
Implicit–explicit (IMEX) Runge–Kutta methods for non-hydrostatic atmospheric models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardner, David J.; Guerra, Jorge E.; Hamon, François P.
The efficient simulation of non-hydrostatic atmospheric dynamics requires time integration methods capable of overcoming the explicit stability constraints on time step size arising from acoustic waves. In this work, we investigate various implicit–explicit (IMEX) additive Runge–Kutta (ARK) methods for evolving acoustic waves implicitly to enable larger time step sizes in a global non-hydrostatic atmospheric model. The IMEX formulations considered include horizontally explicit – vertically implicit (HEVI) approaches as well as splittings that treat some horizontal dynamics implicitly. In each case, the impact of solving nonlinear systems in each implicit ARK stage in a linearly implicit fashion is also explored.The accuracymore » and efficiency of the IMEX splittings, ARK methods, and solver options are evaluated on a gravity wave and baroclinic wave test case. HEVI splittings that treat some vertical dynamics explicitly do not show a benefit in solution quality or run time over the most implicit HEVI formulation. While splittings that implicitly evolve some horizontal dynamics increase the maximum stable step size of a method, the gains are insufficient to overcome the additional cost of solving a globally coupled system. Solving implicit stage systems in a linearly implicit manner limits the solver cost but this is offset by a reduction in step size to achieve the desired accuracy for some methods. Overall, the third-order ARS343 and ARK324 methods performed the best, followed by the second-order ARS232 and ARK232 methods.« less
Batterink, Laura; Neville, Helen
2011-11-01
The vast majority of word meanings are learned simply by extracting them from context rather than by rote memorization or explicit instruction. Although this skill is remarkable, little is known about the brain mechanisms involved. In the present study, ERPs were recorded as participants read stories in which pseudowords were presented multiple times, embedded in consistent, meaningful contexts (referred to as meaning condition, M+) or inconsistent, meaningless contexts (M-). Word learning was then assessed implicitly using a lexical decision task and explicitly through recall and recognition tasks. Overall, during story reading, M- words elicited a larger N400 than M+ words, suggesting that participants were better able to semantically integrate M+ words than M- words throughout the story. In addition, M+ words whose meanings were subsequently correctly recognized and recalled elicited a more positive ERP in a later time window compared with M+ words whose meanings were incorrectly remembered, consistent with the idea that the late positive component is an index of encoding processes. In the lexical decision task, no behavioral or electrophysiological evidence for implicit priming was found for M+ words. In contrast, during the explicit recognition task, M+ words showed a robust N400 effect. The N400 effect was dependent upon recognition performance, such that only correctly recognized M+ words elicited an N400. This pattern of results provides evidence that the explicit representations of word meanings can develop rapidly, whereas implicit representations may require more extensive exposure or more time to emerge.
Kleynen, Melanie; Braun, Susy M.; Bleijlevens, Michel H.; Lexis, Monique A.; Rasquin, Sascha M.; Halfens, Jos; Wilson, Mark R.; Beurskens, Anna J.; Masters, Rich S. W.
2014-01-01
Background Motor learning is central to domains such as sports and rehabilitation; however, often terminologies are insufficiently uniform to allow effective sharing of experience or translation of knowledge. A study using a Delphi technique was conducted to ascertain level of agreement between experts from different motor learning domains (i.e., therapists, coaches, researchers) with respect to definitions and descriptions of a fundamental conceptual distinction within motor learning, namely implicit and explicit motor learning. Methods A Delphi technique was embedded in multiple rounds of a survey designed to collect and aggregate informed opinions of 49 international respondents with expertise related to motor learning. The survey was administered via an online survey program and accompanied by feedback after each round. Consensus was considered to be reached if ≥70% of the experts agreed on a topic. Results Consensus was reached with respect to definitions of implicit and explicit motor learning, and seven common primary intervention strategies were identified in the context of implicit and explicit motor learning. Consensus was not reached with respect to whether the strategies promote implicit or explicit forms of learning. Discussion The definitions and descriptions agreed upon may aid translation and transfer of knowledge between domains in the field of motor learning. Empirical and clinical research is required to confirm the accuracy of the definitions and to explore the feasibility of the strategies that were identified in research, everyday practice and education. PMID:24968228
Challenging some assumptions about empathy.
Gallagher, Peter; Moriarty, Helen; Huthwaite, Mark; Lim, Bee
2017-12-01
In New Zealand little nursing or medical curricula time, if any, is specifically devoted to the enhancement of empathy. If being empathic is important in the context of patient care, it is a quality that is already present in students or is learned by students during their practicum in the company of experienced clinicians. This study aimed to compare self-reported empathy ratings between different groups of medical students and one cohort of nursing students who were either exposed or not exposed to explicit empathy training or learning in clinical settings in the presence of patients. The Jefferson Scale of Physician Empathy (JSPE) was completed before and after groups of medical and nursing students had been exposed to various extended periods of practicum. Some medical student cohorts undertook brief empathy training, whereas others had no exposure. The nursing student cohort had no formal, explicit empathy training. Irrespective of profession, length of practicum or exposure to specific empathy training, there were no significant differences in the self-reported JSPE scores across the seven different cohorts of students. Empathy is a quality that is already present in students or is learned by students during their practicum DISCUSSION: If empathy is caught rather than taught, then brief efforts to enhance empathy may be futile. To optimise the inherent empathic qualities of aspirant health professionals, explicit consideration should be given to how empathy is influenced by the practicum experience. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Improving our legacy: Incorporation of adaptive management into state wildlife action plans
Fontaine, J.J.
2011-01-01
The loss of biodiversity is a mounting concern, but despite numerous attempts there are few large scale conservation efforts that have proven successful in reversing current declines. Given the challenge of biodiversity conservation, there is a need to develop strategic conservation plans that address species declines even with the inherent uncertainty in managing multiple species in complex environments. In 2002, the State Wildlife Grant program was initiated to fulfill this need, and while not explicitly outlined by Congress follows the fundamental premise of adaptive management, 'Learning by doing'. When action is necessary, but basic biological information and an understanding of appropriate management strategies are lacking, adaptive management enables managers to be proactive in spite of uncertainty. However, regardless of the strengths of adaptive management, the development of an effective adaptive management framework is challenging. In a review of 53 State Wildlife Action Plans, I found a keen awareness by planners that adaptive management was an effective method for addressing biodiversity conservation, but the development and incorporation of explicit adaptive management approaches within each plan remained elusive. Only ???25% of the plans included a framework for how adaptive management would be implemented at the project level within their state. There was, however, considerable support across plans for further development and implementation of adaptive management. By furthering the incorporation of adaptive management principles in conservation plans and explicitly outlining the decision making process, states will be poised to meet the pending challenges to biodiversity conservation. ?? 2010 .
Improving our legacy: incorporation of adaptive management into state wildlife action plans.
Fontaine, Joseph J
2011-05-01
The loss of biodiversity is a mounting concern, but despite numerous attempts there are few large scale conservation efforts that have proven successful in reversing current declines. Given the challenge of biodiversity conservation, there is a need to develop strategic conservation plans that address species declines even with the inherent uncertainty in managing multiple species in complex environments. In 2002, the State Wildlife Grant program was initiated to fulfill this need, and while not explicitly outlined by Congress follows the fundamental premise of adaptive management, 'Learning by doing'. When action is necessary, but basic biological information and an understanding of appropriate management strategies are lacking, adaptive management enables managers to be proactive in spite of uncertainty. However, regardless of the strengths of adaptive management, the development of an effective adaptive management framework is challenging. In a review of 53 State Wildlife Action Plans, I found a keen awareness by planners that adaptive management was an effective method for addressing biodiversity conservation, but the development and incorporation of explicit adaptive management approaches within each plan remained elusive. Only ~25% of the plans included a framework for how adaptive management would be implemented at the project level within their state. There was, however, considerable support across plans for further development and implementation of adaptive management. By furthering the incorporation of adaptive management principles in conservation plans and explicitly outlining the decision making process, states will be poised to meet the pending challenges to biodiversity conservation. Published by Elsevier Ltd.
Improving Explicit Congestion Notification with the Mark-Front Strategy
NASA Technical Reports Server (NTRS)
Liu, Chunlei; Jain, Raj
2001-01-01
Delivering congestion signals is essential to the performance of networks. Current TCP/IP networks use packet losses to signal congestion. Packet losses not only reduces TCP performance, but also adds large delay. Explicit Congestion Notification (ECN) delivers a faster indication of congestion and has better performance. However, current ECN implementations mark the packet from the tail of the queue. In this paper, we propose the mark-front strategy to send an even faster congestion signal. We show that mark-front strategy reduces buffer size requirement, improves link efficiency and provides better fairness among users. Simulation results that verify our analysis are also presented.
Third-order 2N-storage Runge-Kutta schemes with error control
NASA Technical Reports Server (NTRS)
Carpenter, Mark H.; Kennedy, Christopher A.
1994-01-01
A family of four-stage third-order explicit Runge-Kutta schemes is derived that requires only two storage locations and has desirable stability characteristics. Error control is achieved by embedding a second-order scheme within the four-stage procedure. Certain schemes are identified that are as efficient and accurate as conventional embedded schemes of comparable order and require fewer storage locations.
2013-06-01
ABBREVIATIONS ANSI American National Standards Institute ASIS American Society of Industrial Security CCTV Closed Circuit Television CONOPS...is globally recognized for the development and maintenance of standards. ASTM defines a specification as an explicit set of requirements...www.rkb.us/saver/. One of the SAVER reports titled CCTV Technology Handbook has a chapter on system design. The report uses terms like functional
Code of Federal Regulations, 2014 CFR
2014-07-01
... evaluate requirements with respect to the environment; d. Ensure consideration of: (1) Requirements of... other similar broad-guage descriptive factors; b. Identification of the important environmental issues... in its internal affairs and its prerogative to evaluate requirements with respect to the environment...
Code of Federal Regulations, 2013 CFR
2013-07-01
... evaluate requirements with respect to the environment; d. Ensure consideration of: (1) Requirements of... other similar broad-guage descriptive factors; b. Identification of the important environmental issues... in its internal affairs and its prerogative to evaluate requirements with respect to the environment...
Höfener, Sebastian; Bischoff, Florian A; Glöss, Andreas; Klopper, Wim
2008-06-21
In the recent years, Slater-type geminals (STGs) have been used with great success to expand the first-order wave function in an explicitly-correlated perturbation theory. The present work reports on this theory's implementation in the framework of the Turbomole suite of programs. A formalism is presented for evaluating all of the necessary molecular two-electron integrals by means of the Obara-Saika recurrence relations, which can be applied when the STG is expressed as a linear combination of a small number (n) of Gaussians (STG-nG geminal basis). In the Turbomole implementation of the theory, density fitting is employed and a complementary auxiliary basis set (CABS) is used for the resolution-of-the-identity (RI) approximation of explicitly-correlated theory. By virtue of this RI approximation, the calculation of molecular three- and four-electron integrals is avoided. An approximation is invoked to avoid the two-electron integrals over the commutator between the operators of kinetic energy and the STG. This approximation consists of computing commutators between matrices in place of operators. Integrals over commutators between operators would have occurred if the theory had been formulated and implemented as proposed originally. The new implementation in Turbomole was tested by performing a series of calculations on rotational conformers of the alkanols n-propanol through n-pentanol. Basis-set requirements concerning the orbital basis, the auxiliary basis set for density fitting and the CABS were investigated. Furthermore, various (constrained) optimizations of the amplitudes of the explicitly-correlated double excitations were studied. These amplitudes can be optimized in orbital-variant and orbital-invariant manners, or they can be kept fixed at the values governed by the rational generator approach, that is, by the electron cusp conditions. Electron-correlation effects beyond the level of second-order perturbation theory were accounted for by conventional coupled-cluster calculations with single, double and perturbative triple excitations [CCSD(T)]. The explicitly-correlated perturbation theory results were combined with CCSD(T) results and compared with literature data obtained by basis-set extrapolation.
Neuropathology analysis as an endpoint during nonclinical efficacy and toxicity studies is a challenging prospect that requires trained personnel and particular equipment to achieve optimal results. Accordingly, many regulatory agencies have produced explicit guidelines for desig...
76 FR 60838 - Debarment, Suspension, and Ineligibility of Contractors
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-30
... GOVERNMENT ACCOUNTABILITY OFFICE Debarment, Suspension, and Ineligibility of Contractors AGENCY... of government contractors. As a legislative branch agency, GAO is not subject to the requirements of... (hereinafter, contractors) who are responsible. However, GAO's Procurement Order has not explicitly referenced...
Understanding the vertical equity judgements underpinning health inequality measures.
Allanson, Paul; Petrie, Dennis
2014-11-01
The choice of income-related health inequality measures in comparative studies is often determined by custom and analytical concerns, without much explicit consideration of the vertical equity judgements underlying alternative measures. This note employs an inequality map to illustrate how these judgements determine the ranking of populations by health inequality. In particular, it is shown that relative indices of inequality in health attainments and shortfalls embody distinct vertical equity judgments, where each may represent ethically defensible positions in specific contexts. Further research is needed to explore people's preferences over distributions of income and health. Copyright © 2013 John Wiley & Sons, Ltd.
New more accurate calculations of the ground state potential energy surface of H(3) (+).
Pavanello, Michele; Tung, Wei-Cheng; Leonarski, Filip; Adamowicz, Ludwik
2009-02-21
Explicitly correlated Gaussian functions with floating centers have been employed to recalculate the ground state potential energy surface (PES) of the H(3) (+) ion with much higher accuracy than it was done before. The nonlinear parameters of the Gaussians (i.e., the exponents and the centers) have been variationally optimized with a procedure employing the analytical gradient of the energy with respect to these parameters. The basis sets for calculating new PES points were guessed from the points already calculated. This allowed us to considerably speed up the calculations and achieve very high accuracy of the results.
The any particle molecular orbital grid-based Hartree-Fock (APMO-GBHF) approach
NASA Astrophysics Data System (ADS)
Posada, Edwin; Moncada, Félix; Reyes, Andrés
2018-02-01
The any particle molecular orbital grid-based Hartree-Fock approach (APMO-GBHF) is proposed as an initial step to perform multi-component post-Hartree-Fock, explicitly correlated, and density functional theory methods without basis set errors. The method has been applied to a number of electronic and multi-species molecular systems. Results of these calculations show that the APMO-GBHF total energies are comparable with those obtained at the APMO-HF complete basis set limit. In addition, results reveal a considerable improvement in the description of the nuclear cusps of electronic and non-electronic densities.
NASA Astrophysics Data System (ADS)
Bañados, Máximo; Düring, Gustavo; Faraggi, Alberto; Reyes, Ignacio A.
2017-08-01
We study the thermodynamic phase diagram of three-dimensional s l (N ;R ) higher spin black holes. By analyzing the semiclassical partition function we uncover a rich structure that includes Hawking-Page transitions to the AdS3 vacuum, first order phase transitions among black hole states, and a second order critical point. Our analysis is explicit for N =4 but we extrapolate some of our conclusions to arbitrary N . In particular, we argue that even N is stable in the ensemble under consideration but odd N is not.
NASA Technical Reports Server (NTRS)
Chang, C. I.
1989-01-01
An account is given of approaches that have emerged as useful in the incorporation of thermal loading considerations into advanced composite materials-based aerospace structural design practices. Sources of structural heating encompass not only propulsion system heat and aerodynamic surface heating at supersonic speeds, but the growing possibility of intense thermal fluxes from directed-energy weapons. The composite materials in question range from intrinsically nonheat-resistant polymer matrix systems to metal-matrix composites, and increasingly to such ceramic-matrix composites as carbon/carbon, which are explicitly intended for elevated temperature operation.
A Green's function formulation for a nonlinear potential flow solution applicable to transonic flow
NASA Technical Reports Server (NTRS)
Baker, A. J.; Fox, C. H., Jr.
1977-01-01
Routine determination of inviscid subsonic flow fields about wing-body-tail configurations employing a Green's function approach for numerical solution of the perturbation velocity potential equation is successfully extended into the high subsonic subcritical flow regime and into the shock-free supersonic flow regime. A modified Green's function formulation, valid throughout a range of Mach numbers including transonic, that takes an explicit accounting of the intrinsic nonlinearity in the parent governing partial differential equations is developed. Some considerations pertinent to flow field predictions in the transonic flow regime are discussed.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Senate Committee on the Judiciary.
This hearing addressed Senate Bill 1384, which deals with the copyright issue and seeks to alter the 5-to-4 decision of the Supreme Court of the United States in the Mills Music case. The question under consideration is whether the law should be made explicit to the effect that the class of intended beneficiaries of all royalties under the…
Automated detection of slum area change in Hyderabad, India using multitemporal satellite imagery
NASA Astrophysics Data System (ADS)
Kit, Oleksandr; Lüdeke, Matthias
2013-09-01
This paper presents an approach to automated identification of slum area change patterns in Hyderabad, India, using multi-year and multi-sensor very high resolution satellite imagery. It relies upon a lacunarity-based slum detection algorithm, combined with Canny- and LSD-based imagery pre-processing routines. This method outputs plausible and spatially explicit slum locations for the whole urban agglomeration of Hyderabad in years 2003 and 2010. The results indicate a considerable growth of area occupied by slums between these years and allow identification of trends in slum development in this urban agglomeration.
Axisymmetric buckling of the circular graphene sheets with the nonlocal continuum plate model
NASA Astrophysics Data System (ADS)
Farajpour, A.; Mohammadi, M.; Shahidi, A. R.; Mahzoon, M.
2011-08-01
In this article, the buckling behavior of nanoscale circular plates under uniform radial compression is studied. Small-scale effect is taken into consideration. Using nonlocal elasticity theory the governing equations are derived for the circular single-layered graphene sheets (SLGS). Explicit expressions for the buckling loads are obtained for clamped and simply supported boundary conditions. It is shown that nonlocal effects play an important role in the buckling of circular nanoplates. The effects of the small scale on the buckling loads considering various parameters such as the radius of the plate and mode numbers are investigated.
NASA Technical Reports Server (NTRS)
Hazra, Rajeeb; Viles, Charles L.; Park, Stephen K.; Reichenbach, Stephen E.; Sieracki, Michael E.
1992-01-01
Consideration is given to a model-based method for estimating the spatial frequency response of a digital-imaging system (e.g., a CCD camera) that is modeled as a linear, shift-invariant image acquisition subsystem that is cascaded with a linear, shift-variant sampling subsystem. The method characterizes the 2D frequency response of the image acquisition subsystem to beyond the Nyquist frequency by accounting explicitly for insufficient sampling and the sample-scene phase. Results for simulated systems and a real CCD-based epifluorescence microscopy system are presented to demonstrate the accuracy of the method.
Updating the OMERACT filter: discrimination and feasibility.
Wells, George; Beaton, Dorcas E; Tugwell, Peter; Boers, Maarten; Kirwan, John R; Bingham, Clifton O; Boonen, Annelies; Brooks, Peter; Conaghan, Philip G; D'Agostino, Maria-Antonietta; Dougados, Maxime; Furst, Daniel E; Gossec, Laure; Guillemin, Francis; Helliwell, Philip; Hewlett, Sarah; Kvien, Tore K; Landewé, Robert B; March, Lyn; Mease, Philip J; Ostergaard, Mikkel; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; van der Heijde, Désirée M
2014-05-01
The "Discrimination" part of the OMERACT Filter asks whether a measure discriminates between situations that are of interest. "Feasibility" in the OMERACT Filter encompasses the practical considerations of using an instrument, including its ease of use, time to complete, monetary costs, and interpretability of the question(s) included in the instrument. Both the Discrimination and Reliability parts of the filter have been helpful but were agreed on primarily by consensus of OMERACT participants rather than through explicit evidence-based guidelines. In Filter 2.0 we wanted to improve this definition and provide specific guidance and advice to participants.
Understanding and preventing military suicide.
Bryan, Craig J; Jennings, Keith W; Jobes, David A; Bradley, John C
2012-01-01
The continual rise in the U.S. military's suicide rate since 2004 is one of the most vexing issues currently facing military leaders, mental health professionals, and suicide experts. Despite considerable efforts to address this problem, however, suicide rates have not decreased. The authors consider possible reasons for this frustrating reality, and question common assumptions and approaches to military suicide prevention. They further argue that suicide prevention efforts that more explicitly embrace the military culture and implement evidence-based strategies across the full spectrum of prevention and treatment could improve success. Several recommendations for augmenting current efforts to prevent military suicide are proposed.
How to Connect Cardiac Excitation to the Atomic Interactions of Ion Channels.
Silva, Jonathan R
2018-01-23
Many have worked to create cardiac action potential models that explicitly represent atomic-level details of ion channel structure. Such models have the potential to define new therapeutic directions and to show how nanoscale perturbations to channel function predispose patients to deadly cardiac arrhythmia. However, there have been significant experimental and theoretical barriers that have limited model usefulness. Recently, many of these barriers have come down, suggesting that considerable progress toward creating these long-sought models may be possible in the near term. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
An implicit dispersive transport algorithm for the US Geological Survey MOC3D solute-transport model
Kipp, K.L.; Konikow, Leonard F.; Hornberger, G.Z.
1998-01-01
This report documents an extension to the U.S. Geological Survey MOC3D transport model that incorporates an implicit-in-time difference approximation for the dispersive transport equation, including source/sink terms. The original MOC3D transport model (Version 1) uses the method of characteristics to solve the transport equation on the basis of the velocity field. The original MOC3D solution algorithm incorporates particle tracking to represent advective processes and an explicit finite-difference formulation to calculate dispersive fluxes. The new implicit procedure eliminates several stability criteria required for the previous explicit formulation. This allows much larger transport time increments to be used in dispersion-dominated problems. The decoupling of advective and dispersive transport in MOC3D, however, is unchanged. With the implicit extension, the MOC3D model is upgraded to Version 2. A description of the numerical method of the implicit dispersion calculation, the data-input requirements and output options, and the results of simulator testing and evaluation are presented. Version 2 of MOC3D was evaluated for the same set of problems used for verification of Version 1. These test results indicate that the implicit calculation of Version 2 matches the accuracy of Version 1, yet is more efficient than the explicit calculation for transport problems that are characterized by a grid Peclet number less than about 1.0.
The Role Of Nonprofit Hospitals In Identifying And Addressing Health Inequities In Cities.
Carroll-Scott, Amy; Henson, Rosie Mae; Kolker, Jennifer; Purtle, Jonathan
2017-06-01
For nonprofit hospitals to maintain their tax-exempt status, the Affordable Care Act requires them to conduct a community health needs assessment, in which they evaluate the health needs of the community they serve, and to create an implementation strategy, in which they propose ways to address these needs. We explored the extent to which nonprofit urban hospitals identified equity among the health needs of their communities and proposed health equity strategies to address this need. We conducted a content analysis of publicly available community health needs assessments and implementation strategies from 179 hospitals in twenty-eight US cities in the period August-December 2016. All of the needs assessments included at least one implicit health equity term (such as disparities , disadvantage , poor , or minorities ), while 65 percent included at least one explicit health equity term ( equity , health equity , inequity , or health inequity ). Thirty-five percent of implementation strategies included one or more explicit health equity terms, but only 9 percent included an explicit activity to promote health equity. While needs assessment reporting requirements have the potential to encourage urban nonprofit hospitals to address health inequities in their communities, hospitals need incentives and additional capacity to invest in strategies that address the underlying structural social and economic conditions that cause health inequities. Project HOPE—The People-to-People Health Foundation, Inc.
Hysteretic behavior using the explicit material point method
NASA Astrophysics Data System (ADS)
Sofianos, Christos D.; Koumousis, Vlasis K.
2018-05-01
The material point method (MPM) is an advancement of particle in cell method, in which Lagrangian bodies are discretized by a number of material points that hold all the properties and the state of the material. All internal variables, stress, strain, velocity, etc., which specify the current state, and are required to advance the solution, are stored in the material points. A background grid is employed to solve the governing equations by interpolating the material point data to the grid. The derived momentum conservation equations are solved at the grid nodes and information is transferred back to the material points and the background grid is reset, ready to handle the next iteration. In this work, the standard explicit MPM is extended to account for smooth elastoplastic material behavior with mixed isotropic and kinematic hardening and stiffness and strength degradation. The strains are decomposed into an elastic and an inelastic part according to the strain decomposition rule. To account for the different phases during elastic loading or unloading and smoothening the transition from the elastic to inelastic regime, two Heaviside-type functions are introduced. These act as switches and incorporate the yield function and the hardening laws to control the whole cyclic behavior. A single expression is thus established for the plastic multiplier for the whole range of stresses. This overpasses the need for a piecewise approach and a demanding bookkeeping mechanism especially when multilinear models are concerned that account for stiffness and strength degradation. The final form of the constitutive stress rate-strain rate relation incorporates the tangent modulus of elasticity, which now includes the Heaviside functions and gathers all the governing behavior, facilitating considerably the simulation of nonlinear response in the MPM framework. Numerical results are presented that validate the proposed formulation in the context of the MPM in comparison with finite element method and experimental results.
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Labarta, Jesus; Gimenez, Judit
2004-01-01
With the current trend in parallel computer architectures towards clusters of shared memory symmetric multi-processors, parallel programming techniques have evolved that support parallelism beyond a single level. When comparing the performance of applications based on different programming paradigms, it is important to differentiate between the influence of the programming model itself and other factors, such as implementation specific behavior of the operating system (OS) or architectural issues. Rewriting-a large scientific application in order to employ a new programming paradigms is usually a time consuming and error prone task. Before embarking on such an endeavor it is important to determine that there is really a gain that would not be possible with the current implementation. A detailed performance analysis is crucial to clarify these issues. The multilevel programming paradigms considered in this study are hybrid MPI/OpenMP, MLP, and nested OpenMP. The hybrid MPI/OpenMP approach is based on using MPI [7] for the coarse grained parallelization and OpenMP [9] for fine grained loop level parallelism. The MPI programming paradigm assumes a private address space for each process. Data is transferred by explicitly exchanging messages via calls to the MPI library. This model was originally designed for distributed memory architectures but is also suitable for shared memory systems. The second paradigm under consideration is MLP which was developed by Taft. The approach is similar to MPi/OpenMP, using a mix of coarse grain process level parallelization and loop level OpenMP parallelization. As it is the case with MPI, a private address space is assumed for each process. The MLP approach was developed for ccNUMA architectures and explicitly takes advantage of the availability of shared memory. A shared memory arena which is accessible by all processes is required. Communication is done by reading from and writing to the shared memory.
Spectrum analysis on quality requirements consideration in software design documents.
Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji
2013-12-01
Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.
Feedback-related brain activity predicts learning from feedback in multiple-choice testing.
Ernst, Benjamin; Steinhauser, Marco
2012-06-01
Different event-related potentials (ERPs) have been shown to correlate with learning from feedback in decision-making tasks and with learning in explicit memory tasks. In the present study, we investigated which ERPs predict learning from corrective feedback in a multiple-choice test, which combines elements from both paradigms. Participants worked through sets of multiple-choice items of a Swahili-German vocabulary task. Whereas the initial presentation of an item required the participants to guess the answer, corrective feedback could be used to learn the correct response. Initial analyses revealed that corrective feedback elicited components related to reinforcement learning (FRN), as well as to explicit memory processing (P300) and attention (early frontal positivity). However, only the P300 and early frontal positivity were positively correlated with successful learning from corrective feedback, whereas the FRN was even larger when learning failed. These results suggest that learning from corrective feedback crucially relies on explicit memory processing and attentional orienting to corrective feedback, rather than on reinforcement learning.
NASA Astrophysics Data System (ADS)
Kang, S.; Muralikrishnan, S.; Bui-Thanh, T.
2017-12-01
We propose IMEX HDG-DG schemes for Euler systems on cubed sphere. Of interest is subsonic flow, where the speed of the acoustic wave is faster than that of the nonlinear advection. In order to simulate these flows efficiently, we split the governing system into stiff part describing the fast waves and non-stiff part associated with nonlinear advection. The former is discretized implicitly with HDG method while explicit Runge-Kutta DG discretization is employed for the latter. The proposed IMEX HDG-DG framework: 1) facilitates high-order solution both in time and space; 2) avoids overly small time stepsizes; 3) requires only one linear system solve per time step; and 4) relatively to DG generates smaller and sparser linear system while promoting further parallelism owing to HDG discretization. Numerical results for various test cases demonstrate that our methods are comparable to explicit Runge-Kutta DG schemes in terms of accuracy, while allowing for much larger time stepsizes.
Hicks, Lindsey L; McNulty, James K; Meltzer, Andrea L; Olson, Michael A
2016-06-01
A strong predisposition to engage in sexual intercourse likely evolved in humans because sex is crucial to reproduction. Given that meeting interpersonal preferences tends to promote positive relationship evaluations, sex within a relationship should be positively associated with relationship satisfaction. Nevertheless, prior research has been inconclusive in demonstrating such a link, with longitudinal and experimental studies showing no association between sexual frequency and relationship satisfaction. Crucially, though, all prior research has utilized explicit reports of satisfaction, which reflect deliberative processes that may override the more automatic implications of phylogenetically older evolved preferences. Accordingly, capturing the implications of sexual frequency for relationship evaluations may require implicit measurements that bypass deliberative reasoning. Consistent with this idea, one cross-sectional and one 3-year study of newlywed couples revealed a positive association between sexual frequency and automatic partner evaluations but not explicit satisfaction. These findings highlight the importance of automatic measurements to understanding interpersonal relationships. © The Author(s) 2016.
Hicks, Lindsey L.; McNulty, James K.; Meltzer, Andrea L.; Olson, Michael A.
2016-01-01
Sex is crucial to reproduction, and thus humans likely evolved a strong predisposition to engage in sexual intercourse. Given that meeting interpersonal preferences tends to promote positive relationship evaluations, sex within a relationship should be positively associated with relationship satisfaction. Nevertheless, prior research has been inconclusive in demonstrating such a link, with longitudinal and experimental studies showing no association between sexual frequency and relationship satisfaction. Crucially, though, all prior research has utilized explicit reports of satisfaction, which reflect deliberative processes that may override the more automatic implications of phylogenetically older evolved preferences. Accordingly, capturing the implications of sexual frequency for relationship evaluations may require implicit measurements that bypass deliberative reasoning. Consistent with this idea, one cross-sectional and one three-year study of newlywed couples revealed a positive association between sexual frequency and automatic partner evaluations but not explicit satisfaction. These findings highlight the importance of automatic measurements to understanding interpersonal relationships. (150 words) PMID:27084851
Schroeder, Susan A.; Cornicelli, Louis; Fulton, David C.; Merchant, Steven S.
2018-01-01
Although research has advanced methods for clarifying factors that relate to customer satisfaction, they have not been embraced by leisure researchers. Using results from a survey of wild turkey hunters, we applied traditional and revised importance-performance (IPA/RIPA), importance-grid analysis (IGA), and penalty-reward-contrast analysis (PRCA) to examine how activity-specific factors influenced satisfaction. Results suggested differences between the explicit and implicit importance of factors related to turkey hunting. Opportunities to kill turkeys were explicitly rated as less important than seeing, hearing, or calling in turkeys, but opportunities for harvest had relatively higher levels of implicit importance. PRCA identified “calling turkeys in” and “hearing gobbling” as minimum requirements that cause dissatisfaction if not fulfilled, but do not provide satisfaction, whereas “seeing turkeys” and an “opportunity to kill a turkey” related to both satisfaction and dissatisfaction. RIPA, IGA, and PRCA could provide valuable insights about factors that may improve satisfaction for leisure participants.
Effects of Divided Attention at Retrieval on Conceptual Implicit Memory
Prull, Matthew W.; Lawless, Courtney; Marshall, Helen M.; Sherman, Annabella T. K.
2016-01-01
This study investigated whether conceptual implicit memory is sensitive to process-specific interference at the time of retrieval. Participants performed the implicit memory test of category exemplar generation (CEG; Experiments 1 and 3), or the matched explicit memory test of category-cued recall (Experiment 2), both of which are conceptually driven memory tasks, under one of two divided attention (DA) conditions in which participants simultaneously performed a distracting task. The distracting task was either syllable judgments (dissimilar processes), or semantic judgments (similar processes) on unrelated words. Compared to full attention (FA) in which no distracting task was performed, DA had no effect on CEG priming overall, but reduced category-cued recall similarly regardless of distractor task. Analyses of distractor task performance also revealed differences between implicit and explicit memory retrieval. The evidence suggests that, whereas explicit memory retrieval requires attentional resources and is disrupted by semantic and phonological distracting tasks, conceptual implicit memory is automatic and unaffected even when distractor and memory tasks involve similar processes. PMID:26834678
Effects of Divided Attention at Retrieval on Conceptual Implicit Memory.
Prull, Matthew W; Lawless, Courtney; Marshall, Helen M; Sherman, Annabella T K
2016-01-01
This study investigated whether conceptual implicit memory is sensitive to process-specific interference at the time of retrieval. Participants performed the implicit memory test of category exemplar generation (CEG; Experiments 1 and 3), or the matched explicit memory test of category-cued recall (Experiment 2), both of which are conceptually driven memory tasks, under one of two divided attention (DA) conditions in which participants simultaneously performed a distracting task. The distracting task was either syllable judgments (dissimilar processes), or semantic judgments (similar processes) on unrelated words. Compared to full attention (FA) in which no distracting task was performed, DA had no effect on CEG priming overall, but reduced category-cued recall similarly regardless of distractor task. Analyses of distractor task performance also revealed differences between implicit and explicit memory retrieval. The evidence suggests that, whereas explicit memory retrieval requires attentional resources and is disrupted by semantic and phonological distracting tasks, conceptual implicit memory is automatic and unaffected even when distractor and memory tasks involve similar processes.
Analogy motor learning by young children: a study of rope skipping.
Tse, Andy C Y; Fong, Shirley S M; Wong, Thomson W L; Masters, Rich
2017-03-01
Research in psychology suggests that provision of an instruction by analogy can enhance acquisition and understanding of knowledge. Limited research has been conducted to test this proposition in motor learning by children. The purpose of the present study was to examine the feasibility of analogy instructions in motor skill acquisition by children. Thirty-two children were randomly assigned to one of the two instruction protocols: analogy and explicit instruction protocols for a two-week rope skipping training. Each participant completed a pretest (Lesson 1), three practice sessions (Lesson 2-4), a posttest and a secondary task test (Lesson 5). Children in the analogy protocol displayed better rope skip performance than those in the explicit instruction protocol (p < .001). Moreover, a cognitive secondary task test indicated that children in the analogy protocol performed more effectively, whereas children in the explicit protocol displayed decrements in performance. Analogy learning may aid children to acquire complex motor skills, and have potential benefits related to reduced cognitive processing requirements.
Agriculture and natural resources in a changing world - the role of irrigation
NASA Astrophysics Data System (ADS)
Sauer, T.; Havlík, P.; Schneider, U. A.; Kindermann, G.; Obersteiner, M.
2009-04-01
Fertile land and fresh water constitute two of the most fundamental resources for food production. These resources are affected by environmental, political, economic, and technical developments. Regional impacts may transmit to the world through increased trade. With a global forest and agricultural sector model, we quantify the impacts of increased demand for food due to population growth and economic development on potential land and water use. In particular, we investigate producer adaptation regarding crop and irrigation choice, agricultural market adjustments, and changes in the values of land and water. Against the background of resource sustainability and food security topics, this study integrates the spatial and operational heterogeneity of irrigation management into a global land use model. It represents a first large scale assessment of agricultural water use under explicit consideration of alternative irrigation options in their particular biophysical, economic, and technical context, accounting for international trade, motivation-based farming, and quantified aggregated impacts on land scarcity, water scarcity, and food supply. The inclusion of technical and economic aspects of irrigation choice into an integrated land use modeling framework provides new insights into the interdisciplinary trade-offs between determinants of global land use change. Agricultural responses to population and economic growth include considerable increases in irrigated area and agricultural water use, but reductions in the average water intensity. Different irrigation systems are preferred under different exogenous biophysical and socioeconomic conditions. Negligence of these adaptations would bias the burden of development on land and water scarcity. Without technical progress in agriculture, predicted population and income levels for 2030 would require substantial price adjustments for land, water, and food to equilibrate supply and demand.
[Internet Addiction, Suicidality and Non-Suicidal Self-Harming Behavior - A Systematic Review].
Steinbüchel, Toni Andreas; Herpertz, Stephan; Külpmann, Ina; Kehyayan, Aram; Dieris-Hirche, Jan; Te Wildt, Bert Theodor
2017-11-23
Background Internet addiction (IA) is associated with a high rate of co-morbid mental disorders, especially depression, anxiety disorders, ADHD and personality disorders and a considerable level of psychological strain. In terms of risk assessment, the present work investigates the current research literature on suicidal behavior and non-suicidal self-injurious behavior (NSSI). Methods We performed a systematic literature search in 14 databases on title and abstract level for the most common keywords for IA, NSSI and suicidality. After deduction of multiple items, 2334 articles remained. They were filtered per inclusion and exclusion criteria. We identified studies that examined the relationship between IA, NSSI and suicidality, which were assessed by validated psychometric instruments. This allowed a total of 15 studies to be included. Results The relationship between IA and suicidality was examined in 10 studies, four studies examined the relationship of IA, suicidality, and NSSI, and one study exclusively focused on IA and NSSHB. All studies showed higher prevalence for NSSI and respectively suicidality of the subjects with an IA compared to subjects without IA, with point prevalence varying considerably between 1.6-18.7%. Discussion The results of the included publications suggest that Internet dependency is associated with an increased rate of non-suicidal self-harming behavior and increased suicidality, with suicidal ideation being more closely related to IA than suicidal actions. In order to develop a better understanding of causal relationships between IA, NSSI and suicidality, further longitudinal studies are required. Conclusion Against the background of the presented studies NSSHB and suicidality need to be explicitly addressed within the assessment and treatment of IA patients. © Georg Thieme Verlag KG Stuttgart · New York.
Connectivity, biodiversity conservation and the design of marine reserve networks for coral reefs
NASA Astrophysics Data System (ADS)
Almany, G. R.; Connolly, S. R.; Heath, D. D.; Hogan, J. D.; Jones, G. P.; McCook, L. J.; Mills, M.; Pressey, R. L.; Williamson, D. H.
2009-06-01
Networks of no-take reserves are important for protecting coral reef biodiversity from climate change and other human impacts. Ensuring that reserve populations are connected to each other and non-reserve populations by larval dispersal allows for recovery from disturbance and is a key aspect of resilience. In general, connectivity between reserves should increase as the distance between them decreases. However, enhancing connectivity may often tradeoff against a network’s ability to representatively sample the system’s natural variability. This “representation” objective is typically measured in terms of species richness or diversity of habitats, but has other important elements (e.g., minimizing the risk that multiple reserves will be impacted by catastrophic events). Such representation objectives tend to be better achieved as reserves become more widely spaced. Thus, optimizing the location, size and spacing of reserves requires both an understanding of larval dispersal and explicit consideration of how well the network represents the broader system; indeed the lack of an integrated theory for optimizing tradeoffs between connectivity and representation objectives has inhibited the incorporation of connectivity into reserve selection algorithms. This article addresses these issues by (1) updating general recommendations for the location, size and spacing of reserves based on emerging data on larval dispersal in corals and reef fishes, and on considerations for maintaining genetic diversity; (2) using a spatial analysis of the Great Barrier Reef Marine Park to examine potential tradeoffs between connectivity and representation of biodiversity and (3) describing a framework for incorporating environmental fluctuations into the conceptualization of the tradeoff between connectivity and representation, and that expresses both in a common, demographically meaningful currency, thus making optimization possible.
16 CFR 1051.5 - Requirements and recommendations for petitions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... injury data; or a research study); and (5) Contain an explicit request to initiate Commission rulemaking... hazardous product; and (3) Supply or reference any known documentation, engineering studies, technical studies, reports of injuries, medical findings, legal analyses, economic analyses and environmental impact...
16 CFR 1051.5 - Requirements and recommendations for petitions.
Code of Federal Regulations, 2011 CFR
2011-01-01
... injury data; or a research study); and (5) Contain an explicit request to initiate Commission rulemaking... hazardous product; and (3) Supply or reference any known documentation, engineering studies, technical studies, reports of injuries, medical findings, legal analyses, economic analyses and environmental impact...
Evolutionary Software Development (Developpement Evolutionnaire de Logiciels)
2008-08-01
development processes. While this may be true, frequently it is not. MIL-STD-498 was explicitly introduced to encourage iterative development; ISO /IEC... 12207 was carefully worded not to prohibit iterative development. Yet both standards were widely interpreted as requiring waterfall development, as
Evolutionary Software Development (Developpement evolutionnaire de logiciels)
2008-08-01
development processes. While this may be true, frequently it is not. MIL-STD-498 was explicitly introduced to encourage iterative development; ISO /IEC... 12207 was carefully worded not to prohibit iterative development. Yet both standards were widely interpreted as requiring waterfall development, as
Teaching Multiplication with Regrouping to Students with Learning Disabilities
ERIC Educational Resources Information Center
Flores, Margaret M.; Hinton, Vanessa M.; Schweck, Kelly B.
2014-01-01
The Common Core Standards require demonstration of conceptual knowledge of numbers, operations, and relations between mathematical concepts. Supplemental instruction should explicitly guide students with specific learning disabilities (SLD) in these skills. In this article, we illustrate implementation of the concrete-representational-abstract…